Feels like a business opportunity to me
Feels like a business opportunity to me
This is the way
You forgot poor
I feel like that’s a way to rapidly run out of spare universes
There are others above who provide instructions warnings against bypassing paywalls (⌐■_■)
Check protondb. It sometimes has workarounds for launcher issues.
You misspelled infinifactory 🫡
This was new to me. Thanks!
Thank you for the works best editor Bram. :x!
Very good to know, thanks!
I can see most individuals and SMBs going with specialist “good enough” models which they can run on prem/ locally, leaving the truly huge systems to those with compute to spare. The security model for these MAAS systems is pretty much “trust me bro”. A lot of companies will not want to, or be able to, trust such a system. PI/CID can not be left in the hands of the ai as a service company. They will have to either go on prem, or stand up their own models in their private cloud. Again, this limits model size for orgs, available compute etc. This points to using available models, optimised, etc. OSS FTW (I hope)
Given the pace of oss optimisation, I fully expect the requirements for a gpt3.5 equivalent performance model to be much lower in the coming year. The biggest issues are around training or fine tuning right now. Inference is cheaper, resource wise. For truly large models, the moat is most definitely gpu compute and power constraints. Those who own their own gpu farms will be at an advantage until there is significant increase in cloud gpu capacity - right now, cloud gpu is at a premium, and can also include wait time for access. I don’t expect this to change in the next year or two.
Tl;dr; moat is real, but it’s gpu and power constraints.
Thanks for the link!
Fully agree. Secrecy in content moderation houses agendas
Thanks for the summary!
Looks like your ssh examples are [email protected].
Sounds like a great resource for game devs.
I game on Linux. Go check protondb for compatibility with your favourite game