So basically, it’s a poorly marketed $40 game facing a lot of free and popular competition.
So basically, it’s a poorly marketed $40 game facing a lot of free and popular competition.
For LLMs, I’ve had really good results running Llama 3 in the Open Web UI docker container on a Nvidia Titan X (12GB VRAM).
For image generation tho, I agree more VRAM is better, but the algorithms still struggle with large image dimensions, ao you wind up needing to start small and iterarively upscale, which afaik works ok on weaker GPUs, but will gake problems. (I’ve been using the Automatic 1111 mode of the Stable Diffusion Web UI docker project.)
I’m on thumbs so I don’t have the links to the git repos atm, but you basically clone them and run the docker compose files. The readmes are pretty good!
The implication of “leave a review!” is they want info on quality to improve service; the twist is they don’t care about that, just getting information about you for ad targeting.
Some servers have a c/NoStupidQuestions
I cried watching Measure of a Man. It’s a gift to humanity. But Ad Astra Per Aspera clearly overtakes it imo. It doesn’t ask just about the implications of the law, but about what it - and society, and its constituents - should aspire to be. It’s less about military codes and an individual’s selfishness (Maddox), telling a more universally applicable story still firmly entrenched in Trek lore.
Jerboa’s been working for me. I wonder if it’s a background battery / processing permission issue.
Can we call communities “lemlets?”
As a longtime Plex user, I also hate their lack of focus and tendancy to priorotize bad features (like paid streaming and VR). But this one feels more like a way to re-focus on video by removing photo code from the main (video) app’s codebase, making it easier to maintain.