

Alibaba’s QwQ 32B is already incredible, and runnable on 16GB GPUs! Honestly it’s a bigger deal than Deepseek R1, and many open models before that were too, they just didn’t get the finance media attention DS got. And they are releasing a new series this month.
Microsoft just released a 2B bitnet model, today! And that’s their paltry underfunded research division, not the one training “usable” models: https://huggingface.co/microsoft/bitnet-b1.58-2B-4T
Local, efficient ML is coming. That’s why Altman and everyone are lying through their teeth: scaling up infinitely is not the way forward. It never was.
Another thing is people focusing on the “AI vs anti-AI” argument and overlooking the “open source vs close corporate AI” war going on.
There’s a very narrow window to solidify “personal” AI before the giants capture the market and snuff everything else out. Its future is either useful tools you run on your phone/PC (or maybe in P2P swarms or among highly competitive API hosts), or it’s what you described: shitty, unethical, corporate UIs that ruin everything.
Lemmy vs Reddit (and simply being ‘anti-Reddit’ obscuring that) is an apt analogy.
It’s why, to be blunt, the broad liberal “anti AI” stance really annoys me. It feels like everyone shooting themselves in the foot.