ResearchGate Twitter Flickr Tumblr

A few days ago, I wrote about how training large language models is still prohibitively expensive, but that the costs of running them is coming down like a rocket.

Today, Jan-Keno Janssen (c’t 3003) posted a YouTube video on how to get Stanford’s ALPACA/LLaMA ChatGPT clone to run locally on ordinary hardware at home. It’s in German language, but you can switch on English subtitles, and there’s a companion page with the full German transcript that you can feed to Google Translate (or whatever you work with).

As Janssen points out, the copyright situation is murky; also, it became apparent yesterday that Facebook’s begun to take down LLaMa repos. But, as an instant countermeasure, the dalai creator already announced the launch of a decentralized AI model distribution platform named GOAT.

Point being, we’re approaching Humpty Dumpty territory. Once all that stuff is in the wild and runs reasonably well on ordinary hardware, the LLM business model that sat on the wall will take a great fall. And OpenAI’s horses and Facebook’s men won’t be able to put it together again.

permalink