Hacker News new | past | comments | ask | show | jobs | submit login

Do it all locally.

I wrote a blog post[1] describing what a local only LLM could do. The answer is quite a lot with today's technology. The question is - do any of the tech giants actually want to build it?

The locally hosted scenarios are in some ways more powerful than what you can do with cloud hosted services, and honestly given that companies could charge customers for the inference hardware instead of paying to host, it would likely be a net win for everyone. Sadly companies are addicted to SaaS revenue and have forgotten how to make billions by selling actual things (with the exception of Apple).

[1] https://meanderingthoughts.hashnode.dev/lets-do-some-actual-...




I didn't say it in the prior comment, but this is what I'm hoping for and that people end up caring enough so that this option "wins." Evidence suggests people will take the cheaper option, though, even if all of their info ends up in the hands of advertisers or something far more nefarious.

You mention Apple... I feel like, of the megacorps, they're the most likely to do something like that. Then between the phone, AirPods, HomePod (tethered to the phone I guess or a newer version of the hardware), and your car with CarPlay, the hardware already exists and so someone will build a privacy-focused LLM that Apple could plug into. At least Apple could justify that by being the hardware interface between the LLM and the user if they can't build their own effective LLM (seems unlikely they'll be able to do that given track record).

If I were really crazy I'd say Apple could buy Anthropic (right right they don't do big acquisitions) and turn it into their privacy-focused LLM.

Now to read your blog post...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: