Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why couldn't the AI be done on-device rather than server side? Isn't that where things are going with mobile hardware?


Mobile devices don’t have the memory or the memory bandwidth to run an LLM that’s big enough to be good at much. Plus the fixed battery and thermal constraints.


my M1 Macbook air can run LLM's pretty well.. worse specs than the latest iPad Pro (and iPhone pro wouldn't be too far behind).


Running them is a whole lot less resource intensive than training them.

Unless the plan was just to build a RAG source from your personal data, in which case it would be yet another underwhelming feature.


I guess that will never change huh




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: