Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can just setup your local openAI like API endpoints for LLMs. Most devices and apps won't be able to use them, because consumers don't run self-hosted apps, but for a simple chatGPT style app this is totally viable. Today.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: