Hacker News new | past | comments | ask | show | jobs | submit login

This is really cool. I wonder how long it will be till we have GPT-4 quality models that run locally (if we ever will). Would open up a lot of possibilities.



We aren't quite there yet, but the last year has been an incredibly exciting time for the open source LLM community. If your computer is decently powerful, you might be really surprised by what's already possible. LM Studio on Apple Silicon supports GGUF quants on GPU out the box.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: