Hacker News new | past | comments | ask | show | jobs | submit login

Which ones in particular? I have an M2 air with 8GB, and doing some RAG development locally would be fantastic. I tried running Ollama with llama3.2 and it predictably bombed.



I would have thought that llama3.2:latest would run OK with 8G. Same for phi3.5:latest




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: