Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting. It sounds like using local LLMs (via vllm, ollama, etc) with decent agentic capability might be starting to become a reality.

Next step, just need a shitload of vram. ;)

Maybe those Intel Battlematrix 48GB cards might be useful after all... :)

https://www.storagereview.com/review/intel-arc-pro-b60-battl...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: