Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

OP notes they switch between Linux and Windows.

I've not tried local LLMs on Windows, but I do loads with 'em on a three-year-old Legion running Arch.

That said, whilst small local models are nice for some use-cases, I'm leaning more towards APIs these days. I like the better selection of models and the ability to use them without bringing my machine to a halt.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: