I've not tried local LLMs on Windows, but I do loads with 'em on a three-year-old Legion running Arch.
That said, whilst small local models are nice for some use-cases, I'm leaning more towards APIs these days. I like the better selection of models and the ability to use them without bringing my machine to a halt.
I've not tried local LLMs on Windows, but I do loads with 'em on a three-year-old Legion running Arch.
That said, whilst small local models are nice for some use-cases, I'm leaning more towards APIs these days. I like the better selection of models and the ability to use them without bringing my machine to a halt.