Hacker News new | past | comments | ask | show | jobs | submit login

I just looked up Ollama and it doesn't look like it supports Windows. (At least not yet)



Oh my apologies for the wild goose chase. I thought they had added support for Windows already. Should be possible to run it through WSL, but I suppose that’s a solid point for Nvidia in this discussion.


I think there's a market for a user who is not very computer savvy who at least understands how to use LLMs and would potentially run a chat one on their GPU especially if it's just a few clicks to turn on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: