Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting. I run my local llms through ollama and it's zero trouble to get that working in openwebui as long as the ollama server is running.


I think that's the thing. Compared to LM Studio, just running Ollama (fiddling around with terminals) is more complicated than the full E2E of chatting with LM Studio.

Of course, for folks used to terminals, daemons and so on it makes sense from the get go, but for others it seemingly doesn't, and it doesn't help that Ollama refuses to communicate what people should understand before trying to use it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: