Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But isn't ollama only local chat? Or I am missing something? I'd like to setup it as a server for my usages on another laptop (use it as my local AI hub) and would love to integrate it with some IDE using MCP




No, it can listen on 0.0.0.0 or you can serve it through a proxy



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: