Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
zombot
50 days ago
|
parent
|
context
|
favorite
| on:
Opencode: AI coding agent, built for the terminal
It doesn't say how to configure a local ollama model.
ethan_smith
50 days ago
|
next
[–]
You can configure Ollama by setting OPENCODE_MODEL=ollama/MODEL_NAME and OPENCODE_BASE_URL=
http://localhost:11434/api
in your environment variables.
stocksinsmocks
50 days ago
|
prev
[–]
You can’t edit files with Ollama served models. Codex has the same problem. This is not an issue with Aider.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: