Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Crush has an open issue (2 weeks) to add Ollama support - it's in progress.


They should add "custom endpoint" support instead [0].

[0] https://github.com/microsoft/vscode/issues/249605


FYI it works already even without this feature branch (you'll just have to add your provider and models manually)

```

{

  "providers": {

    "ollama": {

      "type": "openai",

      "base_url": "http://localhost:11434/v1",

      "api_key": "ollama",

      "models": [

        {

          "id": "llama3.2:3b",

          "model": "Llama 3.2 3B",

          "context_window": 131072,

          "default_max_tokens": 4096,

          "cost_per_1m_in": 0,

          "cost_per_1m_out": 0

        }

      ]

    }

  }
}

```


why?

it's basic, edit the config file. I just downloaded it, ~/.cache/share/crush/providers.json add your own or edit an existing one

Edit api_endpoint, done.


nice, that would be my reason to use Crush.


Me too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: