Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe it's easier to just set up a proxy mirroring Anthropic's API, but pointing to whatever model you want? Genuine doubt.


It's not much different than your proxy idea. It's implemented as a transformation between the internal message structure (close to Anthropic API's) to OpenAI message spec and vice-versa. Then, it's calling all the other models using the openai-node client, as pretty much everyone supports that now (openrouter, ollama, etc)


Cool! The only downside would be keeping your code in sync with upstream, then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: