Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Simon, I see if via Chat Completions as well as Responses in their API platform playground.


I just tried sending an o1-pro prompt to the chat completions API in the playground and got:

  This is not a chat model and thus not supported in the
  v1/chat/completions endpoint. Did you mean to use v1/completions?


Sorry, since the Platform UI featured it as an option, I figured OpenAI might enable o1-Pro via the chat completions endpoint, I just got around to testing it, and I also get the same 404 `invalid_request_error` error via the platform UI and API. It's such an odd and old 404 message, to suggest using the old completions API! It's hard to believe it could be an intentional design decision. Maybe they see it as an important feature to avoid wasting (and refunding) o1-pro credit. I noticed that their platform's dashboard queries https://api.openai.com/dashboard/which lists a supported_methods property of models. I can't see anything similar in the huge https://raw.githubusercontent.com/openai/openai-openapi/refs... schema yet (commit ec54f88 right now), and it lacks mention of o1-pro at all. Like the whole developer messages thing, the UX of the API seems like such an after-thought.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: