Hacker News new | past | comments | ask | show | jobs | submit login

Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.



Correct but I'm going to loom into a locally running LLM so it would be free.


Please do (assuming you mean "look"). When you add support for a custom API URL, please make sure it supports HTTP Basic authentication.

That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).


Look into allowing it to connect to either a LM Studio endpoint or ollama please.


Yes


yes, but gpt-4o-mini costs very little so you probably will spend well under $1/month


I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write. I like that.


Openai does not train models on data that comes in from the API.

https://openai.com/policies/business-terms/


Assuming for the moment that they aren't saying that with their fingers crossed behind their back, that doesn't change the fact that they store the inputs they receive and swear they'll protect it (Paraphrasing from the Content section of the above link). Even if it's not fed back into the LLM, the fact that they store the inputs anywhere for a period of time is a huge privacy risk -- after all a breach is a matter of "when", not "if".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: