Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You upload a knowledge file explicitly so GPT can share that knowledge with users. Then call this a leak? People are stupid.


That’s fair, but 0 guardrails just entirely minimizes the incentive to create a custom GPT.

I was also thinking about when there’s paid access to certain GPTs. Someone can just download the files and spin up their own? Doesn’t seem valuable for devs. If that’s how the incentives were intended then sure. I just don’t expect many people to be enabling use cases with it.


I think the GPTs market is a bit of a novelty thing. But it might work with low costs. I've been making my own GPTs last few days and I find the capability useful but very inconsistent.

If you give a GPTs personality more complex tasks, it gets distracted and reverts to its default ChatGPT personality.

There aren't enough parameters and compute in these LLMs for the things we're pushing them to do. We need hardware to catch up and offer analog compute for neural nets, so we can scale them way up. And of course evolve architectures to make use of that capability. Only then I see this becoming truly rock solid.

Also using the context window for personalities is likely the wrong approach. You gotta finetune this on the model itself.

But yeah GPT can't keep a secret. But humans also can't, so I guess it's imitating us fine.


Is there some domain specific meaning to the "knowledge file" terminology?


Not that I know, although in this field terms are made up at a rapid rate currently. The knowledge file in GPTs is just extracted and put on the (hidden) chat log (i.e. the context window) like anything else while you talk.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: