Hacker News new | past | comments | ask | show | jobs | submit login

I work for a Fortune 100 company with 80,000+ employees. All of us are explicitly forbidden from using any sort of AI/LLM tool without written permission from the head of legal AND the CEO. In other words, nobody is going to get permission.

The concerns are 2 fold - 1. We might inadvertently use someone else’s intellectual property. 2. Someone else might gain access to our intellectual property.

What you are describing would help alleviate the concern about issue 2, but I’m not sure if it would help alleviate the concerns with issue 1.




It's basically the same thing in our company, too. They basically put a similar rule in place that prevents anyone from using e.g. Chat GPT. Little do they know that all software devs within the company are using co-pilot and the company is even paying for it. It's quite a funny situation tbh..


> Little do they know that all software devs within the company are using co-pilot and the company is even paying for it.

Just like annual sexual harassment training - it's mostly corporate CYA on liability. If it ever goes to court, they'll plead ignorance and blame the employees who should have known better as they were trained/informed on what they ought not to do.

Paying for co-pilot could bite them though, so I suspect it's a case were the one part of the organization isn't aware of what the other is doing


All of your assumptions are exactly right. They (mostly managers with little to no IT background) want to cover their own asses in case shit hits the fan (unlikely scenario if you ask me, because the company is just overrating the value of their data. Nobody gives a fuck about us anyway...) and many parts of this company have developed their own habits... The company is just very big and I can understand why they might be afraid, but come on, nobody will take that policy seriously forever. You need to eventually put some reasonable rules in place that allow you to make use of such Innovations...


Except, at my company they block software like that. Not only do they block it, but if you try to go to it a security person will immediately call your manager and ask what you are doing.


And people can access those systems via their own devices without the company knowing.


Well yes, but then I’d have to pay for it. That ain’t happening.


There may also be a third more important concern: AI/LLM generated works are not generally copyrightable.[1]

The Copilot lawsuit should answer concern #1 more definitively.

#2 is already solved by running your own model or using Azure OpenAI.

[1]https://www.copyright.gov/ai/ai_policy_guidance.pdf


we try to eliminate this problem by using code models trained only on permissevely licensed code, then you can run them locally without sending code anywhere


Change company. Honestly. If you go as far as to forbid your partners in crime (workers sigh..) to explore new uncharted territory at all - well ya know someone will/might just win by not doing that.


This is particular, specifically problematic territory. I cannot imagine handing over proprietary data to a third party without a contract in place for how that data is stored and used. It’s not about innovation, it’s about using someone else’s tools without ownership. For the other case, it’s both about integrity in owning your own work, and a shield from legal consequences. These things should be very relevant to any business.

I also don’t know any professional devs who have used tools like copilot and said they were anything but a toy. I am more bullish on LLMs than most of my coworkers. I think there is a lot of potential there. I do not see that potential in the current commercial offerings, and the financial outlay to fine-tune an open-source model and run it at scale is…prohibitive.


> I also don’t know any professional devs who have used tools like copilot and said they were anything but a toy.

Really? I'm academic now but I find Copilot at least moderately helpful when I'm writing a library. It's pretty good a lot of boilerplate functions, docstrings, regex, etc. I certainly don't want to go back to not using it, my code is a lot closer to production quality now and looks nicer.

Thinking back to my days in back-end it seems like it would have been very helpful/sped things up so I'm surprised to hear it's just a toy but I've been out of the professional game for a while now. What's the main criticism?


That's not banning all uncharted territory, it's banning specific legally fraught territory.


Working for an 80.000+ employee company, one has already accepted a certain degree of inertia.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: