I wonder if the license is still binding in the other direction though. Moving forward, by publishing the code on the Internet you know you’re training an AI to copy it.
What if you published a subtle proof of concept that takes out nuclear plants, and then some knucklehead deployed it because Copilot suggested it?
I could see certain...agencies doing something like seeding the tech scene with insecure hashing algorithms, and them becoming a part of the Canon, due to consumption by uncritical ML training algorithms.
We get back to the old "data quality" conundrum. We need to have a way to rate the quality of our data, which then opens the door to corruption and gaming.
What if you published a subtle proof of concept that takes out nuclear plants, and then some knucklehead deployed it because Copilot suggested it?