If I think you're fluent, I might think you're an idiot when really you just don't understand.
If I know they struggle with English, I can simplify my vocabulary, speak slower/enunciate, and check in occasionally to make sure I'm communicating in a way they can follow.
If those don't apply, as mentioned, if I realize I will as mentioned also ignore them if I can and judge their future communications as malicious, incompetent, inconsiderate, and/or meaningless.
There is still so much to be done with LIdar. I myself am so intrigued of the output these devices present. The data and capabilities that are possible with this tech is amazing to me. Terrain scanning to autonomous vehicles the list goes on. Lasers are amazing breakthrough for mankind and breakthrough for the physics that involved, light that can be manipulated to seek alternative outcomes of data or real-world applications.
100%. They are trying to get YouTube a exclusion from the list, or make the list the non-default. I already know the next step is that the "community" is going to fork the list, and the forked list is going to be heavily advertised on YouTube channel as a way to support the channel.
> There is a strong para-social relationship with many younger internet users, so maybe people really do feel strongly about affecting their favorite youtube star's view count?
100% this. They were even threatening him with facing the ire of social media if he didn't reopen the issue.
A law like this would obviously need some sort of sensible definition of what "AI" means in this context. Online translation tools also use ML models and even systems to unlock your device with your face do, so classifying all of that as "AI contributions" would make the definition completely useless.
I assume the OP was talking about things like LLMs and diffusion models which one could definitely single out for regulatory purposes. At the end of the day I don't think it would ever be realistically possible to have a law like this anyway, at least not one that wouldn't come with a bunch of ambiguity that would need to be resolved in court.
OK, so define it for us, please. Because, once again, this thread is talking about introducing laws about "AI". OP was talking about LLMs you say - So SLMs then are fine? If not, then where is the boundary? If they're fine then congratulations you have created a new industry of people pushing the boundaries of what SLMs can do, as well as how they are defined.
Laws are built on definitions and this hand-wavy BS is how we got nonsense like the current version of the AI act.
Why are you so mad at me, I'm not even the OP you should ask these questions. I'm also not convinced we need regulation like this in the first place, so I can't tell you where this boundary should be, but a boundary could certainly be found and it would be beyond simple spellchecking autocorrect.
I also don't understand why you think this would be so impossible to define. There are regulations for all kinds of areas where specific things are targeted like chemicals or drugs and just because some of these have incentivized people to slightly change a regulated thing into an unregulated thing does not mean we don't regulate these areas at all. So how are AI systems so different that you think it'd be impossible to find an adequate definition?
Saying something is the law is never a valid argument in discussions like this. There has been multiple historic examples of things being the law until everyone realized how unethical the original law really was.
Yes, but for now that's the law. So, the pragmatic answer is: they have to do it because otherwise their stuff will be pulled from the shelves and they will be fined. This does not prevent them from discussing with the Commission, which they are already doing, or lobbying for changes in the regulations, which their are also already doing within the limits of the political system.
reply