It really comes down to what you think of the human brain.
I don't think computers will ever come anywhere close to the human brain in terms of practical understanding and problem solving. Machine learning involves "Give a computer an awful lot of existing labeled data sets and try to get it to figure out, via un-debuggable magic, what they are." The human brain can figure out novel things without having to have had the huge dataset of previous examples, and I'll point to literally any small child as an example (I have two, and one solid rule of parenting is "They're more creative than you would ever expect").
The next question is then one of power efficiency, and there, again, even if we could do something similar to the human brain (which, see above, I don't think is at all likely), can we scale the required bits down to a car's power budget? A 10kW server rack in the trunk isn't really feasible.
You seem to believe that anything humans can do, we can eventually design a computer to do. And I disagree entirely.
And even with all of that, humans still get confused while driving. Think about the last time you came upon a construction zone with unclear lane markings, or a traffic pattern change in an area where you drive frequently, almost unconsciously.
Absolutely, which is why it should be obvious that self driving is impossible.
IF I stuck STOP signs every 5 meters, a human will realize that they can be ignored and its a prank. Unless autonomous cars are trained for precisely this kind of malice they will fail.
IF someone decides to hunt autonomous cars, there really is nothing in their models which will save them. They aren’t designed for predator/prey dynamics.
When you open up technology to the world, you open it up to the best and worst of humanity. There is no way fully autonomous cars survive.
I can lay a spike strip across 8 lanes of highway traffic whenever I want. It's not like it takes self-driving cars to allow people to be assholes and break the law. It's why we have laws.
Suddenly you don't need a spike strip and physical damage and consequences though, just some cardboard and textas to make stop signs.
The bar for malice will be significantly lower, and it's the car that will be at fault, not the person with the stop sign.
Can I walk around with a stop sign on a pole? Why not? It's not an official street sign. We don't have laws that I know of against carrying around homemade stop signs, I don't think, because any competent driver is able to identify that they aren't official.
Maybe I should get a backpack with a stop sign on it, for when I'm riding the bike.
I mean, you can find fault with different aspects of the example all you like, as long as we agree that AI will fall for tricks and pranks that normal people won’t.
Driving data is inherently culture biased on the most frequent scenarios. But this is a data set with many outliers.
A few days ago drove over intersection, but didn't turn as usual, was confused by old lane markings that were left there, from quite a time ago. Our road environment is quite far from clean.
No, not at all. OP was implying (maybe?, could have read that wrong) that lidar was what was holding Tesla back. I was just commenting that it's obviously _possible_ without lidar because we do it every day without spinning lasers on our heads. Will it ever actually happen? I have no idea at all. Maybe not. But it won't be for lack of lidar. It will be for the lack of everything you just mentioned.
Large transformers display few shot learning similar to children. It’s a major advancement, machine learning is much more than supervised learning nowadays.
> So you really think that in 10.000 years we cant figure and build a computer that rivals the human brain ?
There might be hard limits we'll never break through. All the experts will tell you is "we know that we don't know", because when it comes to the brain we don't know shit.
There are many theories out there and materialism is only one of them. People deify tech and underestimate the brain, I'd advise to do the opposite
We might simply not be able to comprehend our own brain to a level that let use emulate it. The toys we're playing with today aren't even remotely close to emulate any kind of intelligent animal, the best we can (almost but not even fully) do is a worm with 300 neurons [0] (vs 80+ billions for the human brain)
Technocrates told us we'd have AGI and flying cars by now, I'd stay cautious about people claiming reaching AGI is a question of time alone.
OpenWorm has been stalled for like half a decade now too.
That said, there's a huge difference between "the brain may be too hard for us" and "the brain may not even be physical." That's about on the same scale as the difference between "the pyramids were built with advanced mechanical tools" and "the pyramids were built by aliens."
I don't think computers will ever come anywhere close to the human brain in terms of practical understanding and problem solving. Machine learning involves "Give a computer an awful lot of existing labeled data sets and try to get it to figure out, via un-debuggable magic, what they are." The human brain can figure out novel things without having to have had the huge dataset of previous examples, and I'll point to literally any small child as an example (I have two, and one solid rule of parenting is "They're more creative than you would ever expect").
The next question is then one of power efficiency, and there, again, even if we could do something similar to the human brain (which, see above, I don't think is at all likely), can we scale the required bits down to a car's power budget? A 10kW server rack in the trunk isn't really feasible.
You seem to believe that anything humans can do, we can eventually design a computer to do. And I disagree entirely.