What matters here is the concept itself (deep learning as a generic technique) but also scalability. Not the specifics that we have today, but the specifics that we will have 20 years from now.
The concept is proven, all that matters now is time...
> The concept is proven, all that matters now is time...
This is a very naive point of view. You could deep-learn with a billion times more processing power and a billion times more data for 20 years and it would not produce a general artificial intelligence. Deep learning is a set of neural network tweaks that is insufficient to produce AGI. Within 20 years we may have enough additional tweaks to make an AGI, but I doubt that algorithm will look anything like the deep learning we have today.
What matters here is the concept itself (deep learning as a generic technique) but also scalability. Not the specifics that we have today, but the specifics that we will have 20 years from now.
The concept is proven, all that matters now is time...