Since the dawn of computing, we've seen great increases in programming productivity as we've ramped up our level of abstraction. I'd assume that visual programming is attractive as it appears to be the next rung on this ladder. It may turn out to be a dead end, but people still try.
I'd argue that our next big jump in productivity is likely to come when we are able to have a conversation with an AI about the desired function of a program and have the AI write it for us. Whether visual programming will be a part of this conversation remains to be seen.
No we haven't. What do we really have now that we didn't 30 years ago? A web browser is just an IBM 3270 at the end of the day. The boost in productivity is an illusion hidden behind ever more elaborate bells and whistles.
The boost in productivity will come when we stop reinventing the wheel every few years.
Each new batch of programmers is so ignorant of the past that they spend all their time reinventing it (usually, badly) instead of making any real progress.
It won't happen.
The essence of programming is having a vague description of a problem and figuring it out a way to solve it very, very precisely. This is just not a thing that can be done using AI. Any progress that has been made in AI is just the opposite: Put in a very precise question or description of the problem and you may get a vague answer.
Heck, if, even two decades from now we have some form of AI that just barely understands _existing_ code so that it can have at least a simple discussion with the programmer, I would be very surprised.
Speech in conjunction with diagramming is more powerful than either alone. There's a reason most software companies' supply cupboards are stocked with dry-erase markers and notepads.
I'd be disappointed if our hypothetical AI was unable to grok simple diagrams or generate its own. If a conversation with this entity can still be called programming, I can definitely see it having a visual element.
I think the point is that programming languages fail even in that regard: Even if the syntax is simplistic and understanding is easy, writing code requires a deeper understanding both in terms of how to model things in code and how to conform to syntax ("Oh, I got 2 pages of unrelated errors because I forgot a semicolon"). Both problems don't exist when reading, but do exist when writing.
how do you solve the issue you point out? isn't using symbols to represent something machine-readable what a programming language already is?