I would say that having a healthy skepticism (which is not the same thing as pessimism or cynicism) helps make it more likely for things to work out in the long run.
It's the difference between saying, "Elon Musk wants to colonize Mars, wouldn't that be cool?" and "you're going to be living on Mars someday, better get ready". It's also the difference between saying, "self-driving cars would be great, let's try to make that happen" vs. "I'm not going to buy another car until I can get a self-driving one, that's how sure I am that this is imminent".
Really it's the difference between underpromising and overdelivering vs. overpromising and underdelivering. The people who are the most enthusiastic and vocally certain about the inevitability of these things in the near future are more of a threat to making those things happen than the people who are cautiously optimistic but skeptical about them.
IMO that's the reason to be skeptical- just to be realistic and to help the general population set their expectations appropriately. Nothing to do with bracing yourself for an emotional letdown. If overenthusiastic Mars/self-driving cars/AI people are vocal enough to convince the media and the general population about the near-term revolutionary inevitability of these things, then after a few years of those things not happening, the public will flip in the opposite direction. Then voila, no more tax breaks or research funding or other public support for those things, and they go from "might happen in my lifetime" to "will never happen in my lifetime" in reality, not just perception. Right now if you want to work on autonomous vehicles and know your stuff, then maybe there's a job or a grant for you someplace. If public opinion flips to "these people have been saying this stuff is just around the corner for 20 years, this has all been a waste of time and money", then boom, no more government assistance, no more venture capital, no more big companies spending lots of money on R&D for this stuff.
It's the difference between saying, "Elon Musk wants to colonize Mars, wouldn't that be cool?" and "you're going to be living on Mars someday, better get ready". It's also the difference between saying, "self-driving cars would be great, let's try to make that happen" vs. "I'm not going to buy another car until I can get a self-driving one, that's how sure I am that this is imminent".
Really it's the difference between underpromising and overdelivering vs. overpromising and underdelivering. The people who are the most enthusiastic and vocally certain about the inevitability of these things in the near future are more of a threat to making those things happen than the people who are cautiously optimistic but skeptical about them.
IMO that's the reason to be skeptical- just to be realistic and to help the general population set their expectations appropriately. Nothing to do with bracing yourself for an emotional letdown. If overenthusiastic Mars/self-driving cars/AI people are vocal enough to convince the media and the general population about the near-term revolutionary inevitability of these things, then after a few years of those things not happening, the public will flip in the opposite direction. Then voila, no more tax breaks or research funding or other public support for those things, and they go from "might happen in my lifetime" to "will never happen in my lifetime" in reality, not just perception. Right now if you want to work on autonomous vehicles and know your stuff, then maybe there's a job or a grant for you someplace. If public opinion flips to "these people have been saying this stuff is just around the corner for 20 years, this has all been a waste of time and money", then boom, no more government assistance, no more venture capital, no more big companies spending lots of money on R&D for this stuff.