Hacker News new | past | comments | ask | show | jobs | submit login

I've been thinking for the past year or two that we are hitting a bit of an inflection point. Previously, it seems the hardware developments moved somewhat linearly PC -> internet -> mobile but now it seems like there is all of a sudden significant overlap. Multiple new disruptive hardware technologies with significant, potential consumer applications are emerging simultaneously.

I might be just drawing arbitrary lines. "Mobile" depended on a number of technologies to happen. Possibly, looking back, this large number of simultaneous technologies will be described by one overarching technology category.




Your problem may be that you're using hindsight to see the PC, internet and mobile eras and using somewhat irrational exuberance to view the current "disruptions." It's a pretty good bet that not all of them will pan out or have the kind of pervasive impact on society that you're predicting and that 20 years from now, again with hindsight, you'll see the 1 landscape-altering technology that defined the current era and fits into your linear model.

For instance, people have been predicting that VR will be the next big thing for years now. I remember a birthday party more than 30 years ago at an arcade with an expensive VR setup that, while far more limited than today's applications, was still an awkward piece of headgear that's more of a novelty than a potentially-ubiquitous change to society. It's still possible that we hit some sort of inflection point where technology improves to the point where AR/VR becomes unobtrusive enough that it can become ubiquitous, but that's by no means a certainty.

Similarly, I think the jury is still out on IoT, drones, 3D printers and cryptocurrencies/blockchains. If I had to place a bet, I'd say that when we look back on this time period, we'll be talking about machine learning and AI defining this era. The rest of the "current hotness" technologies I could easily see not getting that big.


I have to wonder if some of that growth was related to Moore's Law? Namely, the number of transistors grew and the use of them grew linearly with them in a sort of 'if you build it, they will come' fashion?

Now, as I understand it, growth has slowed outside of the labs and Moore's Law appears to no longer be valid. So, there will likely still be growth, but the growth will be more rare, difficult, and expensive?

Like your VR example, we've often had great predictions of the future and so very few of them actually pan out as expected.

I don't know, it's just a thought I've pondered.


CMOS scaling is close to dead and that’s been a very special tech and tech enabler. It’s not the only way to continue ramping performance and function. See GPUs for example. But it’s hard to replace that kind of tech.


Because the pitch on wires hasn’t decreased nearly as fast as effective feature size, there’s a strong bottleneck effect that limits single-socket devices in terms of their usable compute power. A good rule of thumb since ~2012, and for at least the next 5 years would be ~12 tera-single-channel ops/sec.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: