Hacker News new | past | comments | ask | show | jobs | submit login

I wouldn't rule it out, but no computing hardware platform has ever been dominant for 50 years before so there are reasons to believe we're not about to get started with that kind of cycle now.

Mainframes, minicomputers, desktop personal computers, laptops, and now tablets and smartphones. Maybe we have reached the end (for 50 years at least), but I think the safer bet is that the next big technology 30 years from now will be something that surprises us, not something we already have or could easily predict having wide adoption.




>*I wouldn't rule it out, but no computing hardware platform has ever been dominant for 50 years before so there are reasons to believe we're not about to get started with that kind of cycle now.

"Mainframes, minicomputers, desktop personal computers, laptops, and now tablets and smartphones" -- yes, so we had like 5 generations of computing hardware platforms. And we began with so premature and begging for improvement technologies (when they started in the 50s) and so large form factors that we had Moore's law going on for decades until a few years past.

Now we've reaching physical limits in CPU shrinking, and we have components so small (e.g. Apple Watch) than if they were any smaller you couldn't handle their buttons, or fit a camera lens, or speakers, or a large enough screen in there.

My point is, just because something has been going on for 5 generations of form factors, it's nowhere near enough evidence to be meaningful statistically or otherwise to deduce it will go on forever.

A turkey is fed every day by the farmer for months on end. But after 300+ observations that make it think feeding is never ending and inevitable there comes thanksgiving.


> if they were any smaller you couldn't handle their buttons

Voice-interfaced intelligent agents like Siri/Alexa/etc will continue to evolve.

Given a sufficiently advanced AI, buttons won't be needed for most computing tasks - any more than stirrups and a bridle were needed to drive the first automobiles.


I don't like talking to my devices, specially because most of the time I'm around other people and I don't want to share what I'm doing with them.


The next form factor is pervasive computing, of which Alexa and Google Home are the first stumbling attempts.


How do you use them in a public space, where others are trying to do the same thing ?


You wear the controls to your own wearable computer/smartphone, which might or not be linked to a better one on the environment that can do a lot more for you.


Voice Biometrics?


And who said you'd like for others to hear what you use them for?


> if they were any smaller you couldn't handle their buttons,

They will connect to our brains.

> or fit a camera lens, or speakers, > or a large enough screen in there.

Not needed when they are connected to our brains.


Trust me, they won't for a while and you don't want them to. Do some research into bio-compatibility. After that, do some research into electrolysis. You can do a fun experiment by connecting a battery to your tongue for a while. Just google it before please to know what is happening.

Meanwhile, non invasive BCI are getting AWESOME...if you are a quadriplegic with no other options.

Now imagine we solve those problems, and the first BCI is on the market. It's safe, it's a little bit faster than typing/reading (hey, first generation right). Who is going to be the first early adopter to undergo massive brainsurgery? Turns out upgarding cochlear implants is so hard they are made to last 100 years (they get surrounded by bones and scar tissue...) so upgrading will be really hard. And who wouldn't want to have their blackberry implanted into their head right now...even worse, if we have proprietary technology in the first ones and now your brainberry can only talk to other brainberries...or we go full internet of shit and it needs a cloud server to do anything:)


Yeah, don't hold your breath for those.

You'll be needing an oxygenated brain for when they arrive.


We will probably end up controlling our microdevices (hopefully optionally implanted beneath our skin) with interfaces like Google's Project Soli[0], voice commands, "Morse code" style tapping commands, etc and also through hub devices like screen membranes underneath our skin or a device similar to a phone.

[0] https://atap.google.com/soli/


My presumption is that we will interact with most computers through wireless devices. For example, the chromecast has no buttons on it, and I haven't felt limited by that.


The desktop PC became the dominant computing platform almost 40 years ago with the introduction of the first generation of affordable home computers (Apple II being the primary example). Even though laptop sales have overtook desktop sales around 10 years ago, the preferred form factor (15"+ screen) and intended use (sits at a desk 90%+ of the time) of the vast majority of laptop owners means the majority of laptops are nothing but desktops that happen to be easier to move to another desk.

Even though the amount of time spent using desktops and laptops in a desktop-like manner is decreasing, it's still a form factor found in every home. There may be more smartphones in that home but in my opinion, any product which is universally present in people's home is dominant even if it isn't being used as often as other options. Another example of this would be the microwave versus the oven. I'm not sure which one is used more often than the other but both are still dominant in terms of their universal presence in every home.


"Dominance" of the PC was more late 1980s than late 1970s, and that's if you count only office use. PCs didn't hit 50% residential penetration until ~1998:

http://farm3.static.flickr.com/2244/2199183615_c2a8acbaff.jp...

Yes, you could buy a PC in the late 1970s. But they were still pretty rare, and even a freestanding home PC didn't become hugely common until the 1990s -- networking was the missing step.

Mainframes have had buffers put between them and end-users, but are still highly utilised today. And the computerisation of the office was proceeding at mixed rates through the 1980s and 1990s (though well advanced by the early 1990s).


Well, imagine being able to project your "laptop" where ever you want using AR? Then you could use it as your home desktop-laptop and carry it everywhere with you.

Tablets are cool and you can carry them everywhere, but you can't really work important stuff on them since they are still too small.

On the other hand, even if laptops had very good battery life (which usually is not the case), they are still too big and heavy to carry around unless really needed (ie. for a meeting or something like that).

But an augmented reality laptop that lives in your phone is something people might want to use. Not for gaming, but for normal office work, be it programming or using spreadsheets.

And, at that point, regular users might think about obsoleting desktop PCs and laptops.


A projected AR display would only be useful for creative and business work if it can completely occlude the background. I won't be able to see the spreadsheet clearly if light leaks through it through from the window in front of me. Due to limits imposed by optics, complete background light blockage is always going to require a bulky, awkward head-mounted display. No one has proposed any way to get around this issue even in theory.


Well, you can always project the virtual laptop's screen on a pizza box or a wall or something like that. But I agree in general. I'd prefer a foldable laptop, but I doubt the technology for something like that is anywhere near, while this AR laptop could be made tomorrow. There already are virtual machines made for mobile phones, all that is left is connecting them to AR glasses and voila.


Walls and pizza boxes aren't sufficiently smooth or reflective to use as projection screens for real work. An AR laptop couldn't be made tomorrow because the glasses exist only in limited prototype form with poor display quality.


I don't disagree with your main point but I think it's worth saying - Pen-and-paper is a computing hardware platform! And it ruled for much longer than 50 years.

And to no small success - they worked out quantum mechanics and a lot of other very difficult problems on this platform.

People still even make games for this platform too.


Well, pen and paper is a memory device; the actual computing was carried out by human brains. But, yes, human brains have been successful for much longer than 50 years, and still enjoy success in many domains…


Maybe slide rules/abacus/soroban ? "Computing" in the most literal sense :)


Except the Moore's law has (almost?) ended now, for the first time in 50 years.


Unfortunately that heuristic is useless in deciding when periods of change end. Ditto when periods of stability end.

I could say, "The manually driven, privately owned internal combustion car and truck have been dominant in transport for 100 years, so the safe bet is that they will continue to be." And that is definitely safe bet. But the interesting work will be done by people making the unsafe bets.


I think we'll struggle for a long while to make the smartphone our main computer. One that can interface smoothly with wireless displays and peripherals at home.

I could totally see a smartphone hooking up to another computer for gamers - like the Thunderbolt 3 chassis allowing you to use an external GPU. Or simply when you walk into your house your phone connects to a more powerful hub giving you sensor data and processing power?

Eventually I think we'll socially "shift" to allowing electronics within our bodies to augment our abilities. We can't fit everything into a smartphone and we can't keep shrinking it. We can make things more efficient but it's more likely we'll begin building onto humans. Bodies are bigger than phones.

I'm more excited than terrified about an implant attached to my optic nerve for true augmented reality. Imagine everyone having a photographic memory. I'd love to see more done to store data on crystal - or on DNA.

Let's get freaky. :-)


You should watch Black Mirror S1E3, entitled "The Entire History of You". https://en.wikipedia.org/wiki/The_Entire_History_of_You


The next big technology revolution is AI. Should be pretty obvious at this point. The next one after that will be consumer technology integrated with human biology. All the fitness trackers are the first primitive attempts at that, but lab on a chip and brain machine interfaces are on their way. The $6 cell phone semen analyzer is a harbinger of this.


>> brain machine interfaces

I'm pretty skeptical of the common "brain machine interface". It probably requires surgery, and for what ? to make people smarter in a world with AI ?

No, i think the main motivation for a brain machine interface should be experience/emotion oriented, And there's a big question if people will take surgery for it, or would they suffice with an external device ?

One such modality is fmri neurofeedback - using fmri analysis to give detailed and accurate feedback of what happens inside your brain, so you could achieve better control of it, and your internal world.

How much better control ? well, there's some early research about treating depression/anxiety, etc. but the more interesting research is about letting people train themselves to reach(very rapidly, unlike years/decades as a monk) a state very similar to the buddhist enlightenment, where you are content and free of ego. Another type of training session has taught people to increase empathy. And i'm sure you could do a lot more, since fmri is relatively accurate and fine-grained.

This line of research is early, and very expensive to do, but the technology is drastically improving[1], and with it will research.

And once you have that, once you could potentially become internally satisfied and happy, why would you risk that with a brain surgery , in order to have somewhat better virtual reality ?

[1]mri's are expensive, but marie lou-jespen is working on a $100 mri helmet


Re your [1]: Mary Lou Jepsen.

https://www.maryloujepsen.com/resume?_escaped_fragment_=#! "Goal: Replace the functionality of MRI (Magnetic Resonance Imaging) with a consumer electronics wearable using novel opto-electronics to achieve comparable resolution to MRI."

(Couldn't find anything else really; https://www.engadget.com/2016/05/05/oculus-exec-mary-lou-jep... "she will focus on "curing diseases with new display technology," by bringing MRI machines to every doctor' office in the world."; links a TEDxYouth talk but that doesn't talk about it.)


> The next big technology revolution is AI. Should be pretty obvious at this point.

Unless you mean strong AI, which isn't anywhere close, I don't think this AI rush has much wider chances than the previous one, except for some narrow use cases.


>The next big technology revolution is AI. Should be pretty obvious at this point.

I don't think so. I think AI is just one of those fads that come up every now and then, and then get abandoned for the next one.

I'm calling AI as in "actual working strong AI" or useful soft AI stuff as fad.

I'm calling "AI" as in the thing currently every VC/company is investing in and we see various helpers like Cortana and Siri based, the machine learning craze, on etc. Besides some obvious and fitting applications (like in self driving cars) it will go nowhere fast.

VR has already reached this stage, without making any large dent on the market in the first place even temporarily.

Note that this is the second time AI is promised and will get nowhere (the first was back in the 80s, with the "AI winter").


Google.com is today's Ai. No reason to believe it will stop improving.


Stop? I can't wait for it to start!


And yet their search result quality subjectively seems to be stable at best or declining slightly. If that's AI then it's certainly not as smart as the spammers and black-hat SEOs. On the mobile side, when the latest version of Google Assistant misunderstands what I want so often as to be worse than useless.

Still hoping to see some improvement but not holding my breath.


Well, Google search results have been declining for some years now, as observed by very many.

And there's nothing to Google.com more than a simple heuristic algorithm, which can be totally dumb to context and nuance.


Any pointers to such accounts?

(I've written my own, but I'm interested in others'.)

https://ello.co/dredmorbius/post/qc4ip4m_33unj0eipgyhga




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: