Hacker News new | past | comments | ask | show | jobs | submit login

There were only 9 years between the 1998 Comdex and the iPhone release in 2007. Of the 23 years we have had iPhones for 14 years... Time perception is sometimes strange.



The 10 year span between 2000 and 2010 also had one of the largest relative uplifts in transistor density. Going from 130nm to 28nm was HUGE.

I think people forget how quickly computers (especially in efficiency!) improved from just 2000-2005 and again from 2005-2010.

Everything has leveled out a bit until recently. I think we'll see another 5 year growth spurt once stacked designs become common end of 2022ish. Though TSV vs EMIB is a whole other argument/ball of wax.


The 1990s felt similarly insane. In 1991 we we had Hovertank 3D, in 1993 we had Doom, in 1998 we had Half-Life, and in 2000 we had Deus Ex.

Go back nine years today and things don't feel that different from now. People still buy Skyrim.


AR is probably in its "hovertank" era right now, after years of interesting but largely useless demos.

Someday we're going to wonder how anyone managed to navigate or buy things in shops without realtime vision markup.


I just want a little note that reminds me of a person's name and how I know them.


That probably has more to do with the human bottleneck in games, than anything else.

Big open world is good, if you can manage to fill the big open world with stuff. But procedural generation is annoying and repetitive, but at the same time handcrafting a world is time and labor intensive. See: Cyberpunk 2077


> Everything has leveled out a bit until recently.

You can say that again. I'm typing this on a desktop PC powered by a Xeon that is approaching 10 years old and I don't have any complaints. The cost of upgrading it over the years has been modest.

In comparison I have a Nexus 5 that is about the same age and getting modern Android on it is a pain, the processor really struggles and the physical device itself has seen much better days.


Anecdote: I'm typing this on a 2013 Xeon (Haswell E3v3), and it's acceptable for web/productivity, but the x265 encode performance is absolutely abysmal compared to newer CPUs, especially AMD's. GPU encoding is good for ephemeral streams, but good archival encoding is all CPU-bound: https://forum.doom9.org/showthread.php?p=1908148#post1908148

"""Luckily""" for me I do mostly standard-definition encodes and can get a sustained ~11FPS with my own ICC-built x265 binaries and my own custom "SD" encoder tuning. Every time I encode an HD video I wonder why I haven't replaced this system yet. Then I look at component prices and remember why.


The big jump has been in storage. HDD to SSD is 4-10x and from SSD to NVMe is 4-10x again.


Yes, I was going to say -- the increase in IOPS is off the charts. I worked at an ad-tech startup in 2010 and that was the biggest bottleneck for what we were trying to do. At that point SSDs were a thing on the desktop, but not really in the datacenter still. Too pricey.

Kids today with their insanely high IOPS! They don't know the pain!


I remember upgrading my CPU and GPU every year because of the exciting new technology being released almost constantly.


> 28nm was HUGE

... so to speak.


It's funny how much Apple dominates our memory.

It was only a couple years between the 1998 Comdex and when I got my first wifi PalmOS device with a color screen. At some point my handheld devices acquired the ability to connect directly to a cellular network, which was nice, and then became really nice with the iPhone. But I don't think that was ever a planned feature for the WebPad in the first place.

The form factor is perhaps more interesting. We did have to wait 12 years until Apple legitimized the idea of a device along these lines that doesn't fit in a pocket.


The Palm devices were great, I had one of the originals, then Samsung had a flip-phone version, and Treos followed after. I used the heck out of all of them, and even wrote some code for the platform. It was really sad that they lost their way.

"Form factor" can be a slippery notion. The genius of the iPad design, I think, is that it _couldn't_ be mistaken for an educational device. It actually looked like it belonged in a leather folio next to some important papers, not in your kids' school bag.


Isn't that the truth. I recently came across a meme that I suspect will drive this point home for many in this community. https://www.reddit.com/r/zelda/comments/mj9mim/loz_ww_decide...


The one that gets me is that ENIAC was (mostly) live in 1946 and the IBM PC first shipped in 1981.

The upshot of this is that there's more time between now and the IBM PC than there is between the IBM PC and ENIAC. (18K vacuum tunes, filled a room, and and was programmed via physical rewiring, at least initlally.)

> Time perception is sometimes strange.

Agreed.


I fired up one of my old laptops from 1995 or so. It was a cutting edge machine at the time, very nice. It seemed completely unusable today. The tiny disk drive, for instance.


For me it's the displays. Old laptops had terrible TN panels and are painful to look at today. It makes them near useless even for basic web browsing.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: