Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The performance difference between i7s of 10 years ago versus today is more than 2x per core, and the number of cores have also more than doubled in that time period.

That's over a 4x performance difference in 10 years, it's not quite the good old days of Moore's law, but it's still an exponential improvement.



That sounds like an S-curve, not an exponential improvement. You can make any monotonically increasing function look exponential if you only have two points (f(n+x)/f(n)) and you get to pick both n and X. "It doubled over the past fifty years!"

If looking at all the points shows the time it takes to double is way longer than it used to be you're probably on the top of the S.

Which, ok, we know we're headed to an S-curve instead. There are limitations on what we can do on silicon, right?

It's just that the CISC/Intel S-curve is way lower than the ARM/M1 S-curve. Intel got used to leaving a margin of performance sitting on the table because it'll be faster next year. They got lazy. As chip gains have slowed those margins have started to add up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: