Hacker News new | past | comments | ask | show | jobs | submit login

However, it does work with the playbook of going ever closer to computing appliances from general-purpose computers.

In the beginning there was the motherboard and a CPU. Then, before homogenic PC era, we had dedicated chips that took care of certain operations. The SID chip in C-64. The blitter chip in Amiga. (Can't remember the name, I'm sorry.) Even the x87 math coprocessor in the 386/486 age!

With advent of PC and the megahertz wars, dedicated peripheral chips became less common - except in SoC environment. Where the x86 world went with raw processing power, embedded world had to find ways to fit specialised chips on the board.

My experience is mostly centred around crypto accelerators, but I know from very painful experience that all Maemo devices had on-board DSP units to handle some sound decoding, and pretty much all video processing. So the pendulum swings: CPU for everything, then peripheral devices for specific high-intensity jobs. Some of the most commonly used get integrated into CPU's, making entire classes of chips irrelevant ... until the next CPU-intensive thing comes up, and the main processor is again too slow.

Apple is banking on their ability to both predict and dictate the direction of near-future computing needs. I expect the A7/A10 boards to come up with all kinds of integrated support chips to handle the heavier loads.

As long as their predictions are correct, all is well. Any bets on what's the next CPU burner that will require a dedicated ASIC to preserve even the semblance of battery longevity?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: