Hacker News new | past | comments | ask | show | jobs | submit login

Did adding more channels start improving performance in a significant way sometime in the past decade?

Last time I checked, single vs dual channel was like a 5-10% performance difference, mostly useful for integrated graphics (and even then latency was the bigger problem)...




It's highly workload dependent. Many things are cache friendly and won't care. Adding a integrated GPU definitely shows improvements with increased bandwidth and increased channels.

Similarly adding enough cores makes the extra channels helpful as well, for that reason servers often have 4x the memory channels as desktops (8 channels vs 2). Even with 8 channels often the performance scaling for using half the cores vs all the cores is poor, at least for the top spec chips.

AMD seems to have realized this and their next gen will have 24 channels instead of 8.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: