I'm so sad Larrabee or similar things never took off. No, it might not have benchmarked well against contemporary graphics cards, but I think these matrixes of x86 cores could have come to great use for cool things not necessarily related to graphics.
Intel launched Larabee as Xeon Phi for non-graphics purposes. Turns out it wasn't especially good at those either. You can still pick one up on eBay today for not very much.
The novelty of sshing into a PCI card is nice though. I remember trying to use them at a hpc cluster, all the convenience of wrangling GPUs but at a fraction of the performance
Probably not aided by the fact that conventional Xeon core counts were sneaking up on them—not quite caught up, but anybody could see the trajectory—and offered a much more familiar environment.
Yes, I agree. Still unfortunate. I think the concept was very promising. But Intel had no appetite for burning money on it to see where it would go in the long run.
That's where we have to agree to (potentially) disagree. I lament that these or similar designs didn't last longer in the market, so people could learn how to harness them.
Imagine for instance hard real time tasks, each one task running on its own separate core.
I think Intel should have made more effort to get cheap Larabee dev boards onto the market, they could have been using chips that didn't run at full speed or with too many broken cores to sell at full price.
Larrabee was mostly x86 cores, but it did have sampling/texturing hardware because it's way more efficient to do those particular things in the 3d pipeline with dedicated hardware.