I imagine (or hope/wish) that there will remain a market for machines that don't include this. I don't see a day when I'll be using AI systems on my own machines, and I don't want to pay for hardware support that I'll never use.
I suspect Dell isn't going to be all that correct. No way that Microsoft is going to give up server-side processing and all the sweet sweet data it entails.
Actually before finishing this comment I changed my mind. Windows will collect and send all that data to Microsoft regardless whether it runs client-side or server side. Client side shifts the electricity and cloud bill to the user, so that's a big win. It's also hardware that can get obsolete and need to be replaced, which means future sales.
Most PCs already are. Intel CPUs already include a chip for offloading minor "AI" tasks like noise cancellation, and Ryzen AI is already here, as well.
AI doesn't necessarily mean generative, but there are tons of places that a little specialized accelerated chipset would have been used in decades past that maybe now we can abstract to a little GAN+ chip that might be costlier on other hardware.
Think more "assistive technologies to help stabilize mice or correct typing in people with motor function disabilities" than "the spyware that copied me". ...Hopefully.
>I imagine (or hope/wish) that there will remain a market for machines that don't include this. I don't see a day when I'll be using AI systems on my own machines, and I don't want to pay for hardware support that I'll never use.
I dont understand, like... why bother?
You check what CPU instruction or what "tiles" your CPU has in order to decide which CPU has too much specialized stuff that you dont need?
Monetary cost, sure, but what about the other costs?
Regardless, if that's the way things are going, then obviously I'll deal with it. In the meantime, though, this will certainly be a factor in my purchasing decisions as long as I continue to have a choice.
You're paying for all the silicon on the board either way. It's essentially impossible to buy modern components without paying for some amount of dead silicon you'll never use, even if it's just the debug circuits.
That's only for Apple products, which is fine because I don't have to use Apple products. I'm not objecting to machines that have "AI hardware" in them. There are people who want that. I'm objecting to the idea that all machines will have such hardware in them.
To be clear, I don't think that's going to be the case, which is why I started my comment with "Dell wishes".
This is a feature that will be in every new PC once these new CPUs become the baseline.
It's a useful feature for most people. Being able to search my photo library on my Mac for "car" to find that photo of a car I found parked weirdly last week is something I do all the time, and not having to upload all my photos to Google in order to enable it is actually a good thing.
Intel's early integrated GPUs were practically free: their chipsets had a minimum size necessary to accommodate all the various IO PHYs around the perimeter, and they had enough free space in the interior to add a minimal GPU without much increase in total die size. After proving to be a lucrative way to kill off the competition's low-end GPUs, iGPUs started moving into the CPU package and later on-die to offer better performance.
Nowadays, we're running against barriers from the end of Dennard scaling, which prevent chips from operating at 100% utilization and all but forces us to start including more specialized units that will only be used for some tasks.
I imagine (or hope/wish) that there will remain a market for machines that don't include this. I don't see a day when I'll be using AI systems on my own machines, and I don't want to pay for hardware support that I'll never use.