Hacker News new | past | comments | ask | show | jobs | submit login
Dell promises 'every PC is going to be an AI PC' whether you like it or not (theregister.com)
30 points by rntn 11 months ago | hide | past | favorite | 27 comments



AI is the new multimedia (yes I'm old, I remember it was all the hype in the 90ies) - the word already lost any meaning for consumer products anyway.

Personally I'm a little scared because it will probably end with proprietary closed source implementations in hardware and software that relentlessly send all of my interactions via internet to the vendor anyway.

On the other hand hopefully someone manages to exploit the hardware.

Having fast matrix multiplication available would be great but I guess nobody will bother.


>it will probably end with proprietary closed source implementations in hardware and software that relentlessly send all of my interactions via internet to the vendor anyway.

Ain't the point of this to have the compute capabilities at the edge, so data doesn't have to leave the device?

Also, how that transfer would even work? magically bits would be sent without anyone seeing them?


More like sending my prompts to "improve the user experience" and yes probably not in hardware, who knows.


Reminds me of Dell Inspirons with dedicated Home Media Centre buttons. I had one which even had a tiny remote.

I think the “every PC” is the obvious headline grabbing statement from the VP. At the business end, Dell has made machines for every type including the Linux only Ubuntu pre-installed XPS line, which were highly recommended when they came out. This is just them covering the AI consumer base.


Fast Matrix Multiplication is also useful for physics and other simulations, so I'm sure someone will make some SDKs or something for such AI hardware beyond just the scope of AI.


Dell wishes!

I imagine (or hope/wish) that there will remain a market for machines that don't include this. I don't see a day when I'll be using AI systems on my own machines, and I don't want to pay for hardware support that I'll never use.


I suspect Dell isn't going to be all that correct. No way that Microsoft is going to give up server-side processing and all the sweet sweet data it entails.

Actually before finishing this comment I changed my mind. Windows will collect and send all that data to Microsoft regardless whether it runs client-side or server side. Client side shifts the electricity and cloud bill to the user, so that's a big win. It's also hardware that can get obsolete and need to be replaced, which means future sales.

Ignore me, I have no idea what to think.


Most PCs already are. Intel CPUs already include a chip for offloading minor "AI" tasks like noise cancellation, and Ryzen AI is already here, as well.

AI doesn't necessarily mean generative, but there are tons of places that a little specialized accelerated chipset would have been used in decades past that maybe now we can abstract to a little GAN+ chip that might be costlier on other hardware.

Think more "assistive technologies to help stabilize mice or correct typing in people with motor function disabilities" than "the spyware that copied me". ...Hopefully.


By the current definition of "AI" (which is not actually AI) any search algorithm that uses a database is "AI". The whole thing is just dumb now.


>I imagine (or hope/wish) that there will remain a market for machines that don't include this. I don't see a day when I'll be using AI systems on my own machines, and I don't want to pay for hardware support that I'll never use.

I dont understand, like... why bother?

You check what CPU instruction or what "tiles" your CPU has in order to decide which CPU has too much specialized stuff that you dont need?


It's what I said in my comment: if I'm not going to use it, I don't want to have to pay for it.

If it exists but doesn't cost more money, doesn't increase complexity, energy use, etc., then I don't care one way or another.


Well, since it is huge manufacturing at scale

Then there's possibility that removing it could increase cost, who knows?


Monetary cost, sure, but what about the other costs?

Regardless, if that's the way things are going, then obviously I'll deal with it. In the meantime, though, this will certainly be a factor in my purchasing decisions as long as I continue to have a choice.


You're paying for all the silicon on the board either way. It's essentially impossible to buy modern components without paying for some amount of dead silicon you'll never use, even if it's just the debug circuits.



That's only for Apple products, which is fine because I don't have to use Apple products. I'm not objecting to machines that have "AI hardware" in them. There are people who want that. I'm objecting to the idea that all machines will have such hardware in them.

To be clear, I don't think that's going to be the case, which is why I started my comment with "Dell wishes".


Both Intel and AMD CPUs both now feature NPUs as well https://en.wikipedia.org/wiki/Meteor_Lake#NPU https://en.wikipedia.org/wiki/List_of_AMD_Ryzen_processors#P...

This is a feature that will be in every new PC once these new CPUs become the baseline.

It's a useful feature for most people. Being able to search my photo library on my Mac for "car" to find that photo of a car I found parked weirdly last week is something I do all the time, and not having to upload all my photos to Google in order to enable it is actually a good thing.


But Microsoft and Intel (and Dell) may already be in cahoots on this. So, eventually you may have to give up on those as well.


Could AI hw be emulated time shared on some xilinx program able chip?


I wonder if people felt this way about integrated GPUs. I wouldn’t be surprised to see integrated AI hardware become standard in the same way.


Intel's early integrated GPUs were practically free: their chipsets had a minimum size necessary to accommodate all the various IO PHYs around the perimeter, and they had enough free space in the interior to add a minimal GPU without much increase in total die size. After proving to be a lucrative way to kill off the competition's low-end GPUs, iGPUs started moving into the CPU package and later on-die to offer better performance.

Nowadays, we're running against barriers from the end of Dennard scaling, which prevent chips from operating at 100% utilization and all but forces us to start including more specialized units that will only be used for some tasks.


HP Offers 'That Cloud Thing Everyone Is Talking About'

https://youtube.com/watch?v=9ntPxdWAWq8


Dell, my vacuum cleaner supposedly has AI in it and it still sucks the floor the same as the one before it.


"Dell promises every PC will have graphics/FPU/USB whether you like it or not"

It's not unreasonable to expect AI hardware processing units to be standard in future CPUs or chipsets.


AI Bubble Tracker++


I'm an accelerationist when it comes to enshittification - there will be a pheonix rising out of all this shit.


And it will be taking an even bigger dump on us all. Tech isn't the issue. Tech is neither good nor bad - it all comes down to who has access and control of the tech and that's always rich people and that's always good for rich people and usually no-one else.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: