That's very nice, but speaking for myself, I will never again voluntarily own an Nvidia graphics adaptor. I've seen too many of them overheat and fail in otherwise normal circumstances, resulting in the involuntary abandonment of a laptop before its time.
At one point I had three Nvidia-equipped laptops stacked in my closet, all essentially bricked, each eventually replaced by a laptop equipped with an ATI/AMD adaptor.
I initially had nothing against Nvidia, and Dell seemed to prefer them in its offerings, but experience has forced a change in my outlook.
Meanwhile, ATI are unable to write drivers. I've actually lost work due to that; and it's not limited to a parts from a particular early lead-free process (IIRC the cause of those nVidia problems).
> This could be more to do with cooling design of the laptop itself rather that the nvidia chip.
Yes, perhaps, but:
1. Nvidia would need to unambiguously specify extraordinary cooling requirements, to avoid difficulties in the field. Apparently they didn't do this.
2. The graphics adaptors of others, in the same laptops, had no similar problems.
Someone may argue that ... oh, wait, you do make this argument:
> High performance graphics chips are going to get hot.
Yes, but if they reliably melt down, that fact negates their impressive specifications.
I'm imagining an advertising campaign in a parallel universe where everyone has to tell the truth -- "Nvidia -- the hottest graphics processors in existence!" Well, yes, but ...
Thermal output management is not new. If the chip overheats, you can always clock itself down or power down some cores. GPUs, being very parallel, should make it even easier than it is with CPUs.
Not being able do transparently do this should be considered an important design flaw.
Sure, but when your chip tends to overheat, you should design mechanisms to reduce the thermal output when the need arises. CPUs have similar mechanisms integrated in them since the early 2000s (remember the videos of AMD CPUs melting down seconds after the removal of a heatsink?).
Unless you absolutely need high performance GPU/vector acceleration, I'd suggest going with Intel GPUs. I wouldn't, at this time, support companies that, to put it mildly, can't cooperate with the rest of the Linux ecosystem and that try to work against it when possible.
They make powerful GPUs, but, unless they can reliably perform their functions with the software I want to run, their products are worthless.
I think the problem may be the laptop brand. I've had a couple of Dell laptops and they both got alarmingly hot to the touch. I've now got a Thinkpad w520 (Quad core i7, Nvidia quadro chip) and I can sit with it on my lap comfortably. I don't know how Lenovo do it, but their thermal management is amazing.
At one point I had three Nvidia-equipped laptops stacked in my closet, all essentially bricked, each eventually replaced by a laptop equipped with an ATI/AMD adaptor.
I initially had nothing against Nvidia, and Dell seemed to prefer them in its offerings, but experience has forced a change in my outlook.