> This could be more to do with cooling design of the laptop itself rather that the nvidia chip.
Yes, perhaps, but:
1. Nvidia would need to unambiguously specify extraordinary cooling requirements, to avoid difficulties in the field. Apparently they didn't do this.
2. The graphics adaptors of others, in the same laptops, had no similar problems.
Someone may argue that ... oh, wait, you do make this argument:
> High performance graphics chips are going to get hot.
Yes, but if they reliably melt down, that fact negates their impressive specifications.
I'm imagining an advertising campaign in a parallel universe where everyone has to tell the truth -- "Nvidia -- the hottest graphics processors in existence!" Well, yes, but ...
Thermal output management is not new. If the chip overheats, you can always clock itself down or power down some cores. GPUs, being very parallel, should make it even easier than it is with CPUs.
Not being able do transparently do this should be considered an important design flaw.
Sure, but when your chip tends to overheat, you should design mechanisms to reduce the thermal output when the need arises. CPUs have similar mechanisms integrated in them since the early 2000s (remember the videos of AMD CPUs melting down seconds after the removal of a heatsink?).
Yes, perhaps, but:
1. Nvidia would need to unambiguously specify extraordinary cooling requirements, to avoid difficulties in the field. Apparently they didn't do this.
2. The graphics adaptors of others, in the same laptops, had no similar problems.
Someone may argue that ... oh, wait, you do make this argument:
> High performance graphics chips are going to get hot.
Yes, but if they reliably melt down, that fact negates their impressive specifications.
I'm imagining an advertising campaign in a parallel universe where everyone has to tell the truth -- "Nvidia -- the hottest graphics processors in existence!" Well, yes, but ...