I understand what you're saying, but it's also kind of begging the question. We can't calculate a standard deviation because temperatures are rising, but we also can't test if temperatures are unusually warm because we don't have a standard deviation?
I think the message is simple: standard deviations don't provide insight when your main concern is extreme values (max/min temperatures, in this case).
I'm not expert with these data, but your example is not a fair analog. Let's go back to the actual problem.
From a casual glance at the graphs showing past hot years (each one shows 5), NOAA appears to have records for most cities going back to the 1950s (probably 100 years in some cases).
So, in a set of 63 (1950-2012) values, which are themselves average temperatures (and thus representative of some kind of trend, not just a single hot day), the procedure picks out the 5 highest values of average temperature.
It's reasonable to expect that the sequences corresponding to these 5 highest values will be unusual when compared to the other 58. Nothing weird about that.
Of course, if they had only 10 years data, as in your example above, or if they weren't using monthlong averages, it would be a different story.