Germany hardly waged war against the US, toppled their governmnent, occupied it, bombed it, etc.
Japan did a few of those, though after much provocation to achieve exactly that and give an excuse to the US to sell the war to its public.
The US intervened in the European war (and not even decidely so, that's another myth), to ensure their improved role in the post-war environment, as the old European colonial powers were weakened by the war.
What exactly do you mean? It's just a description of Hitler's declaration of war on the US. It did not result in total war being waged on US soil, and it was mostly a strategic blunder on Hitler's part.
Note the count includes actions by Japan against the US. Compare it to the European theatre of war.
> Their lack of ‘total war’ on the US is mostly the consequence of a lack of resources / more pressing concerns.
But this is irrelevant for this discussion. The fact remains that the US didn't suffer total war waged by Germany on their soil during WW2, and this might explain the comment which sparked this thread:
> It wasn't very many years earlier that Germany waged total war against the US but a grudge hasn't held out there...
It's easier to hold a grudge with millions dead, bombing campaigns destroying your cities, etc, don't you think? Arguing formalities such as whether Germany and the US were at war seems pointless in this context, doesn't it?
No, I think you are vastly underplaying the extent to which Germany was America’s enemy. Don’t forget that Jews had escaped Germany to the US, especially prominent scientists like Einstein. The US didn’t ‘hold a grudge’ because the Cold War power struggles didn’t allow for it. West Germany needed to be an ally.
Oh, I definitely agree with this! This attitude also helped shape the narrative of WW2, especially of the Eastern Front [1], by former Wehrmacht officers in the employ of the US Army Historical Division. The Cold War made friends of former enemies, and let them tell their story in an unprecedented way -- an instance of history being told by the losers.
Japan did a few of those, though after much provocation to achieve exactly that and give an excuse to the US to sell the war to its public.
The US intervened in the European war (and not even decidely so, that's another myth), to ensure their improved role in the post-war environment, as the old European colonial powers were weakened by the war.