The difference is that decades ago all movies were about how good and just the US was and now everything Hollywood produces is movies that tell that the US is the devil and other generic progressive slop. Is that really the image the US wants to project to the world?