Civil War’s Alex Garland makes it clear that his movie actually has nothing to say about America
Garland argues that "“left and right are ideological arguments about how to run a state” and nothing more"

From the very first trailer, it seemed like director Alex Garland’s Civil War would be a scathing indictment of something, be it the American Empire in general or more specific concerns like the rising threat of fascism or people who have the privilege to “stay out of politics” even outside of the United States’ borders, but it all seemed a bit murky. The weirdness of the world that Garland had created for the film didn’t help, with California and Texas allied against (and successfully invading) almost the entire midwest and east coast in a new civil war even though that doesn’t make a damn lick of sense, but all Garland had really said—until recently—was that the logistics of the war aren’t really the point and that the film was more about the importance of journalism (Civil War centers on Kirsten Dunst as a reporter documenting the horrors of the war).
That made a fair amount of sense, but during a recent South By Southwest panel (via The Hollywood Reporter), Garland felt the need to clarify the film’s politics by saying that it basically doesn’t have any—at least not in terms of any real world specifics. For starters, Garland explained that Civil War isn’t really about America, because America’s problems can happen and have happened all over the world, even with what he refers to as America’s assumption that it is “immune to some kinds of problems.”