Is the search for word-of-mouth consensus ruining the way we talk about movies?
As long as movies have existed, predicting what movie audiences will like has been something of a fool’s errand. The best most people can do is tell you what movie audiences like right now. This month, we can say for sure that they like Gravity. Look no further than the metrics for determining audience approval: second-weekend staying power and CinemaScore grades.
Gravity dropped less than 23 percent in its second weekend, an almost unheard-of figure for a splashy studio movie that opened to more than $55 million the weekend before. For a big-budget, widely advertised film like this, a drop of 50 percent would be routine; a drop of 40 percent would be encouraging. Twenty-three percent is practically a divine miracle. Plus, to seal the deal, Gravity received a CinemaScore grade of A- from paying audiences. It’s settled, then. Everyone likes Gravity! It’s been statistically proven.
But hold on. A highly unscientific Twitter search of the words “Gravity” and “boring” yields endless tweets of dissatisfaction from paying customers, with very few of them talking about how boring the Newtonian scientific phenomenon is. And another recent movie received an A- grade from audiences: Battle Of The Year, which somehow failed to parlay this apparently widespread satisfaction into a phoenix-like rise from the box-office ashes in its second weekend. In fact, it fell 56 percent.
A 56 percent drop would normally be chalked up to good old-fashioned word-of-mouth—except that both second-weekend drops and CinemaScore grades are supposed to be quantifiable translations of that very concept. The latter has sold itself as the go-to source for this information, billing itself the “industry leader is measuring movie appeal among theatre audiences.” The mysterious Vegas-based firm polls opening-night audiences, asks them for grades in order to gauge reactions to mass-appeal movies, and reports the averages to… someone. A few current movies are listed on the its website, but for the most part the grades appear in articles about box office, presumably sent as press releases to the authors of those articles. The lack of a clear database makes these findings difficult to study, but years of perusing casually reported CinemaScore data confirm this much: For long stretches, few films would ever score below a C+, which meant that anything below a B or so would be considered cause for alarm—a rubric that remains part of word-of-mouth conventional wisdom today.
Anecdotally, it seems that audiences have become stricter graders in recent years, less apt to reward a movie with a gentleman’s B and perhaps realizing that doling out a C to a nice young man like Justin Timberlake will not, in fact, end his career. A quick look at the current CinemaScore site reveals that a diverse group of movies—including Runner Runner, Don Jon, The Family, and Getaway—have all scored a C+ or lower, with several more movies hitting the once-rare B-. The rarest CinemaScore grade, the F, has also turned up more often: Three of the eight movies ever to receive an F came out in 2012. (Incidentally, apart from the bad-twist horror movies that tend to garner such vitriol, an F should generally be considered a badge of honor; non-horror recipients include Killing Them Softly, The Box, and Bug, all worth seeing). Even Gravity, with its amazing staying power, “only” earned an A-, which is actually on the low end of crowd-pleasing according to this metric. Ideally, as any marketing wonk will tell you, you want an A or an A+.
The absurdity over disappointment in an A- gets to the heart of CinemaScore’s basic uselessness. As its long-standing grade inflation has declined, interpretations of the grades seem to have risen considerably. A grade of B, once considered the standard, is now routinely described as a lukewarm reaction on sites like Box Office Mojo. Even a hearty B+ tends to be dismissed with a halfhearted shrug. As a critic, I know if I give a movie a B or a B+, I am not saying, “This movie is okay, I guess, or it at least beats talking to people for a couple of hours.” I’m actually saying, “This is a good movie. I enjoyed it. I think perhaps that you, the reader, should see it.”
In fact, this was my thinking the one time I participated in a CinemaScore poll. I had just seen Roman Polanski’s The Ghost Writer, and liked it. As such, I gave it a B, which, in my academic career, I knew to mean “above average,” and which seemed correct for a movie that was much better than most studio thrillers but didn’t quite have the extra vitality or resonance that would secure it a spot on my best-of-the-year list. But according to most CinemaScore interpreters, handing out a B is basically the reaction of someone who channel-surfed blankly for a few hours—neither a wonderful nor a notably terrible experience, just kind of there.
Granted, by paying any attention to the parsing of CinemaScores, I’m contributing to the insanity. Perhaps it’s better, then, to take the word-of-mouth question to the people, via the much-vaunted social media, which supposedly helps word about a movie spread faster and more ruthlessly than ever before. So over to Twitter, where you can find out that audiences hated Gravity because it was super-boring.
To be fair, you can find ill-considered reactions to just about any recent movie on Twitter. But the perverse process of searching for dumb tweets about movies leads to a more central question: Why, exactly, does anyone care about what “general audiences” think of a particular movie? Yes, it makes sense that studio executives would want to know; the $300 million they make from an Iron Man may not count any differently than the $300 million they make from an Iron Man 2, but they’d almost certainly rather have the goodwill associated with Iron Man, if only to make selling Iron Man 2 easier.
But reactions have become part of the cultural consensus on movies, especially as reported by hybrid news/criticism outlets like Vulture or Entertainment Weekly, which thrive on rounding cultural experiences up or down to the nearest “A” or “F.” As much as culture writers like to believe that they champion movies and TV shows and albums based on their impeccable and hard-won tastes, there’s an extra jolt of validation in finding out, however dubious the means, that moviegoers are responding similarly. It’s a larger-scale version of the warm feeling you get when you realize you agree 100 percent with a review by a giant like Roger Ebert or Pauline Kael—or the security of knowing the other three guys you saw A.I. with all thought it was pretty great, too.
The question, then, of why we want to quantify something as diverse, messy, and varied as the general public’s opinion of a movie connects to the question of why we want to find a cultural consensus at all. It’s the same instinct that causes some readers to report a movie’s Rotten Tomatoes “score” as if film critics all had a meeting and agreed to assign a particular movie a particular grade. Even those who understand the actual metrics of the Tomatometer—it includes a lot of critics, good and bad,it relies on a binary thumbs-up/thumbs-down verdict that doesn’t allow for much nuance—may still find themselves referring to it. (Or they’ll talk about the superior accuracy of Metacritic, which ultimately still converts specific human reactions into semi-meaningless data). I know I do, even while understanding that a movie’s Tomatometer number has relatively little correlation with my enjoyment of it.
Even the most fervent Rotten Tomatoes or exit-poll devotees know that no movie will elicit the exact same reaction and opinion from every single person who sees it. Yet that’s not the message that emanates from reports about Tomatometers or CinemaScores or second-weekend drops (which can be explained any number of ways: increased frontloading, stiff or soft competition, excited fans of a particular genre rushing out the first night). Regardless of whether social media may have increased the speed of word of mouth, it’s almost certainly increased the speed of the cultural conversation—and the accompanying desire to consider a movie settled, done, ready for the next one. So The Lone Ranger isn’t just a movie that got mixed reviews and failed to make back its budget; it’s a disaster. Gravity isn’t just a well-reviewed hit; it’s a phenomenon that might win Best Picture.
This is symptomatic of a strange cultural deflation that sometimes places the bar for success ridiculously high: A- or better on CinemaScore! 88 percent or better on Rotten Tomatoes! $200 million or bust! Accept nothing less, and report it as even less than that. When Iron Man 3 came out earlier this year, several publications referred to its predecessor as “critically panned,” despite its 73 percent score on Rotten Tomatoes. (Incidentally, all three Iron Man movies averaged a CinemaScore grade of A. That’s probably not great ammunition for defending Iron Man 2 as a riff-heavy, playfully messy dialogue comedy with a killer supporting cast, but it does suggest the lack of nuance or distinction that go into these polls). At a screening a few weeks ago, I overheard a fellow critic mention, sounding semi-sheepish, that he’s “one of, like, three people” who approved of Alexander Payne’s last movie, The Descendants. It seems a perfect crystallization of consensus-deflation that less than two years after its release, an $82 million-grossing, 89-on-Tomatometer-scoring, Oscar-winning movie with a CinemaScore of A- (just like Gravity!) is now designated as something that, essentially, nobody liked.
Of course, these are just anecdotes from people who are really talking more about their particular circle of peers than the culture at large. But what are numbers converted from words or grade averages converted from audience polls if not mass-scale anecdotes, masquerading as data? On its own, these scores can be an amusing if misguided curiosities. But when studios or pundits or critics start leaning too heavily on that magical, elusive word of mouth, it’s as absurd (and, at worse, as moviegoing-averse) as formulating a screenplay based on box-office stats. In attempting to quantify discussion, the content of that discussion becomes muffled. For a true audience reaction, don’t trust the CinemaScore or the Box Office Mojo or even the Tomatometer. Go experience it for yourself.