I Hate Metascores, And You Should Too

rotten2


Metascoring, in case you aren’t aware, is the process of gauging a movie’s quality through aggregating lots of different reviews and spitting out a score based on the percentage of positive reviews it’s gotten. Rotten Tomatoes and Metacritic are two very popular websites that metascore, and you’ve probably visited one or both at some point in your life.

The entire concept is bunk.

Even just the idea of applying a ‘score’ to a movie is stupid, and yet it’s become customary for critics to tack them on at the end of their reviews for some reason. We’ve all seen the five-star system, the four-star system, the percent-out-of-a-hundred system—or out of ten, but with decimals—or maybe the most offensive, the A to F grading, which treats the film as though it were a High School essay on Wuthering Heights rather than a comprehensive piece of art.

The only industry that’s worse about scoring than the film industry is the video game industry—most reviewers insist upon breaking a game down to its component parts—sound, graphics, story, and gameplay—and giving each its own unique score. I used to write for a now-defunct gaming website which did this, and I hated it then too. Trying to come up with something to say about the sound design of Mega Man Zero 4 may be the most Kafkaesque moment of my life thus far.

The reason we have film criticism in the first place is that, despite what Hollywood might tell you, movies are not simple products with clear positives and negatives. They’re nuanced, and can’t easily be summed up—which is why filmmakers make them in the first place, rather than just write their intended message on a sign and hold it up in front of an audience, and why critics write reviews, rather than just hold up a sign with a number on it.

You read a critical review for words, not numbers—words which will help you form your own personal assessment and understanding of a film, whether you agree with them or not. Sure, looking at the end of a review at the little stars might be quick and easy way to tell whether a movie is ‘good’, at least in theory. But in practice, ‘good’ to who? Critics can be wrong. Not just wrong in the sense that you might disagree with them, but also because some movies are quite difficult to score. No critic who sees, for example, The Room, can quantify their reaction numerically. A film which is technically terrible, yet still manages to be entertaining, can’t be explained through mere stars. You can find ways to be funny about it, by giving it a minus score or something, but that sort of thing is just further proof that scoring a movie is, at its core, reductive.

Now, on to metascores.

Taking these critics thoughts, distilling them to mere ‘pass’ or ‘fail’, and then tossing it all into a blender does not reveal some sort of collective ‘truth’—it makes you an idiot holding a blender full of useless, meaningless, homogeneous goop.

Metascores are especially inaccurate for movies that are ‘love it or hate it’. Take Kill List, for example. I’ve seen it twice, each time with a different person present. The first time, the guy I was watching it with thought it was genius and heralded the director as the second coming. The other guy hated it, more so than any other movie he’d ever seen before (until we watched Only God Forgives). The meta-opinion I’m supposed to take from this is 50% then, right? Half-and-half—a very average movie with flaws, but maybe a few good things hiding in there? No. Neither person thought Kill List was half-good. It was either great, or terrible.

Part of the beauty of film is that filmgoers can have such gloriously opposing reactions to the same work, and quantifying their reactions down to a single in-betweeny number just to satisfy some need for everything to fit on some grand, ridiculous scale grinds that beauty to a fine powder. No one is going to think Kill List is average. Giving it a fifty ruins its nuance and makes it seen like a halfway-decent, halfway-crap, lukewarm affair.

Last gripe: Revenge of the Sith, which in my opinion is the least fun Star Wars prequel to actually watch (technically better, but less entertainingly-bad than the other two) has a slightly higher Rotten Tomatoes rating than that of Return of the Jedi, my favourite of the entire series (Revenge has 80%, Return has 78%). The sheer weight of this fact fills me with anger. I’ll guarantee you that very few film critics honestly think Revenge of the Sith is better than Return of the Jedi, or even in the same ballpark of quality, yet there they are, neck and neck. All these numbers really mean is that 80% of the critics think Sith is better than the other two films in the prequel trilogy, whereas only 78% of the critics thought Jedi is as good as the other two in the original trilogy. And remember, these are just the critics Rotten Tomatoes chose. Plus, Sith’s score is based on 254 reviews, whereas Jedi’s is based on a little over 78. And even though Sith has a higher metascore, it has 52 ‘bad’ reviews to Jedi’s 15. All this to say, metascores mean nothing, and make no sense.

I give metascores no stars out of a hundred, plus two lifeless, severed thumbs down.

3 thoughts on “I Hate Metascores, And You Should Too”

  1. Fantastic essay. Great points. I honestly think that most people using Rotten Tomatoes think of the score as an average of the scores of 100 and not a percentage of positive. I recall having the same issues when first using RT and learning of its prevalence — so if 100/100 critics agree that something is fairly good it will have a better score than 98/100 thinking its a top-ten all-time movie and 2 thinking it’s shit? Great piece here.

    1. Yeah, I think Rotten Tomatoes’ popularity rides on the fact that most people don’t understand the scores. If they did, I don’t think they’d use the site.

Leave a Reply

Your email address will not be published. Required fields are marked *