How one-star and ten-star votes break IMDb's user score
Last week, two fandoms started fighting over a decimal point. An episode of A Knight of the Seven Kingdoms briefly climbed into the same rarefied space as Breaking Badâs âOzymandiasâ. Within hours, users were mobilising. Some piled in with 10-star votes to defend the newcomer. Others countered with 1-star votes to protect an incumbent. None of this is new. IMDb has long had to contend with coordinated vote-stuffing and defend against the claim that its official âIMDb Ratingâ is biased and/or broken. But how much do one- and ten-star reviews actually affect the final IMDb score? I want to take you on a journey behind the numbers to provide context for the current (and future) arguments. To do this, I looked at over 85 million votes cast on IMDb across 73,000 movies. On the surface, it is a single number between 1 and 10, rounded to a single decimal place. Most people use it as a quick, effective way to say whether a film is any good. For example: Terminator 2: Judgment Day has a score of 8.6 out of ten, putting it in the top 3.5% of movies on IMDb. Terminator 3: Rise of the Machines has a score of 6.3, putting it in the bottom 52.2%. We all like a single specific score. The number looks precise, it implies consensus, and is easy to quote. Also, itâs useful. A quick glance at a movieâs IMDb rating is often enough to nudge us towards or away from investing time in a movie. And yet, as soon as you put even a modicum of scrutiny on this number, you realise that it is deeply flawed. Among the problems with it are: It pretends we agree. A single number implies that there is an objectively âcorrectâ measure of a filmâs quality, hiding any sense of polarisation or disagreement. It ignores the extremes of love and hate. Extreme and intense feelings matter. We all have movies we will defend to our last breath, and others we couldnât be paid to watch again. A single rating doesnât reflect the levels of extreme love that 10 out of 10 votes reflect, not those scoring it just 1. (More on this in my piece from a few weeks ago, What makes people never want to rewatch a great film?). What is it even measuring?! Some see the IMDb score as a signal of a filmâs inherent artistic quality, as in âThis is a good filmâ. It can also be used as a reflection of a filmâs appeal, as in âThis is a popular filmâ. Another reading could be of a movie that people want to see more of, as in âI support this kind of movieâ. It doesnât tell us what we actually want to know. If youâre picking a movie to watch, there are so many factors that go into your choice that a single, averaged number from the wider public is close to useless. I would guess that the extent to which a film is entertaining would be greater than its artistic merit. Schindlerâs List is the 6th-highest-rated film on IMDb at 9.0, but few would casually add it to their list when looking for a movie to pass the time. (See Which types of movies make money despite receiving bad reviews from critics?). It ignores who is voting. People are not in a single homogeneous group, and so a single score ignores age, gender, location, cinephilia, etc. Worse than that, one group may dominate: when I last crunched the data, 81.7% of IMDb votes were cast by men, and 45.0% by people aged 30-44. Early voting is not representative. People who see a movie early behave differently from those who see it late, but a single average cannot distinguish between them. This is especially important if the score is used to determine audience behaviour in the crucial opening week of a movie's release. It can be gamed. Vote stuffing to increase or decrease a movieâs score is rife. More on that in a moment. Both the films below (Companion and Vettaiyan) have an official IMDb score of 6.9, but show very different voting patterns. An uneven distribution of votes is not uncommon. This is the world in which IMDb has operated when itâs calculating its final official IMDb Score. The raw data itâs dealing with (i.e., the votes cast) are frequently extreme, reflecting a desire to move the needle rather than passively reflecting the voterâs measured opinion. There are so many films with voting patterns that simply cannot reflect what viewers think of the movie's quality. There are three ways in which a film could be affected by vote stuffing: Push-down, where unusually heavy low-end voting (especially 1-star) is dragging down the average. Push-up, where unusually heavy high-end voting (especially 10-star) is bringing up the average. Collision voting, in which both ends are elevated at once (lots of 1s and lots of 10s), results in the score moving in either direction, depending on edge strength. To get a sense of how many films experience these forces, I classified each film by comparing its 1-star and 10-star vote shares against what is normal for films with similar average scores in the same country. If neither extreme is unusually high, it goes into normal If only 1-star is unusually high, it is push-down. If only 10-star is unusually high, it is push-up And if both extremes are unusually high at the same time, it is classified as collision voting. This reveals that over 10% of films on IMDb are experiencing extreme edge pressures. In a word, yes. Below is a time series showing the rate of extreme voting behaviour. It reveals that since the early 2010s, the number of 10-star reviews has increased massively. One-star voters saw a slight increase, followed by a return to the longer-term average in the most recent years. Eagle-eyed readers will have already spotted two anomalous years in the graph above. The number of ten-star votes in 1994 and 2008 is strangely high. And this neatly highlights another issue with movie ratings - outliers. The spike in those two years is almost entirely down to just two movies - The Shawshank Redemption (1994) and The Dark Knight (2008). 54.9% of all the votes cast for The Shawshank Redemption were ten stars. Given that itâs also the siteâs most rated film (3.2 million votes to date), this is enough to disrupt the whole yearâs ratings. Similarly, 45.6% of The Dark Knightâs 3.1 million votes were full marks. If we remove just those two titles, the chart reads a little clearer. These push-pull effects are not distributed equally. IMDb now shows country-level vote breakdowns, showing the vote counts for the five highest-voting countries for each movie. The differences between voters in each country are likely to be twofold: Cultural differences around what it means to be a fan. In some countries, vote-stuffing may be seen as a proper way to support your favourite movies or star, while in others it may be frowned upon, no matter the fervour of your fandom. Movies of national relevance. Cinema has always been political (no matter what Wim Wenders says), and thatâs never clearer than when a film tells a story associated with a nationâs identity. These are more likely to whip up pride and anger, leading to strategic voting. Fans in India, Bangladesh, and Turkey are most likely to leave one- or ten-star reviews online. Spanish, Swedish, and American fans are the least (from the countries with enough voting data). Genres attract different kinds of audience behaviour. Perhaps appropriately, war films receive the most abnormal voting behaviours. I put this down to their high propensity to be stories of politics, identity and national history. It seems reasonable to assume that most war films make one side out to be more virtuous than another, and so the number of people who feel offended is likely to be higher than the number of people who feel offended by romance films. (I guess one advantage of having your victim be uncaring, shitty boyfriends over a particular national group is that the very people you are slating donât know or care that itâs them being attacked). This highlights a point I made earlier, that a 7.2 average can hide wildly different realities. For a rom-com, it might mean most people are genuinely settled around a 7. For a warâŠ
Send this story to anyone â or drop the embed into a blog post, Substack, Notion page. Every play sends rev-share back to StephenFollows.com - Using data to explain the film industry.