I started researching this article in the hopes of finding a relatively recent Best Picture winner that was certified rotten (less than 60% positive reviews) on Rotten Tomatoes (RT). Unfortunately, no winners since 1970 (I wanted to keep this reasonable, so I started just forty years back) turned out rotten, though a few came close. So I had to move on to the next best concept: have the winners ever been the best reviewed films of the year? But that led to other topics, such as the practice of reviewing a movie years after it was released. All of this led me to ponder the purpose of the Academy in all this and if there is a through line for this article, then that is it.
Looking at the reviews of Best Picture winners compared to the losers is tough, if only because it would be too exhausting to search for nearly every movie’s rating from the past forty years. So yes, there are probably plenty of extra examples that I will be leaving out. But I felt that I did a decent enough job with the films I looked into. This is meant to be an article, not a list, so I’m not going to post a comprehensive list of my search results. Instead, I just want to mention some of my findings.
Let’s deal with perfection first. The Godfather is the only film of the past forty years to win Best Picture and get a 100% on RT. It is not, however, the only film to get a 100%. Aliens, Toy Story, and Toy Story 2 all received 100% but did not win Best Picture.
The idea that a movie can be “perfect,” though, is ridiculous in itself. Films are subjective and a fresh rating on RT doesn’t necessarily mean a movie is good. So why even write this? Well, I have noticed that some film journalists will refer to the RT score of a movie and decide they can probably skip it if it’s rotten. Many people see RT as a way of confirming how good a movie is. The very way critics turn in reviews to RT is too simplistic, though. As a “critic” myself (my reviews are not on RT…yet) I decided to develop my own rating system (see left sidebar) because I couldn’t just give a movie a letter grade or stars, much less just say it is good or bad (or fresh or rotten). I am not in the thumbs up or thumbs down school of film criticism.
All that acknowledged, I do tend to check out RT before any other site to see how a film is doing. I make a point to never read a review of a film before I write my own…but I will check its RT score before I write it. More often than not, my feelings mirror those of the RT contributors as a whole. That’s enough for me to view RT as a site worthy of comparing the Academy’s choices to. I suppose some people might argue that Metacritic would be the better choice for comparison, but I disagree for two reasons. First, I don’t use Metacritic. Second, someone already wrote that article and it was filled with graphs and mathematical equations which I don’t even want to begin to try to understand (the article, by Michael Wallace for significancemagazine.org, can be located here). So I’m sticking with RT.
During my research I discovered another problem: revisionist reviews. Looking over reviews for Blade Runner, I noticed that one critic found the film to be “overrated.” No way was that review written in 1982. There are plenty of examples of this. (I just hope I never come across a review of The Matrix that decries its clichéd use of bullet-time.) The worst reviewed Best Picture winner, Out of Africa sneaking by with a 63%, features reviews that ring of jaded critics who had seen plenty of epic romances over the years and couldn’t stomach this lengthy film from 1985. Times, and taste, change. This means it is a bit unfair to compare the Academy from forty years ago to today’s critics. But I’m going to do it, anyway, because I don’t want my hour or two of research to go to complete waste.
This idea of revisionist hate bothers me, though. I have read and heard plenty of people bash Braveheart, Gladiator, and American Beauty (among others) lately. Some critics treat it as a foregone conclusion that those films didn’t really deserve Best Picture that year. The Usual Suspects scored an 87%, which bests Braveheart’s 77%. Crouching Tiger, Hidden Dragon had a 97% to Gladiator’s 78%. And The Insider is above American Beauty 96% to 89%. Does that mean any of these films deserved the Oscar more than the actual winners? It’s all up to personal taste, of course, but the point is that better reviews don’t guarantee Best Picture Oscars.
It could be that hindsight is 20/20 and the voters of the time just didn’t realize which films would have the most lasting effect. (For example, even if you didn’t care for Gangs of New York, it wouldn’t be ridiculous to claim that that film deserves a second look more than that year’s winner, Chicago.) The Academy doesn’t have hindsight, but its choices are made in the moment, for better or worse. It’s interesting to look back and see what types of films were winning in the 1990s (or any decade) and realize that they would have little or no chance of winning if released today. Epics like Gladiator and Braveheart are deemed too action heavy, or too long, or they are accused of “trying too hard” or some other critique the film wasn't receiving upon its original release. Does Rocky win today? How about Unforgiven? Who knows? All that is certain is that these films did win, no matter what we think of them now.
So what has been the point of this rambling discourse on RT and reviewing movies after the fact? I’m not even sure. I can think of a dozen points but none of them fit perfectly with the information I looked up. I could stress that revisionist reviews are just plain wrong and unfair in the grand scheme of things. I am a firm believer in the idea that a movie should be judged with almost no outside information influencing the critic’s opinion. That means ignoring other reviews and ignoring the feedback at large before writing the review. How is it fair to judge Blade Runner in 2011? I dabbled in reviewing older films my first year or so on this site and decided to stop. I just couldn’t write very much about them or even offer anything relevant about them, so I’ve stuck with new releases.
Obviously the Academy is sticking with new releases as well, but they aren’t really choosing the Best Picture of the year. They are choosing an idea of what the Best Picture of the moment is. Sometimes they are wrong (many would argue that they are often wrong), but their choice always says something about taste in movies for that time in history. It turns out that my research into all of the scores of the past forty Oscar winners was kind of pointless, but looking at other films’ reviews compared to past Oscar winners is kind of pointless anyway, especially when it turns out that Pixar films should have won each year because they are always the best reviewed. (For the record, Toy Story 3 is the best reviewed nominee this year at 99% and Inception is the worst reviewed at 86%.)
Film reviews in general will always be subjective no matter what theory or school of criticism is being applied. There are no right or wrong answers in movie reviews, only opinions. The Academy is no different (though one would argue that politics play a bigger factor in their opinions). You will either agree or disagree with the Academy this year, but you’ll look back on this year’s winner and remember that, “Yes, a movie like that could win back in 2011. But it wouldn’t stand a chance in 2021.”
No comments:
Post a Comment