Tuesday 18 November 2008

Essay 2: Peer-reviewing films, what gives?

Thinking of peer-production, the service that comes to mind first would be Filmtipset. When joining in 2003 I regarded it as a Swedish response to Imdb. I wanted to find out what people think about quality in movies. I admired their pursue for democratic information compared to mass media. Added to that, I found it interesting to roam about every movie I seen in my life and rate it for myself, then comparing it with my friends or everybody else. After doing that, I ask myself: What can I do with this? After all, they are just rating numbers. What does it tell about quality? What do these grades have to do with the real product? People give any rating they want, but they are far from being as well informed, objective nor responsible as a proffessional. Well that's one issue. Add to that, chances are high that somebody is manipulating the numbers to increase his/her own influence.

Putting it this way might appear as I'm degrading the whole concept of peer reviews, but those arguments did at least not effect the democratizing effect. In this sense, the movie ratings compare a bit with an anecdote about Amazon.com. They were criticized by publishers who claimed their products received unfairly negative reviews. The founder of Amazon countered:
"...we want to make every book available – the good, the bad, and the ugly...to let truth loose" (Spector 2000).

When I joined Filmtipset, I had the notion that mass media in Sweden had a one-sided focus towards Hollywood productions. I wanted to see the unbiased reality. Just collecting as many titles and users ratings as possible was a simple idea, yet serving this purpose more or less. Going beyond the extensive size of this database, it certainly had its flaws. I mean, of course I can't check on every movie in the world only for the sake of being less biased. I would rather let a system of some kind decide the order.

Looking at Filmtipset and Imdb today, they have different strategies. Both are very different from the excellent credibility system of Slashdot. The most appearant difference is actually transparency. Neither wants to reveal their whole strategy, as they claim that vandals would exploit the system for own purposes. In the case of Imdb, the strategy is to use demographic data to create a weighted average rating for each film. But why do they have to hide the scheme they are using? Probably because they don't have a system for user moderation like the one of Slashdot. It seems like a pity, but could serve a purpose, as for making it possible for users to compare one arbitrary movie with another. They don't need the absolute top ranked material, as like the readers of Slashdot.

Hypothetically, what if they did use the system of Slashdot? If everybody submitted articles about movies they considered worth mentioning, and letting the community engage in a discussion. What kind of articles would make it to the front page? And not the least, what kinds of movies? I dare not to guess which direction they would take. More relevant, I presume that they actually would take a direction, just like Slashdot admits.

Incidentally, Filmtipset have some strategies to handle that. When a user have rated a great deal of films, the whole set of ratings are compared with the sets of every other member to find correlations. If there are sets of closely correlated ratings, the system assumes that the users have similar idea of relevance. Based on that, new recommendations, expected rating and such advice are presented. The major drawback I found so far is that the comments and articles are not at all peer-reviewed. Maybe it's just an unimplemented feature. Maybe Slashdot has the advantage of a higher number of peers. Maybe there are qualities of films that are impossible to agree on. In reality, I don't care so much of the grades presented at Filmtipset. The real relevance for me is still something out of control for the system.

1 comment:

Gao Jie said...

I have the same idea about the success of Slashdot is because they have a large number of peers. And I have to say that the method of judge whether a piece of news is of good quality or not is not the same thing as the movie quality judgement. Slashdot does implement an automated system to select moderators from their registered users. Those moderators mark the comments according to the quality of the comments. When they do the judgement, they may spend about 10seconds or more, but will not be as long as do the judgement of a movie. Watch a movie may cost people at least one and a half hour. The movies may be in different languages and of different background culture. It is such a complex work to mark a movie to make the result of good quality. The way of grade the movies as you said may be the easiest way to let the viewers know which one is more popular. When they consider watching a movie, they may have enough time and a tendency to one kind of movie. The grade system is not so good, so they have the top ten movies in one category for viewers to choose from.