Friday, December 13, 2013

How Crowdsourcing Is Like The X Factor


This crowd at a cricket match are probably as good at
choosing a national pop icon as they are at translating.


Over the last few weeks we have ashamedly become somewhat addicted to the X Factor in the UK. With the final this weekend, we will certainly be watching and hating ourselves for it. It did get us thinking though, that the X Factor is exactly like crowdsourced translation.


First of all, if you didn't know, the X Factor goes through auditions followed by a series of rounds in which one contestant is eliminated each week. The two contestants with the lowest number of votes, as voted by the public, compete in a sing-off whereby the panel of judges select the better of the two to save, at least until the following week.

Although the finalists are selected by the judges from nationwide auditions, it's certainly not guaranteed that you'll necessarily like all of the finalists. You can only hope that the public will vote to eliminate the worst contestants first.

With crowdsourced translation, the translation solutions are provided by a large community of people, just like the public voting for contestants on the X Factor. Crowdsourcing often comes under criticism for providing a lower quality in translation. One explanation for this is that often there is no prior screening to ensure the skills of the translators who translate, just like how those who vote on the X Factor are not necessarily music experts, talent scouts, or agents. They may know what they like, but that doesn't mean they know what qualities constitute a pop star that will be famous for years to come.

However, some crowdsourced translation efforts are edited or harmonised by a professional, much like the judges that are entrusted to "save" the best act of the two with the lowest number of votes on the X Factor. This cannot always work though, if for instance the public vote has put two acts who deserve to stay in the competition into the bottom two, or if perhaps there are mistranslations that have permeated through the crowdsourcing systems and made their way to the editor.

This comparison provides a good argument against crowdsourcing. If you consider the previous X Factor winners, Steve Brookstein, Shayne Ward, Leona Lewis, Leon Jackson, Alexandra Burke, Joe McElderry, Matt Cardle, Little Mix, and James Arthur, how many have had sustained musical careers, or even success amongst their peers through awards?

The UK isn't the only country to have the X Factor.
In the same way that not everyone is a translator, not everyone is good at choosing the best musicians or singers. Of course, the business model of X Factor is not entirely geared towards producing stars, but rather generating earnings through sponsorship, marketing, TV ratings, and the huge buzz that surrounds the show in the build up to the final. Of course, if crowdsourced translations provided the same buzz and entertainment as a national spectacle culminating in one solitary winner, then perhaps it would be worthwhile having it.

We're definitely not saying that crowdsourced translation is a lost cause that produces terrible results. Much like the X Factor, the proof's in the pudding. Whereas the X Factor has only had 9 "winners" and the quality of those 9 acts is debatable, crowdsourced translation has produced a huge amount of work and from the criticisms levelled at it, the quality is certainly significantly behind that of work produced by an individual professional translator or a harmonised team of translators.

No comments:

Post a Comment