Do It Yourself

Do it yourself booklet
Image © Science Museum/Science and Society Picture Library

Note: this is the first in a series of blogs looking at the history and future of Web 2.0.

From Facebook pages to viral Barack Obama speeches, the latest boom to hit the media is the rise of user-created content. Services such as Facebook and YouTube have created a new business model: rather than selling content to consumers, as media companies traditionally have done, they provide the means for consumers to make and distribute their own content (or, as an anonymous contributor on put it, “You make all the content, they get all the revenue.”) The resulting movement, called Web 2.0 by some to distinguish it from the older content-delivery model, has already made fortunes, stealing both employees and the cutting-edge image of companies like Google.

Exactly what user-created content is, however, remains a matter for debate. This is not surprising: with its roots in the 1970s DIY (“Do It Yourself”) culture, which itself sprang from the authenticity-obsessed punk movement, the question of whether or not something is “really” user-created content is bound to be a controversial and political issue. While an opposition to consumerism was an essential part of the early DIY movement, today’s user-created content largely stems from a desire to participate in the creation of consumer culture.

One reason for this is that today’s DIY is a hybrid, born not only from the anti-consumerist movement, but also from computer culture. What makes computer culture unique among media is that, for most of its history, it has consisted largely of user-created content. The earliest computer games, such as Spacewar and Colossal Cave, were amateur products created during slack time on university mainframes and then passed around from lab to lab without hope of profit. Early home computer systems, such as the Apple II, came with programming languages like BASIC installed, which allowed users to create their own programs – the same as if every TV came with a simple video camera. Many games also included “modding” tools, used to create customized content; “Lode Runner,” which allowed users to create their own levels, became one of the most popular games for the Apple II. Unlike other media, then, in computers the line between content creators and consumers was thin-to-nonexistent from the beginning, and nearly all creators started out as fans.

What brought the two parents of user-created content – the DIY movement and computer culture – together was the Internet. While the Bulletin Board Systems of the 1980s had been largely the province of people who identified themselves as computer hobbyists, the introduction of graphic browsers such as Mosaic – and its successor Netscape – made the Internet as user-friendly as Macintosh and Windows had made computers. While the first generation of Internet services tried to make a business of providing content to users, it became clear that users were at least as interested in creating that content themselves.

How much user-created content is actually out there? According to a 2006 Pew Internet and American Life survey, just over one in three Internet users have created some kind of online content, such as an online video, a Web site, a blog or a social networking profile page. (It’s worth noting that Facebook was not open to the general public until September of 2006, too late to be reflected in the study.) There is now enough user-generated content out there to support services devoted to cataloguing, sifting and exploring it, such as Digg, MetaFilter and StumbleUpon.

Neither “Web 2.0” nor “user-created content” are terms with simple definitions. Whether a Facebook profile page, for instance, truly counts as user-created content is a matter of some debate; similarly, YouTube contains as many clips of movies and TV shows as it does material created by its users. Some people feel, as well, that the whole notion of user-created content gives credibility to an outdated division between users and producers. Even the term “Web 2.0” has come under scrutiny, with some calling it nothing but a marketing device, and others saying it simply describes what the Internet has been all along. (Stephen Fry has compared social networking sites to the old “closed” online communities such as Compuserve or America Online.)

What, then, is user-created content? Who is a user, and who is a creator? Ralph Koster, a designer on one of the first massively multiplayer online games, has suggested that all users are creators: even playing a simple video game involves the user in creating a narrative. Koster notes that not all activities require the same amount or level of creative input, but he’s surely right in saying that there are no passive consumers. Even someone watching TV or reading a book is involved in a collaboration with thatproduct’s authors – interpreting characters, anticipating plot events, judging the morality of actions. The key element of user-created content is not the actual content: it is how that content is delivered to an audience that may range from the single digits to the millions. In the next instalment, we’ll look at new genres and media that have their roots in Web 2.0.

For Classroom Discussion

  • What differences do you find between traditional media products (movies, TV shows, etc.) and user-created products? What might be the cause of some of these differences?
  • Consumerism has been defined as “the theory that an increasing consumption of goods is economically beneficial.” (Princeton Wordnet) If we consider “goods” to include media products, would you say Web 2.0 is mostly consumerist or anti-consumerist? Why might this be so?
  • What effect do you think the appearance of Web 2.0 will have on more traditional media? Why?
  • Which kinds of user-created content do you think will be more successful in the long run – those that involve a lot of user involvement (like blogs or videos), or relatively little user involvement (like Facebook profiles)? Why?

Add new comment