The Super Bowl has long been seen as the “tent pole” of American consumer culture: an annual game that routinely pulls in viewers at a scale otherwise achieved only by one-off events like series finales and celebrity car chases. It actually drives sales of TVs: the Retail Advertising and Marketing Association reports that 2.5 million people plan to buy a new TV for the express purpose of watching the game, part of an overall $8.7 billion in Super Bowl-related consumer spending.
If anyone still doubts that youth need to learn how to evaluate online information, those doubts should have been dispelled by a recent hoax perpetrated by the group called the Yes Men. This group, which has a history of staging fake press conferences, decided to draw attention to Canada’s position at the Copenhagen conference on climate change by creating a number of fake Web sites purporting to be, among others, the Copenhagen summit site, the Wall Street Journal, and Environment Canada’s site. While it didn’t take long for Environment Canada to make a statement exposing the hoax, by that time many journalists had reported the story as fact and the story had been widely distributed by wire services.
If you’re a parent, chances are there was at least one video game under the tree this Christmas. Even though your kids may be thrilled by a new title, as a parent you may be less enthusiastic. Even those of us who grew up with Alone in the Dark may balk at the detailed level of violence in Modern Warfare and Fallout: New Vegas, at least when considered as fare for kids. Both of these games receive an “M” rating, which means that they are considered unsuitable for players under 17; as with all other things, though, labeling these titles as ‘for adults only’ often makes them more appealing to the unintended youth audience. In addition to the violence question, there remain issues of meaning in videogames which are harder to track but no less important. So how concerned should parents be about indulging their children’s appetite for virtual violence?
This is the second in a series of columns looking at the history and future of Web 2.0. In the last instalment of this series we examined the origins of the Do-It-Yourself (DIY) ethic and some of the issues around the definition of “user-created content.” Turning from the theoretical to the practical, we’ll now take a look at just what is actually out there, and begin to examine some of the ethical and legal implications.
Malcolm Gladwell’s recent New Yorker article “Small Change” has set the blogosphere buzzing with its strongly stated argument that social networks such as Facebook and Twitter will not usher in a new age of social activism, as some digital evangelists have proposed, but that they and the relationships they foster are actually detrimental to real social change. As Gladwell puts it, “The instruments of social media are well suited to making the existing social order more efficient. They are not a natural enemy of the status quo.”
The old saying that the rich get richer and the poor get poorer applies to cyberspace, too: these maps comparing router and population density show that the developing world has a long way to go to catch up to North America, Western Europe and Japan when it comes to getting online. The One Laptop Per Child project aims to change all that, designing, constructing and distributing Internet-ready laptops to children in developing countries.
The Web is full of great online resources for teachers and students, with new material appearing every day. With the arrival of National Media Education Week, teachers may be looking for fresh ideas to bring media education into the classroom. Here’s a quick overview of recently created (or recently discovered) resources that may help:
The term “Web 2.0” was coined to describe (and, in part, predict) the rise in user-created content on the Net. Recently there have been two stories that show interesting developments in Web 2.0’s evolution: bumps in the road to the anticipated convergence with television, and the rise of 2.0 as alternative journalism.
With Christmas approaching, video games are the one media industry that seems recession-proof, with games topping many wish lists. Parents, though, can find it difficult to tell just what they’re buying for their children. They may know about Grand Theft Auto, for instance, but may wonder what kind of sins are in Sins of a Solar Empire. Of course, nobody wants to disappoint their children: if parents decide not to buy Gears of War, will little Johnny be happy with Rock Band instead? Fortunately, there are both tools and techniques at hand to help parents identify games they might find inappropriate and also to pick appropriate games their children will like.
In the last year or two many writers and researchers have been trying to correct the common perception that young people do not care about privacy. While the public may finally be getting the message that teenagers do value their privacy – as they define it – the idea that younger children have any personal information worth protecting is still a new one. Certainly, most people would probably be surprised to learn how early children are starting to surf the Net: the average age at which children began to use the Internet dropped from age 10 in 2002 to age four in 2009 (Findahl, Olle, Preschoolers and the Internet, Presented at the EU-kids online conference, London, June 11, 2009); and, thanks to the iPhone and iPad, that number has probably dropped even lower.