This article was originally published in The Conversation.
The World Health Organization’s description of “gaming disorder” as an “addictive behavior disorder” includes a vague description of how much digital gaming is too much. The WHO warns that “people who partake in gaming should be alert to the amount of time they spend on gaming activities.” At what point does a leisure activity turn into an addiction?
Games researchers are no strangers to complaints about the dangers of too much game playing. Video games have been blamed for causing aggression, unemployment and even the vitamin D deficiency called rickets. Games have also, of course, been championed for improving surgical skills, encouraging pro-social behavior, aiding in cancer treatment and helping develop new AIDS medications.
New forms of popular media are often targets of public concern, going back to dime-store novels, comic books and jazz, all the way through rock ’n’ roll and rap. But those fears eventually wane, and society embraces work like “Maus,” the first graphic novel to be a National Book Award finalist and rapper Kendrick Lamar, who won a Pulitzer Prize earlier this year.
Digital video games can be exceptionally enticing and engaging. Regarding the risk of addiction, it is interesting to analyze the WHO’s warnings about excessive gaming in the wider context of leisure. As part of the Games for Change conference, I and others who study psychology, serious games and youth advocacy will be talking about the myths of games, media and technology addiction.
Leisure in history
Development psychologists and educators bemoan the overscheduled itineraries of American children, and “being too busy” can be a status symbol oddly juxtaposed with the idea of ultra-luxury leisure and globe-trotting vacations. Indeed, the average medieval peasant only worked 150 days a year, giving them more leisure time than the average U.S. worker today.
Historically, social leisure has evolved with society. Before sports were ubiquitous, the Puritans and other political leaders fought their popularity on moral grounds and as a threat to social fabric.
Later, the Industrial Revolution yielded new leisure pastimes that seemed decadent to prior generations – most notably travel. The new urban working class had the remarkable opportunity to temporarily escape their everyday surroundings and routine. Yet at the dawn of the tourism industry, leisure travel was considered a threat to contemporary politics and society specifically because it helped expand travelers’ experiences.
In the modern developed world, the dominant leisure activity is watching television, followed by other leisure activities like sports and entertaining friends. There’s no evidence that game playing is more dangerous than these other leisure activities. In fact, the academic research provides much more evidence about the dangers of television viewing.
Since the 1960s, researchers have been emphasizing television’s potential for addiction and detriments to quality of life. Beyond investigating how TV viewing supplants other leisure activities, researchers have found watching TV drains productivity, encourages obesity, boosts violent or aggressive behavior and can lead to lower life satisfaction and higher anxiety.
People watch television for far more time than they play video games. In the U.S., people watch an average of 4.5 hours of TV every day. That’s more time than they spend reading, relaxing, socializing, participating in sports, playing digital games and using computers – combined.
Television and games
The WHO seems unconcerned about the effects of TV. This is especially clear when it comes to televised sports. Consider a person who skips household and professional Sunday responsibilities to sit on the couch for hours watching pre-game shows; screaming at referees, coaches and players; and following post-game analysis – or who calls in sick to catch a game or breaks friendships over team rivalries. By the WHO’s criteria, this could qualify as “gaming disorder” – except that it’s about sports on TV, rather than video games. (That doesn’t even consider tens of thousands of sport-focused rioters.)
But sports fans aren’t players, the way gamers are. For athletes, the time commitments far exceed even the most devoted fans’ dedication. The average college athlete in the U.S., for example, spends more than 40 hours a week practicing their sport. Many student-athletes say they lack the time to be students, but we wouldn’t identify them as addicted to their sport.
There’s another way to view dedicated video-game players, too: With the rise of esports, professional gamers net millions in performance payouts, attract arena-sized audiences and even earn college scholarships. What’s the point at which a person with “gaming disorder” turns from mental patient or social pariah into a varsity star with serious professional prospects?
The challenge of measuring game addiction
It can be hard to identify addiction to an activity. Though the WHO warns against spending too much time gaming, that is not the way to measure addiction. Some studies demonstrate that some people who spend more time gaming actually exhibit fewer addictive behaviors than people who play less. In a 2009 paper, the drafters of a game addiction scale for adolescents explicitly wrote, “Time spent on games should not be used as a basis for measuring pathological behavior.” And as a leading researcher into games and behavior put it, “Some people who are depressed stay in bed all day, but we wouldn’t say that they have a bed addiction.”
In the end, humans with leisure time seek escape through weekend trips to the country, a visit with the Cleavers’ 1950s America, or exploring the vast desert of “Journey”. What people are looking for in their leisure time is a break, and just because they enjoy that break – and spend a fair amount of time doing it – doesn’t mean it’s an addiction.