Most phone booths and drive-in theaters have gone the way of the dinosaurs. The objects and ideas profiled here may be teetering on extinction, but it’s not too late to save them. The potential loss of these customs and conventions poses complex questions—ones that are worth considering before these are gone for good.
Between email and word processors, many of us rarely write by hand anymore, save for signing credit card receipts. Even that may be on the decline, as more establishments present us with an iPad on which to swipe a signature with a finger.
In many states, the combined effects of keyboarding and standardized tests have led schools to drop cursive from their curricula (some, like Arkansas, are beginning to add it back). But the debate over handwriting continues. In a world in which we most often communicate by typing, does cursive really matter?
Some educators say yes: children who learn handwriting learn to read more quickly, and they improve language proficiency and critical thinking. According to neuroscientists, handwriting stimulates cognitive development in a way typing does not. A 2014 Psychological Science study found that students who take notes on laptops perform worse in answering conceptual questions. Writing in longhand requires students to assimilate information in their own words, whereas typists typically record a speaker verbatim.
Detractors argue that cursive is simply outmoded in modern business and educational environments. Others say the real loss is not cursive, but writing—the intellectual exercise of composition—regardless of whether it happens with strokes of a pen or strikes of a keyboard.
One of cursive’s most enthusiastic supporters is Linda Shrewsbury, a Harvard alum and educator who created an efficient handwriting instruction method. She and her daughter (Time profiled the duo in June) raised money on Kickstarter to produce their workbook, CursiveLogic. Shrewsbury recently gave a talk at the National Archives, cosponsored by Fahrney’s Pens, on “Saving Cursive: New Tools in the Fight for Handwriting.”
Fahrney’s, pen supplier to such language luminaries as Washington Post columnist George F. Will, also hosts an annual handwriting contest. It’s held January 23 to mark National Handwriting Day—the birthday, appropriately, of John Hancock.
Instantaneous access to almost anything we want to know is a luxury. It may also be a step on the slippery slope to mental laziness. Consider the immense variety of questions we can pose to Google (and for which, usually, we’ll receive a satisfactory answer). From the perspective of human endeavors, not having to figure everything out for ourselves is clearly an advantage. But is there a downside to putting our brains on autopilot?
We don’t yet know what the Internet is really doing to our cognitive processes. But there is a strong consensus that even if we aren’t sure what’s happening, something is. Maybe the Internet isn’t literally destroying our attention spans, as some fear. But it is hardly reassuring that we choose not to pay attention, because the Internet has taught us there is always something more, better, different.
That’s the theory of University of Virginia psychologist Daniel Willingham in his January 2015 New York Times op-ed “Smartphones Don’t Make Us Dumb.” He does, however, caution that excessive screen time directs our attention outward, at the expense of inner reflection and creativity.
Idle daydreaming is just one casualty of our constant preoccupation. The ready availability of answers means also we rarely spend time in states of wonder, discovery, and curiosity. The Internet has even squashed our sense of serendipity. Why take a chance on a new restaurant when you can scour reviews and base your decision on the experiences of 20 diners before you?
When future historians study our era, they will undoubtedly remark upon the many innovations that emerged from our exceptional connectivity. Let us hope they do not also point to these years as the ones in which we lost our ability to ponder, speculate, and explore.
We can safely add “proper grammar” to the blame-it-on-the-smartphone pile, along with face-to-face interactions and withering attention spans. The more we rely on text, emails, and social media to communicate, the more we favor speed and efficiency. In an age of multitasking, taking time for proper grammar seems tedious, if not downright uptight. After all, who cares?
Turns out, some people do. Back in 2002, one Guardian writer decried texting as “penmanship for illiterates.” More than a decade later, grammar purists still feel their skin crawl with every “ur” instead of “your.” Although research is inconclusive, one study suggests kids’ use of abbreviated messages worsens their performance on grammar tests.
Tech CEO Kyle Wiens published a Harvard Business Review column titled “I Won’t Hire People Who Use Poor Grammar. Here’s Why.” In business, he argues, good grammar establishes credibility, demonstrates attention to detail, and provides a reasonable indicator of how one might approach other tasks. “If it takes someone more than 20 years to notice how to properly use ‘it’s,’ then that’s not a learning curve I’m comfortable with. So, even in this hyper-competitive market, I will pass on a great programmer who cannot write.”
Grammar is essentially a collection of rules by which we agree to communicate. Naomi Baron, a professor in AU’s Department of World Languages and Cultures, studies linguistics, in which rules focus on patterns rather than “correctness.” Patterns depend on consistency, and that’s what began to erode in the 1990s.
“People were becoming inconsistent in their own speech and writing. Sometimes it would be ‘between you and me’ and other times ‘between you and I,’” Baron says. “When I asked about this kind of fickle usage, the response was the equivalent of ‘Whatever!’ The issue with grammar today isn’t ignoring prescriptive rules. It’s that consistency itself has little cachet.”
There’s no shortage of speculation that civility is passé. Social observers’ hand-wringing about the loss of good manners is fueled by our deepening attachment to portable technologies. We seem to reserve our best attention for our virtual interactions, with scarcely a nod or a smile for the human being in front of us.
Americans are keenly aware of this shift, but we disagree on what constitutes rudeness in the new landscape. According to a recent Pew Research Center survey, 62 percent of adults frown on using phones in a restaurant, but 38 percent think it’s fine. While 75 percent approve of phone use on public transportation, one quarter wish commuters would put their phones away. And 5 percent of adults see nothing wrong with using phones in a quiet theater.
Technology may have spurred the most sweeping changes to our notions of polite behavior, but examples abound of etiquette gone awry with nary a cell in sight. On any given day, we are likely to encounter customers demanding exquisitely customized caffeinated beverages with no word of thanks to the barista and drivers who run red lights and fly through pedestrian walkways. Most such behaviors come down to self-absorption, betraying our belief that we really are the most important person in the room.
It’s no accident that “civility” shares an etymological root with “civilization.” After all, these agreed-upon rules of mutual courtesy are what keep human endeavors running smoothly. In times of stress and strain—say, a jam-packed Metro car in Monday morning rush hour—social niceties can make all the difference.
Thirty years ago, Harvard biologist E. O. Wilson observed that the worst disaster humans could face would not be energy depletion, economic crisis, or even totalitarian government. The most far-reaching disaster, one that would take millions of years to repair, would be the loss of biodiversity.
“This is the folly that our descendants are least likely to forgive us,” Wilson predicted.
Biodiversity is the variety of living organisms on the planet, and protecting it is crucial to our well-being. The fragile interrelationship of species means that a loss of one has ramifications for many. Consider insects pollinating flowers, earthworms sustaining healthy soil, wetlands inhibiting floods. Disruptions to healthy ecosystems impact the spread of disease, the production of food, and global economies.
“All of life on earth is connected,” says Kiho Kim, a marine ecologist and chair of AU’s Department of Environmental Science. “When a species is lost, the integrity of our planet and the vast riches on which we depend are diminished. We also lose our personal connection to nature and the wonder and awe that the diversity of life inspires.”
According to the World Wildlife Fund’s Living Planet Index, the number of vertebrate species populations—mammals, birds, reptiles, amphibians, and fish—fell 52 percent between 1970 and 2010 due to exploitation, climate change, and habitat loss and degradation. Although threats to species such as elephants, tigers, and gorillas attract media coverage, some of the biggest losses often go unseen: marine turtles, certain seabirds, and numerous shark species. The International Union for Conservation of Nature reports steep drops in species as diverse as bison and butterflies, coral reefs and mangrove trees.
The combined effects of humans’ day-to-day consumption determine whether we help or hinder environmental equilibrium. The World Wildlife Fund offers recommendations for individual action.
Reliance on GPS navigation, which we often trust blindly in unfamiliar areas, is bound to occasionally steer us awry. In 2012, a driver following the “turn right” instruction drove into an Alaskan harbor. A Belgian woman drove 900 miles out of her way thanks to a GPS error. (When she reached Croatia two days later, she decided she might need to double-check her route.)
When GPS works, it’s great. When it doesn’t, you might feel nostalgic for the low-tech reliability of that dog-eared, road-weary Rand McNally atlas. Now that our smartphones give us up-to-date maps anywhere, the once-critical paper map has taken a back seat.
Does this mean we are losing the time-tested ability to decipher printed lines and symbols, match them to our surroundings, and use them to identify the best way from Point A to Point B? London’s Royal Institute of Navigation thinks so. It wants schools to start teaching map-reading skills because, it argues, today’s youngsters don’t have them.
Eric Gundersen, SIS/BA ’02, SIS/MA ’03, is the CEO of Mapbox, a mapping platform for developers. He predicts that in the future, paper maps will be most important for survival situations and other special cases. But he says the win/lose question is not so simple. For all their usefulness, paper maps have a major disadvantage: they limit us to the vision of the cartographer. Digital representations of data, on the other hand, have infinite capacity to represent location details.
“The individual choices people make will dictate what is lost or gained,” he says. “Blindly follow GPS directions? Sure, you’re going to lose spatial awareness. Use a great app that personalizes point-of-interest choices and displays them in a brilliant and intuitive interface? Maybe you’re now even more aware of your surroundings than locals.”
If we could travel back to an earlier century, one of the most stunning differences we’d notice would likely be the quiet. The days before car alarms, leaf blowers, and bass-thumping stereos were not only simpler, they were blissfully quiet by comparison. Noise pollution—often in the form of cell phone chatter—has even infiltrated places we once visited specifically for quiet, such as nature trails.
Groups like Noise-Free America and the Noise Pollution Clearinghouse (whose slogan is “Good neighbors keep their noise to themselves”) are fighting back. They push for stronger noise ordinances and call out violators. Noise pollution isn’t just irritating, it’s harmful. A 2014 study estimated that 104 million Americans were at risk of noise-induced hearing loss.
In the District, where the mixed-use trend makes neighbors of residents and businesses, high decibels have raised the hackles of the DC Nightlife Noise Coalition. The DC City Council this year considered a proposal requiring nightclubs to measure and report noise levels during certain hours.
Deborah Norris, director of AU’s Psychobiology of Healing Program and founder of Bethesda’s Mindfulness Center, notes that the opposite of noise pollution is not, as you’d expect, silence.
“Silence is not actually a natural state of the world around us. Before all the manmade sounds, the sounds we experienced were the sounds of nature. If you spend time truly isolated in nature, you will notice that the sounds of the birds, crickets, frogs, and other creatures are nearly constant,and loud!”
Her antidote to the noise and overstimulation that keep our nerves on edge? Mindfulness, which teaches that “true silence resides within.”
Alternately, since we can’t time travel, you could visit the quietest place in the United States, an area in Washington State’s Olympic National Park that’s been dubbed “One Square Inch of Silence.”
Personal finance has undergone major change in recent years—dwindling use of cash and checks, huge jumps in electronic transactions, new threats to financial data security—but one of the biggest changes is almost invisible: the loss of employer-funded pensions. It’s a major shift, according to Kogod School of Business professor Larry Schrenk.
Private pension plans have been around since at least 1899. The US Revenue Acts of 1921 and 1926 were early efforts to facilitate employers’ provision of post-retirement funds, or defined benefit plans, to longtime workers. Many employees today, however, have never known this type of benefit.
“Under defined benefit plans, your employer faced all the risk: if their investments did not cover your benefits, it was their problem. If you lived longer than expected, it was their problem,” Schrenk says. “Under defined contribution plans, you now face those risks, and many people are not ready to deal with them effectively.”
According to the Employee Benefit Research Institute, more workers now rely solely on 401K-type plans. Even unions are having a hard time fighting the trend. Boeing last year eliminated pensions for 30,000 employees in its Washington state facility—what one columnist called “yet another nail . . . in the coffin of the defined-benefit pension in America.”
According to Schrenk, “The loss of pensions will be a momentous change in the way people live after retirement.” That change could entail working longer than planned or scaling back lifestyle to protect whatever retirement savings workers manage to accumulate.
It used to be that if you did something foolish, whether from youthful indiscretion or a lapse in judgment, you might be censured by your community, but the transgression would eventually fade away. It would not become 24-hour fodder for anyone with an Internet connection, and it would not haunt you for years, anytime someone typed your name in a search engine.
In 2014, the European Court of Justice took a major step toward restoring that right to privacy when it ruled that people can ask Google to remove personal information from search results. But while Europe recognizes the “right to be forgotten,” Americans remain divided.
This year, Consumer Watchdog asked the Federal Trade Commission to investigate a similar protection for US citizens. According to one survey, 52 percent of Americans would strongly support such a measure strongly, while 11 percent are opposed.
Privacy is one side of the issue. On the other side are free speech, censorship, and equal access to information. The Washington Post opposed Consumer Watchdog’s proposal, arguing that the government shouldn’t decide what information is available to whom.
In So You’ve Been Publicly Shamed, Jon Ronson profiled people whose bad decisions led to widespread condemnation. One is Justine Sacco, who in 2013 lost her PR exec job and drew vitriolic attacks after offending many with her tweet about Africa and AIDS. Ronson doesn’t defend his subjects, but he does consider what it’s like to be on the receiving end of that online hate.
Most of us, thankfully, won’t ever inspire outrage with our tweets and texts. But we all know how disconcerting it feels to read something about ourselves online that we didn’t realize was “out there.”
We will undoubtedly continue to wrestle with questions of what belongs online. Parents concerned about images of children, people who appear in YouTube videos without their knowledge, job applicants terrified that employers will uncover their embarrassing past—all have a vested interest in finding a solution that is fair, pragmatic, and wise.
DINNER WITHOUT DISTRACTIONS
Between working parents and overbooked kids, we might assume that family dinner has dwindled to downing fast food in the car on the way to soccer practice. In reality, Gallup reports, the number of families sharing the evening meal is holding steady. A December 2013 poll found that 53 percent of families with kids eat dinner at home six or seven nights per week.
What is changing is whether those dinners have digital devices on the menu, threatening the interaction that makes mealtime so valuable. According to a Harris Interactive survey, 56 percent of Americans feel annoyed by electronic devices at meals, and 61 percent say tech overuse affects the family negatively. Just 35 percent, though, have made any effort to limit these intrusions.
What’s more, Harris’s results don’t capture the more traditional dinner companion: television. In a Kaiser Family Foundation survey, 64 percent of children ages 8 to 18 said the TV is typically on during family meals.
Advocates for unplugging argue that screen-free dining helps kids practice social skills, learn to form and discuss opinions, and sustain communication that can encourage them to share problems and concerns. Other research points to long-term benefits in children’s relationships, schoolwork, and future health.
Some researchers propose a chicken-or-egg question: Do families who share meals have better outcomes because they eat together, or because they are already functional enough to gather themselves around the table on a regular basis? As with most aspects of modern families, there’s no simple answer.
Recording the details of our lives has become such second nature, it’s easy to forget that for centuries, historians relied on source materials that were incredibly limited. In the days before mass-produced publications, films, and photographs, written documents were one of the richest sources of information about the way people lived, loved, learned, fought, created, and died.
Handwritten letters carry particular value for historians. Unlike government missives and other official documents, letters often tell the stories of ordinary men and women. In their own words, in unguarded candor and confidence, letters reveal the day-to-day experiences of real individuals.
Even so, letters from famous people are treasures in their own right. The National Archives, for example, has a May 13, 1958, letter from Jackie Robinson, the first African American to play major league baseball, to former president Dwight D. Eisenhower, exhorting him to support civil rights. Letters can also shed an ironic light on historic figures: consider the Archives’ copy of Elvis Presley’s December 21, 1970, letter to former president Richard Nixon, asking him to credential Presley as a federal agent so he could help fight the nation’s drug war.
Jodi Boyle, CAS/MA ’07, is an archivist at the University at Albany. When she shares letters with visitors to the university’s Department of Special Collections, she sees documents that are revealing, riveting, and most of all human.
“I might select a handwritten plea from a European refugee during World War II written on every inch of a piece of paper or the musings and doodles written in marker from one giant of twentieth-century American literature to another,” Boyle says. “Handwritten letters help convey emotions and deeper context, which is often lost in today’s digital correspondence.”
One irony of the modern age is that we’re taking more photos than ever, yet printing them less and less. We can now put photos on almost anything—coffee mugs, calendars, even shower curtains—but our penchant for collecting photos into albums is fading like an Polaroid. It’s true that we carry virtual albums in our pockets, so we can enjoy our favorite images anywhere. Still, there is something special about settling in with an album as we savor and relive our memories.
Sara Neufeld, SOC/BA ’08, CAS/MAT ’10, is a professional photographer and teaches visual arts for Anne Arundel County Public Schools. She treasures her grandparents’ leather albums, which make her feel more connected to those around her.
“With all our digital tools and endless cloud storage space, we are losing these bonding moments, we are creating images that will never be seen for more than a few seconds, and most damaging, we are losing the idea that quality is exponentially more important than quantity.”
Our shifting attitude toward photography may reflect a principle of economics: scarcity increases value. Back when film came on limited-exposure rolls, shutterbugs had to be choosy about the images they captured. Now, unlimited digital capacity frees us to shoot as much as we want.
Although many people still appreciate the concept of assembling photos in an album, particularly for once-in-a-lifetime memories such as a trip to Paris or a baby’s first days, the sheer volume of digital images is often overwhelming. This is one ritual, though, that just might be worth preserving.