Innovation offers exciting progress but can yield unintended consequences. AU’s Center for Security, Innovation, and New Technology launched in February to explore the relationship between humans and technology—especially the ethics of social media, robotics, big data, and artificial intelligence. Director Audrey Kurth Cronin sought “a bigger voice for the social sciences and humanities, not just for those solely engrossed in technology.” With philanthropic support, the center can bring together government, technology companies, the hard sciences, and humanities in a way few institutions can to help Congress, policymakers, business leaders, and consumers grapple with the full implications of innovation. “The center is dedicated to maximizing the benefits of innovation and minimizing the risks,” Cronin says. “AU has a long-standing dedication to ethics, offering a balanced perspective on new technology and the future of humanity.
On the leading edge
What are the biggest threats to security at home and around the globe? AU experts know. American University boasts some of the world’s top authorities on terrorism and national security, cybersecurity, data privacy, emerging technology, internet governance, and artificial intelligence.
“What we know about security issues today will be dramatically different in 2050,” says Mary Clark, deputy provost and dean of faculty. “These are exciting, dynamic—not static—areas of interest that will continue to grow.”
And AU is on the leading edge of that evolution.
The School of Public Affairs’ homeland security and emergency management programs are the 14th best in the nation, according to US News and World Report, while the School of International Service—the largest school of its kind in the country—offers master’s degrees in US foreign policy and national security and global governance, politics, and security.
At AU’s Internet Governance Lab, faculty directors Derrick Cogburn, Laura DeNardis, and Nanette Levinson—all internationally recognized experts in the field—research how to design and administer the internet at a time when more of our economies, infrastructure, industries, and even our homes are entangled in the web.
The award-winning Investigative Reporting Workshop addresses security through in-depth reports, like SOC graduate student Orion Donovan-Smith’s February Washington Post story on shifting immigration status, identity, and security.
AU’s wide-ranging expertise spans traditional realms of security—state and military interests—and emerging threats like climate change and transnational crime. Technology can empower states, groups, and individuals to be extraordinarily dangerous, giving them unprecedented reach and lethal capacity. At the same time, the rise of connected devices, self-driving cars, and appliances leaves people around the world increasingly vulnerable to hacking via the Internet of Things.
“Our future security depends upon a holistic perspective that looks at both traditional and emerging threats,” says Audrey Kurth Cronin, director of the Center for Security, Innovation, and New Technology. “That’s one of the big advantages of AU. We have people looking at everything: nuclear nonproliferation, the future of work, twenty-first century warfare, counterterrorism, transnational networks, and human trafficking. My center looks at the evolution of technology with both traditional and nontraditional threats because technology is changing both. Our goal is to channel innovation toward a more ethical, stable future.”
Philanthropic investment is essential to advance security research to safeguard our future. For information on how to support big ideas in security, contact Courtney Surls, vice president of Development and Alumni Relations, at 202-885-5900 or email@example.com.
The Human Side of Cybersecurity
Ninety-five percent of cybersecurity research focuses on technology. The Kogod Cybersecurity Governance Center (KCGC) is rare, as it concentrates on what director Heng Xu calls “the weakest link”: human factors.
“Many high-profile data breaches involving billions of dollars and millions of consumers don’t occur because of complicated technology issues—it’s because people fail to [heed] any of the warning signs that are generated by our technology,” says Xu, professor of information technology and analytics. Human factors, from corporate leaders concealing data breaches to designers rushing to launch and missing security gaps, leave us vulnerable.
KCGC research focuses on social equity (to determine if cybersecurity exacerbates discrimination and disparities, or if privacy concerns impede fairness), business dynamics, and data analytics. Xu and Kogod professor Nan Zhang are currently researching the privacy decisions consumers make on their mobile devices. Funded by a $900,000 grant from the National Science Foundation, the project will identify the privacy decisions most likely to trigger regrets among Americans—75 percent of whom use smartphones—and help them rectify those mistakes to safeguard their personal information.
Xu and Zhang have also explored the trade-offs between privacy and the utility of public health data to determine whether the need for anonymity masks health concerns in minority populations that would emerge if the data included demographic identifiers like race and geolocation.
“Traditionally the gap between academic research and industry practices is large and they don’t work together. Also, most cybersecurity research has focused on identifying problems, not solutions,” Xu says. “My goal is to work with practitioners and help develop solutions.”
Philanthropy is essential to advancing KCGC’s research and developing real-world solutions. Philanthropic investors—including Marriott International, Raytheon, the Financial Industry Regulatory Authority, American International Group, and Discovery—are active partners. Corporate advisors present to classes and engage with students, and more collaborations are in the works, including conferences, research projects, and internships for AU students interested in cybersecurity-related fields.