You are here: American University School of International Service Centers Security, Innovation, and New Technology Encryption: A Tradeoff Between User Privacy and National Security


Encryption: A Tradeoff Between User Privacy and National Security

By  | 

The long-standing encryption dispute between U.S. law enforcement agencies and tech companies centers on whether a “backdoor,” or an access key into encrypted messages, should be enabled by private companies and shared with law enforcement agencies.

The argument dates to the early 1990s, when the Internet was still in its infancy, but it has recently heated up in the wake of high casualty terrorist attacks. Most publicly, in 2015, following the terrorist attack at the Inland Regional Center in San Bernardino, California, that killed 16 people and injured 24 others, the Federal Bureau of Investigation (FBI) filed an order to compel Apple to unlock the iPhone of one of the perpetrators. Apple’s response to the court order was firm; they refused to provide U.S. law enforcement with special iOS technology that would grant an infinite number of password attempts to unlock the phone and avoid data deletion, which typically happens after a certain number of failed attempts. The media attention to this case drew widespread attention to encryption debates that have an interesting history in the United States.

In 1993, NSA developed “Skipjack”, which was created in response to encryption becoming increasingly common on both public and private networks. Skipjack was intended to be used solely by the government to decrypt messages using a block cipher, or an encryption algorithm which can both encrypt and decrypt messages using identical keys called a symmetric-key algorithm. The world wide web was not yet widely accessible, so most sensitive communication still went over government public and private networks. But the US government feared that with open access to encryption, criminals could engage in secret conversations protected by the same strong means. So in 1994 the Clinton administration proposed the Clipper Chip, which used Skipjack’s encryption algorithm to scramble conversations on mobile devices while also providing a record of each key that would allow government officials to intercept and decrypt messages at will.

Yet the Clipper Chip project failed for a number of reasons. First, it faced intense backlash from the cryptography community and privacy advocates, who opposed any such effort to weaken encryption under any circumstances. Second, encryption software was quickly becoming ubiquitous, so the public was already accustomed to private communications. Finally, the hardware-based technology complicated the rollout of the Clipper Chip because placing one in individual devices was expensive and tedious. The Clinton administration’s failed deployment of Clipper Chip was the first unsuccessful move to weaken encryption technology in the United States. The same basic encryption debate has intensified over the last few years, with both sides unwilling to compromise.

The current focus of the encryption debate is on end-to-end encryption, meaning encrypted data securely transmitted from one user’s device that can only be decrypted by the end user’s device. However, by emphasizing end-to-end encryption, tech companies cannot both ensure private, secure encrypted communications for users and also provide law enforcement agencies with a key to decrypt the same conversations. Doing both is technically impossible.

The U.S. Department of Justice and legislators across the political spectrum are still examining ways to persuade tech companies to provide law enforcement agencies with exceptional access to track criminal activity; however, this opens up the possibility of abuse. Tech companies fear a backdoor will leave their customers unprotected from malicious actors and from unwanted surveillance. It is also important to note that many tech companies claim they do not retain users’ messages for an extended period of time nor do many have the technology to break encryption algorithms themselves, they assert.

Additionally, there is broad concern from computer scientists and privacy advocates that providing law enforcement with the tools to intercept and decrypt messages will lead to increased security risks and grave human rights violations. It could deeply affect marginalized communities because of the threat of increased government hacking and surveillance. The security risks increase as user data crosses national borders and is therefore more exposed to numerous governments’ interference. Device encryption also protects whistleblowers and journalists working around the world, allowing them to publish their work anonymously online.

Still, the U.S. Department of Justice hopes to pressure tech companies by pointing to laws passed by the UK and Australia, which give their law enforcement the legal authority to require tech companies to create algorithms that provide government access to encrypted messages. Since many corporations operate worldwide, this legal precedent may increase compliance amongst U.S.-based tech companies, potentially ending encryption and heightening data insecurity for their customers.

The debate surrounding encryption and backdoors is difficult to resolve because both law enforcement agencies and tech companies hold absolutist positions. There is evidence that encrypted devices have been used by criminals to “go dark” from law enforcement and carry out nefarious crimes, such as the 2015 Paris terrorist attack. However, providing law enforcement agencies special decryption privileges could create new insecurities for individual consumers, leaving marginalized communities vulnerable to malicious actors and threatening the integrity of tech platforms. It is a “security vs security” dilemma, where bad agents exploit encryption even as everyday citizens and large institutions depend on it to protect the vast amounts of personal data being transmitted.

New capabilities on the horizon may shift the foundations of the debate. The rapid development of quantum computing technologies could enable computers to quickly dismantle even the most complicated encryption technologies. Additionally, the likely adoption of “user-controlled” encryption will give the end user total control over the keys required to recover their data. Both will fundamentally reshape and further complicate the encryption impasse between tech companies and law enforcement agencies.

The conversation must move past the broad and absolutist positions of each side by breaking the encryption argument down into its component parts. Focusing on smaller debates, like mobile phone encryption for data at rest—meaning data that isn’t actively transmitting information between devices—could alleviate the concern that exceptional access for law enforcement would leave data vulnerable to foreign adversaries and other criminal activity. Ultimately, any technical solution and subsequent policy proposal should hold up against important principles that allow for limited access for law enforcement while addressing privacy and equity concerns of consumers. The debate surrounding open encryption remains fraught, but by focusing on smaller, short-term solutions, both sides might find small compromises in this crucial privacy/security conundrum. 



About the Author: 

Pragya Jain is a 2020-2021 CSINT Fellow and current undergraduate sophomore studying International Relations with a minor in Data Science. She has an interest in how emerging technologies can greatly change global relations. Her research focuses include clean energy and technology as well as the implications artificial intelligence will have on humanity.