Fri. Apr 26th, 2024

It’s Friday night. You’re walking home from a party, a little drunk, and a passing cop suspects you of public intoxication.

What if those photos of beer funnels and keg stands, not to mention some embarrassing childhood snapshots, stored safely behind the passcode of your iPhone were fair game for snooping? Would law enforcement be going too far, unlawfully violating our privacy?

These are just some of many questions behind Apple’s heated standoff with the Federal Bureau of Investigation over creating a “backdoor” to the iPhone, and the answer is complicated.

As I’m sure most of us have seen in news by now, the Cupertino company is locked in a dispute with the American FBI over allowing access to the iPhone of the San Bernardino shooter, who killed 14 people and injured 22 in a terrorist attack last December.

Although the FBI holds a warrant to search the phone, they are unable to access it due to strong encryption protecting the device.

As a result, they’ve asked Apple to develop a “backdoor” allowing them to bypass the fundamental security measures defending the phone’s data.

Apple CEO Tim Cook, along with thousands of individuals online, have responded with disgust. Cook, in a letter published on Feb. 16, called the backdoor “too dangerous to create,” citing the possibility of exploitation.

Granted, this seems like a valid concern. Plus, requiring a private company to create software that detracts from its users’ security doesn’t sound fair either.

This situation presents an interesting constitutional conundrum.

Article IV of the Bill of Rights protects individuals from search and seizure unless the government provides a warrant “particularly describing the place to be searched.”

Until recent history, this provided a very effective guideline for characterizing a lawful search. If law enforcement held a valid warrant, they could break into your home, or jimmy the lock on your car door to perform the search if they had to.

These “places,” as the constitution calls them, are easily identifiable and can be entered with relative ease, even without the owner’s consent.
But how do we deal with the modern reality of places that physically cannot be searched, even if a warrant is held?

This is an underlying issue in the Apple vs. FBI debate, and its resolution will set an important precedent. With that said, there are already established precedents applicable to this situation.

First, the thought of Apple being required to create new firmware at the government’s request sounds off-putting initially, but the government does this to companies all the time and we don’t always place such a negative connotation on it.

Take oil companies for instance. Congress passes bills requiring they meet certain emission standards. This means they have to alter their business practices, or even develop new technology to meet these standards, but citizens accept this because it’s in their best interest. It protects the environment and, by extension, their right to life.

The same could be said of the FBI’s request for the creation of a backdoor.

Apple states that this would harm the consumer by making them less secure, but the absence of a backdoor could hinder the right to justice promised by the same Bill of Rights that instills our right to privacy.

If a murderer’s phone contains incriminating information that, through a warranted search, should legally be available to law enforcement, the inability to access that data could prevent victims or their families from finding justice.

In this way, depending on which rights you most value, backdoors could be considered too dangerous not to create. This conclusion relies on the long-established concept of guarding our rights from contradiction.

Historically, rights such as religious freedom have only been upheld to the extent that they don’t interfere with our other liberties.

For example, acting on a religious belief in murder wouldn’t be protected by the government because it interferes with our right to life.

Similarly, if the right to privacy holds the potential to detract from our right to justice, and possibly even our right to life in cases when information dealing with terrorism is involved, then caveats to privacy must be accepted if we truly believe in protecting all of our rights equally.

Furthermore, when Cook says the establishment of a backdoor will “hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data,” he is insulting the intelligence of the software engineers he claims to hold so highly.

Surely, through collaboration between Apple, the FBI, and other top cyber-security experts, encryption can be created that retains formidable security for users while also allowing for access by law enforcement when the need arises. It may not be the most secure system possible, but it is the most necessary.

The individuals supporting Apple’s steadfast position against the FBI’s requests seem to be promoting privacy as a sort of “almighty” right.

I’ll admit privacy is a critical component of our freedom, but I certainly don’t consider it more important than our right to justice.

We cannot let the protection of one right endanger the exercise of others. If you ask me, Apple and other industry leaders need to create a safe “backdoor” to fix this dilemma.

Luckily, developing solutions to society’s problems is what tech does best.

Bryce Detweiler is a third-year student majoring in communications studies and philosophy. He can be reached at BD846487@wcupa.edu.

Leave a Reply

Your email address will not be published. Required fields are marked *