Apple’s Fight With the FBI Highlights Privacy vs. Security Debate
The FBI wants evidence from a terrorist’s phone, but will that make your personal data less secure?
A legal fight between Apple and the FBI is highlighting some critically important issues in the debate between privacy and security.
After the San Bernardino attacks, an iPhone 5C belonging to one of the two shooters, Syed Rizwan Farook, was among the evidence gathered by the FBI. The FBI obtained a warrant to search the contents of the iPhone, but Farook’s iPhone, like most people’s, is protected by a passcode that encrypts the data on the phone and prevents anyone without the code from accessing it.
Apple’s security systems are designed to prevent hacking attempts like “brute-force attacks” by requiring delays after wrong passcode guesses and an auto-delete function that is activated after ten incorrect attempts. The FBI went to court to obtain an order to get Apple to help them get around these security features and access the data on Farook’s phone.
Court orders Apple to help the FBI; Apple refuses to comply
On Tuesday, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California issued an order requiring Apple to provide “reasonable technical assistance” to help the FBI unlock Farook’s iPhone, including specifically instructing Apple to allow them to “bypass or erase the auto-erase function” and the required delay between passcode attempts.
Judge Pym’s order included instructions for Apple to object to the order within five business days if they believed that “compliance with this Order would be unreasonably burdensome,” and that’s exactly what Apple did.
Apple CEO Tim Cook posted a statement on Apple’s website stating their objection to what he described as the government’s “unprecedented step which threatens the security of our customers.”
Calling smartphones like the Apple iPhone an “essential part of our lives” that “store an incredible amount of personal information,” Cook wrote that Apple was “deeply committed” to safeguarding customer data:
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
Describing what the FBI wants as a “backdoor” to the iPhone, Cook argues that they intentionally created the iPhone with safeguards to protect customers’ data, and that this backdoor would be “too dangerous to create.”
Should we really trust the government with this power?
That’s the question posed by National Review’s Kevin Williamson. “You know what would be better than prosecuting those who helped the San Bernardino jihadists?” wrote Williamson. The painfully obvious answer: “Stopping them.”
An arranged marriage to a Pakistani woman who spent years doing . . . something . . . in Saudi Arabia? Those two murderous misfits had more red flags on them than Bernie Sanders’s front yard on May Day, and the best minds in American law enforcement and intelligence did precisely squat to stop their rampage. Having failed to do its job, the federal government now seeks even more power — the power to compel Apple to write code rendering the security measures in its products useless — as a reward for its failure…
From the IRS to the ATF to the DEA to Hillary Rodham Clinton’s super-secret toilet e-mail server, the federal government has shown, time and again, that it cannot be trusted with any combination of power and sensitive information. Its usual range of official motion traces an arc from indifference through incompetence to malice.
Where the federal government imagines that it gets the power to order a private firm to write software to do its incompetent minions’ jobs for them is anybody’s guess. Tim Cook and Apple are right to raise the corporate middle finger to this nonsense.
What do you think? Should Apple fight the court order or should they help the FBI access the data on Farook’s iPhone? In this fight between privacy and national security, where should the limits be?
Follow Sarah Rumpf on Twitter @rumpfshaker.
Donations tax deductible
to the full extent allowed by law.
Guy’s this isn’t a lock or just a case of asking for information. Apple would end up complying if it was just that. The government is saying create something and give it to us, not give us a piece of information you already have. That is a big difference. If Apple had already created the program the FBI wants, it would be a subpoena. Here it is some weird kind of unlimited you are conscripted and now work for us order that tells Apple to spend the creativity and man-hours to write a new program to do what the FBI wants, regardless of if you want to. make a set of safe-breaking tools and to give it to them. First amendment– can the government force speech?
The one key fact missed by most …
The killers had THREE phones.
They physically destroyed two of them. Hammered them into pieces.
Why destroy TWO and leave the THIRD?
Logic indicates it has nothing on it !!!
Think about it.
Would you have destroyed it, if it could be used against ( you / allies / contacts / funding groups ) ? They had the time to do it. They had the hammer in hand. So why didn’t they do it?
This is nothing more than a by-pass of the encryption debate and public opinion.
I think this might force the use of ( EEPROMs / ROMs ) in the CPU’s themselves. No ability to modify them.
Often in an investigation a criminal doesn’t know what will be significant info. There’s a war on even if we’re not fighting it like one.
Still this is overreach by the government. They could and should just subpoena the data off the phone and leave it at that.
Leave a Comment