Unlocked iPhone opens privacy debate

The FBI recently dropped a lawsuit against Apple that sought to force the phone manufacturer to unlock the iPhone used by a shooting suspect in San Bernardino, California. Before the suit was dropped, Apple said they did not have the ability to unlock the iPhone adding that creating a program to unlock the phone would be arduous and create a back door which would essentially nullify the encryption meant to protect the private information of the phone’s user. The lawsuit was dropped because the FBI claims to have found a way into the phone, though no one at the FBI is saying how they unlocked the phone.

Reuters reports, “Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the [FBI], which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used.”

Lawsuit dropped, phone unlocked, case closed, end of discussion, right? Not so fast. The San Bernardino iPhone was just one of the phones the feds wanted to access and since Apple says they don’t have the ability to unlock the phones, some in Congress now want to compel the company to install a back door on their devices.

UPI reports draft legislation filed by Senators Richard Burr & Diane Feinstein would “effectively requires tech firms, in accordance with a court’s order, to hack into devices involved in criminal investigations when authorities can’t do it.” This would most assuredly require software on every device to be preloaded with a back door the feds – and only the feds, don’t think for a moment that hackers could access the back door clearly marked “for federal agent use only” – would be able to use to access encrypted data in an “intelligible format” (i.e. encypted data must become unencrypted data).

Kevin Bankston, the director of the New America Foundation’s Open Technology Institute, told Wired “this is easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen.”

Wired notes the legislation is dangerous because it does not limit the legal obligation placed on tech companies “to ‘reasonable assistance’ but rather ‘assistance as is necessary,’ a term that means the bill goes beyond current laws that the government has used to try to compel tech firms to help with data access.”

Bankston adds the bill would also require all “license distributors [such as] Apple’s app store, Google’s play store, [and] any platform for software applications somehow has to vet every app to ensure they have back door[s] or little enough security to comply.” Adding the proposal “seem[s] to also be a massive internet censorship bill.”

This is not the first time that Diane Feinstein has tried to force tech companies to do the government’s bidding. Luckily her previous attempts to stifle free speech and privacy have failed. This legislation will likely have the same fate. Privacy advocates say this bill is so bad that it has “almost zero likelihood” of passing. And that is good news for everyone who supports the right of people to develop, sell, and use encryption technology and distributed software that allows users to protect their privacy.