Category: <span>Security</span>

Apple vs. FBI

Syed Farook, the perpetrator in the San Bernardino attack last December, had an iPhone which is now in the FBI’s possession. Despite their attempts, the FBI has been unable to unlock the phone and unable to bypass the device’s privacy and encryption features.

Apple has long taken the stance that protecting the privacy and security of their users’ data is hugely important, and they have implemented strong encryption on the iPhone in order to protect this sensitive information.

Now a court order requested by the FBI is demanding that Apple build a custom version of iOS that circumvents these security features. This would, in the words of Tim Cook, “undeniably create a backdoor” to the iPhone that “would have the potential to unlock any iPhone in someone’s physical possession”.

Apple’s public refusal to comply, along with supporting statements made by some other high-profile technology companies including Google and Microsoft, has rekindled the fierce debate over the value of strong data encryption, and whether the government has the authority to demand that companies disable or defeat it.

The case is even becoming an issue in the presidential race, as candidates weigh in with their positions on encryption and privacy.

Our Digital Privacy Is Important

With the increasing pervasiveness of smartphones and other Internet-enabled devices, we rely on technology more every day for nearly everything we do. From seemingly mundane personal communications like text messaging, to business and financial activities like signing documents, searching and applying for jobs, and online banking– creating, storing, and sending our information digitally is nearly unavoidable.

When our trust that information we believe to be private is violated, the consequences can be severe. It should be no surprise to anyone involved in designing software products today that software involved in critical business communications and transactions must preserve security and privacy.

But what’s less well understood is that our personal interactions online can make us vulnerable, and personal information is increasingly the target of so-called “bad actors” who want to compromise our privacy in order to commit crimes from identity theft and fraud to stalking and burglary.

As makers of software products, we have an obligation to design and build our products to embody the principle of protecting the privacy and security of our users and customers. This is true regardless of whether the products we make are used for business or finance where security is an obvious concern, or for entertainment, gaming, or social networking where the privacy risks are either less obvious or seemingly less important.

What is Strong Encryption and What Is It Used For?

The most widely used strong encryption technologies are free, open source technologies that anyone can use. They provide key functions that protect your privacy and security, and they are integrated into many products from web browsers and mobile apps, to enterprise email systems, stock trading networks, and cellular phone communications. Encryption is everywhere.

Some of the uses of strong encryption include:

  • Preventing unauthorized third-parties from accessing information in storage devices like hard disks, smart phones and “cloud” services.
  • Protecting data from snooping in transit on the Internet or cellular data networks.
  • Verifying authenticity of messages and documents, and ensuring they have not been tampered with.
  • Proving the identity of a person or company, and preventing imposters from impersonating them in order to attack others.

Of course the same encryption technologies can also be used by criminals to hide information and communication from law enforcement. This is why the FBI wants Apple (and other companies) to provide tools that work around or remove the encryption technologies integrated into their software and devices.

Why Apple is Right to Fight the FBI Order

On February 16, Apple chose to take a public stand against the FBI’s order to make software that enables decrypting data on iPhones, so they can collect evidence in their investigation of the San Bernardino shooting case.

To some, Apple’s opposition to the FBI may seem wrong or even immoral. After all, why would we want to prevent law enforcement from bringing criminals and their accomplices to justice? Of course we wouldn’t.

By supplying software that decrypts data stored on the iPhone in the case, Apple would also be supplying the FBI with tools that could be used to undermine the privacy and security of any Apple device, and therefore anyone who uses one.

Here are some of the specific problems that would arise, which are too important to ignore:

As Apple CEO Tim Cook points out, and as Apple’s follow-up FAQ reiterates, making tools that subvert or remove encryption creates a real risk that criminals or other “bad actors” could steal these tools directly or replicate the methods they employ. This puts everyone at risk, including law-abiding people who uses these devices to store their most personal information.

Once these tools are in the FBI’s hands, there could be tens, hundreds, or even thousands of individual people who gain access to them over time, possibly even extending to other agencies outside the FBI. Even if decryption tools could only be used on FBI premises by authorized personnel, there is no way that the government could honestly guarantee to the public that no person with access to these tools would ever violate their privacy for criminal or political purposes.

There is also a real danger that hackers or organized criminals could get access to these tools through technical means, bribery, or extortion. Criminals could sell or even “weaponize” decryption tools for identity theft, fraud, or other criminal activities, even possibly cyber terrorism.

What’s At Stake

We’ve seen over and over again in recent years, the damage caused to individuals and businesses when criminal hackers steal credit card numbers, social security numbers, and other personal information. But in addition to personal information like contacts and phone numbers, the information we routinely store on our devices can be easily used to gain access to other accounts and systems.

The consequences of a widespread privacy breach on the scale of all smartphone users, or even all iPhone users, could be dire, not just for Apple and the individuals directly affected, but also for the companies and agencies they work for. The potential damage that could be caused by widespread cyber-security breaches puts our economy and our national security on the line.

The erosion of our personal privacy – which strong encryption is designed to protect – could also lead to misuse by private organizations. Imagine your health insurance provider data-mining your location, your driving habits, or who you associate with. Imagine mortgage lenders monitoring your location and discovering you enjoy casinos. What if you went to interview for a job, and they knew every other company you’d sent your resume to, and every other job you’d been turned down for.

Safeguarding our digital privacy and security is critically important. Sacrificing it in the interest of a single case, even with good intentions, is a trade-off that’s not worth making. Allowing this precedent to be set would be to sacrifice constitutional rights that our country’s founders fought fiercely to obtain, and which we should also fight to protect.



This post originally appeared on the L4 Digital blog on February 25, 2016.

Apple Security

Lock-iconI just finished installing an SSL certificate on The main reason was to prevent impersonation and man-in-the-middle attacks while I'm editing or administering my site. I was using SSL to connect to my WordPress admin interface already, but with a self-signed certificate that produces warnings in the browser (in addition to not being as secure as it should be). Now that I have a CA-backed certificate, the warnings go away.

There are a some additional benefits to this:

  1. API clients like dedicated blog editing apps, that validate SSL certs (as they all should when connecting securely) should now work, though I have yet to test this.
  2. Anyone who visits my site can request the secure URL, and get an encrypted connection to protect their privacy. They can also be reasonably sure that they're actually visiting my real site and not an imposter—not that I'm actually worried about imposters.
  3. Google (at least) has started ranking sites that fully support SSL higher in their searches. Not that I'm really big on SEO for my site, but it's a “nice-to-have” feature.

See also: Embracing HTTPS (Konigsburg, Pant and Kvochko)

If you see any problems, please let me know via a comment, tweet or some-such.



The Register: “The 18-year-old allegedly hacked his way into eight banking Web sites as part of a suspected $3 million fraud. He is then alleged to have posted the details of 6,500 card holders on the Web.”

Jake's Brainpan Security

Comments closed