Amid all the architectural changes in information technology, the cultural changes in enterprises, and the mindset shifts toward the delivery of customer experience, the topic of information security strangely fails to change very much from year to year.
To be honest, it’s been stuck in a rut for the past five years.
How Many Times Has This Happened to You?
When the organizers of the various sessions of next week’s RSA Conference in San Francisco find out I’m a reporter assigned to cover the entire week, they unlock the dam and flood my inbox with solicitations.
If you’re in the email marketing department of your business, you know what these emails look like already.
“Dear Stocc Fultron: You’ve read the news. There’s huge controversy and explosive allegations swirling around [insert huge security incident here]. Billions of dollars are at stake. Your [insert some part of your humanity] is at risk! [Hackers | Terrorists | Your federal government | Some other government | A major department store chain] knows who you are and can track your every move!
“Not since [insert previous huge security incident here] have things gotten this bad!
“Fortressky Stop-All Security CEO Jake F3rd is more than happy to take time from recess to comment on the steps businesses need to take now to protect themselves from the onslaught of [see catastrophe above].”
Insert “Apple,” “FBI,” and “privacy” into some of the appropriate slots, and you have the pattern for a few hundred e-mails (not an exaggeration) I’ve received this week alone.
This is what the world looks like when what should be an incentive for action becomes, instead, the trigger for the next news cycle.
Outside Our Fortress Walls
Thankfully, the RSA conference has always been a place where serious discussion happens. Whether serious action takes place as a consequence always depends upon the motivations of the respective participants, but at least the discussion is real.
Apple is the most skilled organization on the planet at saying next to nothing about its products or its operations, and having so much said about that next-to-nothing as a result. If only presidential candidates were as skilled.
The company has successfully framed the entire discussion about the Justice Dept.’s request to help them disable the phone-wipe capabilities of a suspected terrorist’s iPhone, into an issue around whether the government has the right to break encryption, or to force manufacturers to break it, or tunnel around it, on its behalf.
As I strongly suspect most of the (real) experts attending RSA next week will affirm, the (real) issue is not the strength of encryption and how to overcome it, but rather the weakness of encryption and how to buttress it.
Yes, there is the deeper, more moral, issue of our rights as citizens to our own information, and when and where we forfeit those rights. We enjoy a reasonable right to privacy as citizens, but the nature of reason evolves more quickly than does technology.
(Sadly, the absence of Justice Antonin Scalia from this conversation henceforth will make it harder for all sides of the debate to form a cohesive thesis.)
But there is this less moral, more technological issue hanging over us: Encryption is a weak protection. We talk about whether a government should have the right to a back door around encryption.
What we fail to acknowledge — especially in those hundreds of PR e-mails — is that no government needs a back door when it can sledge through the front door with ease.
Distributed computing is a force I talk about with respect to the architecture of applications, and an improvement in the way businesses work. But I rarely talk about the performance improvements of distributed computing in the context of algorithmic performance.
If this year, it takes two hours to sledge through an encryption scheme, next year — thanks to distributed computing — it will take two minutes.
Apple’s argues that iOS 8 encryption provides a lock so strong that, once its customer has activated the lock, it cannot fashion a key to open it. This contention is a clever and effective maneuver at portraying itself at the defender of the fortress.
“Strong encryption” is a joke. To be fair, in the modern world, the effort to resolve almost any issue through the construction of a beautiful fortress wall, is a joke.
But it’s our first compulsion. We build walls around what we don’t understand, and then we wander aimlessly around them, like both the heroes and the villains of some rehashed, dystopic sci-fi movie, their motives and motivations so undifferentiated that we forget who’s supposed to wear what hat.
We’ve entrusted some portion of our identities and our personalities to these small devices we’ve begun carrying in our pockets and purses. Then we build digital walls around them — or trust others with that task — and tell ourselves, at last, here’s something outside our own skulls that truly belongs to us: our mobile, personal cyber-workspaces.
We cling to this ownership as though it were a human right, bestowed upon us by some deity, or by whoever was the victor in that last sci-fi epic.
Then we see the Justice Dept. serve Apple with a warrant, we portray the attack on our beautiful wall as the siege of Fort Sumter, and we look for the next version of encryption to restore our myth of impenetrability. In the sequel, we tell ourselves, the empire takes revenge.
Regardless of whatever value you place on your own digital privacy, the solution to the problem of maintaining personal security on mobile devices and the solution to the problem of law enforcement’s accessibility to data that could save lives, are the same solution: a verifiable, evolvable, personal identity system for every citizen.
We talk about this every year at RSA. This time, let’s do more than talk about it after RSA ends.
Title image of a picture print by Currier & Ives of the Siege of Ft. Sumter in the public domain