Is it a crisis?
The latest news from the NSA snooping debacle suggests it is. If they have the means to deliberately insert vulnerabilities into well known encryption standards and circumvent others, then what were previously thought to be secure connections, to banks, email providers and search engines, may not be anymore.
Bruce Schneier issued somewhat of a call to arms yesterday, asking the engineers to look at how to resolve these problems, and reegineer the internet to our own needs once again, rather than those of some faceless security services personnel, somewhere. I am not at all reassured by reports that the NSA only spied on their exs a few times using these powerful technologies they have at their disposal.
This got me thinking about areas of trust , which really are based on the word of large companies. One such area is SSL, which has long been criticised for it’s reliance on central certificate authorities as the purveyors of trust and identity. When getting an SSL certificate for your server, if you want it to be correctly recognised by web browsers, you must have your certificate issued via a root authority, such as Symantec, Comodo, or Globalsign, or the reseller of these. If I was the NSA, I’d try to get my own access to root certificates, so I could issue man-in-the-middle attacks on encrypted websites. That’s not withstanding problems already reported in the past, with issue of root certificates to untrusted third parties.
Although an end user may see that the website is secure, there is presently no standard validation procedure to ensure the certificate you are receiving is the one you would expect to be receiving. This has been well publicised, and there’s been several cases of commercial companies using it to their advantage – notably, nokia in their mobile web browsers.
What this essentially means is that it isn’t that technically difficult to trick a user into submitting their ‘secure’ traffic via your proxy. All the traffic will be encrypted – until it gets to your proxy – then you read it all, and forward it on to the actual website the user was trying to access. They believe they are accessing the website directly, but in reality, it’s all being decrypted by some third-party on the way.
There are several ways to defeat this – one is the extended validation (EV) certificate that some companies, notably banks, often use. These certificates can’t be spoofed – so you know if you are seeing a padlock in green, the browser is verifying it in a separate way. These are well and good, but most sites do not use them, and again, they are only as secure as the keys embedded in the browser. The green bar is also worthless for Internet Explorer, which has a way to add your own EV certificates, for ‘convenience’. I think the engineers at Microsoft sort of missed the point of these certificates entirely.
A more promising solution is the DANE standard, a co-technology of DNSSEC. DANE allows the fingerprint of an SSL certificate to be entered as a DNS record. Your browser can then verify that the certificate you are receiving, is the one the site owner intended you to receive, and not issued by a third party in transit.
This standard sounds great, but as yet, it’s not supported by browsers. There are some extensions to allow people to use it – but the average user certainly isn’t going to do that. DNSSEC rollout has been slow, and most people’s domains do not yet have the keys needed to verify the validity of the DNS records either.
This is promising though – technologies are already there to improve the integrity of the internet, it’s just a case of using them. And there’s nothing like a major security scare to push people to start implementing more secure means of communications.
If the NSA want to get into your computer, they probably can. But that’s not what we are trying to prevent really – it’s the casual snooping of data, from anyone and everyone, just because they can, which is the problem. No warrants, no court orders, just riffling through your underwear, without anyone’s permission.
I can see in the next few months, more revelations coming out. I am already eyeing up my android phone with suspicion – it would be easy enough for the NSA or GCHQ to write nefarious code into the operating system to track people’s locations, turn on the microphones or cameras, or record calls and texts they sent. We already know they get co-operation from Google, so why not? Indeed, it was already revealed that Apple was tracking user locations in an iPhone cache file – now it seems to me that this could have been one of the helpful security issues NSA would be happy to exploit.
The problem is – companies that we trusted to be acting in their customer’s best interests, have now been revealed not to have been. They often seemed to prefer the approval of the NSA, than of their own customers. If that isn’t a privacy crisis, then I don’t know what is.