720-891-1663

Should We Compromise Security For Preventing Terrorism

After the Paris attacks, politicians have been falling all over themselves trying to be more anti-terrorist than the other.  Prior to the attacks, the odds of the CISA bill in Congress were dicey.  Now the odds are pretty high, even though that bill will do almost zero in terms of preventing terrorism.

One of the big issues is encryption.  Web site encryption (like HTTPS: or SSL/TLS) is really not an issue because the government cracked that years ago.  It takes them a little effort, but it doesn’t really stop them.

A bigger problem is encrypted phones – iPhones and android  – that Apple and Google do not have the keys to decrypt.  This means that the gov has to get a judge to issue a subpoena and then go to the owner, assuming the owner hasn’t been killed, say by a drone strike, and get them to comply.  If the owner is dead or not in the U.S., that is hard to do.  Hence, the government would like to have a secure back door.

However, secure and back door cannot exist in the same sentence.  You can have either one – just not both.  Many noted cryptographers and computer scientists signed a letter to Congress recently stating this, so it is not just me who thinks this is not possible.

Assuming the government or many private companies had a skeleton key to get in (and there would need to be tens of thousands of these keys given the number of software vendors out there) – given the number of breaches of both government systems and private company systems – do you really think that we could keep a skeleton key private for many years.  I don’t think so.  And, wherever those tens of thousands of keys are stored would be a super hot target for hackers.

Then you have the applications to deal with.  They are thousands, if not hundreds of thousands of applications.  Many written by one-person companies in some country like Ukraine or China.

Assuming the government required a back door, do you really think a developer in China would really care?  I didn’t think so.  Do you really think that you could stop a terrorist from getting that software from China or some other country?  No again.

So let’s look at the real world.

According to police reports and the Wired article, police have found cell phones next to dead terrorists – like the ones who blew themselves up in Paris – and in trash cans.  Are these phones encrypted with impenetrable encryption?  No, they are not encrypted at all.

Sure, some terrorists are using software like Telegram that is encrypted.  What we have to be VERY careful about is which software is really secure and which software only pretends to be secure.  The article gives some examples.  If you believe the FBI or NSA is going to tell you which software fits in which category, then I have a bridge for sale, just for you, in Brooklyn.

Once the feds find a phone, they can go to the carrier and get the call log from the carrier side.  That gives you text messages, phone numbers, web sites visited, etc.  Is this perfect?  No, it is not.  They used these facts in Paris to launch the second raid – the one in Saint-Denis – where they killed the mastermind of the first attack.  And, while they have not said this publicly, this is likely how they captured the terrorists in Belgium.

All that being said would the feds love all the traffic to be unencrypted? Sure.  Does that mean they are going blind, like they have claimed?  Nope.  Not even close.

In talking with a friend who used to be high up in one of the three letter agencies, he said that he has been warning them for 10 years that this is going to be a problem and they better plan for it.  How much planning they have done is classified – and needs to remain that way.

Creating the smoke screen that they are going blind is a great way to lull terrorists into a false sense of security – right up until the moment the drone strike happens.  If you don’t think that they are doing this on purpose, I recommend you rethink your position.

In talking with another very high ranking former DHS executive about whether we should weaken the crypto, he is very emphatic that the answer is no.

This is basically a repeat of the crypto wars of the 1990s when the FBI tried to force everyone to use a compromised crypto chip (called Clipper).  The concept didn’t work then.  Now, there is software being developed in every country in the world and if the NSA or FBI thinks that they can put the genie back in the bottle, they are fooling themselves.

I recommend reading the Wired article – it will provide a different perspective on the situation.

Information for this article came from Wired.

Facebooktwitterredditlinkedinmailby feather

Leave a Reply

Your email address will not be published. Required fields are marked *