#RSAC: Panel – Encryption and Back Doors: The Line Between Privacy & National Security

#RSAC: Panel – Encryption and Back Doors: The Line Between Privacy & National Security

A panel of security experts gathered at RSA Conference 2017 this week to discuss ‘Encryption and Back Doors: The Line Between Privacy and National Security’, exploring some of the main ethical, technical and business issues that currently surround the topic and what they mean for the future of the data protection community.

Getting proceedings underway, panel moderator Bree Fowler, technology writer, Consumer Reports, asked:

Can you shed a little light on the nuts and bolts of back doors and encryption, and how they work?

Will Acklery, CTO and co-founder, Virtru: Encryption I think we all know, but in terms of what a back door actually is, there a lot of different definitions. A lot of people consider one thing a back door and others something else, but I think there are really three criteria that make up a backdoor: intent, consent and access. The first around intent; excluding a vulnerability which was not intentional, rather focusing on whether somebody did something purposeful for unintended access to your data. The second is consent; making sure that if the capability is not disclosed to a user and it’s hidden, then that will basically check the box on the consent criteria. The third is access; if this capability can provide access to your data without you knowing it, it should be considered a back door. In a lot of ways it boils down to transparency. If there’s a capability in an application or a device and you are not aware of it, and you have to trust someone as a result and they can get access to your data, then that is considered a back door.

So are there legitimate reasons for [US] government to have access to back doors and what would be some examples?

Deborah Plunkett, principal, Plunkett Associates LLC: Listening to what has been publically stated by the previous administration, and to some extent the current administration, the US government certainly seems to be very much in favor of security and strong encryption, but the other side of that of course is the responsibility of the government to protect the populous, which includes being able to gain access to information that might be needed to keep the US safe. Really those two are a dichotomy; how can you have strong end-to-end encryption security and also have a mechanism by which, should they need it, the government has a way to get information? That is the million dollar question: first of all should there be a way, I think technically there could be a way, but should there be a way is another question. Technically if there is a way, how can it be done to install the right level of confidence and trust and to keep that information away from those who would use it to bad affect, that really is a challenge.

Have researchers been able to create a back door without problems? Technically is this possible right now, and is this something that scientists should be chasing from a practical perspective?

Will Acklery: There certainly have been back doors created in the past, but those technologies have not necessarily been immune to abuse. We have to be extraordinarily careful about the precedent that we set. There are some really powerful capacities that I think can be brought to bear to individuals so they have more choice, they have more transparency around whether they want to give someone access to their data under certain conditions.

So is there an opportunity to make a compromise and meet in the middle when it comes to back doors and encryption, even if it’s not from a technical standpoint?

Jedidiah Bracy, editor, The International Association of Privacy Professionals: For me, if we mandate back doors in the US, that means China, Russia, other countries around the world are going to want to do it as well. I think that would be impossible to manage.

Lastly, given the uncertainties surrounding policy that are coming up, are companies changing the way that they do business, or are they going to?

Jedidiah Bracy: You’re seeing much more of a push for default encryption; I don’t think it’s a coincidence that Apple was one of the first to do it and had a big battle with the FBI. WhatsApp has done it as well, and I think you’re going to see more companies moving towards it, especially companies who want to do business in places like Europe – it’s a business model decision.

Will Acklery: One call to action for any cryptographer here is to innovate on the algorithms that help with the user experience.

Source: Information Security Magazine