This article is more than 1 year old

Facebook COO Sheryl Sandberg: Crypto ban won't help trap terrorists

Tells Desert Island Discs that Indie apps don't hand over metadata

Facebook's chief operating officer Sheryl Sandberg has reiterated the social network's position that weakening the encryption of messaging apps isn't going to give governments what they want.

Governments and law enforcement agencies are increasingly going public with their frustration that encryption prevents them accessing electronic messages.

That's led to fears of a renewed “crypto-war” mirroring the 1990s, when the US government tried (and failed) to mandate the “Clipper chip”, tried (and failed) to put a stop to Phil Zimmerman's Pretty Good Privacy, and tried (and failed) to limit browser SSL encryption strength.

That's led at least two of the “Five Eyes” – the UK and Australia – to pursue legislative approaches that compel messaging services to help it out.

Last year, Westminster passed the Investigatory Powers Act, which when implemented will let the government issue “technical capability notices” ordering operators to remove “electronic protection … to any communications or data”.

The Australian government hasn't yet detailed its proposals, but prime minister Malcolm Turnbull has cited the UK law as a model for local legislation.

Turnbull also led the Five Eyes push against encryption, a stance followed by Germany.

Sandberg said breaking encryption is a bad idea that will leave governments with less rather than more information, because organisations like Facebook (whose WhatsApp is frequently cited as dangerous) at least try to comply with requests for metadata.

Interviewed for the BBC's Desert Island Discs, Sandberg said it's pretty simple: “The message itself is encrypted, but the metadata is not. If people move off those encrypted applications to other applications offshore, the government has less information, not more.”

Sandberg said Facebook plans to expand the workforce of around 4,500 already working to identify terrorist content and hate speech by a further 3,000.

As well, she re-stated the collaboration between several tech companies (Facebook, Google, Twitter and others) to collaborate in taking down illegal content.

“If a video by a terrorist is uploaded to any of our platforms, we are able to fingerprint it for the others, so that they can't move from platform to platform,” she said.

But it's clear that humans are still needed as well as automation: “Context matters. If someone puts up an ISIS flag, are they doing it to recruit or are they doing it to condemn? We absolutely don't allow the first, but we want the second.” ®

More about

TIP US OFF

Send us news


Other stories you might like