Law Enforcement Concerned as Meta Rolls Out End-to-End Encryption

Some law enforcement advocates are expressing concern that a new policy change by Facebook parent company Meta, could make it harder to track down child predators and other criminals online.

Last week, Meta announced that end-to-end encryption will now be the default option for messages on Facebook and its Facebook Messenger platform. The feature had already been available, but users had to opt-in.

Under the change, no one other than the sender and recipient will be able to read messages, not law enforcement, not Meta itself, not criminal hackers, nor anyone else. The Wall Street Journal reports Meta-owned Instagram plans to follow suit with 100 percent encryption at a later date.

Facebook Head of Messenger Loredana Crisan hailed the change as a win for privacy and downplayed safety concerns.  

“We take our responsibility to protect your messages seriously and we’re thrilled that after years of investment and testing, we’re able to launch a safer, more secure and private service,” wrote Crisan in a blog post.  “We worked closely with outside experts, academics, advocates and governments to identify risks and build mitigations to ensure that privacy and safety go hand-in-hand.”

Meta CEO Mark Zuckerberg first mentioned the plan to move toward encrypted messages in 2019, saying that some online conversations deserve absolute privacy.  

Facebook’s WhatsApp is already 100 percent encrypted.

Law Enforcement Groups Raise Objections

For years safety advocates and law enforcement groups have expressed reservations that end-to-end encryption could make it harder to track down child exploitation, human trafficking, hate speech, and other nefarious activities. Meta and Facebook’s recent announcements highlighted those concerns.

The Virtual Global Taskforce, an international coalition of 15 law-enforcement agencies including the Federal Bureau of Investigation (FBI), expressed opposition to Meta’s plans calling the move a “purposeful design choice that degrades safety systems and weakens the ability to keep child users safe.”

And the National Center for Missing and Exploited Children said the move is a “devastating blow to child protection” noting that Meta had been one of the leading companies in reporting child sex abuse, with Meta reporting over 20 million incidents to law enforcement last year.

Meta meanwhile, says it’s working on artificial intelligence (AI) technology to help detect potential predators earlier, which Meta says will lead to more robust reports for law enforcement organizations.

“As a society, we should be stopping that harm from happening before it takes place,” Gail Kent Director of Messaging Policy told NBCNews. “It’s a big change that we’re asking from law enforcement, I completely understand that.”   

However, James Babbage, director general for threats at the United Kingdom’s National Crime Agency, sees it differently, saying: “As a result of Meta’s design choices, the company will no longer be able to see the offending occurring on their messaging platform, and law enforcement will no longer be able to obtain this evidence from them. This problem won’t go away; if anything it will likely get worse.”


Previous
Previous

Police using drones to patrol malls during holiday shopping season

Next
Next

New Body Armor Testing Standards in First Update in 15 Years