Security

Bad News Confirmed For 1.3 Billion Facebook Messenger Users – Forbes


If you’re one of Facebook Messenger’s 1.3 billion users, you have just been given some seriously bad news. Meta has suddenly confirmed delays to critical security updates, and there are more serious problems that should concern you even more.

Facebook is the world’s messaging giant. WhatsApp and Messenger are the two most popular cross-platform services, with 2 billion and 1.3 billion users, and there’s also Instagram DMs added into the mix. While WhatsApp default encrypts all messages, the other two platforms do not. Facebook announced a grand plan to fix this in 2019, shifting to a fully encrypted, integrated back-end, but those plans are not going well.

The intent is that Facebook Messenger and Instagram DMs will come under the same level of default end-to-end encryption that now protects WhatsApp. We already knew the rollout had been pushed to 2022. Now we’ve been told it will slip another year.

“We’re taking our time to get this right,” Meta’s Global Head of Safety confirmed last week. “We don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023.”

Words are important. Facebook had already said “2022 at the earliest,” and is now talking about completion a year later. You could squint and argue this is consistent, but it is clearly taking significantly longer than expected and is now caught in the mire of government pushbacks on encryption and child safety.

Taken at face value this delay is a very serious issue. Meta’s Antigone Davis also reiterated that encryption is required to keep messages “safe from hackers, fraudsters and criminals,” and that “at Meta… we know people expect us to use the most secure technology available,” citing WhatsApp as having the reference model.

This isn’t new. WhatsApp’s boss, Will Cathcart, has repeatedly said that encryption should be the standard for all messaging platforms. “End-to-end encryption locks tech companies out of particularly sensitive information. Will we be able to have a private conversation, or will someone always be listening in?”

Hard to reconcile WhatsApp’s stance, again at face value, with Facebook’s other billion-plus user messaging giant pushing this level of security another two years down the road. Especially when Facebook has admitted to monitoring user content on Messenger and after we exposed its alarming handling of private document links.

This isn’t the most serious Messenger problem, though. For that we can turn to WhatsApp again, and the eleven words in its statement on fighting child exploitation that undermine Facebook’s argument to add full encryption to Messenger: “On WhatsApp, you cannot search for people you do not know.”

Meta’s update on Facebook Messenger security was not really intended to drop the bad news on timing, but more to reassure on child safety. Apple’s poorly conceived plan to compromise iMessage’s end-to-end encryption with on-phone AI detection for sexually explicit images sent to or received by children, prompted a response.

MORE FROM FORBESiOS 15.2-Apple’s iMessage Safety Update Is A Major Change For iPhone Privacy

“We believe people shouldn’t have to choose between privacy and safety,” Meta’s Davis said. “We are building strong safety measures into our plans and engaging with privacy and safety experts, civil society and governments to make sure we get this right.”

No surprises in what those measures are—metadata monitoring to mine for “suspicious patterns of activity,” preventing users that admit to being adults from contacting children, filtering incoming messages and reporting functions.

This approach, Meta says, “already enables us to make vital reports to child safety authorities from WhatsApp.” It won’t be enough—not even close. To contact someone on WhatsApp, you need their contact details. You can’t trawl WhatsApp for strangers.

Despite this, children’s charity NSPCC tells me that “10% of child sexual offences on Facebook-owned platforms take place on WhatsApp.” But its encryption means this “accounts for less than 2% of child abuse the company reports because they can’t see the content.” So, there’s an issue even on WhatsApp, and encryption makes detection much harder—this combination is very bad news for Messenger.

The reality is that adding messaging to social media platforms is bad enough, but encrypting that messaging to mask content from review is dangerous. Facebook Messenger and Instagram DMs have added disappearing messages and media and limited 1:1 secret chats. It is ridiculously simple to search both sites, looking at user profiles and photos, and clicking to message. The idea that all minors on those sites “know” all their connections and can vouch for those accounts is nonsensical.

I’m clearly an advocate for end-to-end encrypted messaging—on messaging platforms. Go use WhatsApp or Signal or iMessage. But I’m not an advocate for adding such security to social media platforms where adults share a vast public space with children. Let’s remember, there’s no certainty that any profile is real.

The final point regards Facebook itself. The concept of a private conversation on Messenger, which sits within Facebook—the world’s most surveilled “public space,” is a stretch. Yes, the specificity of content might be protected, but the platform knows everything about you and those you message. Unlike dedicated messengers, it harvests and mines all the data it can find. It knows almost everything and can infer the rest.

MORE FROM FORBESLatest Google Chrome Privacy Warning Gives You Another Reason To Switch

Facebook assures me it can address child safety without compromising encryption, but that ignores the social media issue. WhatsApp is different—it doesn’t carry those same risks, as it says itself in its statement. “Half my day is explaining to people that WhatsApp isn’t a social network,” one platform insider told me.

The stark truth is that Facebook is not the right guardian of private messaging or content. It is in the data harvesting and monetization business. There should be strict rules around the mixing of browsable social media and messaging and the monitoring of virtual “public spaces” that mix adults and children. Critical lessons here for the forthcoming metaverse, which will find it hard to present as a safe space.

The argument around child safety on messengers, now exposed by Apple and its well-intentioned if poorly planned child safety updates, will run and run. And while WhatsApp, Signal and others will need to defend their approach, the social media platforms need an entirely different approach.

Facebook is delaying a security update it says is critical, and there are arguments this update should not take place and may yet be prohibited. That’s a reason to consider shifting your private messaging to WhatsApp and try out Signal if you haven’t already.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.