Editor’s Note: Morning Tech is a free version of POLITICO Pro Technology’s morning newsletter, which is delivered to our subscribers each morning at 6 a.m. The POLITICO Pro platform combines the news you need with tools you can use to take action on the day’s biggest stories. Act on the news with POLITICO Pro.

Social media companies have been gameplanning for months — in some cases, years — to protect the integrity of the U.S. election. Now, it’s the moment they’ve all been waiting for.

— Behind the scenes with Silicon Valley: Facebook, Twitter and Google have instituted policies on misinformation, content moderation and political ads as the election crosses the finish line. Here’s where they’re at today as their years-long efforts are put to the test.

— Behind the scenes with the experts: What’s keeping mis- and disinformation gurus on the edge of their seats? Leading cybersecurity, human rights and civic society voices weigh in.

THE TIME HAS COME. WELCOME TO ELECTION DAY MORNING TECH! I’m your host, Alexandra Levine, a proud voter in the battleground state of Pennsylvania.

Got a news tip? Write to Alexandra at [email protected], or follow along @Ali_Lev and @alexandra.levine. An event for our calendar? Send details to [email protected]. Anything else? Full team info below. And don’t forget: Add @MorningTech and @PoliticoPro on Twitter.

ALL HANDS ON TECH! LET’S GET READY TO RUMBLE — Physical “war rooms” may be out of the question during a pandemic, but teams and task forces at the world’s leading tech companies are working virtually as the election concludes. (Despite President Donald Trump claiming otherwise, that is not likely to happen tonight.) After widespread manipulation of social media rocked the 2016 cycle, the pressure’s on: Silicon Valley’s performance this week could set the tone for the next four years and shape conversation for decades.

Morning Tech’s Election Day edition kicks off with a bird’s eye view of how the platforms are preparing. But you can only do so many fire drills before the fire actually starts — and once it does, even the best-laid plans could meet unforeseen obstacles. We’ve talked to leading mis- and disinformation experts on what they’re watching heading into Election Day, night, and the possible limbo to follow.

BEHIND THE SCENES: FACEBOOK PREP — The social network has made clear that platform interference is still a threat — disclosing as recently as one week ago that foreign adversaries were targeting the U.S. election — and for months, it has been removing nefarious networks and accounts. The giant’s Elections Operations Center, launched in October and composed of threat intelligence, data science, engineering and legal leaders from across Facebook, is serving as air traffic control for these and other issues that could arise. (Those include developments around voter suppression material or political misinformation, for example.) The team also plans to address, in real-time, concerns flagged by state election officials — who have the hub on speed dial.

— If a candidate prematurely declares victory: After polls close, election-related posts on Facebook (from all users) will be labeled with a link to the platform’s Voting Information Center, which highlights real-time results from races across the country. If a candidate or party prematurely declares a win, language in those labels will be updated accordingly (“Votes are being counted. The winner of the 2020 U.S. Presidential Election has not been projected”) and displayed at the top of Facebook and Instagram feeds. More here.

TWITTER PREP — CEO Jack Dorsey hired additional personnel this year to help handle the U.S. election, with employees around the world expected to assist with the heavy lifting. (And the company met with government officials as recently as Wednesday to strategize on potential challenges.) Twitter has banned posts with false or misleading information about the election, and in a move to slow the spread of misinformation more broadly, the platform recently put in place a prompt for users to add their own commentary before retweeting. Twitter has also taken the hardest-line approach to political advertising, banning it altogether one year ago. (Facebook and Google updated their own election ads policies in recent weeks, but neither has taken as aggressive a stance as Dorsey did.)

— If a candidate prematurely declares victory: Twitter will label those claims, which are not allowed on the platform. (Here’s what that label will look like.) But Twitter on Monday said it will consider election results official if declared by at least two outlets out of a list of seven news organizations (from CNN to Fox News) that it deems trustworthy. If Trump or Biden makes a victory claim citing one of those sources, Twitter will not apply a label.

GOOGLE PREP — The search giant is sourcing election results from the Associated Press, so those are the stats you’ll see when you query Google for the winner or ask Google Assistant for an update. (Give it a try tonight: “Hey Google, what are the current election results?”) The company’s Trust and Safety teams and its Threat Analysis Group will be tracking Google and Google-owned YouTube for suspicious activity. Speaking of YouTube: The video platform will continue taking down policy-violating videos (including those with false or misleading claims about voting) and amplifying authoritative material (including election live streams from major news outlets). More here.

Putting these policies into practice — especially at a rapid pace when the stakes are so high — is easier said than done. They’re hypothetical until they’re real. Content decisions are, in many cases, no longer a matter of “keep up” or “take down,” but rather, fall somewhere in between. Here’s what else these tech companies have tried to prepare for, and what could still go horribly wrong.

NEXT UP: ASK THE EXPERTS — In typical elections, the threat of mis- and disinformation tends to be highest in the lead-up; now, the dangers could extend for days or weeks. Across the board, experts we spoke with are worried about how viral political misinformation on social media could play a role in voter suppression and intimidation, as well as how online activity could incite real-life violence. Related challenges could stretch to the presidential inauguration and beyond.

Responses from Fadi Quran, campaign director at Avaaz; Steve Grobman, chief technology officer at McAfee; and Paul M. Barrett, deputy director of the NYU Stern Center for Business and Human Rights, have been lightly condensed.

What are you most concerned about heading into the election?

Avaaz’s Quran: “This really is a democratic emergency. Members of our team are from around the world — they have tracked misinformation from Myanmar to Brazil — and what they are seeing in the U.S. is terrifying them. The ecosystem of misinformation driven by the White House, broadcast TV, alt media web pages and Facebook has created a culture of misinformation that, until now, we’ve only really seen (at this scale) under authoritarian regimes. Given that seven out of 10 Americans use Facebook, and over half say they use it as a source of news, Americans are living in different versions of reality right now.”

McAfee’s Grobman: “One of the most effective forms of disinformation that we saw in 2016, and to some degree this year, is when information is released as part of a leak or breach. My concern is that fabricated disinformation can have enhanced credibility by being combined with legitimate information that has been stolen or that can’t be verified,” he said, citing the recent disputed narrative around Hunter Biden and the notorious laptop as an example.

What is the biggest under-the-radar threat — one that you worry might be overlooked?

NYU’s Barrett: “The mass texting to cell phones of all kinds of smears, which are almost impossible to track in real time, let alone stop. Example: a video texted primarily to voters in swing state Michigan falsely accusing Joe Biden of supporting sex-change surgery for very young children.”

Quran: “The use of private Facebook groups, veiled groups on messaging apps, and platforms like Gab and Parler to mobilize violence [or] armed aggression, target voters at the polls on Election Day, and [retaliate] after Election Day.” He added separately: “There are still not enough eyes on the Spanish-language disinformation landscape, and we are becoming increasingly worried about content we’re seeing that is meant to create a divide between the Latinx and African American community.”

What types of mis- or disinformation are you most concerned about?

Quran: “Images/memes and manipulated video are largely used by the disinfo spreaders and extremists we monitor to skirt Facebook and other platforms’ policies — disinfo and calls to violence, for example, in memes and video are harder to detect than posts/links and tend to enjoy more virality. On Election Day, we’re going to be looking for [an] 11th-hour surprise, [like] dubious ‘bombshell’ reports produced by domestic actors.”

What they’ve seen over the last few days: Claims that “ballots are being sabotaged or blocked; Democrats are planning to wage a “coup” against Trump if he wins reelection; there are plans to assassinate Trump; and Trump supporters are receiving threatening letters suggesting that if Trump doesn’t concede the election they will be targets of violence.”

Grobman: Deepfakes. “One of the big differences that we have in 2020, as compared to 2016, is the state of artificial intelligence has moved to a very different level of maturity, in that anybody can now create a deepfake. … You don’t need to be an expert in AI or media technology to build a deepfake video.”

Does the threat look different for the presidential vs. Senate vs. down-ballot races?

Barrett: “The X factor at the presidential level is Trump’s Twitter-driven lie machine. Trump is willing to say anything via Twitter, and his huge audience there — plus mainstream media coverage that follows — mean that he gets attention for outrageous smears, including the supposed corruption of the Biden family.”

What do you see as the worst-case scenario?

Barrett: “I’m very concerned about real-life violence incited by online exhortations, with most of the danger related to self-styled right-wing ‘militias.’”

Quran: That “extremists (motivated by what they see or consume online and/or Trump poll watcher exhortations) will mobilize a mass armed intimidation effort,” he said, citing recent violence in Wisconsin, and the Kenosha Guard case, as an example. They are also concerned about widespread claims of ballot or voter fraud and questioning of the integrity of a swing state’s voting system. “The risk of these reports triggering suppression of turnout, massive protest, or litigation in swing states is concerning to us.”

What is the most important thing for voters/the American public to keep in mind?

Grobman: “We need to be very careful of not jumping to conclusions on attribution if we do see [disinformation] activity. One of the challenges in cybersecurity is [that] attribution is tough. … We need to recognize that doing attribution takes time and requires a combination of digital forensics as well as detailed law enforcement and intelligence investigation,” he said. So if there is unusual cyber activity, whether trying to persuade voting habits or discredit the legitimacy of the election, “we just have to accept that we might not know who’s doing it for a good number of weeks; the expectation that we’ll know that it’s a particular country is just an unrealistic expectation.”

Quran: “Keep in mind that misinformation is rampant and a majority of us are being exposed to it. Its aim is to confuse us, create chaos and exacerbate anxiety. To protect our individual sanity and our society, we must cut out the noise. Find your trusted sources of information and disconnect from the rest.”

Jack Dorsey will remain CEO of Twitter, WSJ reports. … Angel Arocho of Comcast, Eric Hagerson of T-Mobile and Chris Freeman of Marshall County 911 were reelected to the Next Generation 911 Institute’s board of directors; Lynne Houserman of Motorola and Ron Bloom of Frontier were also elected as new board members; Peter Beckwith of South Sound 911 was elected as chair, and Arocho, as vice chair.

It ain’t over ‘til it’s over: “How election tech could create a recount nightmare,” via POLITICO.

Podcast OTD: The latest episode of FCC Commissioner Jessica Rosenworcel’s “Broadband Conversations” features her discussion with Kathryn de Wit, manager of the Broadband Research Initiative at The Pew Charitable Trusts. Listen on iTunes, Google Podcasts, Google Play or via the FCC.

Tips, comments, suggestions? Send them along via email to our team: Bob King ([email protected], @bkingdc), Heidi Vogt ([email protected], @HeidiVogt), Nancy Scola ([email protected], @nancyscola), Steven Overly ([email protected], @stevenoverly), John Hendel ([email protected], @JohnHendel), Cristiano Lima ([email protected], @viaCristiano), Alexandra S. Levine ([email protected], @Ali_Lev), and Leah Nylen ([email protected], @leah_nylen).

TTYL.





READ NEWS SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here