Categories
For Edit

What to expect from Facebook, Twitter and YouTube on Election Day

The sites are key conduits for communication and information. Here’s how they plan to handle the challenges facing them before, on and after Tuesday.

Facebook, YouTube and Twitter were misused by Russians to inflame American voters with divisive messages before the 2016 presidential election. The companies have spent the past four years trying to ensure that this November isn’t a repeat.

They have spent billions of dollars improving their sites’ security, policies and processes. In recent months, with fears rising that violence may break out after the election, the companies have taken numerous steps to clamp down on falsehoods and highlight accurate and verified information.

We asked Facebook, Twitter and YouTube to walk us through what they were, are and will be doing before, on and after Tuesday.

Facebook

Since 2016, Facebook has poured billions of dollars into beefing up its security operations to fight misinformation and other harmful content. It now has more than 35,000 people working on this, the company said.

One team, led by a former National Security Council operative, has searched for “coordinated inauthentic behavior” by accounts that work in concert to spread false information. That team, which delivers regular reports, will be on high alert on Tuesday. Facebook has also worked with government agencies and other tech companies to spot foreign interference.

To demystify its political advertising, Facebook created an ad library so people can see what political ads are being bought and by whom, as well as how much those entities are spending. The company also introduced more steps for people who buy those ads, including a requirement that they live in the United States. To prevent candidates from spreading bad information, Facebook stopped accepting new political ads on Oct. 20.

At the same time, it has tried highlighting accurate information. In June, it rolled out a voter information hub with data on when, how and where to register to vote, and it is promoting the feature atop News Feeds through Tuesday. It also said it would act swiftly against posts that tried to dissuade people from voting, had limited forwarding of messages on its WhatsApp messaging service and had begun working with Reuters on how to handle verified election results.

Facebook has made changes up till the last minute. Last week, it said it had turned off political and social group recommendations and temporarily removed a feature in Instagram’s hashtag pages to slow the spread of misinformation.

On Tuesday, an operations center with dozens of employees — what Facebook calls a “war room” — will work to identify efforts to destabilize the election. The team, which will work virtually because of the coronavirus pandemic, has already been in action and is operating smoothly, Facebook said.

Facebook’s app will also look different on Tuesday. To prevent candidates from prematurely and inaccurately declaring victory, the company plans to add a notification at the top of News Feeds letting people know that no winner has been chosen until election results are verified by news outlets like Reuters and The Associated Press.

Facebook also plans to deploy, if needed, special tools that it has used in “at-risk countries” like Myanmar, where election-related violence was a possibility. The tools, which Facebook has not described publicly, are designed to slow the spread of inflammatory posts.

After the polls close, Facebook plans to suspend all political ads from circulating on the social network and its photo-sharing site, Instagram, to reduce misinformation about the election’s outcome. Facebook has told advertisers that they can expect the ban to last for a week, though the timeline isn’t set in stone and the company has publicly been noncommittal about the duration.

“We’ve spent years working to make elections safer and more secure on our platform,” said Kevin McAlister, a Facebook spokesman. “We’ve applied lessons from previous elections, built new teams with experience across different areas and created new products and policies to prepare for various scenarios before, during and after Election Day.”

Twitter has also worked to combat misinformation since 2016, in some cases going far further than Facebook. Last year, for instance, it banned political advertising entirely, saying the reach of political messages “should be earned, not bought.”

At the same time, Twitter started labeling tweets by politicians if they spread inaccurate information or glorify violence. In May, it added several fact-checking labels to President Trump’s tweets about Black Lives Matter protests and mail-in voting, and restricted people’s ability to share those posts.

In October, Twitter began experimenting with additional techniques to slow the spread of misinformation. The company added context to trending topics and limited users’ ability to quickly retweet content. The changes are temporary, though Twitter has not said when they will end.

The company also used push notifications and banners in its app to warn people about common misinformation themes, including falsehoods about the reliability of mail-in ballots. And it expanded its partnerships with law enforcement agencies and secretaries of state so they can report misinformation directly to Twitter.

In September, Twitter added an Election Hub that users can use to look for curated information about polling, voting and candidates. The company has said it will remove tweets that call for interference with voters and polling places or intimidate people to dissuade them from voting.

“The whole company has really been mobilized to help us prepare for and respond to the types of threats that potentially come up in an election,” said Yoel Roth, Twitter’s head of site integrity.

On Tuesday, Twitter’s strategy is twofold: Root out false claims and networks of bots that spread such information by using both algorithms and human analysts, while another team highlights reliable information in the Explore and Trends sections of its service.

Twitter plans to add labels to tweets from candidates who claim victory before the election is called by authoritative sources. At least two news outlets will need to independently project the results before a candidate can use Twitter to celebrate his or her win, the company said.

People looking for updates on Tuesday will be able find them in the Election Hub, Twitter said.

Twitter will eventually allow people to retweet again without prompting them to add their own context. But many of the changes for the election — like the ban on political ads and the fact-checking labels — are permanent.

For Google’s YouTube, it wasn’t the 2016 election that sounded a wake-up call about the toxic content spreading across its website. That moment came in 2017 when a group of men drove a van into pedestrians on London Bridge after being inspired by YouTube videos of inflammatory sermons from an Islamic cleric.

Since then, YouTube has engaged in an often confusing journey to police its site. It has overhauled its policies to target misinformation, while tweaking its algorithms to slow the spread of what it deems borderline content — videos that do not blatantly violate its rules but butt up against them.

It has brought in thousands of human reviewers to examine videos to help improve the performance of its algorithms. It has also created a so-called intelligence desk of former analysts from government intelligence agencies to monitor the actions of foreign state actors and trends on the internet.

Neal Mohan, YouTube’s chief product officer, said that he held several meetings a week with staff to discuss the election, but that there was no last-minute effort to rewrite policies or come up with new approaches.

“Of course, we’re taking the elections incredibly seriously,” he said in an interview. “The foundational work that will play a really major role for all of this began three years ago when we really began the work in earnest in terms of our responsibility as a global platform.”

Before Tuesday, YouTube’s home page will also feature links to information about how and where to vote.

On Tuesday, Mr. Mohan plans to check in regularly with his teams to keep an eye on anything unusual, he said. There will be no “war room,” and he expects that most decisions to keep or remove videos will be clear and that the usual processes for making those decisions will be sufficient.

If a more nuanced decision is required around the election, Mr. Mohan said, it will escalate to senior people at YouTube, and the call will be made as a group.

YouTube said it would be especially sensitive about videos that aimed to challenge the election’s integrity. YouTube does not allow videos that mislead voters about how to vote or the eligibility of a candidate, or that incite people to interfere with the voting process. The company said it would take down such videos quickly, even if one of the speakers was a presidential candidate.

As the polls close, YouTube will feature a playlist of live election results coverage from what it deems authoritative news sources. While YouTube would not provide a full list of the sources, the company said it expected the coverage to include news videos from the major broadcast networks, as well as CNN and Fox News.

Starting on Tuesday and continuing as needed, YouTube will display a fact-check information panel above election-related search results and below videos discussing the results, the company said. The information panel will feature a warning that results may not be final and provide a link to real-time results on Google with data from The A.P.

Google has said it will halt election advertising after the polls officially close. The policy, which extends to YouTube, will temporarily block any ads that refer to the 2020 election, its candidates or its outcome. It is not clear how long the ban will last.

 

— New York Times: Top Stories

— Mike IsaacKate Conger and 

Categories
For Edit

Donald Trump made them furious, and organized. Now is the big test.

For a group of women in western Pennsylvania, 2016 was a shock and a reason to get politically involved for the first time. “Forget about taking no for an answer, they’re not even asking for permission.”

Carolyn Gibbs puts on the striped pants first, then the striped jacket. The hat is the final touch. That’s if it’s an Uncle Sam day. For Statue of Liberty, it’s a mint green dress, a foam halo and a political sign, usually, standing in as the torch.

Before Donald Trump became president, Ms. Gibbs, 59, rarely dressed up for Halloween, only occasionally for a costume party.

But for the better part of four years, she has shown up to rallies in shopping centers of suburban Pittsburgh in elaborate costumes, ready for the role of playful protester.

“I’m willing to make a fool of myself for democracy,” is how she often puts it.

Yet for all her playfulness — and it is boundless — Ms. Gibbs is driven by a sense of anger and residual shock. How could so many of her neighbors in western Pennsylvania vote for a man she saw as a threat? She still finds herself stuck on the question.

“I had begun to think we were including and serving everybody in this country,” Ms. Gibbs said. “But that’s totally not true anymore.”

For the past four years, Ms. Gibbs and half a dozen women (along with one man) have poured countless hours into Progress PA, a political group they created to get Democratic candidates elected in western Pennsylvania, a part of the state that helped fuel Mr. Trump’s victory last time. Joseph R. Biden Jr. is counting on voters like them — older, suburban dwellers — to win back Pennsylvania, where polls show him ahead. But their work is less about their enthusiasm for the former vice president than their revulsion at the current occupant of the White House.

— New York Times: Top Stories

Categories
For Edit

Trump scorns his own scientists over virus data

A public scolding of the C.D.C. chief was only the latest but perhaps the starkest instance when the president has rejected not just the policy advice of his public health officials but the facts and information that they provided.

 

— NYT: Peter Baker

Categories
For Edit

Trump accuses Judge of ‘stacking the deck’ against him in tax ruling

The president is appealing an order that allowed his tax returns and other financial records to be released to the Manhattan district attorney.

— NYT: William K. Rashbaum and Benjamin Weiser