Facebook war room to combat manipulation

Elias Hubbard
October 20, 2018

On the walls are posters of the sort Facebook frequently uses to caution or exhort its employees. In an otherwise innocuous part of Facebook's expansive Silicon Valley campus, a locked door bears a taped-on sign that reads "War Room".

Inside the war room at its Menlo Park, California headquarters, are more than two dozen experts from divisions across the company, including software engineering, threat, intelligence and data science, as well as its legal teams, working on identifying real-time threats and preventing the spread of harmful content.

But Facebook's blase attitude shifted as criticism of the company mounted in U.S. Congress and elsewhere.

CEO Mark Zuckerberg initially brushed off the accusations of Russian meddling, but soon the company found thousands of ads promoting false information tracing back to Russia.

"These employees represent and are supported by the more than 20,000 people working on safety and security across Facebook".

The "war room is something new that we're trying in terms of having a physical presence, and we'll reevaluate and see how it works here after the USA midterms to determine if this is something that we want to continue for major elections going forward", said Katie Harbath, Facebook's director of global politics and government outreach.

Facebook Director of Elections and Head of Civic Engagement Samidh Chakrabarti told CNN that the war room is "really the culmination of two years of massive investments we've made both in people and technology to ensure that our platforms are safe and secure for elections". The social networking giant has rolled out several initiatives to fight fake news and bring more transparency and accountability in its advertising since then.

The efforts are also coordinated with Facebook's fact-checking partners around the world including media organizations such as AFP and university experts.

With the new ad architecture in place, people would be able to see who paid for a particular political ad.

"It's always a challenge because there are always going to be people when you have a public debate leading up to an election that are is going to try to target them", Gleicher said. "On balance, I would say they that are still way off".

Nathaniel Gleicher, Facebook's head of cybersecurity, told CNBC that if the system it has in place now would have existed in the same form in 2016, the company could have prevented alleged Russian manipulation of the 2016 US presidential election.

Ms. McKew believes Facebook is conflicted about blocking some content it already knows is suspect "because they keep people on their platform by sparking an emotional response, so they like they like the controversial stuff". And now Facebook argues that the war room is a prime example of its efforts.

These preparations helped a lot during the first round of Brazil's presidential elections, Facebook claimed. He declined to elaborate. Now Facebook hopes to avoid a repeat in the upcoming U.S. midterms as well as elections across the globe. The walls are draped with American and Brazilian flags, alongside clocks showing different time zones and televisions blaring cable news.

The team in the War Room, Facebook said, also helped quickly remove hate speech posts that were created to whip up violence against people from northeast Brazil after the first round of election results were called. While on duty, war room workers are only allowed to leave the room for short bathroom breaks or to grab food to eat at their desks.

However, it's unclear whether the war room will become a permanent fixture or just a temporary Band-Aid. "This is our new normal".

Other reports by Click Lancashire

Discuss This Article

FOLLOW OUR NEWSPAPER