Congress did it when Google refused to send CEO Sundar Pichai to a hearing in August, and a multi-governmental hearing in Britain into Facebook’s transgressions did it on Tuesday with Facebook CEO Mark Zuckerberg, making sure a chair with his name on it was prominently positioned at center stage for the TV and newspaper cameras.
(Facebook VP of Public Policy Richard Allan was questioned in his place.) There was also a predictable amount of grandstanding to go along with the dramatic accoutrements, with MPs and senior ministers from Britain, Canada, Argentina, and several other countries, complaining about Facebook. “Our democratic institutions have been upended by frat-boy billionaires from California,” Canadian MP Charlie Angus said.
The theatrics actually began even before the UK hearing, when Damian Collins, the head of the committee looking into the spread of misinformation and the Cambridge Analytica scandal, used a little-known legal gambit to compel an American businessman, Ted Kramer, to turn over documents he had on his laptop. Kramer was escorted to the House of Commons by the Serjeant-at-Arms (the guy who usually holds the ceremonial mace) and forced to hand over the files, including emails and other documents related to a lawsuit between his company and Facebook (Kramer has since alleged that UK authorities may have been tipped off to his whereabouts by Guardian reporter Carole Cadwalladr, who has so far declined to comment). Collins said the committee may even publish parts of the documents at some point in its investigation, possibly as soon as next week, despite the fact that they have been sealed by a California court.
So after all that drama, do the documents contain any smoking gun proving Facebook’s guilt, either in the Cambridge Analytica scandal or in the proliferation of fake news? So far the biggest stick that has emerged from the files is the fact that an unnamed Facebook engineer reportedly told his superiors at the social network in 2014 that unknown entities using Russian IP addresses were accessing as many as three billion data points a day. Allan, however, told the committee that “any information that you have seen in that cache of emails is at best partial, and at worst potentially misleading.” Facebook later provided copies of the emails showing that the engineer in question misunderstood what he was looking at; the data was being accessed by users of Pinterest, and it was more like six million data points, not billions).
Whether the Kramer documents reveal anything, the issue of when and how much Facebook knew about Russian agents using the platform to spread disinformation is a fairly crucial question for regulators. An internal report put together by the social network’s security team as recently as last year stopped short of attributing any of the suspicious behavior on the platform to the Russians, except to reference, obliquely, US intelligence agency reports. Alex Stamos, the former head of security for Facebook, has said Facebook executives didn’t shut down any investigations or scrub any reports. But it is increasingly obvious that the company downplayed that influence, and only admitted later to knowing about it because Zuckerberg wound up on the hot seat in front of Congress.
The British parliamentary committee is also interested in Kramer’s claim that the social network didn’t just allow apps like his to access to the “friend graph” data of users, it actively promoted that access. This is less of a smoking gun than it might seem, since the API access Kramer and Cambridge Analytica had was commonplace at the time. Facebook believed that making it easier for users to find each other on various apps and services was a positive thing (part of Zuckerberg’s stated mission to make the world more connected) and that the best way to do that was to allow those services to see who else in a user’s friend group also used an app or service. Who could possibly have known that this friend data would be used to create psychographic profiles in an attempt to convince people to vote for a former reality-TV star with a terrible track record?
Here’s more on Facebook, the UK, misinformation, and Cambridge Analytica:
- Sending the cat: If you are looking to relive the UK hearing, reporters for The Guardian were live-blogging it as it occurred, including a Belgian representative explaining that when someone doesn’t show for an appointment, the Flemish say he “sent his cat.”
- Sandberg under fire: Bloomberg’s Sarah Frier says Chief Operating Officer Sheryl Sandberg is coming under fire inside Facebook for her role in some of the controversies that are hounding the social network, including a lack of action against misinformation.
- Facebook purge: Rolling Stone writer Matt Taibbi talks to a man who started a left-wing news site that got millions of visitors from Facebook, until it was suddenly shut down in a crackdown on spam and “inauthentic behavior.” But the site’s creator says he published fact-checked news and was never told by Facebook why his site was removed.
- The “war room” lives: One of the misinformation measures Facebook promoted recently was a “war room” in which staff would target fake news. After a Bloomberg piece said the war room was being disbanded, the company said it wasn’t, and that it is just trying to expand its misinformation work in other ways, or something.
- Mistakes were made: Facebook’s former head of security, Alex Stamos, wrote a Washington Post editorial in which he admitted the social network did not do enough to recognize the threat from Russia, but he said others were also to blame.