Going dark
"Democracy dies in darkness," slogans the Washington Post on its morning emails and front page. This week, a case in point surfaced with the news that during the US presidential campaign Facebook sold about $100,000 worth of ads to a Russian company with a history of pushing pro-Kremlin propaganda. A "troll farm", the article calls it. The news comes directly from Facebook, in both testimony to congressional investigators and a blog post by Alex Stamos, the company's chief security officer. Stamos says there were roughly 3,000 of these ads and that they were associated with about 470 "inauthentic" accounts. (That is, the accounts were real enough, but the people behind them weren't provably who they said they were.) "We don't allow inauthentic accounts on Facebook," he advises, adding that the accounts have been shut down.
The Post writers note that although Facebook reports in that blog post that about a quarter of these ads were geographically targeted, company' official declined to provide specifics about which areas or demographic groups were the targets. The company also declined to disclose samples of the ads in question, citing "federal law" and Facebook's own data policy as reasons why they couldn't disclose user data and content. However, a company official did say that the ads were directed at "people on Facebook who had expressed interest in subjects explored on those pages, such as LGBT community, black social issues, the Second Amendment, and immigration". Stamos says in his blog post that the "vast majority" of these ads didn't specifically reference the election, voting, or any specific candidate but appeared to focus on "amplifying divisive social and political messages across the ideological spectrum". This is the kind of strategy that worked so well for Iago. Games People Play author Eric Berne would call this one, "Let's you and him fight". Democracy can also die in divisiveness, as in "divide and conquer".
At the Atlantic, David A. Graham considers the history of Russian interference efforts and the implications for election law: US campaign finance laws bar foreigners from spending to influence an election. When you're talking about the auditable financial accounts belonging to candidates and their campaigns, that's a manageable prospect, as the Sunlight Foundation shows. What makes the Facebook situation hard is the lack of insight into the company's inner workings: darkness deeper than that of the "dark web" because it's defended by well-paid experts.
When James Comey, the head of the FBI under the Obama administration, complained about "going dark", he meant encryption. This is the same back door argument so exercising UK Home Secretary Amber Rudd at the moment, and it elicits the same response: it's dangerous, unworkable, and counter-productive. All of this was laid out clearly in the 2015 paper Keys Under Doormats: Mandating Insecurity, written by a parliament of security experts, who noted that today's law enforcement has access to a wildly greater supply of data about all of us than at any time in history.
What is really at risk of going dark is our visibility into public life as more and more of it moves onto proprietary platforms whose inclination is to stockpile information rather than make it transparent. In another of this week's Facebook stories, Politico's Jason Schwartz talks to Facebook's new army of fact checkers and finds that the company's refusal to share data about their results is hurting their ability to decide which stories to prioritize for fact checking. The only feedback they apparently get is the advice that false news is decreasing on Facebook. It's hard to know what that even means: does it mean fewer fake stories are shared, that fewer people see fake stories, or, the hardest to measure, that the stories' influence is less? Have they found any unwanted side effects, such as the disappearance of real stories? Recall that only a few months ago, Facebook's leaked training slides showed that the company's policies are already a mess of ad hoc precedents.
Along with this is the Slate story from a couple of weeks ago following the post-Charlotteville purge of hate speech, in which April Glaser finds that the "alt-right", Nazis, and white supremacists are building their own web of social media sites. This is unsurprising to anyone who's seen filmmaker Jen Senko's excellent documentary, The Brainwashing of My Dad. In studying the personal transformation of her own father under the influence of a steady diet of Fox News and Rush Limbaugh, Senko unearths the history of conservative right-wind media. The Nixon strategist (1968) and Fox News chair Roger Ailes played a key role in creating a media of "our own". The bubble that created was bad enough; now we have a group of already-alienated, angry people pushed together even further away from the mainstream on platforms where they can bond as "martyrs" and "refugees". What could possibly go wrong?
In lamenting the end of the Sun operating system Solaris, Brian Cantrill says that becoming proprietary is the moment of death for software. Only open source, he writes, lives eternally. The same is true of public discourse.
Illustrations: Facebook logo; James Comey; Jen Senko's dad.
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.