I never paid for it in my life
So Jaron Lanier is back, arguing that we should be paid for our data. He was last seen in net.wars two years back, arguing that if people had started by charging for email we would not now be the battery fuel for "behavior modification empires". In a 2018 TED talk, he continued that we should pay for Facebook and Google in order to "fix the Internet".
Lanier's latest disquisition goes like this: the big companies are making billions from our data. We should have some of it. That way lies human dignity and the feeling that our lives are meaningful. And fixing Facebook!
The first problem is that fixing Facebook is not the same as fixing the Internet, a distinction Lanier surely understands. The Internet is a telecommunications network; Facebook is a business. You can profoundly change a business by changing who pays for its services and how, but changing a telecommunications network that underpins millions of organizations and billions of people in hundreds of countries is a wholly different proposition. If you mean, as Lanier seems to, that what you want to change is people's belief that content on the Internet should be free, then what you want to "fix" is the people, not the network. And "fixing" people at scale is insanely hard. Just ask health professionals or teachers. We'd need new incentives,
Paying for our data is not one of those incentives. Instead of encouraging people to think more carefully about privacy, being paid to post to Facebook would encourage people to indiscriminately upload more data. It would add payment intermediaries to today's merry band of people profiting from our online activities, thereby creating a whole new class of metadata for law enforcement to claim it must be able to access.
A bigger issue is that even economists struggle to understand how to price data; as Diane Coyle asked last year, "Does data age like fish or like wine?" Google's recent announcement that it would allow users to set their browser histories to auto-delete after three or 12 months has been met by the response that such data isn't worth much three months on, though the privacy damage may still be incalculable. We already do have a class of people - "influencers" - who get paid for their social media postings, and as Chris Stokel-Walker portrays some of their lives, it ain't fun. Basically, while paying us all for our postings would put a serious dent into the revenues of companies like Google, and Facebook, it would also turn our hobbies into jobs.
So a significant issue is that we would be selling our data with no concept of its true value or what we were actually selling to companies that at least know how much they can make from it. Financial experts call this "information asymmetry". Even if you assume that Lanier's proposed "MID" intermediaries that would broker such sales will rapidly amass sufficient understanding to reverse that, the reality remains that we can't know what we're selling. No one happily posting their kids' photos to Flickr 14 years ago thought that in 2014 Yahoo, which owned the site from 2005 to 2015, was going to scrape the photos into a database and offer it to researchers to train their AI systems that would then be used to track protesters, spy on the public, and help China surveil its Uighur population.
Which leads to this question: what fire sales might a struggling company with significant "data assets" consider? Lanier's argument is entirely US-centric: data as commodity. This kind of thinking has already led Google to pay homeless people in Atlanta to scan their faces in order to create a more diverse training dataset (a valid goal, but oh,.the execution).
In a paywalled paper for Harvard Business Review, Lanier apparently argues that instead he views data as labor. That view, he claims, opens the way to collective bargaining via "data labor unions" and mass strikes.
Lanier's examples, however, are all drawn from active data creation: uploading and tagging photos, writing postings. Yet much of the data the technology companies trade in is stuff we unconsciously create - "data exhaust" - as we go through our online lives: trails of web browsing histories, payment records, mouse movements. At Tech Liberation, Will Rinehart critiques Lanier's estimates, both the amount (Lanier suggests a four-person household could gain $20,000 a year) and the failure to consider the differences between and interactions among the three classes of volunteered, observed, and inferred data. It's the inferences that Facebook and Google really get paid for. I'd also add the difference between data we can opt to emit (I don't *have* to type postings directly into Facebook knowing the company is saving every character) and data we have no choice about (passport information to airlines, tax data to governments). The difference matters: you can revise, rethink, or take back a posting; you have no idea what your unconscious mouse movements reveal and no ability to edit them. You cannot know what you have sold.
Outside the US, the growing consensus is that data protection is a fundamental human right. There's an analogy to be made here between bodily integrity and personal integrity more broadly. Even in the US, you can't sell your kidney. Isn't your data just as intimate a part of you?
Illustrations: Jaron Lanier in 2017 with Luke Robert Mason (photo by Eva Pascoe).
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.