Well, the truth is that I am. At least, I sometimes write about tech. I’m sorry I am, but it pays the bills. I’ve accepted that no one will ever pay me a month’s rent for a day spent looking at and writing about art. But what I’ve earned writing data journalism has allowed me some financial freedom to do so at will. In the gig economy (or information economy, if you prefer), there’s no better avenue to freedom than to sell out bigly for only half your time. I’m aware, though, that this won’t last. My side gig is one of thousands imperiled by algorithms designed to do the same thing, cheaply.
An example might explain what all of this has to do with anything. This summer, Wired declared 2016 the year movies ceased to matter. Their evidence? No one saw this one superhero movie, or those who did weren’t tweeting about it. Instead, the writer’s feed was bulked by chatter of Game of Thrones, Beyoncé’s “Lemonade,” and “dank memes.” Besides, movies have been around forever anyway. (Never mind that so have music and TV.) Our third Michelangelo will be a Content Creator, and Amazon their Medici. Get used to it. It caused quite a stir—at least, for those whose pageview was registered; it was all I read about for days.
But 2016 also saw sweeping and historic changes to the Academy of Motion Picture Arts and Sciences, designed to increase diversity among its board of governors in response to the previous year’s #OscarsSoWhite boycott. Brian Raftery, the writer of the Wired piece, relegates film to the list of dead mediums at the very moment women and people of color are being welcomed in.
The reason why this is relevant is that Raftery staked a critical position on what information presented itself to him, and claimed a foundation of objective data (as if such a thing exists!), rather than looking outward himself. In doing so, he confused information for knowledge, and entrusted too much faith in our models. What information is or is not available is ontological, because it determines a finite set of possible computations, which only might serve as accurate models of actual circumstance. This Wired bit is a small example, but it would take significantly more space to parse out the systemic failure of pollsters whose models—almost without exception—predicted a Clinton presidency based, in part, on a failure to account for the social media echo-chambers of misinformation and half-truths—which, on November 10, Facebook made vague promises to address. It’s absurd to believe that a computational model could ever account for every bit of data, or even that a reductionist model could account for significant aberrations. And yet, in politics and in culture, these lately seem to be the methods of critics who, like techies and investors, value prescience over insight.
2016 was an awkward year for progress. SpaceX blew up the Internet over Cape Canaveral, and the iPhone 7 was released. So was Musk’s plan to colonize Mars. Earth, too, is outdated and vulnerable to water, so leave it to the oceans with your 6S. Oh, and better leave the Leonardos behind too. There’s no room on the spaceship, and it might explode, but that’s the danger we signed up for. Anyway, we’ll make new art once we’re there.