In the early 2000s, some philosophers got together to tackle the problem of philanthropy and came up with the idea of “effective altruism,” whereby people could use data and science to direct their philanthropic dollars to where they might do the most good. This was in the early days of the popular internet, but really an extension of 20th century data culture — as computers and networks spread, more people could measure more and more things, and that, the argument goes, makes decision-making better.
More on that in a bit.
A consequence of effective altruism was the idea that big money philanthropists can have the biggest impact on the world, so people should make decisions to maximize their earnings, which will in turn enhance their charitable potential. Sam Bankman-Fried, the founder of a collapsed cryptocurrency exchange and related hedge fund and the subject of an ongoing criminal fraud and money laundering trial in the U.S., was a huge proponent of this idea.
On the surface, it’s a very attractive notion — it justifies the pursuit of wealth while flattering our egos by praising our good intentions. It’s like praying to a god to intervene on your behalf in the Powerball drawing by promising to donate half the winnings to orphans and refugees. The effective altruism people dug deeper, though, and saw that a consequence of this belief is that people should, from an early age, pursue fields of study and career paths that will enable wealth to fund philanthropy.
Suddenly, following your passion to pursue a studio arts degree, even at the Rhode Island School of Design, becomes a moral failing. You’re not going to build a fortune on the level of Jeff Koons or Damien Hirst, after all. The data show that most of even talented and successful artists wind up just scraping by, with little money left over to buy mosquito nets for Africa. Go to law school, instead! And, for goodness’ sake (really, for the sake of goodness in the world) study corporate law, don’t become a poorly paid public defender. The world needs you!
Now, data without statistical analysis or context is just data, and it can be used to say anything. For example, on X (nee Twitter) this week, I saw a journalist make an interesting point about Target closing stores in San Francisco and New York City. Target says it is closing the stores due to losses from rampant theft. But, argued this journalist, the stores Target is shuttering actually reported fewer incidents to police than other locations that remain open. So, Target is up to something. Well, maybe, but maybe not. In this case, the journalist picked one datapoint (police reports) that doesn’t tell you a whole lot about why those stores are being closed. “What is the ratio of police reports to revenue?” is the first question that jumps to the Middlebrow mind. Because if I have a store with 50 complaints a month and one with 60 complaints a month, but the store with fewer complaints has drastically lower sales, I would close the store with fewer complaints and still blame the thefts.
But it’s also possible to be too clever with data and statistical analysis, or to just miss the point entirely. Our friend and soon-to-be-felon Samuel Bankman Fried complained to journalist Michael Lewis that there is no way that William Shakespeare can be the greatest writer to have ever lived. Citing “Bayesian priors,” he says it’s statistically impossible that the greatest writer in the world would have been born in 1564. I’ve since seen his adherents (and Bankman-Fried still has followers) argue that humans have become so much better at everything else since 1564, from running to weightlifting to making computers, that there’s just no way that Shakespeare hasn’t been surpassed by now.
If you’re a Middlebrow, you’re rolling your eyes right now. First, the “greatest writer who ever lived,” idea is just stupid, and even Shakespeare would say so. We don’t read Shakespeare in 2023 because he is the greatest, we read him because his stories still largely resonate with our lived experience. It is good for contemporary readers and theatergoers to see old stories that remind us that however much the world has changed, that young people still face the doubts and uncertainties, just like Hamlet, and that older people face mortality and waning influence just like King Lear. It’s nice to know, when you wake up in the middle of the night with worry that people even as far back as the 16th century felt the same way you do. That, rather than any notion of “greatest” is why we still revere Shakespeare and why a new translation of The Iliad is a bestseller today.
A major problem with the effective altruists and the people you meet, day-to-day, who claim to be “data driven,” and the truly insufferable people who claim that their understanding of behavioral economics shields them from cognitive errors, is that they all view the arts as frivolous because the arts contain kernels of inspiration that they cannot quantify.
Of course, people have been trying to quantify what makes the arts work for a very long time. Aristotle was consumed with the problem. In The Poetics he figures out what all the great plays of the Athenian golden age had in common and turned it into a kind of template for how to write a tragedy. This is what most “how to write a screenplay” or “how to write a novel” books do — they pick the most popular stuff, figure out what they all have in common and call it a formula. And it is a formula! For mediocrity. Because, no one of the great play has all of the elements in the exact proper places. Where the writer tweaks the formula is what makes each play distinct and memorable. Deus ex machina is weak, says Aristotle, and he’s mostly right. But sometimes, having the gods descend from Olympus lets the audience touch the divine. Just ask Tony Kushner.
Formulas help us understand art, but they are not a roadmap to success. Also, as we’ve discussed here a lot, tons of great art goes unrecognized because markets aren’t perfect. This is why the effective altruists will tell you that if you’re trying to maximize income, don’t even bother with this silly stuff.
We have much more data now than we did in Aristotle’s time. But I’m not sure it’s “better” data because all data has one flaw in common — it is backward looking. If you work in the corporate world, you have probably heard the phrase “real-time” used before words like "monitoring” or “data collection.” Well, that’s just silly. All data is backward looking, even if an artificial intelligence collects and analyzes it very quickly. Backward looking data can tell us things about patterns that might persist, but it will never, ever be determinative.
Quickly gathered data can suggest trends that aren’t there, or that are fleeting. How many heavily focused group movies or products flop because they were tuned to the immediate reactions of test audiences that turned out not to represent the wider public? Did you put all your money into fidget spinners in 2017? Because if those trend continue…
The phrase “data driven” is supposed to mean “guided by the facts on the ground,” and it’s supposed to develop playbooks for our professional, financial and even creative lives. As more and more of the thought economy becomes automated, this idea will gain influence. This is meant to protect us from the flawed use of intuition and gut feelings that can ill serve us in a complex world.
But intuition has value, if it’s well-earned. Our gut misleads us when we are in unfamiliar situations or do not realize that circumstances have changed on us, but feelings are very powerful when a practiced musician picks up their instrument and improvises.
We are in a world where some very influential and powerful people believe that through a combination of data science, behavioral economics and artificial intelligence that you can be told what to study, what career to pursue, how to work, how your community should be planned, what you should eat, who you should marry, how many children you should have... They truly believe no questions are off limits and many of these people (I have worked for and with them) do not even believe in concepts like privacy or individual agency.
We artists know the world is more chaotic than that, and that chaos always worms its way around systems of control, but we are at the moment on the cusp of what some important people believe is a technocratic utopia.
“One must still have chaos in oneself to give birth to a dancing star,” said Friedrich Nietzsche, who would have howled mightily at the data fetishists running our society. But you’ll find, as people continue to embrace all that is measurable in life, that sentiments like Nietzsche’s will likely just confuse people.
Stay weird, folks. It’s the only effective form of protest.