For aspiring authors feeling the intimidation that often accompanies beginning a new project, novelist E.L. Doctrow offered the following advice: “Writing is like driving a car at night. You can only see as far as the headlights, but you make the whole trip that way.”
This little tidbit came to mind while listening to a recent episode of “Plain English,” the reliably excellent podcast by Derek Thompson, a staff writer at The Atlantic. In a conversation with Matthew Yglesias about the recent FTX scandal, the two discussed the goals of Effective Altruism, henceforth in this piece “EA,” which was the alleged philosophical guiding light for CEO of now-bankrupt FTX, Samuel Bankman-Fried (henceforth SBF, against my will and better judgment).
Both Thompson and Yglesias are up front about their bias towards EA, though Thompson seems slightly more skeptical about whether its stated goals are actually achievable.
For those (like myself) who lived blithely unaware of EA up until five weeks ago, the philosophy seeks to make philanthropic giving do the most good, for the most people possible. If you’ve got money you’re looking to bestow upon charitable outlets, EA asks that you put on the glasses of cold, clear logic and donate to the “most important” causes that will have the biggest impact on total human wellbeing.
Thompson points out that EA has notably changed in the past few years, shifting from a focus on the urgent human needs of today towards the possible human catastrophes of a very distant tomorrow.
Our intellectual and financial betters, it turns out, have been hard at work in the predictions business, hashing out “serious” arguments about whether human apocalypse is more likely to result from a pandemic or a rogue comet strike, that we might best allocate billionaires’ money resources towards preventing the potential suffering of our not-yet-born homo sapien kin, sometimes projecting out on a millennia-scale timeline. (If this sounds like the pretentious late-night jawing of undergrads in a smoke-filled dorm room, well… perhaps that should tell us something.)