Epistemic Status: Ranting with the fire of a thousand suns
I was on page 48 of the (so far) otherwise interesting and enjoyable Algorithms to Live By, a birthday gift from my friend Jacob who writes the blog Put a Num on It, in which the authors Brian Christian and Tom Griffiths was discussing the Explore/Exploit dynamic and the virtues of A/B testing, when I came upon the following passage, which caused a strong instinct in me to say ‘until you have properly ranted about this you are not allowed to continue reading this book’:
In fact, these distinctions turn out to matter immensely-and it’s not just presidential elections and the internet economy that are at stake.
It’s also human lives.
Human lives that might come to a proximate end are not the trump card. They are not the one and only metric that determines worthiness. The world is not divided into non-overlapping magisteria, Things That Are Human Lives, and Things That Might Effect Humans But Are Not Directly And Explicitly At-Risk Human Lives, with everything in the first magisteria more important than everything in the second magisteria.
You also can’t solve this problem by shifting some group of additional things from the second magisteria into the first magisteria.
You cannot say: Yes, I understand that when we talked about raising $57 million dollars for a presidential campaign, we were talking about only politics or only money. When we were talking about the entire internet economy, we were only talking about a bit of technology or only money. All of that pales in comparison to this one marginal improvement I will show you in one tiny corner of health care, because that might save a life, and therefore I win.
If I wanted to refute this particular example, I could point to the fact that the presidential campaign in question resulted, for good or ill, in the passage of a bill that gave health insurance to tens of millions of Americans. Human lives were at stake!
If I wanted to defend the internet economy, I could point out that it enables people to get better medical information, find doctors and insurance plans, find alternate solutions, get better treatment, and also hold up our entire economy thereby allowing us to keep paying for all that health care we keep buying. Human lives were at stake!
I don’t want to do either of those things, because I shouldn’t have to.
On one level, I shouldn’t have to because those truths should be obvious. But on a more important level, I shouldn’t have to because these truths should be unnecessary.
It is good and right that we disagree on the question “Conan, what is best in life?” and most of us strongly disagree with Conan’s answer, but I would hope we can all agree that “not dying”is, at best, a terribly incomplete answer.
We (almost) all know that, of course. Our aliefs (almost) all agree. Almost every day we get up and do things other than minimize people’s chances of death that day or maximize people’s expected future lifespan. And all of this is coming from a guy who thinks of ‘living longer than the expected lifespan of the Sun’ as not only a good idea but a life goal worth working towards.
It is not enough to know this. It is not even enough to act like it when you make decisions for yourself. You have to act like it to others. You must act like it when you make decisions for everyone. You need to speak like it, write like it, debate and judge argument like it. You must assign status like it. You need to actually vote for the torture over the dust specks if and only if that is the answer you get when you shut up and multiply. No excuses.
If you hear someone say something like the quoted passage above, let alone say the exact words “but human lives are at stake!” then act the same way you would if someone had just said “think of the children.”
Let us not belittle, or shame, or price into bare subsistence with insurance costs and regulations and moral obligations, the butcher, the baker or the candlestick maker.
Now that we have this as a reference I and others can link back to, I can now resume reading the book. Good talk.