I have been reading a bit of fellow Yale alum Aaron Gertler’s blog here. Recently, it has featured very persuasive argument of the idea that while empathy is nice, we also need to take morality beyond that and start helping people “even when we have no personal connection or emotional investment in the outcome.” A lot of what he mentions is combating the bystander effect, meaning intervening when kids are bullied shot etc instead of just standing there with our teeth in our mouths, as my grandpa would say, cause we don’t feel like helping. This I appreciated, because being brave is a cool thing that doesn’t get enough mention in morality-world probably.
The problem is Aaron’s article reminded me of my archenemy, which is rationality. And also helping people. And worst of all, rationality AND helping people. This is my archenemy because people like Peter Singer are always telling me that if I was rationally helping people I would be giving $10 to the most effective charity to save lives of people I don’t know, because rationally this would produce EVEN MORE GOOD than spending the same amount of money on going to the Indian buffet by myself, which is, of course, what I did. Except actually, when Singer tells me this I don’t feel bad at all. I don’t know why. Maybe I have a character flaw.
Aaron reminds me of this by quoting from Against Empathy, by the infamous Yale Psychology Professor Paul Bloom. I actually took “Against Empathy: The Class” (aka PSYC 423/CGSC 423/PSYC 623: Cognitive Science of Good and Evil) last fall as an undergrad, which solidified my hatred of rational helping. I think Professor Bloom (who is a great professor btw) would agree that the agenda for the class consisted mainly of him trying to persuade us to be against empathy, which inevitably pushed me to become quite a fan of the “emotion” (empathy probably isn’t an emotion)(empathy is an ill-defined construct). His preferred alternative to having empathy was to be a nice psychopath, or in other words, basically utilitarian.
Which is fine, I guess, if that’s like, your thing. But utilitarianism is honestly kind of boring. You just pick a value, and then you compute things. Traditionally the value would be happiness, which seems like a terrible idea to me but this is kind of a separate issue, so let’s just say the value is happiness. The real problem is I’m kind of not down with any kind of morality where it would be preferable for me to turn into a computer. I’ma just bring back this quote from my last post (ironically by J.S. Mill who was a huge utilitarian):
Supposing it were possible to get houses built, corn grown, battles fought, causes tried, and even churches erected and prayers said by machinery–by automatons in human form–it would be a considerable loss to exchange for these automatons even the men and women who at present inhabit the more civilized part of the world, who are assuredly but starved specimens of what humanity can and will produce.
It seems like utilitarianism kind of runs into the automaton issue. Mixing other thoughts and feelings with our empathy, as in Aarons proposal, seems good. But eliminating empathy and other human emotions in the name of morality seems like a problem. I do not consider myself an especially emotional person. Someone once told me I don’t have emotions, which was rude. But it WAS rude because it seemed like they were denying my humanity. I wouldn’t take that as a compliment, even if according to Bloom et al., maybe I should because overall most emotions are morally detrimental because they make us use favoritism, punch people, spend money on buffets, etc. etc. Morality made out of “pure reason” is fine I GUESS but since I’M not made out of pure reason I can’t help but feel like being turned into an emotionless automaton in human form wouldn’t be the best thing for me. I have a bias towards being the person that I am cause I think individuality is cool and I don’t want to be an indistinguishable rational robot-person. So maybe that’s not THE BEST argument for not helping starving children but it’s where I’m at right now.