Again related to our recent post about an ethical situation we found ourselves in (you don’t need to start with that post but you can if you wish).
After weeks of pondering, the simplest description of the whole category of ethics for us is, “Can I trust you?”
This came about from reading merely the introduction to the book Skin in the Game* by controversial thinker Nassem Nicholas Taleb. (You can read the intro for free in the online preview, which is worthwhile no matter what.)
We’ve come across so many instances of ethics in the real world since considering this incident (in case you are sick of this discussion and wish we’d talk about something else by now… sorry/not sorry. this is interesting for us and we feel wholly relevant to an individual’s existence in the modern era).
The reason why ethics matters is that it helps you make decisions — for yourself, and also decisions on who you want to involve yourself with.
If a company offered you a job with a salary higher than you’ve ever earned, are you going to accept it regardless?
Or are you going to think about what that company does, and consider the value that it brings into the world?
Here’s one piece of seriously disturbing data from a survey of programmers that the site StackExchange conducted :
Perhaps respondents misread the question or were moving too quickly through the survey to think it through. As phrased, it sounds like the question says, “You have already identified that it’s unethical. What do you do?” If you’ve DECIDED that it’s unethical, then that right there is your decision. Isn’t it? You should not be involved in building software — or contributing to any task at work — if you’ve decided that it’s unethical. Yet 36.6% sounded like they were waffling. That maybe they’d go ahead and build it.
Yes we agree, there are a gazillion grey areas, but in this case it’s saying that you’ve DECIDED what the ethics are. Why is the answer then, “It depends”?
The main angle we can see in those 36.6% is that they are being honest and acknowledging that perhaps they might do it anyway. It’s actually likely that a good chunk of the 58.5% saying they would not build it actually would. In the real world, there are tremendous pressures and influences, and dealing with a hypothetical, it’s very easy to claim moralistic purity. If not building the software out of your convictions would cost you your job, we can totally see why it would be hard to stand up and speak out. There’s also the problem of power dynamics. If you’re just one guy (or gal) on the team, and your boss and all your teammates are going along with this, and you understand it to be greenlit and approved from above, then you’re inclined to assume that others have thought through the ethics and have determined that it’s OK to do it. Most people want to assume that they work with others who also are ethical. If you’re the only one perceiving an ethical problem and everyone else is all gung-ho and excited to blast forward with it, it would be very easy to second-guess your own worries and stay silent.
But those are exactly the situations where being brave and speaking up are so important.
Sometimes there is all this momentum in an organization and nobody stops and takes a step back. Everyone assigns the responsibility for moral policing to the overall group, assuming that everyone else has handled that task — yet it’s possible no one has done it. It’s so easy to get blinders on.
It’s like the H&M hoodie ad that came out in January with a picture of an African American boy in a sweatshirt that was just — no. That was so obviously insensitive that some people even assumed it must have been intentional as a way to generate attention for the brand. Because who could have seen that and not noticed how bad it was?
If you’re sitting in a meeting with thoughts of, “Am I crazy? Is nobody else noticing this THING that is wrong here?” then you NEED TO SPEAK OUT.
And you need to put some thought to big questions like this in advance. So that you are prepared and ready to take action if (when) the time comes for such action.
If the first time you’ve ever thought about these difficult situations is in the moment when you’re faced with one for the first time, then it’s likely you will fail to act, out of self-doubt or nervousness or not wanting to be wrong, and lacking confidence to stand up for your ideas. These things matter.
There was a thing on RadioLab about new types of media editing technology that are a little bit freaky in terms of their potential to do harm — but what was even freakier was the completely laissez faire attitude of one of the developers. Here’s direct access to the RadioLab stream if you are interested in hearing the whole piece:
Here’s a demo of an Adobe voice editing app that definitely raises concerns:
It’s basically PhotoShop but for audio.
But the chilling part in the RadioLab segment was where they talked about this technology:
And they interviewed one of the developers in this field, a CS professor at University of Washington named Ira Kemelmacher-Schlizerman (starts around 11:15). And the interviewer was basically asking, “Aren’t you freaked out about what this technology can do?” and the technologist woman answered with a virtual shrug of her shoulders. She said her job is a technologist. That’s it. She was fully unbothered — or she just had never considered the questions or her part in it before. (this specific exchange is around 20:05)
Ohboy.
Brave Supplicant, please don’t be like that. In any context of your life, anywhere. That just hurts to hear.
Though honestly, her position of “I’m just a technologist” is probably not unethical.
These are the things that the real intellectuals are debating. It’s where Elon Musk and Mark Zuckerberg clash (the ethics of AI and how much trouble we’re gonna be in if constraints aren’t put into place, but hey Elon, even if they are, there will be people like this technologist who don’t see it as part of their job to worry about such trivial things as, like, how the tech they’re building can do harm.)
It all just makes the head hurt.
And in the end, we come full circle.
Are all ethical decisions completely personal? Is ethics defined by the individual only?
The technologist with the capacity-to-do-harm software is clearly not bothered. Her ethics say it’s OK.
Granted, there are BIG cultural implications in all of this; what is considered “unethical” varies TREMENDOUSLY based on what part of the world you’re from.
We’re not accusing the technologist of being unethical. We’re looking at stuff.
Any further thoughts coming from all you BSers on these topics? (If not, don’t worry – we’ll be moving along to other things very soon!!!)
*In the first draft of this post, we wrote the title as Sin in the Game which is sort of funny as a Freudian slip. Also: That link has a referral code, if you click it we get like a penny or something if you go on to buy. If you don’t want to make Mr Bezos pay us a penny that way, please just go direct to Amazon and search for the title yourself. You know how it works.
buffalo says
“If you’re the only one perceiving an ethical problem and everyone else is all gung-ho and excited to blast forward with it, it would be very easy to second-guess your own worries and stay silent.
But those are exactly the situations where being brave and speaking up are so important.”
I recently read “All the Light We Cannot See” (great book btw), and this reminds me of Frederick, Werner, and Madame Manec. Without ruining the plot, Frederick speaks up and disobeys orders at the Nazi academy. Frederick engages in explicit defiance. He is not treated kindly (to put it nicely). Werner, his friend, is impressed with Frederick and wants to speak up too but doesn’t. Werner attempts to help Frederick, but does not actively try to subvert the academy. Separately, Madame Manec engages in covert defiance which proves to be rather sucessful.
I find the contrast in techniques to be interesting. Explicit defiance conflicts with people’s engrained tendency towards self-preservation, because it makes you an immediate target. Depending on the situation, this can be good (especially if you are trying to inspire), or it can hinder your overall efforts. Covert defiance is less likely to inspire, but may be more effective. I think that people are predisposed to favor either explicit or covert defiance, but I believe that both are necessary to bring about change. Just don’t be Werner.
essaysnark says
You’re not the first to recommend that book, but you’re the first to describe it in some detail — sounds excellent! These are now contemporary issues for all of us, whether we’re eager to be grappling with them or not (most people aren’t, but some people will do the grappling, because they realize that they must). You also raise questions that go deep, about the danger for an individual in going against social norms, and the fear of being ostracized. These are core in all of us, as they deal with survival. We are social creatures and from an evolutionary perspective, it has been very, very dangerous for an individual to be separated from the group. That’s why the heart races and palms sweat when we speak up to authority or express an unpopular opinion. That cover defiance thing is fascinating — now we’ll have to read this book! Thank you for the recommendation, buffalo!!
buffalo says
Yes, agreed with the socialization aspect. Do read the book! It’s beautiful in its descriptions and written in an engaging way (short chapters that alternate between the two protagonists).
essaysnark says
Maybe we should do another “what are you reading?” post around here — especially as this is pretty much the only time that allows for leisure reading in the admissions calendar! We’re often impressed with the recommendations you BSers offer!