Software Companies and Ethical Responsibility
I’ve been pretty frustrated with western companies helping to enable to censorship and surveillance in non-democratic countries when this directly conflicts with their values otherwise. South Park actually touched on this issue directly with Band in China last year (though that was specifically about a corrupting influence on the entertainment industry). This came up again recently when Zoom blocked a conference organized to commemorate the victims of the Tiananmen Square massacre, by shutting down several American and Hong Kong accounts. Their apology was not that this was wrong, but that they were following Chinese law and their only mistake was the inclusion of non-Chinese accounts in the ban. The coming fix will be to build out their infrastructure to subjugate only those in mainland China (and presumably other places that restrict the speech of their citizens).
Zoom isn’t alone in this - Google had project Dragonfly, Cisco made a pitch to help build the great firewall itself, and even Apple makes concessions for access to the Chinese market. Apple blocks podcast apps that don’t allow China to restrict the available content, removes the Taiwanese flag from Chinese iPhones, and allows China to manage the iCloud infrastructure separately. This kind of behavior is the default taken by western companies: short of laws in the company’s home country preventing sales via things like sanctions, they will employ whatever authoritarian restrictions are required for access to the large market of Chinese citizens. It’s a problem of incentives, but it’s also a problem of character. The people at these companies are responsible for how their software influences the world and are complicit in its abuse. They have a responsibility to think deeply about these issues and consider more than just the legal standard of the country they’re operating in when deciding to engage.
I repeatedly see the same arguments and rationalizations about this issue whenever it’s brought up. After writing the same arguments again in Hacker News that I’ve said elsewhere, I thought it would be better to have a blog post that I can reference to explain how I think about this.
I also understand that there are people who won’t agree with me (often a lot of people) about any specific policy. If people are doing what they think is ethical and they really think it’s the right thing then at least that’s consistent. I might not agree - I might think it’s terrible even, but there’s genuine disagreement there and genuine discussion can be had about it (assuming discussion is still allowed).
What I’m arguing against is more about the people who know better, or are doing something that’s in direct conflict with what they believe to be right. They rationalize it by off-loading that ethical responsibility to others (such as citing the legal framework of the country they’re operating in, saying it’s not my place to determine policy, etc.). They tell themselves their mere presence is helpful, or that if it wasn’t them it’d be someone else who doesn’t care as much as they do.
They structure something like this in their mind for the purpose of being able to hold these contradictory opinions that enable them to do something they think is wrong.
It’s one thing to argue that censorship of citizens is a good idea - I strongly disagree, but at least that’s a genuine position that can be argued against. Positions that allow someone to support doing something that directly conflicts with their values end up being mostly rationalized nonsense.
Evil triumphs when good people do nothing, but it also triumphs when good people do evil things while telling themselves it’s okay.
I think it’s easier to explore this topic with a provocative fake example that can really get at the underlying arguments without getting lost in the specifics of any one particular modern example or policy. By choosing something hopefully everyone thinks is wrong, it’s easier to see the rationalizations more clearly.
On The Ethics of Burning Children
Sarah is a software developer at a large western based international company who has recently become uncomfortable with some of their work. This is a conversation between her and an executive about that work.
Sarah: I’m pretty uncomfortable with our support of the child burnings in the country we just started operating in. I think it’s wrong and unethical, and I don’t think we should be doing it. We have a responsibility for how our software is used, and we’re complicit in the bad behavior of others that use it.
Exec: The child burnings there are entirely legal and we’re complying with all local laws. If we were to make a fuss about this, not only would we be in violation of local law, but we’d be forced to leave the region entirely. You and I may personally find the policy distasteful, but it’s not done arbitrarily - they only burn the children they determine are a risk to the stability of their country. Instability in a region can lead to a lot of conflict, which could lead to a worse outcome for everyone.
Every culture has a different view of what’s right, being uncomfortable with the burnings is just your cultural position on this issue - a symptom of the environment and country you were raised in. Imposing your cultural ideals on others is just another form of imperialism.
Sarah: But don’t the people within that country seem to resist this policy? From what little media gets out, it looks like those protesting the child burnings are imprisoned or even killed. Even if that wasn’t the case and everyone in the country was in favor of the burnings, can’t that still be wrong? Can’t an entire society be wrong about something (like slavery)? Why must we do something we think is wrong, just because others think it’s right? Couldn’t regional instability be used to justify historical levels of oppression and violence, basically anything short of war?
I understand the historical risk of imposing your views on another culture, but that doesn’t mean all views are equally valid, that all morality is relative. I’m also not suggesting we necessarily force our views on them, just that at a minimum we don’t support what we find to be objectionable.
Exec: We monitor the media coming out of that country in order to make sure there’s no abuse of the policy. We haven’t seen much to suggest that they’re abusing our software for this purpose. Our software is also not used to burn the children directly, we just help with the operation of the Ministry of Burnings generally.
Sarah: Yeah, but the policy itself is the abuse. Also, isn’t the media that comes out tightly controlled by the state? I don’t take much comfort just because we’re not the ones who explicitly carry out the burnings.
Exec: Our software is also critical for the Ministry’s other, less controversial responsibilities. This includes things like their forest preservation and environmental protection mandates. While I’d like to be able to pick and choose only the workflows I personally am comfortable with, that’s not realistic for successful contract negotiation or enforcement, and it’s again imposing our cultural views onto the customer.
Sarah: I’m willing to concede that, academically, there is some theoretical point at which the utilitarian trade-off could make the child burnings acceptable, but it certainly isn’t local forest preservation. I would also not trust our judgment in making this kind of calculation given the incentives involved and the risk of rationalizing our actions. I’d be much more comfortable with drawing a line at enabling customer workflows that violate human rights.
Exec: It’s not our place to determine policy. As a global company we have to follow laws within the regions that we operate. It would be extremely difficult to have to evaluate these nuanced issues ourselves. Our country has its own policies that have disenfranchised groups of people for decades, arguably even violating their human rights, yet we still operate in our home country. Isn’t this just a double standard that you’re applying to a foreign place with foreign laws? We need to operate within the legal framework of these countries, we need to respect different cultures’ rules and regulations. What right do we have - a foreign company - to determine what is or isn’t ‘right’ for an entire population of people?
Sarah: Sure, our country has problems, but at least we have a semblance of rule of law and free speech. Western democratic republics also have some dependence on the people to enact those laws. Even that isn’t enough though, and I’d argue we shouldn’t engage in children burning in countries even when it has been enacted by a majority of the people. What’s legal is not the same as what’s right, it’s the bare minimum standard - and in countries without rule of law or without democratic republics it shouldn’t even be that. I’m not arguing that it’s easy, I’m arguing that we have an ethical responsibility to personally consider what we do in addition to whatever law is in place - we can’t offload our ethical responsibility to others entirely. This includes our work at home, perhaps especially, because we have more power to influence it there.
Exec: Ok, but given that you care so much about this - isn’t it better that you’re the one involved? If it wasn’t us, someone else would just take our place - someone that cares less than you do about the children burning. Then isn’t everyone worse off? With us, perhaps we’ll be able to influence children burning policy or comply in such a way where the burning is more compassionate. Local companies skin the children first, which is something we’re trying to convince the local government is antithetical to their goals in a way that they may find persuasive since it’s less efficient than just burning everything. Since the children burning is inevitable, the ethical thing to do in this case is to comply so that we might improve things where we can. In a way, it’s worse to abandon the issue and leave it to those that would do more harm. You may be able to feel comfortable in your moral purity, but it’s a nuanced/complex issue and not engaging doesn’t help to solve it.
Sarah: I’m suspicious when the “ethical” decision also just happens to be what’s easiest for the company to do - I think you’re just rationalizing continued involvement. A lot of other companies have told themselves that their presence will be a positive influence on problematic policy of a country they want to enter, but what ends up happening is that the policy is a corrupting influence on the principles of the company. If there was a direct way to actually influence this and stop the policy - then maybe, but absent that I think we’d be better off not burning the children and working from outside of it to help stop child burnings from happening. Not engaging in the policy we think is wrong doesn’t mean we also have to do nothing.
Exec: You may be able to say we shouldn’t engage in this market at all and feel good about it, but we can’t afford not to - others will step in and get a stronger position. Countries that think we’ll apply our own ethical framework to their policy might avoid signing with us at all if they think we’ll walk away because of our own arbitrary ethical standards.
Sarah: This seems like the cost of doing the right thing. It’s easy to do the right thing when it’s also the easy thing - the only time it’s a real test is when it’s hard. When it’s something you wouldn’t already do otherwise anyway. I’d argue that companies and governments that think our unwillingness to engage in the child burnings makes us bad partners are probably those we would want to avoid anyway.
Three Worlds Collide
It was hard for me to write this without thinking about Three Worlds Collide. It’s a thoughtful fiction story that touches on some of these issues in a more generalized way. I’d recommend it.
Thanks to contributions from interested readers, this post has been translated into other languages.
Russian: Russian Translation