Section 230: the Legal System's Shield for Algorithms

I am a law student just beginning to study the effect of AI on society. Tristan Harris and Yuval Noah Harari brought me here.

Section 230 is widely attacked from all over the political spectrum. Some say it allows too much content moderation (infringing free speech) and others say the opposite, that it allows too little content moderation (allowing companies to escape their social responsibilities.

But no-one I have found is focused on the common problem caused by Section 230: that it has been interpreted by courts to mean that content algorithms are neutral, “passive conduits”, rather than the society-defining architecture they actually are. Because of this statute, courts dismiss all lawsuits that seek to hold internet companies responsible for user-generated content; as a result, their algorithms are kept in the dark. We don’t know how much moderation is being done, and what criteria goes into content algorithms–this seems to me the key legal obstacle to making true progress here.

I’m wondering what can be done about this, and I like the comparison to impact-litigation of the past: to make progress in civil rights laws, Thurgood Marshall designed a strategy to fight particular legal battles over the course of several years, in hopes of pushing the courts to start making more equality-minded law. I think CHT would be a perfect organization to do something like this–they seem to be the ACLU of the 21st Century.

Curious what everyone’s thoughts are on this. I’ve poked around the site and can’t find anyone focused on this issue, and there are very few lawyers/scholars looking at the power of algorithms to design society.



Hi Gideon!

Glad to see someone else making noise about Section 230.

I’m not a lawyer or scholar but it’s my favorite sub-topic at HTC as you can see from my post history.

Section 230 created the monsters we now know as Google, Twitter, and Facebook, that have become the censorship division of the US bureaucratic regime. It can’t be defeated in court because the precedent was already set upholding it. It needs to be repealed. But it won’t be because of the EFF’s (aka the Big Tech Lobby) influence on Congress. You would need some kind of counter-lobby (the problem there is money) or grassroots movement (the problem there is education).

1 Like

One more thought, if you really want to get the word out on this issue, have you thought about starting a podcast?