Section 230: the Legal System's Shield for Algorithms

I am a law student just beginning to study the effect of AI on society. Tristan Harris and Yuval Noah Harari brought me here.

Section 230 is widely attacked from all over the political spectrum. Some say it allows too much content moderation (infringing free speech) and others say the opposite, that it allows too little content moderation (allowing companies to escape their social responsibilities.

But no-one I have found is focused on the common problem caused by Section 230: that it has been interpreted by courts to mean that content algorithms are neutral, “passive conduits”, rather than the society-defining architecture they actually are. Because of this statute, courts dismiss all lawsuits that seek to hold internet companies responsible for user-generated content; as a result, their algorithms are kept in the dark. We don’t know how much moderation is being done, and what criteria goes into content algorithms–this seems to me the key legal obstacle to making true progress here.

I’m wondering what can be done about this, and I like the comparison to impact-litigation of the past: to make progress in civil rights laws, Thurgood Marshall designed a strategy to fight particular legal battles over the course of several years, in hopes of pushing the courts to start making more equality-minded law. I think CHT would be a perfect organization to do something like this–they seem to be the ACLU of the 21st Century.

Curious what everyone’s thoughts are on this. I’ve poked around the site and can’t find anyone focused on this issue, and there are very few lawyers/scholars looking at the power of algorithms to design society.



Hi Gideon!

Glad to see someone else making noise about Section 230.

I’m not a lawyer or scholar but it’s my favorite sub-topic at HTC as you can see from my post history.

Section 230 created the monsters we now know as Google, Twitter, and Facebook, that have become the censorship division of the US bureaucratic regime. It can’t be defeated in court because the precedent was already set upholding it. It needs to be repealed. But it won’t be because of the EFF’s (aka the Big Tech Lobby) influence on Congress. You would need some kind of counter-lobby (the problem there is money) or grassroots movement (the problem there is education).

1 Like

One more thought, if you really want to get the word out on this issue, have you thought about starting a podcast?

Thank you for the response! I will have to look through your posts, I’m only just getting familiar with this forum.

You are exactly right about Section 230. I have indeed been thinking about starting a podcast. Lawyers don’t understand tech (but many think they do, or think they don’t need to) and tech experts don’t have the necessary experience in law to make progress in this area.

Education is the central problem, and a podcast could help bridge the gap in knowledge. A conversation with experts from each domain, or a conversation between two lawyers but moderated by someone with an understanding of the tech, could be really valuable.

I’m picturing people like Tristan Harris, Jonathon Haidt, Sam Harris, Stuart Russell, Erik Weinstein or Josh Wolfe, Shane Parrish, etc. A conversation between experts from tech, law, policy, ethics, marketing, venture capital, etc.

I’d love any feedback or ideas on how to carry this forward. I have an idea for the name: Just Systems, or Justsys for short.

Hey @Gideon: There is a good deal of discussion around Section 230 and how it may be revised. I run an organization called All Tech Is Human that unites technologists, advocates, policymakers, academics, ethicists, artists, designers, and more to tackle thorny tech/society issues like Section 230. Lately we have been doing livestream conversations every two weeks, and we will have an event on Thurs, June 25th on Section 230 with guests Kate Klonick (law prof, expert on Section 230) and Yael Eisenstat (Cornell Tech, former FB Elections Integrity).

My background is as an attorney, so I certainly focus on Section 230 as well. As it happens, the podcast I co-host (Funny as Tech) JUST released an ep on Section 230 where I get into the difficult position that platforms find themselves in right now–the more they moderate and provide a degree of editorial control, the less likely they appear to be a neutral conduit. All of the platforms are also highly aware of the shifting landscape. I serve as a member of TikTok’s Content Moderation Council.

Here is a Tech & Policy discussion:

“A conversation with experts from each domain, or a conversation between two lawyers but moderated by someone with an understanding of the tech, could be really valuable.” Yep, we are doing this. Aspen Institute has also been doing a lot of conversations like this, and recently featured a few of the members of FB’s new Oversight Board.

Also, there are a bunch of other orgs focused around this issue. One that comes to mind is TheBridge, which brings together technologists, policymakers, and politicians.

Happy to connect,


Hi @DavidRyanPolgar, nice to meet you! :wave: I’ll add your podcast to my rotation.

I checked out that All Tech is Human video. Interesting, but not very inspiring. They call for more regulation, more censorship, and a larger government role in tech. I’d like to hear their dedicated discussion on Section 230 but I couldn’t find it.

If I had to bet I’d say their take is something like, “It’s true that CDA 230 doesn’t make sense anymore, but these companies are too big to fail (i.e. too useful to us to want to get rid of), so instead we’ll use CDA 230 as a bargaining chip to load up on regulations and new agencies,” which is exactly where Trump ended up after all the grandstanding he did.

I just finished listening to your podcast, and I really liked the part about the different types of shares of Facebook and Google that exist. TIL