Dean Clough

July 26, 2023

Portico Darwin: You are in Section 230


<3 Minute Read
Flag of Ukraine.jpg

A hump day in the middle of summer?  What better time for a dry policy discussion!  And on the topic of Section 230 to boot!  Try to calm your excitement and pay attention, because the adjustments I propose (really, stole, from Scott Galloway) below to this regulation could largely clean up social media.  Overnight, and I'm not kidding.

OK, OK - I get that some don't even know what Section 230 is, or why it matters.  To get you up to speed, here's a quick blurb from The PBS NewsHour:


If a newspaper or news site falsely calls you a swindler, you can sue the publisher for libel.  But if someone posts that on Facebook, you can’t sue the company — just the person who posted it.

That’s thanks to Section 230 of the 1996 Communications Decency Act, which states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

That legal phrase shields companies that can host trillions of messages from being sued into oblivion by anyone who feels wronged by something someone else has posted — whether their complaint is legitimate or not.

Indeed, it is Section 230 that allowed Facebook, Twitter (now X), Instagram, and all of the rest to become what they are today.  Like when Facebook hosts and elevates nonsense and lies about vaccines.  Or when Twitter platforms Trump and his delusional and destructive fantasies.  And especially, when Instagram encourages young girls to harm themselves by boosting certain imagery.  All of that and more has been facilitated by this near 30 year-old law.  Considering where the world was in 1996, Section 230 could be the poster child for the term unintended consequences.

The result is that any social media platform gets to essentially blow off editing and moderating the content their users post.  Unlike, say, the publishers of a newspaper or the producers of a TV news show.  

Except in two major areas.  Here, you'll find that very little slips through the cracks.

The two categories are:

  • Child Pornography
  • Sex Trafficking

In these cases, all of the major platforms - including the shittiest ones - are able to miraculously prevent this disgusting content from getting out.  But how is that different than election misinformation?   Vaccine disinformation?  Putin's bots?   How can they stop the slime that want to harm children, yet not the slime that wants to confuse the public regarding the efficacy of Covid vaccines?

So today, I ask you to consider an Internet landscape where the following carve-outs from Section 230 are added.  In other words, just like the social media companies CAN be held liable if they publish child porn or enable sex trafficking, they would also be held responsible for:

  • Any content that is boosted due to algorithmic processing - i.e., enrage you so to engage you.  When a platform puts its fingers all over the scales of what they're showing in your feed, they've given up the right to say they're an innocent 3rd party.

  • Likewise, any content published that is sourced from an Artificial Intelligence engine.  You may have heard that all of the major AI companies have agreed to digitally watermark what their engines produce.  Good, because the social media companies need to be held accountable for what AI puts on their sites.

  • Anything about local, state or Federal elections.  This has a precedent:  Facebook suspended all election advertising for 90 days prior to the 2020 race, as a means of tamping down on the insanity they themselves had fomented.

  • And the ol' chestnut, healthcare.  If it involves people's heath, these platforms must be held accountable for what information they allow to get oxygen.  There is science, and then there are opinions.  Are scientists infallible?  Of course not - but I'll take Dr. Anthony Fauci over Robert F. Kennedy Jr. every day of the week and twice on Sundays.

That's it.  Will it happen?  There's less than a zero chance, but there's one obstacle
The big tech companies beg for regulation in their messaging to the general public, yet spend hundreds of millions of dollars every year lobbying Congress to make sure that's exactly what doesn't happen.  

I believe that's called cynicism?  


Did I mention Anderson Valley is an official Portico Darwin Happy Place?  These were on Monday, from our sparkling wine picnic at Roederer Estate, and then the Pinot Noir heaven of Goldeneye, Diamond Certified, both.


Jeez, do I love it there, and its vibe has still yet to be ruined by the hordes that plague Sonoma and especially, Napa.

Thank you to any one that is reading this newsletter.


This is just perfect for summer and has somehow not been played to date on KLUFHere is Morphine and the Killer Like Swimming.  Few bands before or since have had such a unique, and dare I say, cool, sound.

About Dean Clough