Casey Newton, writing at his excellent Platformer newsletter:

One reason that the distinction between content and design features looks illusory to some is that they exist on a spectrum. If you changed the Instagram recommendation algorithm to promote only educational material, you are changing the content and the design of the app at the same time. 

On the other hand, I reject the argument of the 230 diehards that every design decision is a content decision. The contents of YouTube go unchanged if the videos in your feed do not play by default. Snapchat messages will read the same even if Snap is prevented from incentivizing teens to go to extreme lengths to preserve their streaks. TikTok will still be TikTok if it is prevented from sending teenagers push notifications after midnight.

By identifying and restricting mechanical design features like these, we can preserve the core of Section 230 while also limiting at least some of the harm that platforms can cause. I’m under no illusion that we can solve the teen mental health crisis simply by disabling push notifications on Instagram. But given the complexity of the problem, the least we can do is to attempt some harm reduction: and that should begin with features like those above, which have close to no value outside boosting the valuations of the world’s richest companies.

As a strong supporter and purveyor of Section 230, I, too, have been concerned about what Meta and YouTube’s losses in court last month may suggest. But Newton is right here in stating that there is an important distinction between the way a platform is designed and the speech on that platform. Section 230 of the Communications Decency Act explicitly says that platforms are absolved from responsibility for the content their users post. It does not prevent lawsuits about the design of those platforms.

Candidly, we could have a case in the coming years where someone sues a platform for amplifying certain dangerous content. For instance, if Meta systematically boosted child sexual abuse material — to be clear, I am not saying it does this; this is a hypothetical — that would very much constitute a crime. It cannot be held accountable for that material being posted to the site to begin with, but it can be tried for failing to take precautionary measures to prevent it from being posted or amplified at all. One is a content problem, and the other is a content moderation problem.

The only thing that concerns me with this approach is whether platforms will begin proactively removing more content. I don’t think that’s the takeaway here. Platforms will say they have no choice, and the plaintiffs in these cases will say that the content is the problem, but that’s an overly simplistic approach. Instagram is, according to the jury, designed to put child predators in contact with children. It helps Meta boost its advertising revenue. YouTube was designed addictively. It is not the mere presence of child predators on Instagram and addictive content on YouTube that the two platforms were held accountable for, and thus, increasing content moderation to the point of absurdity is hardly a solution. Rather, the platforms should revise the design of their software to prevent child predators and addiction.

I also agree with Newton’s point about “harm reduction,” another nuanced point in this argument. Meta can’t eliminate all predators from Instagram overnight, or even over the course of a few years. No amount of cops will eliminate crime in the world. That’s obviously impossible. The best way to eliminate abusers on Instagram is through parental intervention and education in classrooms across the world. But just because this is a societal problem doesn’t mean the platforms don’t have a role in preventing addiction and harm on their services. Harm reduction should be the modus operandi, not harm elimination, which is a fleeting, impossible goal.

Ultimately, I don’t think there’s a clash between these decisions and Section 230. The law is about the people, and the decisions are about the product that those people use. They’re on opposite sides of the coin.