Panelists:
Berin Szóka, President, TechFreedom
Daphne Keller, Director, Program on Platform Regulation, Stanford Law School’s Cyber Policy Center
Chris Marchese, Director, Litigation Center, NetChoice

About:
Missouri’s Attorney General has proposed a new rule that would require large social media platforms to give users a choice between the platform’s own content moderation algorithm and third-party alternatives. The proposed rule reflects a growing trend of state efforts to regulate content moderation, raising significant questions about free speech. Our panel of experts will explore these concerns and the practicality of enforcing algorithmic neutrality in today’s digital landscape.

Related Materials:

Burning the House Down to Roast the Pig: Constitutional Limits of FTC, FCC and DOJ Interference in Media and Speech (TechFreedom and Competitive Enterprise Institute seminar)

The FTC’s Quixotic Social Media Inquiry (Tech Policy Podcast ep. 411)

The FTC and Online Speech: What’s Next? (Tech Policy Podcast ep. 410)

NetChoice Comment to Federal Trade Commission Request for Information Regarding Technology Platform “Censorship”

Lawful but Awful? Control over Legal Speech by Platforms, Governments, and Internet Users, by Daphne Keller (U. of Chicago Law Review)

Brief amicus curiae of Francis Fukuyama in Moody v. NetChoice, Daphne Keller, Counsel