Close Menu
    Facebook X (Twitter) Instagram
    Wifi PortalWifi Portal
    • Blogging
    • SEO & Digital Marketing
    • WiFi / Internet & Networking
    • Cybersecurity
    • Tech Tools & Mobile / Apps
    • Privacy & Online Earning
    Facebook X (Twitter) Instagram
    Wifi PortalWifi Portal
    Home»Privacy & Online Earning»The Internet Still Works: Wikipedia Defends Its Editors
    Privacy & Online Earning

    The Internet Still Works: Wikipedia Defends Its Editors

    adminBy adminFebruary 10, 2026No Comments8 Mins Read
    Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
    cartoon of a figure holding a megaphone for another + photo of Jacob Rogers
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Section 230 helps make it possible for online communities to host user speech: from restaurant reviews, to fan fiction, to collaborative encyclopedias. But recent debates about the law often overlook how it works in practice. To mark its 30th anniversary, EFF is interviewing leaders of online platforms about how they handle complaints, moderate content, and protect their users’ ability to speak and share information. 

    A decade ago, Wikimedia Foundation, the nonprofit that operates Wikipedia, received 304 requests to alter or remove content over a two-year period, not including copyright complaints. In 2024 alone, it received 664 such takedown requests. Only four were granted. As complaints over user speech have grown, Wikimedia has expanded its legal team to defend the volunteer editors who write and maintain the encyclopedia. 

    Jacob Rogers is Associate General Counsel at the Wikimedia Foundation. He leads the team that deals with legal complaints against Wikimedia content and its editors. Rogers also works to preserve the legal protections, including Section 230, that make a community-governed encyclopedia possible. He was interviewed by Joe Mullin, a policy analyst on EFF’s Activism Team. 

    Joe Mullin: What kind of content do you think would be most in danger if Section 230 was weakened? 

    Jacob Rogers: When you’re writing about a living person, if you get it wrong and it hurts their reputation, they will have a legal claim. So that is always a concentrated area of risk. It’s good to be careful, but  I think if there was a looser liability regime, people could get to be too careful—so careful they couldn’t write important public information. 

    Current events and political history would also be in danger. Writing about images of Muhammad has been a flashpoint in different countries, because depictions are religiously sensitive and controversial in some contexts. There are different approaches to this in different languages. You might not think that writing about the history of art in your country 500 years ago would get you into trouble—but it could, if you’re in a particular country, and it’s a flash point. 

    Writing about history and culture matters to people. And it can matter to governments, to religions, to movements, in a way that can cause people problems. That’s part of why protecting pseudonymity and their ability to work on these topics is so important. 

    If you had to describe to a Wikipedia user what Section 230 does, how would you explain it to them? 

    If there was nothing—no legal protection at all—I think we would not be able to run the website. There would be too many legal claims, and the potential damages of those claims could bankrupt the company. 

    Section 230 protects the Wikimedia Foundation, and it allows us to defer to community editorial processes. We can let the user community make those editorial decisions, and figure things out as a group—like how to write biographies of living persons, and what sources are reliable. Wikipedia wouldn’t work if it had centralized decision making. 

    What does a typical complaint look like, and how does the complaint process look? 

    In some cases, someone is accused of a serious crime and there’s a debate about the sources. People accused of certain types of wrongdoing, or scams. There are debates about peoples’ politics, where someone is accused of being “far-right” or “far-left.” 

    The first step is community dispute resolution. On the top page of every article on Wikipedia there’s a button at the top that translates to “talk.” If you click it, that gives you space to discuss how to write the article. When editors get into a fight about what to write, they should stop and discuss it with each other first. 

    If page editors can’t resolve a dispute, third-party editors can come in, or ask for a broader discussion. If that doesn’t work, or there’s harassment, we have Wikipedia volunteer administrators, elected by their communities, who can intervene. They can ban people temporarily, to cool off. When necessary, they can ban users permanently. In serious cases, arbitration committees make final decisions. 

    And these community dispute processes we’ve discussed are run by volunteers, no Wikimedia Foundation employees are involved? Where does Section 230 come into play?

    That’s right. Section 230 helps us, because it lets disputes go through that community process. Sometimes someone’s edits get reversed, and they write an angry letter to the legal department. If we were liable for that, we would have the risk of expensive litigation every time someone got mad. Even if their claim is baseless, it’s hard to make a single filing in a U.S. court for less than $20,000. There’s a real “death by a thousand cuts” problem, if enough people filed litigation. 

    Section 230 protects us from that, and allows for quick dismissal of invalid claims. 

    When we’re in the United States, then that’s really the end of the matter. There’s no way to bypass the community with a lawsuit. 

    How does dealing with those complaints work in the U.S.? And how is it different abroad? 

    In the U.S., we have Section 230. We’re able to say, go through the community process, and try to be persuasive. We’ll make changes, if you make a good persuasive argument! But the Foundation isn’t going to come in and change it because you made a legal complaint. 

    But in the EU, they don’t have Section 230 protections. Under the Digital Services Act, once someone claims your website hosts something illegal, they can go to court and get an injunction ordering us to take the content down. If we don’t want to follow that order, we have to defend the case in court. 

    In one German case, the court essentially said, “Wikipedians didn’t do good enough journalism.” The court said the article’s sources aren’t strong enough. The editors used industry trade publications, and the court said they should have used something like German state media, or top newspapers in the country, not a “niche” publication. We disagreed with that. 

    What’s the cost of having to go to court regularly to defend user speech? 

    Because the Foundation is a mission-driven nonprofit, we can take on these defenses in a way that’s not always financially sensible, but is mission sensible. If you were focused on profit, you would grant a takedown. The cost of a takedown is maybe one hour of a staff member’s time. 

    We can selectively take on cases to benefit the free knowledge mission, without bankrupting the company. To do litigation in the EU costs something on the order of $30,000 for one hearing, to a few hundred thousand dollars for a drawn-out case.

    I don’t know what would happen if we had to do that in the United States. There would be a lot of uncertainty. One big unknown is—how many people are waiting in the wings for a better opportunity to use the legal system to force changes on Wikipedia? 

    What does the community editing process get right that courts can get wrong? 

    Sources. Wikipedia editors might cite a blog because they know the quality of its research. They know what’s going into writing that. 

    It can be easy sometimes for a court to look at something like that and say, well, this is just a blog, and it’s not backed by a university or institution, so we’re not going to rely on it. But that’s actually probably a worse result. The editors who are making that consideration are often getting a more accurate picture of reality. 

    Policymakers who want to limit or eliminate Section 230 often say their goal is to get harmful content off the internet, and fast. What do you think gets missed in the conversation about removing harmful content? 

    One is: harmful to whom? Every time people talk about “super fast tech solutions,” I think they leave out academic and educational discussions. Everyone talks about how there’s a terrorism video, and it should come down. But there’s also news and academic commentary about that terrorism video. 

    There are very few shared universal standards of harm around the world. Everyone in the world agrees, roughly speaking, on child protection, and child abuse images. But there’s wild disagreement about almost every other topic. 

    If you do take down something to comply with the UK law, it’s global. And you’ll be taking away the rights of someone in the U.S. or Australia or Canada to see that content. 

    This interview was edited for length and clarity. EFF interviewed Wikimedia attorney Michelle Paulson about Section 230 in 2012.

    Defends editors internet Wikipedia Works
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email
    Previous ArticleNeed a new controller? GameSir’s G7 Pro is the best-looking controller I used yet, and it has hall effect triggers, optical switches, and TMR joysticks that are a joy to use
    Next Article Tecno will officially unveil the Camon 50 series and the Pova 8 series at MWC
    admin
    • Website

    Related Posts

    16 Best Checking Accounts of March 2026

    March 3, 2026

    EFF to Court: Don’t Make Embedding Illegal

    March 3, 2026

    11 Best Small Business Checking Accounts of March 2026

    March 3, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Search Blog
    About
    About

    At WifiPortal.tech, we share simple, easy-to-follow guides on cybersecurity, online privacy, and digital opportunities. Our goal is to help everyday users browse safely, protect personal data, and explore smart ways to earn online. Whether you’re new to the digital world or looking to strengthen your online knowledge, our content is here to keep you informed and secure.

    Trending Blogs

    CyberStrikeAI tool adopted by hackers for AI-powered attacks

    March 3, 2026

    16 Best Checking Accounts of March 2026

    March 3, 2026

    3 great Paramount+ movies you’ll want to watch this week (March 2

    March 3, 2026

    Nvidia partners with optics technology vendors Lumentum and Coherent to enhance AI infrastructure

    March 3, 2026
    Categories
    • Blogging (32)
    • Cybersecurity (569)
    • Privacy & Online Earning (79)
    • SEO & Digital Marketing (355)
    • Tech Tools & Mobile / Apps (705)
    • WiFi / Internet & Networking (103)

    Subscribe to Updates

    Stay updated with the latest tips on cybersecurity, online privacy, and digital opportunities straight to your inbox.

    WifiPortal.tech is a blogging platform focused on cybersecurity, online privacy, and digital opportunities. We share easy-to-follow guides, tips, and resources to help you stay safe online and explore new ways of working in the digital world.

    Our Picks

    CyberStrikeAI tool adopted by hackers for AI-powered attacks

    March 3, 2026

    16 Best Checking Accounts of March 2026

    March 3, 2026

    3 great Paramount+ movies you’ll want to watch this week (March 2

    March 3, 2026
    Most Popular
    • CyberStrikeAI tool adopted by hackers for AI-powered attacks
    • 16 Best Checking Accounts of March 2026
    • 3 great Paramount+ movies you’ll want to watch this week (March 2
    • Nvidia partners with optics technology vendors Lumentum and Coherent to enhance AI infrastructure
    • Madison Square Garden Data Breach Confirmed Months After Hacker Attack
    • Google AI Generated Landing Page Patent Is Limited To Shopping & Ads
    • 6 massive sci-fi and fantasy shows you need to watch in March
    • 30 Alleged Members of ‘The Com’ Arrested in Project Compass
    © 2026 WifiPortal.tech. Designed by WifiPortal.tech.
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer

    Type above and press Enter to search. Press Esc to cancel.