Close Menu
    Facebook X (Twitter) Instagram
    Wifi PortalWifi Portal
    • Blogging
    • SEO & Digital Marketing
    • WiFi / Internet & Networking
    • Cybersecurity
    • Tech Tools & Mobile / Apps
    • Privacy & Online Earning
    Facebook X (Twitter) Instagram
    Wifi PortalWifi Portal
    Home»Privacy & Online Earning»The Internet Still Works: SmugMug Powers Online Photography
    Privacy & Online Earning

    The Internet Still Works: SmugMug Powers Online Photography

    adminBy adminApril 27, 2026No Comments7 Mins Read
    Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
    One person holds a megaphone for another, with rainbow stripes
    Share
    Facebook Twitter LinkedIn Pinterest Email

    SmugMug is a family-owned photo hosting and e-commerce platform that helps professional photographers run their businesses online. Founded in 2002, the company provides tools for photographers to show their work, deliver client galleries, sell prints, and manage payments. 

    In 2018, SmugMug purchased Flickr, the long-running photo-sharing community, which added tens of millions of active hobbyist photographers to the company’s user base. 

    Ben MacAskill is President and COO of SmugMug’s parent company, Awesome, which he co-founded with his family. Awesome also includes the media network This Week in Photo and the nonprofit Flickr Foundation, which focuses on preserving publicly available photography. MacAskill has been an active voice in policy discussions around Section 230 and online platform regulation. He was interviewed by Joe Mullin, a policy analyst on EFF’s Activism Team.

    Joe Mullin:  How would you explain Section 230 to a SmugMug photographer who hasn’t heard of it but relies on you to share their work, run their business.

    Ben MacAskill: Section 230 allows us to run our business. We are a small, family run business. We don’t have the resources to police every single upload, every single comment, or every single engagement that happens on the site. 

    That includes photographers who have comments on their sites. Anywhere there’s interaction online, Section 230 protects us. 

    It doesn’t absolve us of liability. We can’t run rampant and do anything we want. It  just helps protect us and make it scalable so that we can run our business.

    What would you have to change if Section 230 were eliminated or significantly narrowed? 

    Honestly, there’s a high chance that it would bankrupt platforms like ours. They’re not wildly profitable. If Section 230 is done away with, we have to [check] content that goes online to make sure we’re not liable. That means policing tens of millions of uploads per day. 

    That would kill the business of a lot of photographers. Can you imagine—you just got married, and you’re waiting for your wedding photos for a week or two because they’re in some moderation queue? 

    If we don’t have legal protections, and we get one nefarious customer—if something goes sideways—then I’m liable for that. 

    I don’t, and can’t possibly know, whether every single photo is appropriate or legal, as it’s uploaded. We would literally have to moderate everything before it goes online. I don’t think any business can afford that, period. I guess you could have an offshore call-center type thing. Still, it would change the entire nature of the real-time internet. Imagine posting something to Instagram and having the platform say, “Cool, we’ll get back to you in 8 to 12 days.” 

    What kind of content moderation do you do on SmugMug? 

    If a user uploads something illegal, we will report them as soon as we find it. We’re not protecting them. We don’t condone or allow illegal behavior. We work very closely with organizations, nonprofits and governmental agencies to detect CSAM—child exploitative material—and we report that to the National Center for Missing and Exploited Children. We will report users, we eliminate illegal content on our platforms—which is one reason we have such a low prevalence of that problem. 

    But that does take effort and time to find, and there is currently no perfect solution. The tech solutions that exist can’t detect it at 100% accuracy, or anywhere close. And with tens of millions of uploads a day, going through them one by one is impossible. 

    How do you think more generally about protecting user speech and creative expression? 

    On SmugMug, we’re really focusing on professionals running their business. So we don’t have to [weigh in] on content too much. 

    On Flickr, we are big proponents of expression and artistic creativity. Photographers have opinions! But we do draw the line at things like hate speech and harassment. We aggressively maintain a friendly platform. Our community guidelines are very specific, that you cannot harass other customers, you cannot upload stuff classified as hate speech, or threats, or anything along those lines. 

    Those rules are generally policed by the community. We do have some text analysis tools, but when community members feel harassed or threatened, reports will come in. We’ll address them on a one-by-one basis and remove harassing material from our platform. 

    Our ability to moderate is one of the things that makes Flickr what it is. If we lose the ability to enforce our own moderation rules—or have that legislated for us—then it changes the entire nature of the community. And not in a good way. Losing the ability to moderate would permanently and forever change what we’ve built.

    What kind of complaints or takedown requests do you receive, and how do you handle it, both in the U.S. and abroad? 

    Flickr is often referred to as the friendliest community online. You know, we’re not dealing with a lot of hate. We’re not dealing with a lot of threats. Under other frameworks, like the DMCA, we do takedowns on copyrighted material. 

    We’re able to handle it with a fully internal team, and we have a great track record. But the user base and the content base is so large that, if we had to assume that those tens of millions of uploads a day are problematic, the burden would be extreme. 

    We have a robust Trust and Safety Team, and we operate in every non-embargoed country on Earth. So we are subject to a lot of different laws and regulations: “likeness” rules and privacy rules in certain countries that don’t exist here in the United States. Even state to state, there’s some varying laws. It’s a complicated framework, but we pay attention to it. 

    The globe responds in much the same way that Section 230 is working. That is, we operate on reports and discovery, not on pre-screening everything. 

    What do you think that policy makers most often misunderstand about how platforms like yours operate?

    One misconception is that we are not beholden to any laws. That Section 230 absolves us of any responsibility and any liability, and we can just do whatever we want. They talk about it as “reining in tech companies,” or “holding tech companies accountable.” But I am accountable for the content on my platform. We’re not given this “get out of jail free” card. 

    And I think they assume all platforms don’t really care about this, that anything that is done is done begrudgingly. But we’re very proactive about keeping a clean, polite, and friendly community. We are already very aggressively policing our platform. 

    And even legal content gets moderated, because it might just not be appropriate for a particular community. 

    We enforce our rules, and much the way that other private in-person businesses will enforce their rules. If you start screaming hateful things at patrons in a coffee shop, they’re going to throw you out. They want a quiet, chill vibe where people can sip their lattes. We’re doing the same sort of things. 

    As an independent family owned company you’re in an ecosystem dominated by much larger platforms. How are these issues different for you as a smaller service? 

    I think it’s a much more existential threat for middle and small tech companies. It also shuts off the next generation of these platforms. The computer science student in a dorm room right now won’t have the legal protections to launch, to even try to build something new. At least not here in the United States. 

    internet Online Photography Powers SmugMug Works
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email
    Previous ArticleHow to Lower Your Cost Per Click in Google Ads & Meta Ads
    Next Article Incomplete Windows Patch Opens Door to Zero-Click Attacks
    admin
    • Website

    Related Posts

    How I Made $500,000 in Candle Sales on Etsy

    April 27, 2026

    7 Steps to Build a Marketing Strategy That Actually Works in 2026

    April 26, 2026

    Act Now to Stop California’s Paternalistic and Privacy-Destroying Social Media Ban

    April 25, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Search Blog
    About
    About

    At WifiPortal.tech, we share simple, easy-to-follow guides on cybersecurity, online privacy, and digital opportunities. Our goal is to help everyday users browse safely, protect personal data, and explore smart ways to earn online. Whether you’re new to the digital world or looking to strengthen your online knowledge, our content is here to keep you informed and secure.

    Trending Blogs

    Americans lost over $2.1 billion to social media scams in 2025

    April 27, 2026

    Best practices for answer engine optimization (AEO) marketing teams can’t ignore

    April 27, 2026

    Nvidia’s ‘AI insurance policy’ balances immediate and future AI approaches

    April 27, 2026

    Incomplete Windows Patch Opens Door to Zero-Click Attacks

    April 27, 2026
    Categories
    • Blogging (69)
    • Cybersecurity (1,539)
    • Privacy & Online Earning (185)
    • SEO & Digital Marketing (939)
    • Tech Tools & Mobile / Apps (1,796)
    • WiFi / Internet & Networking (247)

    Subscribe to Updates

    Stay updated with the latest tips on cybersecurity, online privacy, and digital opportunities straight to your inbox.

    WifiPortal.tech is a blogging platform focused on cybersecurity, online privacy, and digital opportunities. We share easy-to-follow guides, tips, and resources to help you stay safe online and explore new ways of working in the digital world.

    Our Picks

    Americans lost over $2.1 billion to social media scams in 2025

    April 27, 2026

    Best practices for answer engine optimization (AEO) marketing teams can’t ignore

    April 27, 2026

    Nvidia’s ‘AI insurance policy’ balances immediate and future AI approaches

    April 27, 2026
    Most Popular
    • Americans lost over $2.1 billion to social media scams in 2025
    • Best practices for answer engine optimization (AEO) marketing teams can’t ignore
    • Nvidia’s ‘AI insurance policy’ balances immediate and future AI approaches
    • Incomplete Windows Patch Opens Door to Zero-Click Attacks
    • The Internet Still Works: SmugMug Powers Online Photography
    • How to Lower Your Cost Per Click in Google Ads & Meta Ads
    • Unpatched PhantomRPC Flaw in Windows Enables Privilege Escalation
    • Adthena launches Google Ads-to-ChatGPT conversion tool
    © 2026 WifiPortal.tech. Designed by WifiPortal.tech.
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer

    Type above and press Enter to search. Press Esc to cancel.