Close Menu
    Trending
    • What happens if you’re hit by a primordial black hole?
    • When is London Marathon 2026? Start time and how to watch race for FREE
    • Pentagon Requests $54 Billion For AI War
    • Clavicular Hit With New YouTube Crackdown
    • Beijing’s new supply chain rules deepen concerns for US firms in China
    • India denounces ‘hellhole’ remark shared by Trump | Donald Trump News
    • New photos of Mike Vrabel and Dianna Russini emerge
    • AI search demands a new audience playbook
    Benjamin Franklin Institute
    Friday, April 24
    • Home
    • Politics
    • Business
    • Science
    • Technology
    • Arts & Entertainment
    • International
    Benjamin Franklin Institute
    Home»Business»The legal fight that could force Apple to rethink iCloud design
    Business

    The legal fight that could force Apple to rethink iCloud design

    Team_Benjamin Franklin InstituteBy Team_Benjamin Franklin InstituteFebruary 20, 2026No Comments4 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram
    Share
    Facebook Twitter Pinterest Email Copy Link

    West Virginia’s attorney general filed a lawsuit against Apple on Thursday, accusing the iPhone maker of knowingly allowing its software to be used for storing and sharing child sexual abuse material. 

    John B. McCuskey, a Republican, accused Apple of protecting the privacy of sexual predators who use iOS, which can sync images to remote cloud servers through iCloud. McCuskey called the company’s decisions “absolutely inexcusable” and accused Apple of running afoul of West Virginia state law.

    “Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared,” McCuskey said. 

    The West Virginia attorney general said the state would seek “statutory and punitive damages,” changes to Apple’s child abuse imagery detection practices, and other remedies to make the company’s product designs “safer going forward.”

    In the new lawsuit, the state cites a handful of known complaints about Apple’s mostly hands-off approach to its image-hosting service. The biggest concern: Apple finds far fewer instances of online child exploitation than its peer companies do because it isn’t looking for them. 

    In a statement provided to Fast Company, Apple pointed out an iOS feature that “automatically intervenes” when nudity is detected on a child’s device. “All of our industry-leading parental controls and features . . . are designed with the safety, security, and privacy of our users at their core,” an Apple spokesperson said.

    Apple walks the privacy tightrope

    The West Virginia lawsuit isn’t the first of its kind that Apple has faced in recent years, though it is the first coming from a state. In late 2024, a group of thousands of sexual abuse survivors sued the company for more than $1 billion in damages after Apple walked away from a plan to more thoroughly scan the images it hosts for sexual abuse material. In the case, the plaintiffs’ legal team cited 80 instances in which law enforcement discovered child sexual abuse imagery on iCloud and other Apple products. 

    Most tech companies rely on a tool developed by Microsoft more than a decade ago to automatically scan images they host and cross-reference those images against digital signatures in a database of known child-abuse imagery. That tool, known as PhotoDNA, flags those images and acts as the first step in a reporting chain that leads to law enforcement. 

    In the U.S., internet platforms are required by law to report any instances of suspected child sexual abuse material (CSAM) to the National Center for Missing & Exploited Children, the organization that spearheads child abuse prevention in the country. The NCMEC collects tips from online platforms through a centralized CSAM-reporting system, known as the CyberTipline, and forwards those concerns, many collected via PhotoDNA, to relevant authorities.

    In 2023, the NCMEC received only 267 reports of suspected CSAM from Apple. During the same time frame, the organization received 1.47 million reports from Google, 58,957 reports from Imgur, and 11.4 million reports from Meta-owned Instagram.

    Apple appears to know the extent of the problem. “We are the greatest platform for distributing child porn,” Apple executive Eric Friedman said in an infamous 2020 text message that surfaced in discovery during the lengthy court battle between Apple and Fortnite maker Epic Games. Friedman made the statement in a conversation about whether the company’s policies are weighted too heavily toward user privacy rather than safety. 

    Apple is known for robust privacy practices that make its products famously safe from potential hackers. Over the years, those same encryption systems have frustrated law enforcement agencies, such as the FBI, who have sought data locked away on iPhones in the course of their investigations.

    “At Apple, protecting the safety and privacy of our users, especially children, is central to what we do,” an Apple spokesperson said. “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link

    Related Posts

    Business

    AI search demands a new audience playbook

    April 24, 2026
    Business

    AI is replacing creativity with ‘average’

    April 24, 2026
    Business

    Palantir is dropping merch and stirring pots

    April 24, 2026
    Business

    NASA’s awe-inducing iPhone moon video is a free ad for Apple, but there’s a catch

    April 23, 2026
    Business

    The U.S. just changed marijuana law for the first time in decades

    April 23, 2026
    Business

    Want to live a longer, happier life? Science says work to be more successful (but not in the way you might think)

    April 23, 2026
    Editors Picks

    Will UK porn age verification stop children seeing adult content?

    July 24, 2025

    Cracks emerge in Trump’s MAGA coalition

    December 27, 2024

    Opinion | The Underachiever in Chief?

    January 17, 2026

    Travis Kelce’s Ex Ripped For Seemingly Calling Him The ‘Right Person’ For Her

    February 22, 2026

    Doja Cat Slams Timothée Chalamet’s Controversial Arts Comment

    March 9, 2026
    About Us
    About Us

    Welcome to Benjamin Franklin Institute, your premier destination for insightful, engaging, and diverse Political News and Opinions.

    The Benjamin Franklin Institute supports free speech, the U.S. Constitution and political candidates and organizations that promote and protect both of these important features of the American Experiment.

    We are passionate about delivering high-quality, accurate, and engaging content that resonates with our readers. Sign up for our text alerts and email newsletter to stay informed.

    Latest Posts

    What happens if you’re hit by a primordial black hole?

    April 24, 2026

    When is London Marathon 2026? Start time and how to watch race for FREE

    April 24, 2026

    Pentagon Requests $54 Billion For AI War

    April 24, 2026

    Subscribe for Updates

    Stay informed by signing up for our free news alerts.

    Paid for by the Benjamin Franklin Institute. Not authorized by any candidate or candidate’s committee.
    • Privacy Policy
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.