Calling Out Apple’s Failure To Curb Child Porn
Ahead of Apple’s annual meeting, here’s what you need to know about our proposal.
Heads up, this isn’t going to be an easy read.
Several weeks ago, we talked about a proposal we submitted at Apple on a client’s behalf, urging the company to provide a greater level of transparency regarding its choices to not deploy software that would detect child sex abuse material (otherwise known as CSAM). The proposal is on the company’s proxy statement (see here) and will be voted on at Apple’s annual meeting next Tuesday.
Aside from submitting this proposal, we also submitted something called a ‘Notice of Exempt Solicitation’ to the SEC in support of the proposal. In short, it’s a more in-depth explanation of why shareholders should vote in favor of this proposal (the official proposal is limited to 400 words), and a great way for shareholders to talk directly to other shareholders about certain proposals on the ballot.
The following is an (edited) version of what we registered with the SEC, and contains some disturbing content regarding the effects of Apple’s laxity when it comes to preventing online child sex abuse. Reader’s discretion advised.
-
Resolution
Bowyer Research urges Apple shareholders to vote YES on Proposal 5, “Report on Costs and Benefits of Child Sex Abuse Material‐Identifying Software & User Privacy.”
Background
No shareholder ever wants to see their company described as the “greatest platform for distributing child porn.” Yet that’s exactly how one of Apple’s top executives describes the company’s relationship with online child exploitation.
In messages unearthed during the company’s 2021 legal battle with Epic Games, Apple Fraud Engineering Algorithms and Risk head Eric Friedman delivered a shocking assessment of Apple’s status when it comes to protecting kids online. Given Apple’s commitment to protecting user privacy, Friedman admitted the dark reality of that commitment: “we are the greatest platform for distributing child porn, etc." In the same documents, Friedman provided evidence in the form of an upcoming trust and safety presentation at Apple that listed “child predator grooming reports” as both an “active threat” and an “under-resourced challenge.”
But not everyone needs Apple execs to explain how the company’s approach to child sexual abuse material (CSAM) has holes. “Amy” and “Jessica,” the plaintiffs in a $1.2 billion dollar class action lawsuit against Apple, experience this reality every time they discover that photos of the sexual abuse they experienced as children have been uncovered on another Apple device. Law enforcement notifies them when this CSAM, video documentation of rapes and molestation that occurred when they were children, is discovered on iCloud accounts, or on a MacBook in Virginia.
Legal & Reputational Risk
Amy & Jessica go by pseudonyms in the lawsuit to protect the innocent — even as they raise questions about Apple’s ability to do just that. The lawsuit alleges that Apple knowingly allowed child sex abuse material to proliferate under its watch through its choice to not deploy CSAM-scanning material due to privacy concerns. The company’s 2022 decision to not deploy NeuralHash, a technology that would detect CSAM for further analysis and reporting if certain thresholds were met, sparked celebration from online privacy hawks, criticism from child safety advocates, and controversy across the board. And that controversy has only grown.
Apple has been on the National Center on Sexual Exploitation’s ‘Dirty Dozen’ list for the second year in a row, due to its CSAM-scanning policies and a plethora of other points of concern, including evidence of Apple’s App Store promoting “nudify” apps that utilize AI as rated for ages four and above. While this issue with the App Store differs from the choice to kibosh CSAM-scanning protocols, it, too, creates controversy: further fueling shareholder concerns that the balance between user privacy and child protection at Apple is tilting in a way that creates ever-increasing levels of reputational risk.
No Apple shareholder wants to see the company described by its own executives as the “greatest platform for distributing child porn.” No Apple shareholder wants to see articles in mainstream news outlets describing how the company “helped nix parts of a child safety bill.” No Apple shareholder wants to see lawsuits like the one stemming from Amy & Jessica’s horrific abuse as children, alleging that the company knew the risks of CSAM and refused to act. All of these things happened. The push for increased transparency around Apple’s approach to combating online exploitation, as exemplified in Proposal No. 5, is only rational from any shareholder concerned about this growing area of reputational risk.
Rebuttal to the Board’s Statement of Opposition
In direct contrast to the rational nature of this approach, Apple’s statement of opposition (SOP) contains several notable errors, ranging from the tangential to mischaracterization so blatant that it calls into question whether Apple’s Board of Directors even understands the proposal it writes to oppose.
Firstly, the tangential. The SOP references Apple’s Communication Safety protocols, including blurring photos/videos that contain nudity, and notes particularly that such features are on by default for “accounts of children under 13.” While a laudable feature, Apple also implies that this safety feature is not on by default for accounts of users between the ages of 13–18. Does the company believe that children should not be protected, by default, from exposure to sexually exploitative content, after their 13th birthday? Why is this feature not on by default for all users under the age of 18? If the Board’s statement is accurate, children older than 13 using Apple messaging have less default protection against being exposed to explicit sexual content than adults on many dating apps (which blur out explicit media by default for users over 18). When a 13-year-old using iMessage is less protected against sexual exploitation than a 23-year-old on Bumble, Apple’s reputational risk becomes abundantly clear.
-
Secondly, the Board mischaracterizes the essential ask of the Proposal: a cost-benefit analysis on CSAM detection software, and not the implementation of any specific policy. Apple’s SOP repeatedly references the “universal surveillance suggested in the proposal,” as if the Board can’t differentiate between asking for a cost-benefit analysis for software and asking for the implementation of specific software. It may benefit Apple rhetorically to paint the implementation of CSAM detection software as universal surveillance (although Apple was certainly singing a different tune when it announced the software and described it as having a “1 in 1 trillion” error rate). But the proposal isn’t asking for specific implementation, and Apple’s inability to distinguish between the basic and separate categories of risk analysis and implementation is absurd, and only further discredits the company’s opposition to this Proposal.
Conclusion
The sophistry of the board’s statement of opposition aside, the problem the Proposal seeks to address has not changed: Apple has a reputational problem when it comes to its stance towards online child exploitation. This isn’t mere shareholder opinion or twisted corporate activism, but an objective statement of the facts. Apple may have an honest and intelligible rationale for not deploying CSAM detection protocols – but shareholders deserve to know what that rationale actually is.
Apple has the opportunity to cast additional light on the pathway it took to arrive at its decisions surrounding CSAM detection software, and by doing so to offer shareholders something tangible in the face of mounting reputational risk, other than requests for trust in the company with no verification. In the name of providing such verification, we urge a vote in favor of Proposal No. 5.
-
You can read the unedited version, submitted to the SEC here: https://www.sec.gov/Archives/edgar/data/1996368/000109690625000152/bowy_px14a6g.htm
Thanks for being with us in this fight. It’s a privilege to be in this battle on your behalf.
Isaac Willour is a Corporate Relations Analyst at Bowyer Research.