Taking on World's Largest Company Over Sex Trafficking
Asking Apple why it’s fighting child protection software and legislation.
A client of ours has proposed a resolution for the next Apple shareholder meeting, asking the company to explain their decision to scrap plans to roll-out software which helps protect minors from sexually explicit content. The proposal recognizes that there are genuine trade-offs between detection of illegal content and privacy concerns, and so the proposal takes a balanced approach, asking the company to explain how they made this decision. Is Apple fully counting the costs, the risks, associated with inadequate protections for minors? We're not convinced they are.
Surprisingly, the company responded within a week, asking for a meeting. This is not how it usually works. Usually, the companies fight to get the proposals off the ballot (which Apple is doing this time around) and only after losing those battles offers a meeting. Apple was strangely motivated in this case.
We wondered why.
Then we saw this article from the Wall Street Journal come out a few days later showing that Apple had successfully lobbied against certain provisions in Louisiana state legislation designed to protect minors from sexually explicit content. Now we understood why Apple was anxious to talk - they knew the story was in the works and the last thing they needed was another wave of stories about Apple's reticence in protecting children from sexual exploitation.
One could reasonably argue that privacy concerns are important and that Apple should place privacy at a premium over protection. Except when we and our client challenged Apple last year about their willingness to give in to Chinese pressure in micro-managing their app store, the company was unable to point to any attempt at all to push back in that case. Heavy lobbying in Louisiana… instant capitulation in China.
We want you to know that sex trafficking will be a major focus of corporate engagement this coming year, as we work with shareholders and with the crack team at Alliance Defending Freedom. Could there be a more urgent need? We are also in conversations with state officials who are wanting to join the fight, but we'll leave it to them to do their own announcements.
Just in case you ever wanted to know what these resolutions look like, I'm including the Apple one, written by our own Isaac Willour, below. I should tell you, however, that Apple is fighting against this proposal, trying to use various legal technicalities, and we do not know whether their massive legal team will win or not.
We want to thank the shareholder for stepping up and putting their "John Hancock" on the line to fight the largest company in the world on behalf of trafficked children.
Report on Costs and Benefits of Child Sex Abuse Material-Identifying Software & User Privacy
Resolved: Shareholders request that Apple Inc. prepare a transparency report on the costs and benefits of the company’s decisions regarding its use of child sex abuse material (CSAM) identifying software. This report shall be made publicly available to the company’s shareholders on the company’s website, be prepared at a reasonable cost, and omit proprietary information, litigation strategy and legal compliance information.
Whereas: In Apple’s Human Rights Policy[1], the company asserts that “we believe in the power of technology to empower and connect people around the world—and that business
can and should be a force for good.” As shareholders of the largest and most innovative tech company in the world, we believe Apple is uniquely positioned to both defend user privacy and also prevent victimization of at-risk populations.
And yet, the balance of privacy and safety at Apple has tilted in a concerning direction. In early 2024, Apple was named[2] to the National Center on Sexual Exploitation’s ‘Dirty Dozen’ list for the second year in a row, a record of the biggest companies engaged in facilitation and enabling sexual abuse and exploitation through their platforms. In a letter sent to Apple executives, NCOSE wrote that “an increasing number of stakeholders… are taking notice and growing frustrated about Apple’s negligence around child protection.” Such instances include Apple’s decision to reverse the implementation of NeuralHash, a program designed to scan for child sexual abuse material while maintaining user privacy. The reversal drew outrage from child safety groups/anti-trafficking watchdogs, indicating to some that Apple’s willingness to prevent distribution of illegal content came second to its desire to advance its commitments to users’ online privacy. Shareholders who care about both user privacy and child safety deserve further information on the way in which Apple arrived at this decision.
Outside of NeuralHash, Apple still fails[3] to block sexually explicit content from being viewed or sent by users under the age of twelve and does not default to censoring explicit content for teenage users on its messaging services. The App Store also recommends[4] content age-rated above an account user’s entered age, a practice that exposes underage users to sexually explicit content. Apple’s inaction has allowed children to be exposed to adult content and facilitated, wittingly or otherwise, illegal sexual exploitation of its youngest users. This inaction is impossible to reconcile with Apple’s stated commitments to “treating everyone with dignity and respect” and its business model as a “force for good.” If a company’s corporate philosophy on human rights allows the sexualization of innocent children to fall through the cracks, such loopholes belie either a lack of meaningful commitment to defending those rights, or an inability to effectively protect them. As Apple shareholders, we know the company is capable of doing better—and creating a world where “think different” means being the industry gold standard when it comes to protecting the most innocent among us.
[1] https://s2.q4cdn.com/470004039/files/doc_downloads/gov_docs/Apple-Human-Rights-Policy.pdf
[2] https://endsexualexploitation.org/apple/
[3] https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/#:~:text=Today%2C%20in%20a%20rare%20move,collectively%20as%20Communication%20Safety%20features.
[4] https://protectchildren.ca/en/resources-research/app-age-ratings-report/