DSA Transparency Report

App Store

April 2024

 

In accordance with Articles 15, 24 and 42 of the EU’s Digital Services Act (DSA), this transparency report provides information on orders and notices of illegal content received by the App Store1 and content moderation that the App Store has undertaken on its own initiative.2 This report covers the reporting period between 28 September 2023 and 27 March 2024. This report reflects data as of 28 March 2024.

Section 1: Orders Received from EU Member States3

This section covers orders issued by EU Member States’ judicial or administrative authorities to act against illegal content in accordance with Article 9 of the DSA. These orders can be submitted through Content Reports (ContentReports.apple.com) — the App Store notice and action mechanism developed in compliance with Article 16 — as well as through existing communication channels.4 The section also covers orders issued by EU Member States’ judicial or administrative authorities to provide information, pursuant to Article 10 of the DSA.

Total number of EU Member State orders to act against illegal content: 0

Total number of EU Member State orders to provide information: 111

Table 1.1: EU Member State orders to provide information categorised by the EU Member State issuing the order

  Count of EU Member State orders
Czech Republic 1
France 44
Germany 44
Italy 4
Lithuania 1
Poland 2
Spain 13
Sweden 2

 

EU Member State orders to provide information — median time to confirm receipt: 0 days5

EU Member State orders to provide information — median time to give effect to the order: 5 days

Section 2: Notices Received Through Notice and Action Mechanism

The App Store notice and action mechanism in accordance with Article 16 is available at ContentReports.apple.com or through the Report a Problem tool and can be accessed by any individual or entity with an EU IP address.

Total number of notices: 2739

Table 2.1: Notices categorised by type of alleged illegal content concerned6

  Count of notices
Violates intellectual property rights7 1988
Provides or facilitates an illegal service 249
Violates consumer protection or privacy law 148
Child sexual abuse material 104
Violates advertising law 33
Incites terrorism or violence 24
Illegal hate speech 23
Other 170

 

Notices submitted by trusted flaggers: 08

Notices processed using automated means: 49

Total number of notices on which the App Store took an action: 36410

  • Action taken on the basis of the law: 0

  • Action taken on the basis of terms and conditions: 36411

Median time to take action: 5 days12

Section 3: App Store-Initiated Content Moderation

The App Store moderates three types of content for compliance with applicable law and terms and conditions: live apps that are published on the App Store, ads that are live on the App Store,13 and user ratings and reviews and public developer responses (both before and after publication).

Content moderation relating to published apps

Before they’re published on the App Store — that is, before becoming subject to any content moderation decisions — all apps and app updates are subject to two levels of ex ante review by the App Review team (App Review): automated review and human review. After apps are published, the App Store continues to monitor them and can and does take action, including app takedowns, if it identifies apps that do not comply with applicable law, with the App Review Guidelines or with other applicable terms and conditions — namely the Apple Developer Agreement or the Apple Developer Program License Agreement (DPLA). The App Store may also take action if alerted to concerns about an app by third parties, including via Report a Problem or Content Reports.

Automated tools are used to assist App Review specialists in their ongoing monitoring of apps published on the App Store. This includes automated tools that detect malware or bait-and-switch apps that change their functionality after approval by App Review. It also includes automated tools that scan user reviews of published apps to identify concerns raised by consumers that may indicate that apps contain content incompatible with applicable law or terms and conditions. These tools are continually trained and enhanced to address new and emerging threats and to factor in learning from human-based decision-making. Any apps flagged as potentially problematic by these automated tools are escalated to human App Review specialists. These specialists then determine whether the apps violate App Store terms and conditions and, if so, what action to take — for example, app removal. No such actions are taken solely on the basis of automated tools.

App Review supports all official EU Member State languages (see Table 3.1 later in this report). Each specialist receives language- and region-specific training that covers cultural and sensitivity issues as they relate to enforcing the App Review Guidelines. Specialists also participate in regular discussions on new issues or trends that arise in their particular regions.

All App Review specialists charged with the continuing review of apps published on the App Store receive comprehensive training when they join the App Review team. This training is supplemented by ongoing training to ensure that specialists remain informed of new and emerging threats and issues. They have access to senior App Review specialists for guidance and can escalate issues internally, including to the App Review Board.

Content moderation relating to published ads

Before they’re published on the App Store, ads are reviewed to ensure that they comply with Apple Search Ads Advertising Policies. The App Store will also moderate ads after they’re published if it becomes aware that they’re in breach of the Apple Search Ads Advertising Policies.

Some automated tools are used to assist with moderating ads, but final decisions regarding content moderation are taken by humans.

Personnel charged with moderating published Apple Search Ads content receive comprehensive training when they join the team. This training is supplemented by ongoing training to ensure that personnel remain informed of new and emerging threats and issues.

Content moderation relating to user ratings and reviews as well as developer responses

Customers provide ratings and reviews on the App Store to give feedback on their experience with an app and to help others decide which apps they’d like to try. App developers can respond to user reviews regarding their apps. Some submitted ratings and reviews aren’t published on the App Store because automated tools are applied before they’re published to prevent reviews or responses that contain certain types of content — such as spam, fake reviews or profanity — from ever being published on the App Store. Nonetheless, we include these ratings and reviews in the relevant tables that follow.

All user ratings and reviews must comply with the Submissions Guidelines in the Apple Media Services (AMS) Terms and Conditions. Ratings and reviews that do not comply with these terms and conditions can be removed from the App Store. All developer responses must comply with the Developer Code of Conduct in the App Review Guidelines, as well as with the DPLA. Responses that do not comply can also be removed from the App Store.

We use a combination of automated and human review to moderate this content. Any reviews that may violate the law or App Store terms and conditions are evaluated by human moderators. Automated tools are used to flag reviews with potential concerns for human moderators to consider; but because decisions are taken by human reviewers rather than by automated means, considerations of the accuracy or error rate of such automated tools do not apply. Human reviewers sometimes detect or are alerted by developers and customers to published reviews that contain new patterns of illegal content or content that doesn’t comply with the AMS Terms and Conditions. In these circumstances, human reviewers may run automated queries to detect other published reviews that contain such content.

The personnel responsible for moderating user ratings and reviews and developer responses both before and after publication on the App Store receive comprehensive training at onboarding. This training is supplemented by ongoing training to ensure that personnel remain informed of new and emerging threats and issues.

Total number of human resources dedicated to content moderation: 606

Table 3.1: Human resources dedicated to content moderation categorised by supported language14

In accordance with Article 42(2)(a) of the DSA, the App Store must specify the human resources that it dedicates to content moderation, broken down by each official language of the EU Member States. The App Store does not maintain separate content moderation teams per EU Member State language, so it has detailed the number of content moderation human resources broken down by proficiency in EU Member State languages.

  Count of dedicated human resources
English 605
Spanish 44
Portuguese 40
German 32
French 29
Italian 24
Swedish 16
Danish 16
Polish 13
Dutch 12
Bulgarian 11
Slovak 11
Croatian 10
Czech 10
Greek 10
Slovenian 8
Estonian 7
Romanian 7
Hungarian 6
Finnish 6
Lithuanian 6
Maltese 6
Latvian 3
Irish 0

 

Total number of content moderation measures taken: 10,062,658

Number of content moderation measures taken detected solely using automated means: 4,528,36015

Table 3.2: Content moderation measures taken categorised by type of restriction applied

  Count of content moderation measures taken
Ratings or reviews removed16 5,437,149
Accounts terminated17 4,581,316
Apps removed18 43,919
Ads removed 150
Apps restricted19 122
Accounts restricted20 2

 

Table 3.3: Content moderation measures taken categorised by type of illegal content or violation of terms and conditions

  Count of content moderation measures taken
AMS Terms Section K — Prohibited Use of Service 10,017,522
App Review Guideline 4.0 — Design21 35,158
Apple DPLA Section 3.2(f) 8665
App Review Guideline 4.3 — Spam 385
App Review Guideline 5.6.3 — Discovery Fraud 122
App Review Guideline 4.1 — Copycats 119
Advertising Policies 1 — Advertiser Responsibilities 95
App Review Guideline 2.5.18 — Software Requirements 84
App Review Guideline 5.6 — Developer Code of Conduct 66
App Review Guideline 2.3.1 — Accurate Metadata 62
App Review Guideline 2.1 — App Completeness 59
App Review Guideline 3.1.2 — Subscriptions 40
App Review Guideline 1.1 — Objectionable Content 37
App Review Guideline 5.2.1 -—Intellectual Property — Generally 32
Advertising Policies 4.4 — Real Money Gambling 29
App Review Guideline 5 — Legal 21
App Review Guideline 3.1.1 — In-App Purchases 18
App Review Guideline 5.1.2 — Data Use and Sharing 15
App Review Guideline 3.2.2 — Unacceptable Business Model 12
App Review Guideline 4.2 — Minimum Functionality 11
Advertising Policies 4.2 — Sensitive Content or Imagery 9
App Review Guideline 5.1.1 — Data Collection and Storage 9
App Review Guideline 2.3 — Accurate Metadata 7
App Review Guideline 2.3.6 — Accurate Metadata 7
App Review Guideline 2.3.7 — Accurate Metadata 7
App Review Guideline 5.3.4 — Gaming, Gambling, and Lotteries 7
Advertising Policies 4.1— Alcohol 5
App Review Guideline 1.4.3 — Physical Harm 5
App Review Guideline 5.2.2 — Intellectual Property — Third-Party Sites/Services 5
App Review Guideline 5.2.3 — Intellectual Property — Audio/Video Downloading 5
Advertising Policies 4.6 — Dating Services/Match Making 4
App Review Guideline 1.2 — User-Generated Content 4
Advertising Policies 5.2 — Ad Placement Eligibility Overview 3
App Review Guideline 2.5.1 — Software Requirements 3
App Review Guideline 3.1.5 — Cryptocurrencies 3
App Review Guideline 4.8 — Login Services 3
Advertising Policies 3.5 — Controlled Substances 2
App Review Guideline 1.5 — Developer Information 2
App Review Guideline 3 — Business 2
App Review Guideline 5.2.5 — Intellectual Property — Apple Products 2
App Review Guideline 5.4 — VPN Apps 2
Advertising Policies 3.4 — Adult Content 1
Advertising Policies 4.3 — Pharmaceutical and Medical 1
Advertising Policies 4.5 — Simulated Gambling 1
App Review Guideline 1.4 — Physical Harm 1
App Review Guideline 2.2 — Beta Testing 1
App Review Guideline 2.3.10 — Accurate Metadata 1
App Review Guideline 2.3.3 — Accurate Metadata 1
App Review Guideline 2.4.5 — Hardware Compatibility 1
App Review Guideline 4.10 — Monetizing Built-In Capabilities 1
App Review Guideline 5.1.5 — Location Services 1

 

Section 4: Complaints22

Total complaints received: 94023

Median time to take decisions: 5 days24

Table 4.1: Complaints received categorised by decision taken

  Count of complaints received
Decision upheld 453
Decision reversed 225
Decision pending 262

 

Section 5: Out-of-Court Disputes

Number of disputes submitted to out-of-court dispute settlement bodies referred to in Article 21: 025

Section 6: Suspensions for Misuse of the Service

DSA Article 23(1) provides for the suspension of users who “frequently provide manifestly illegal content.” DSA Article 23(2) provides for the suspension of users who “frequently submit notices or complaints that are manifestly unfounded.”

Under its existing content moderation practices, and in accordance with its terms and conditions, the App Store will terminate — rather than merely suspend — the accounts of any user or developer who frequently provides manifestly illegal content in the form of apps, user reviews of published apps or other forms of illegal content. The App Store may suspend or terminate users who frequently submit Content Reports notices or related complaints that are manifestly unfounded.

Total number of suspensions for provision of manifestly illegal content: 0

Total number of suspensions for submission of manifestly unfounded notices: 0

Total number of suspensions for submission of manifestly unfounded complaints: 0

Section 7: App Store Recipients of the Service

Table 7.1: Average monthly recipients of the App Store categorised by EU Member State

  Count of monthly recipients of the App Store26
Austria 3 million
Belgium 4 million
Bulgaria Under 1 million
Croatia Under 1 million
Cyprus Under 1 million
Czechia 2 million
Denmark 4 million
Estonia Under 1 million
Finland 2 million
France 25 million
Germany 30 million
Greece 2 million
Hungary 2 million
Ireland 2 million
Italy 15 million
Latvia Under 1 million
Lithuania Under 1 million
Luxembourg Under 1 million
Malta Under 1 million
Netherlands 8 million
Poland 6 million
Portugal 2 million
Romania 3 million
Slovakia Under 1 million
Slovenia Under 1 million
Spain 12 million
Sweden 6 million
  1. Apple Distribution International Limited (ADI) is responsible for the provision of the App Store in the EU.↩︎

  2. Data in the report comes from EU Member State storefronts only.↩︎

  3. For the purposes of this report, an “order” is a valid request from a Member State judicial or administrative authority in accordance with specific mandatory powers under the laws of the Member State in question, seeking production of information relating to the use of App Store services by one or more specific recipients of the service, or requiring action to be taken in respect of specific items of illegal content, regardless of whether it constitutes an order legally binding on ADI. ADI is an entity established in Ireland under the laws of the Republic of Ireland. As a result, court orders mandating data production by ADI issued by Irish courts are mandatory requests under current law as defined by Article 10 of the DSA. For the sake of greater transparency, we have decided to also include voluntary requests sent by official authorities in EU Member States other than Ireland, which were deemed valid legal requests according to our Legal Process Guidelines for Law Enforcement and Government outside the United States. This reporting is without prejudice to the legal position of Apple Distribution International Limited with regard to the binding nature of any such order, under applicable law.↩︎

  4. For more information on accessing the Content Reports portal, please see Section 2.↩︎

  5. All lengths of time are listed in calendar days.↩︎

  6. This table displays the category of illegal content selected by the notifier. The selected category may not accurately reflect the content concerned.↩︎

  7. Intellectual property rights notices are generally settled by the rights’ claimants themselves without requiring an action from the App Store.↩︎

  8. Under DSA Article 22(5), the European Commission will publish in a publicly available database information regarding entities awarded the status of trusted flaggers by the Digital Services Coordinators; these Digital Services Coordinators are designated by the EU Member States in accordance with DSA Article 22(2). To date, no such information has been published.↩︎

  9. Content Reports employs a filter that automatically rejects notices that are spam or that do not include a valid URL to content hosted on the App Store.↩︎

  10. For more information on the content moderation actions that the App Store takes, see Table 3.2. Some actions on reported notices may not be included in these figures because they were pending at the time that data was collected. Additionally, some notices reviewed were not related to the reporting of illegal content and are therefore not actionable.↩︎

  11. The App Store terms and conditions — including the App Review Guidelines, the Apple Developer Program License Agreement (DPLA) and Apple Media Services (AMS) Terms and Conditions — prohibit illegal content on the App Store, as well as the use of the App Store for illegal purposes. As such, this figure may include actions taken against content that might also be incompatible with applicable laws.↩︎

  12. Median is calculated based on completed decisions at the end of the reporting period.↩︎

  13. Apple Search Ads is a separate business from the App Store, and it’s included in this report because its business involves selling advertising media on the App Store.↩︎

  14. Table 3.1 lists human resources who are proficient in multiple languages under each applicable language.↩︎

  15. This category includes ratings and reviews detected and removed solely through automated means.↩︎

  16. This category includes ratings and reviews removed before and after publication.↩︎

  17. This category includes terminations of App Store developers, AMS customers and Apple Search Ads advertisers.↩︎

  18. This category includes apps that were live on the App Store when the developer account was terminated.↩︎

  19. This category includes apps that were suppressed on App Store charts and in App Store search. It also includes apps blocked from being transferred to other developers.↩︎

  20. This category includes user accounts that are restricted from leaving ratings or reviews on the App Store.↩︎

  21. This removal is the result of ongoing cleanup for outdated apps.↩︎

  22. The App Store has multiple complaint-handling systems from which data is included in this section. These systems cover complaints related to Content Reports notices, review removals, App Review Board (ARB) cases, platform-to-business (P2B) cases and advertisement restrictions. For more information on redress options, please see apple.com/legal/dsa/en/redress-options.html.↩︎

  23. The App Store will seek to provide users filing complaints with additional flexibility to select a basis for complaint for future reports.↩︎

  24. Median is calculated based on completed decisions at the end of the reporting period.↩︎

  25. In accordance with Article 21(8) of the DSA, the European Commission will publish a list of certified out-of-court dispute settlement bodies certified by Digital Services Coordinators under DSA Article 21(3). To date, no such list has been published.↩︎

  26. These figures are approximate based on information that is readily available to the App Store in the normal course of its business.↩︎