
Apple plans to enhance App Store discoverability by leveraging artificial intelligence (AI) tagging techniques that are now accessible in the developer beta build of iOS 26.
However, the tags do not seem on the public App Store as of yet, nor are they telling the App Store Search algorithm on the public store.
The latest analysis by app intelligence provider Appfigures, an app intelligence firm, suggested that metadata extracted from an app’s screenshots is influencing its ranking.
The company theorised that the Cupertino-based tech giant was extracting text from screenshot captions. Previously, only the app’s name, keyword list, and subtitle would count towards its search ranking.
While Appfigures thought Apple was using OCR (optical character recognition) to read screenshot text, the company clarified at its Worldwide Developer Conference (WWDC 25) that it is leveraging its AI methods to extract the relevant information from an app’s description, screenshots, and other more.
This AI tagging aims to surface hidden details and categorize apps more accurately without developers having to add extra keywords to screenshots or descriptions manually.
This AI tagging aims to speculate hidden details and categorise apps more precisely without developers having to add extra keywords to screenshots or descriptions manually.
Developers will receive the ability to select which AI-generated tags apply to their apps.
Additionally, Apple promised that all tags would undergo human review before being shown to users, offering quality control.
Furthermore, the Cupertino-based tech giant promised that all tags would undergo human review before displaying to users.
Once this update is introduced to the public App Store, managing these AI-assigned tags will be essential for developers to enhance their app’s visibility and search ranking.
Currently, the system is in beta testing, offering developers time to adapt before it impacts App Store search results worldwide.