fbpx
The Stack Archive

Apple: Spotlight and Siri in iOS 9 add Google-style ranking algorithms to app content

Tue 16 Jun 2015

p>Apple’s iOS 9 iteration of its mobile operating system is to add ‘in-app search’ functionality via Spotlight and Siri, effectively turning the content ecosphere within installed apps into a mini-internet where the main players will fight for ‘rank’ away from the otherwise ubiquitous eyes of Google. Additionally, in contrast to Google and to standard practices for search networks, the new functionality will not track users.

Apple announced the innovation at the Worldwide Developers Conference in San Francisco this week, to cheers from the developer community. For the SEO community, it’s another set of rules to guess, reverse-engineer and negotiate. But discounting the potential impact of aggregate results from apps is not an option; with 1.7 million apps available in Apple’s app store, a user-base which has purchased over 1 billion iOS devices and a significant number of the top 100 apps offering potentially rankable content (rather than opaque services such as Skype and WhatsApp), searchable app content has enormous potential traction.

App developers can use CoreSpotlight’s API to make their app content searchable. As in the early days of a ‘from scratch’ search engine, initial ranking will be given to in-app content which is accessed frequently. Additionally Siri will deliver results from apps which the user has not installed, allowing significant potential to drive uptake of apps which have content which might be of interest to the end-user.

The feature will also be available in Apple’s forthcoming El Capitan desktop OS upgrade.

The intent of the model is similar to Google’s PageRank system, which Google have downplayed in recent years without actually abandoning, wherein a frequently-changing indexing algorithm assigns value to content on the basis of the frequency and/or quality of the sources linking to it or accessing it. The ‘uprank’ comes with new buoyancy in SERPS for the lucky recipient.

Apple will also index ‘deep content’ via NSUserActivity, functionality introduced last year for the Handoff feature which allows users to maintain a consistent experience among iOs devices.

Effectively NSUserActivity, which is a private interaction between the end-user and an application, will henceforth be yielding access stats to Apple which will form part of the basis of the new searchable in-app environment. This generates onine results from (technically) ‘offline’ activity – it’s the equivalent of a marketer noting that you pick a particular book from your bookshelf quite often and consequently upranking the book in results.

Finally the index will also take in links from the ‘main internet’ via the Web Markup feature, which permits admins to create informational connectivity between their apps and the web. Effectively this will permit Apple OS users to search for information in Safari and open In-App results natively, in context of the app itself.

The results will be contextually interactive, depending on content – telephone numbers will be dialable, maps directly linked from location search results and videos likewise directly accessible without interstitial interruption.

Google’s Android mobile OS introduced similar functionality in 2012, with Android 4.1’s ‘Google Now’. In January of this year Google announced the same kind of functionality – using what seems to be a popular example, the notion of obtaining results about cities or vacations from Airbnb.

It will be interesting to see if Apple can forever resist tracking and exploiting user activity that takes place in the in-app content world. From a business point of view, the company is only delivering more value to a premium product that has already been paid for, with word of mouth a further enticement, arguably, to increased product sales, that ‘iOS virgins’ might get a look at what is potentially a sub/lateral internet in itself.

Tags:

Apple iOS news
Send us a correction about this article Send us a news tip