Apple brings its gen AI ‘Apple Intelligence’ to developers, will let Siri control apps
Apple Intelligence, Appleās new generative AI offering, wonāt just be a consumer-facing feature; developers will be able to take advantage of the latest technology, too. In Appleās keynote address at WWDC 2024 on Monday, the company announced that developers would be able to integrate the experience powered by Apple Intelligence into their own apps.
Appleās SDKs (software development kits) have been updated with a variety of new APIs and frameworks that will allow app makers to do things like integrate the Image Playground ā or genAI image creation ā with just a few lines of code. Apple showed off how an app like Craft could use this offering to make usersā documents more visual by allowing them to add AI images.
AI-powered writing tools will also be automatically available in any app that uses the standard editable text view. For this, Apple demonstrated how an app like Bear Notes would automatically be able to allow users to rewrite, proofread and summarize their notes.
In addition, Apple is building more ways for developers to take actions in apps with Siri.
Developers who have already adopted SiriKit ā an SDK for integrating Siri into their apps ā will see immediate enhancements for many of Siriās new capabilities without any extra work on their part, Apple said. This includes areas like Lists, Notes, Media, Messaging, Payments, Restaurant reservations, VoIP calling and Workouts.
In its Developer keynote, Apple said that there are two new Siri capabilities that developers will be able to benefit from without additional work. First, Siri will be able to invoke any item from an appās menus. That means a user could say something like āshow my presenter notesā when in their slidedeck or even something more conversational, like āI need to see my speaker notes.ā
Secondly, Siri will be abe to access any text displayed on the page using Appleās standard text systems. This will allow users to reference and act on text on the screen. For instance, if you had a reminder or note to āwish grandpa a happy birthday,ā you could just say āFaceTime himā to take action on that note.
Meanwhile, Appleās App Intents framework, which allows for lightweight app-like interactions without the app being installed, will also gain access to Apple Intelligence.
Apple is defining new intents and making them available to developers across categories starting with a subset of categories including Books, Browsers, Cameras, Document Readers, File Management, Journals, Mail, Photos, Presentations, Spreadsheets, Whiteboards, and Word Processors.
These intents are defined and tested so theyāre easier for developers to adopt, Apple claims.
With the intents, a photo-editing app like Darkroom could leverage the Apply Filter intent so users could just say āApply a cinematic present to the photo I took of Ian yesterdayā to have the app take action. More domains will be added in time.
Initially, users will be able to use a develop with the Shortcuts app but over time, Siri will gain the ability to call the app intents in the supported domains.
Plus, Apple shared in its keynote address, apps that fit an existing SiriKit domain, it will be able to benefit from Siriās enhanced conversational capabilities, like responding correctly even if you stumble over your words or understanding references to an earlier part of the conversation.
Siri will also be able to search data from apps using a new Spotlight API that enables app entities to be included in its index. These entities refer to Apple Intelligenceās semantic index of things like photos, messages, files, calendar events, and more.
Also on Monday, the company announced its own password-manager app, AI-generated Bitmoji and Calculator for the iPad.
This post was updated after publication with more information from the Developer keynote.