How Apple Could Help Developers With AI and LLMs

Build a semantic index (SI), and allow apps to access it via permissions given similar to what we do for Address Book or Photos.

Maybe even make the permissions to the SI a bit more fine-grained than you normally would for other personal databases. Historical GPS locations? Scraping contents of the screen over time? Indexed contents of document folder(s)? Make these options for what goes into the SI.

And of course, the same would be true for building the SI. As a user, I’d love to be able to say “sure, capture what’s on the screen and scrape the text out of that, but nope – you better not track where I’ve been over time”.

And similar to the Spotlight indexing API, developers should be able to provide data to the SI along with rich metadata. Rev the Spotlight plugin API so that it can do more, or come up with a new API.

Is this information collected for the SI going to be the most sensitive bucket of bits on your device? Yes, of course it is.

But give developers the opportunity, and then customers will have something to choose from. Make the Mac and iOS the best platform to build personalized LLMs.

— Read on shapeof.com/archives/2025/3/how_apple_could_help_developers_with_ai_and_llms.html


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *