I would like to write a macOS application that uses on-device AI (FoundationModels).
I don’t understand how to, practically, give it access to my documents, photos, or contacts and be able to ask it a question like: “Find the document that talks about this topic.”
Do I need to manually retrieve the data and provide it in the form of a prompt? Or is FoundationModels capable of accessing it on its own?
Thanks
Hi @Julien458,
The Foundation Models Framework itself does not provide a direct way to search your documents. There are two ways your app might go about achieving this functionality:
-
Your app could first identify pertinent documents/information by, for example, querying a database or using an API. Then, your app would include this information as part of the prompt to Foundation Models.
-
Alternatively, your app could provide Foundation Models with a Tool that does this same sort of query action, and Foundation Models could then call on this Tool whenever it is relevant to do so based on the prompt.
How you go about searching for the most relevant information is up to you and depends on how your app's data is structured, and there's no one right answer here. You might first take a look at SwiftData. Or if you want to take it a step further and perform on-device semantic searches, look into vector embeddings with CoreML or the Natural Language Framework.
Best,
-J