Apple Intelligence

RSS for tag

Apple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.

Posts under Apple Intelligence subtopic

Post

Replies

Boosts

Views

Activity

app intents not working with placeholders and without app name
I tried this: struct CarShortcutsProvider: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: LockCarIntent(), phrases: ["Lock my car with \(.applicationName)", "Lock my \(\.$car) with \(.applicationName)"], shortTitle: LocalizedStringResource("Lock Car"), systemImageName: "lock.fill" ) AppShortcut( intent: UnlockCarIntent(), phrases: ["Unlock my car with \(.applicationName)", "Unlock my \(\.$car) with \(.applicationName)"], shortTitle: LocalizedStringResource("Unlock Car"), systemImageName: "lock.open.fill" ) } } but Siri only understands "unlock my car ", not with the placeholder. Siri asks me then for the car, and it understands it, but not in one sentence. Is there something wrong with my code? Also I tried it without applicationName first, and then it didn't work at all with Siri. Is this a general limitation of app intents? I thought the goal was to reduce friction. If the user has to mention the app name all the time, it adds friction.
0
2
435
Nov ’24
Image Playground not available for "Designed for iPad" apps?
I'm currently trying to add support for Image Playground to our apps. It seems that it's not working in an app that is "Designed for iPad" and runs on a Mac. The modal just shows a spinner and the following is logged to console: Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> GP extension could not be loaded: Extension (platform: 2) could not be found (in update) dealloc Query controller [C32BA176-6A3E-465D-B3C5-0F8D91068B89] ImagePlaygroundViewController.isAvailable returns true, however. In a "real" Mac Catalyst app, it's working. Just not when the app is actually an iPad app. Is this a bug?
4
2
1.4k
Jan ’25
Does Apple Intelligence Extensions Have an API?
Hi everyone, On the "Apple Intelligence & Siri" settings there's a section titled "Extensions" that specifically mentions ChatGPT. This got me curious—does Apple provide an API or SDK for developers to create custom integrations or use Apple Intelligence Extensions? Or is this currently limited to the Apple/OpenAI partnership? I appreciate any insights or links to relevant documentation. Here's a screenshot of what I mean: https://imgur.com/a/4MuQkIJ
1
2
828
Dec ’24
Conforming an existing AppIntent to the photos domain schema
I have an image based app with albums, except in my app, albums are known as galleries. When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain. Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”. Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings: Parameter argument title of a required Assistant schema intent parameter target should not be overridden Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user. FB16283840
2
2
752
Jan ’25
Python 3.13
Hello, Are there any plans to compile a python 3.13 version of tensorflow-metal? Just got my new Mac mini and the automatically installed version of python installed by brew is python 3.13 and while if I was in a hurry, I could manage to get python 3.12 installed and use the corresponding tensorflow-metal version but I'm not in a hurry. Many thanks, Alan
3
2
875
17h
Missing parameter prompt for search Assistant Intents
Issue When triggering an App Intent using assistant schemas from Apple Intelligence (voice or text) the App opens without prompting for search criteria. How to repeat This can be repeated in the example provided by Apple here: https://developer.apple.com/documentation/appintents/making-your-app-s-functionality-available-to-siri Download the sample code Build and run on Xcode 16.1 beta 3 Target iPhone 15 Pro Max on iOS 18.1 beta 7 Trigger Apple Intelligence Enter prompt: "Search AssistantSchemasExample" Expected behaviour Apple Intelligence should prompt the user for the criteria and provide this to the App so that the experience is seamless for the end-user. Otherwise Assistant Intents are nothing more than deep links to search screens. Notes The example uses @AssistantIntent(schema: .photos.search) intent. And I've found the issue is also present in other search intents: @AssistantIntent(schema: .system.search) @AssistantIntent(schema: .browser.search) Questions Has anyone managed to get the prompt to appear? Will this only function on iOS 18.2?
3
1
538
Oct ’24
"failed to processImage" in videoProcessor
Hello, I’m working on a program that analyzes video files frame by frame to detect human poses in each frame. However, during the process of reading observations from the stream, the analysis frequently stops with the following error: [LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[85]: code -18 [LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[178]: code -18 The error was caught and printed using a do-catch block, and here is the output: Error Domain=NSOSStatusErrorDomain Code=-18 "Error: failed to processImage" UserInfo={NSLocalizedDescription=Error: failed to processImage} While the do-catch block helps prevent the app from crashing, the frames following the error cannot be analyzed. I’m hoping to understand the cause of this error, or find a way to skip the problematic frames and continue analyzing the subsequent ones. My development environment is Xcode Version 16.0 (16A242d) and iOS 18.0. Thank you for your help. (Attaching my code below.) let videoProcessor = VideoProcessor(videoURL) let bodyPoseRequest = DetectHumanBodyPoseRequest() let asset = AVURLAsset(url: videoURL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first let bodyPoseStream = try await videoProcessor.addRequest(bodyPoseRequest) videoProcessor.startAnalysis() do { for try await observations in bodyPoseStream { guard let observation = observations.first else { continue } if let timeRange = observation.timeRange { /// do something... } } } catch { print("\(error.localizedDescription)") }
0
1
385
Oct ’24
Will Apple Intelligence gather feedback from users out of beta?
I had assumed that Apple Intelligence features would not allow users to give thumbs up or down when they are released later this year. But I recently stumbled upon new marketing material for the iPad Mini (A17 Pro), and in an embedded video on the marketing page, it shows the ability to give a thumbs up and down on an Image generated with Image Wand. https://www.apple.com/ipad-mini/ Was my assumption about non-beta users not being able to submit feedback on the model’s outputs wrong, or was Apple perhaps taking a screen recording of an unreleased beta and forgot to disable the feedback UI? I assume it can’t be the ladder.
1
1
430
Oct ’24
Apple intelligence is not available in the iPhone 15 Pro brought from China.
My iPhone 15 Pro is from Hong Kong (China). I am outside of China and Asia in general. I have never been to China myself and the iPhone was activated in another country. And it is not the EU. My iPhone's language, Siri and region settings are changed to US English. Updated to iOS 18.1 RC. But Apple Intelligence doesn't show up in the Siri settings.
1
1
736
Oct ’24
Apple Intelligence / Mac Mail : Summaries Unavailable
At one point, Mac Mail's apple added a summarize functionality that worked. Now when I click on Summarize, I get: "Summaries Unavailable Mail summarization is unavailable at this time. Try again later." I've rebooted, stopped/restarted Apple AI, waited a day to see if it was synching up things, etc. I'm running the latest version of Apple OS (Version 15.1 (24B82)). any ideas?
2
1
1.2k
Oct ’24
Apple please respond with any information (Image Playgrounds access)
Apple, I speak for the majority when I say that we are frustrated, not exactly from the fact that we are unable to access features and test them and submit feedback and etc. but because of the fact that you are not communicating. If you may, please let us know right here if this is a server bug or if it is initial strategy to rollout the generative features to a small and limited amount of people on IOS18.2 DB1 Thank you!
1
1
442
Oct ’24