Is anyone else seeing their apps crash on iOS/macOS 17.4/14.4 and newer when building a project that simply just includes the iOS 18 @AssistantIntent Macro?
The beta 4 releases still have this problem. There are no notes about this that I have seen in the beta release notes. Crash message shown in console when trying to run on 17.4, 17.5, 17.5.1, etc:
dyld[21935]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <F7A1FEF0-F3B0-379C-A914-D1FB0BA7C693> /Users/jonathan/Library/Developer/CoreSimulator/Devices/CA308F47-BCA8-4429-8599-1BB1CCEAB5B6/data/Containers/Bundle/Application/D7DC8E16-90DB-406A-A521-20F18326E4A7/IntentDemo.app/IntentDemo.debug.dylib Expected in: <88E18E38-24EC-364E-94A1-E7922AD247AF> /Library/Developer/CoreSimulator/Volumes/iOS_21F79/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 17.5.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/AppIntents.framework/AppIntents
Obviously, the new Apple Intelligence AssistantIntents only work on the 2024 OS releases. However, even when these new App Intents are marked with @available(iOS 18, macOS 15, *), the app crashes on any earlier OS version. But it runs just fine on iOS 18 and macOS 15...
I would love for me to just have done something wrong but I don’t think I have… Here is the sample project: https://github.com/JTostitos/FB14323923
Maybe it's a compiler issue thats failing to strip out the macro when building for older OS's or an Xcode issue - I have no idea. I just would like to know why its not working and how to resolve it.
Thanks in advance for anyones help...
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I wanted to join the Apple Intelligence after I updated my iPhone 15pro to iOS 18.1 beta. But it is still showing that I’m in the waitlist. It has been almost one day! Why? Is it normal?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Xcode Version 16.0 (16A242d)
iOS18 - Swift
There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question.
Example Code:
struct MyAppNameShortcuts: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskQuestionIntent(),
phrases: [
"Ask \(.applicationName) a question",
]
)
}
}
struct AskQuestionIntent: AppIntent {
static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question")
static var openAppWhenRun: Bool = false
static var parameterSummary: some ParameterSummary {
Summary("Search for \(\.$query)")
}
@Dependency
private var apiClient: MockApiClient
@Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?"))
var query: String
// perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog
// iOS 17, the same string is passed though
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("Query is: \(query)")
let queryResult = try await apiClient.askQuery(queryString: query)
let dialog = IntentDialog(
full: .init(stringLiteral: queryResult.answer),
supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...")
)
let view = SiriAnswerView(queryResult: queryResult)
return .result(dialog: dialog, view: view)
}
}
Given the above mock code:
iOS17:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
String of "What color is the moon?" is passed to the AppIntent
iOS18:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
Siri answers the question "What color is the moon?"
Follow above steps again and instead reply "Moon"
"Moon" is passed to AppIntent
Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
I'm trying to disable Writing Tools for a specific TextField using .writingToolsBehavior(.disabled), but when running the app on my iPhone 16 Pro with Apple Intelligence enabled, I can still use Writing Tools on the text box. I also see no difference with .writingToolsBehavior(.limited).
Is there something I'm doing wrong or is this a bug?
Sample code below:
import SwiftUI
struct ContentView: View {
@State var text = ""
var body: some View {
VStack {
TextField("Enter Text", text: $text)
.writingToolsBehavior(.disabled)
}
.padding()
}
}
#Preview {
ContentView()
}
What are the possible KPI requirements set by Apple AI for cellular networks, e.g. regarding latency, throughput or jitter?
What is the expected effect on iPhone energy consumption?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Performance
Battery Life
5G
Apple Intelligence
I don’t see the Apple Intelligence tap, even with iOS 18.1, language and region United States, Sri in English US… what can I do?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Why does Apple sell Iphones 16 with AI that won’t work in europe? You do not sell a car without a engine. Was it not better to wait to bring it to europe?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
why it’s taking too long to download appl intelligence on iPhone 15 pro max .
how long it will take to download.
I use 5G now it counting 4 weeks still downloading.
I have apple intelligence working fine with all the features on my iPhone 16 pro max. except for the summary feature and email summarizing, they don't work unless I turn on VPN on. why is that ? I am not living in the EU. and all the features of AI are working except for this one it won't work unless I have my vpn on.
can anyone help me with this.
Every time I click join the waitlist, there's is no reaction.
My phone just redirects me back to this page
Can vector embeddings be used in a SwiftData Model?
If yes, are there resources available to learn more about it? or at least a guided step on how to make it work?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
iOS
Swift
SwiftData
Apple Intelligence
I‘m excited at the development possibilities presented by Apple Intelligence and have begun imagining retrieval augmented generation use cases. Writing tools suggest that this is possible, but I have not seen any direct statements by Apple regarding use of AFMs for RAG applications. Have any references to APIs or sample code for RAG applications been published?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
With apple intelligence enabled I am seeing a 10-15 sec delay in push notification.
This is actually causing issue with my alarm system, door bells and 2FA requests.
I'm using iPhone 16 Pro.
First installed 18.1 Dev beta 5 and AI and noticed this.
When beta 6 was released it continued.
I put in FB15389900.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
User Notifications
Notification Center
Issue
When triggering an App Intent using assistant schemas from Apple Intelligence (voice or text) the App opens without prompting for search criteria.
How to repeat
This can be repeated in the example provided by Apple here: https://developer.apple.com/documentation/appintents/making-your-app-s-functionality-available-to-siri
Download the sample code
Build and run on Xcode 16.1 beta 3
Target iPhone 15 Pro Max on iOS 18.1 beta 7
Trigger Apple Intelligence
Enter prompt: "Search AssistantSchemasExample"
Expected behaviour
Apple Intelligence should prompt the user for the criteria and provide this to the App so that the experience is seamless for the end-user. Otherwise Assistant Intents are nothing more than deep links to search screens.
Notes
The example uses @AssistantIntent(schema: .photos.search) intent.
And I've found the issue is also present in other search intents:
@AssistantIntent(schema: .system.search)
@AssistantIntent(schema: .browser.search)
Questions
Has anyone managed to get the prompt to appear?
Will this only function on iOS 18.2?
Hello,
I’m working on a program that analyzes video files frame by frame to detect human poses in each frame. However, during the process of reading observations from the stream, the analysis frequently stops with the following error:
[LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[85]: code -18
[LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[178]: code -18
The error was caught and printed using a do-catch block, and here is the output:
Error Domain=NSOSStatusErrorDomain Code=-18 "Error: failed to processImage" UserInfo={NSLocalizedDescription=Error: failed to processImage}
While the do-catch block helps prevent the app from crashing, the frames following the error cannot be analyzed.
I’m hoping to understand the cause of this error, or find a way to skip the problematic frames and continue analyzing the subsequent ones.
My development environment is Xcode Version 16.0 (16A242d) and iOS 18.0.
Thank you for your help. (Attaching my code below.)
let videoProcessor = VideoProcessor(videoURL)
let bodyPoseRequest = DetectHumanBodyPoseRequest()
let asset = AVURLAsset(url: videoURL)
let videoTrack = try await asset.loadTracks(withMediaType: .video).first
let bodyPoseStream = try await videoProcessor.addRequest(bodyPoseRequest)
videoProcessor.startAnalysis()
do {
for try await observations in bodyPoseStream {
guard let observation = observations.first else { continue }
if let timeRange = observation.timeRange {
/// do something...
}
}
} catch {
print("\(error.localizedDescription)")
}
I had assumed that Apple Intelligence features would not allow users to give thumbs up or down when they are released later this year. But I recently stumbled upon new marketing material for the iPad Mini (A17 Pro), and in an embedded video on the marketing page, it shows the ability to give a thumbs up and down on an Image generated with Image Wand.
https://www.apple.com/ipad-mini/
Was my assumption about non-beta users not being able to submit feedback on the model’s outputs wrong, or was Apple perhaps taking a screen recording of an unreleased beta and forgot to disable the feedback UI? I assume it can’t be the ladder.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hello everyone,
I hope you’re all doing well. I’m not a developer, but I have an idea for an iOS app that I’d love to get your thoughts on. I wanted to share it here to gather feedback from this knowledgeable community and to learn from your expertise.
Idea Overview: Real-Time AI Running Coach for iOS
The concept is an iOS application that provides personalized, real-time running coaching by leveraging on-device data sources and Apple’s latest technologies. The app aims to offer an adaptive and motivating running experience while ensuring user privacy through on-device processing.
Key Features:
• Personalized Coaching:
• Utilize real-time biometric data and personal insights to deliver AI-driven coaching tailored to the user’s mental and physical state.
• Analyze health metrics, activity data, mood check-ins, and more to provide context-based motivational feedback.
• Privacy First:
• All data processing occurs on-device using Apple’s frameworks like Core ML, ensuring no personal data leaves the device.
• Adaptive Motivation:
• Implement Natural Language Processing to analyze user inputs like journal entries or mood check-ins.
• Generate personalized coaching cues based on historical performance and mood trends.
• Performance Enhancement:
• Offer dynamic adjustments to pace, route, and strategy in real time to help improve running performance.
• Seamless integration with Apple Watch for real-time data collection and haptic feedback.
Technologies and Frameworks Involved:
• HealthKit: Access health metrics such as heart rate, distance run, VO₂ max, sleep patterns, etc.
• Core ML: On-device machine learning for real-time data analysis without latency.
• Natural Language Processing: Analyze personal inputs for better coaching personalization.
• Core Motion & Core Location: Track motion data and location services for runs.
• AVFoundation & Speech: Provide real-time voice feedback and coaching cues.
• SiriKit Integration: Allow users to initiate workouts and receive updates via Siri.
Target Audience:
• Runners of all levels seeking personalized coaching that adapts to their mental and physical states.
• Users who prioritize privacy and want AI-driven insights without their data leaving the device.
• Tech-savvy fitness enthusiasts who use iOS devices and Apple wearables.
Questions for the Community:
1. Feasibility: Is this idea technically achievable using current iOS frameworks and technologies?
2. Data Access: Are there limitations in accessing and processing the necessary data on-device, especially regarding privacy and permissions?
3. Potential Challenges: What hurdles might developers face in creating such an app, and how could they be addressed?
4. Advice: As someone without a technical background, what steps would you recommend I take to move this idea forward?
I truly appreciate any feedback or insights you can provide. I’m excited about the potential of this idea but also aware there may be complexities I’m not considering. Thank you for taking the time to read this!
Best regards,
Paul
My iPhone 15 Pro is from Hong Kong (China). I am outside of China and Asia in general. I have never been to China myself and the iPhone was activated in another country. And it is not the EU.
My iPhone's language, Siri and region settings are changed to US English. Updated to iOS 18.1 RC. But Apple Intelligence doesn't show up in the Siri settings.
I downloaded the RC beta version on my Macbook and joined the waitlist so far I haven't received any message or any kind of notifications that I'm in but I have a question kinda silly but just want confirmation. By joining the beta and AI on my MacBook, whenever the official version is released, am I gonna have AI on my iPhone since already joined AI through my MacBook in the beta version? Kinda curious about it.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I'm using an iPhone 15 Pro Max and running developer beta 18.2 released today. I've already been an 'Apple Intelligence' user and now have been able to link it with my PAID ChatGPT account.
HOWEVER;
I'm searching for these Image features everyone seems to be posting about and cannot find them anywhere.
I'm apparently supposed to sign up for beta access to the Image features through some new Apple natively released app that was supposedly included in this build update, which I cannot find.
What gives??!!