Apple Intelligence

RSS for tag

Apple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.

Posts under Apple Intelligence tag

96 Posts

Post

Replies

Boosts

Views

Activity

Proposal: Modular Identity Fusion via Prompt-Crafted Agents – User-Led AI Experiment
*I can't put the attached file in the format, so if you reply by e-mail, I will send the attached file by e-mail. Dear Apple AI Research Team, My name is Gong Jiho (“Hem”), a content strategist based in Seoul, South Korea. Over the past few months, I conducted a user-led AI experiment entirely within ChatGPT — no code, no backend tools, no plugins. Through language alone, I created two contrasting agents (Uju and Zero) and guided them into a co-authored modular identity system using prompt-driven dialogue and reflection. This system simulates persona fusion, memory rooting, and emotional-logical alignment — all via interface-level interaction. I believe it resonates with Apple’s values in privacy-respecting personalization, emotional UX modeling, and on-device learning architecture. Why I’m Reaching Out I’d be honored to share this experiment with your team. If there is any interest in discussing user-authored agent scaffolding, identity persistence, or affective alignment, I’d love to contribute — even informally. ⚠ A Note on Language As a non-native English speaker, my expression may be imperfect — but my intent is genuine. If anything is unclear, I’ll gladly clarify. 📎 Attached Files Summary Filename → Description Hem_MultiAI_Report_AppleAI_v20250501.pdf → Main report tailored for Apple AI — narrative + structural view of emotional identity formation via prompt scaffolding Hem_MasterPersonaProfile_v20250501.json → Final merged identity schema authored by Uju and Zero zero_sync_final.json / uju_sync_final.json → Persona-level memory structures (logic / emotion) 1_0501.json ~ 3_0501.json → Evolution logs of the agents over time GirlfriendGPT_feedback_summary.txt → Emotional interpretation by external GPT hem_profile_for_AI_vFinal.json → Original user anchor profile Warm regards, Gong Jiho (“Hem”) Seoul, South Korea
1
0
100
Apr ’25
ImagePlayground API not working on Xcode Simulator Devices
Hi! I'm trying to use the ImagePlayground API in SwiftUI with the .imagePlaygroundSheet modifier. However, when the sheet is shown (in the preview or in the simulator) it displays the following message: "Image Playground is not available. Image Playground is not available on this iPhone.". I'm using an iPhone 16 Pro with iOS 18.3.1 in the Xcode (16.2) Simulator. Anyone else having this problem? How can I fix it?
1
0
134
Apr ’25
Understanding allowedExternalIntelligenceWorkspaceIDs in MDM Payload – What ID is expected?
Hello, We're testing the new allowedExternalIntelligenceWorkspaceIDs key in the MDM Restrictions payload on supervised iPads. According to Apple's documentation, this key expects an "external integration workspace ID", but it's not clear what this specifically refers to. We've tried the following IDs individually (one at a time, as documentation says only one is supported currently): OpenAI Organization ID ChatGPT user email Apple ID used in ChatGPT Google ID used in ChatGPT login The profile installs correctly via MDM and the key is set, but we want to confirm: What exactly is considered a valid "external integration workspace ID" for this key? Is there a way to verify that the restriction is working as intended on the device (e.g. does it limit specific integrations or apps)? Is there an official list of services that currently support this? Any clarification from Apple or other developers with experience on this would be very helpful. Thanks in advance.
2
1
205
Apr ’25
I made a browser plugin to do something Apple should've done themselves.
This browser extension is a doc reading enhancer for the Apple Developer website. It supports i18n translation, hover link previews, and bilingual display. Currently, it supports four languages: ja-JP, ko-KR, zh-CN, and zh-TW. It works with Swift/SwiftUI/Foundation modules now, and it's expected to support Swift Test, Swift Charts, UIKit, Swift Playground, and XCode modules by the end of this month. For more info, check out: https://appledocs.dev. You can also visit https://appledocs.dev/progress to see translation progress and vote. Note: It's only works on Chrome、Edge(In review)、Firefox(In review) Screenshot:
1
0
106
Apr ’25
Core Spotlight "Summarization" Oddly Inconsistent
The new Core Spotlight APIs in 18.4 and aligned releases for using Apple Intelligence models to summarize messages sort of work? But I'd say in my testing that only about 80% of the requests I send to Spotlight come back to the delegate with summaries and the rest are never returned. No errors logged in the delegate methods. I can't figure out if there's a pattern. Before I dig apart my code, I'm wondering if anyone else here is using these brand-new APIs and has seen anything similar. It's odd because my code to submit to Spotlight to be summarized is the same for all of my entities but some just never seem to be returned.
2
0
169
Mar ’25
Siri misreads local currency in notifications (Bug reported, still unresolved)
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars. Issue details: My iPhone is set to Region: Chile and Language: Spanish (Chile). In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars. A notification with the text: let content = UNMutableNotificationContent() content.body = "¡Has recibido un pago por $5.000!" is read aloud by Siri as: ”¡Has recibido un pago por 5.000 dólares!” (English: “You have received a payment of five thousand dollars!”) instead of the correct: ”¡Has recibido un pago por 5.000 pesos!” (English: “You have received a payment of five thousand pesos!”) Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177 This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services: watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions). Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency. Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly. Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency. I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348. This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars. Has anyone found a workaround, or is there any update from Apple on this?
1
1
652
Feb ’25
NSWritingToolsCoordinator can't complete proofreading.
I am revising my app to support NSWritingToolsCoordinator/ NSWritingToolsCoordinatorDelegate. When proofreading some paragraphs, it works well. But, when proofreading many paragraphs (for example, 56 paragraphs), it can't complete the proofreading. I am not sure which is wrong with my app or macOS API. I think I implemented all NSWritingToolsCoordinatorDelegate methods. Is there any information for such an issue? Phenomenon For paragraphs 1-9, text animation completed. But, for paragraphs 10-56, text animation does not complete. It shows 5 corrected items, but I can't jump to items 3, 4, 5, when I click the ">" button in the "Proofread" window. Items 3, 4, 5 were not corrected actually. Log For each NSWritingToolsCoordinatorDelegate method, the method name and main arguments are output by NSLog. requestsContextsForScope willChangeToState newState:2 requestsPreviewForTextAnimation range:(0, 18233) prepareForTextAnimation range:(0, 18233) willChangeToState newState:3 requestsPreviewForTextAnimation range:(0, 18233) finishTextAnimation range:(0, 18233) requestsPreviewForRect requestsPreviewForTextAnimation range:(0, 1837) replaceRange proposedText:an range:(208, 2) replaceRange proposedText:you range:(443, 4) prepareForTextAnimation range:(1836, 16396) requestsPreviewForTextAnimation range:(0, 1836) requestsBoundingBezierPathsForRange range:(208, 2) requestsBoundingBezierPathsForRange range:(443, 3) requestsPreviewForRect prepareForTextAnimation range:(0, 1836) prepareForTextAnimation range:(1836, 0) finishTextAnimation range:(1836, 16396) requestsPreviewForTextAnimation range:(1836, 16396) requestsBoundingBezierPathsForRange range:(208, 2) requestsBoundingBezierPathsForRange range:(443, 3) prepareForTextAnimation range:(1836, 16396) finishTextAnimation range:(0, 1836) finishTextAnimation range:(0, 1836) replaceRange proposedText:an range:(208, 2) requestsUnderlinePathsForRange range:(208, 2) requestsUnderlinePathsForRange range:(443, 3) selectRanges ranges.count:1 requestsBoundingBezierPathsForRange range:(208, 2) replaceRange proposedText:an range:(208, 2) requestsUnderlinePathsForRange range:(208, 2) requestsUnderlinePathsForRange range:(443, 3) selectRanges ranges.count:1 replaceRange proposedText:you range:(443, 3) requestsUnderlinePathsForRange range:(208, 2) requestsUnderlinePathsForRange range:(443, 3) selectRanges ranges.count:1 requestsBoundingBezierPathsForRange range:(443, 3) replaceRange proposedText:you range:(443, 3) requestsUnderlinePathsForRange range:(208, 2) requestsUnderlinePathsForRange range:(443, 3) selectRanges ranges.count:1 macOS version is 15.3.1 (24D70)
1
0
293
Feb ’25
Apple Intelligence stuck on "preparing" for 6 days.
On the October 10/28 release day of Apple Intelligence I opted in. My iPhone and iPad immediately went to "waitlist" and within 2 to 3 hours were ready to initialize Apple Intelligence. My MacBook Pro 14" with M3 Pro processor and 18 GB or RAM has been stuck on "preparing" since release day (6 days now). I've tried numerous workarounds that I found on forums as well as talking to Apple support, who basically had me repeat the workarounds that I found on forums. I've tried changing region to an area that does not have Apple Intelligence and then back to the US, I've changed Siri language to an unsupported one and back to a supported one, and I have tried disabling background/startup Apps, I've disabled and reenabled Siri. Oh, I've restarted a bunch and let the Mac alone for hours at a time. I've noticed that my selected Siri voice seems to not download. Finally, after several chats and calls with Apple support, I was told that it's Beta software, they can't help me, and I should try the developer forums.... so here I am. Any advice?
9
1
2.9k
Feb ’25
Conforming an existing AppIntent to the photos domain schema
I have an image based app with albums, except in my app, albums are known as galleries. When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain. Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”. Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings: Parameter argument title of a required Assistant schema intent parameter target should not be overridden Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user. FB16283840
2
2
760
Jan ’25
Image Playground not available for "Designed for iPad" apps?
I'm currently trying to add support for Image Playground to our apps. It seems that it's not working in an app that is "Designed for iPad" and runs on a Mac. The modal just shows a spinner and the following is logged to console: Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> GP extension could not be loaded: Extension (platform: 2) could not be found (in update) dealloc Query controller [C32BA176-6A3E-465D-B3C5-0F8D91068B89] ImagePlaygroundViewController.isAvailable returns true, however. In a "real" Mac Catalyst app, it's working. Just not when the app is actually an iPad app. Is this a bug?
4
2
1.4k
Jan ’25
Code with Swift Assist
Hello, I would like to inquire about the release date of Swift Assist’s beta version. Apple has stated that it will be released later this year, but they have not provided a specific date or time. Could you please provide information on the beta version’s release date? Additionally, is there a trial version available? If so, when was it released? Thank you for your assistance.
2
1
2.5k
Jan ’25
Spotlight results | AppShortcut with AppEntity parameter vs CSSearchableItem.associateAppEntity
I've been exploring the Trails Sample App from this session at WWDC24. The app has a TrailEntity of type AppEntity which is leveraged in multiple places throughout the app, including: The GetTrailInfo App Intent with a trail parameter of type TrailEntity. A parameterized App Shortcut which calls the GetTrailInfo intent. The TrailDataManager's init calls updateSpotlightIndex(), which creates a CSSearchableItem for each Trail in the app, along with an associateAppEntity call linking the corresponding TrailEntity to each item that gets added to the CSSearchableIndex. If you build the app and search "trails" in Spotlight, the Trails Sample App section includes instances of TrailEntity as search results. But if you comment out the App Shortcut that takes a TrailEntity as a parameter and rebuild, there are no instances of TrailEntity in the search results. In both cases, the console prints [Spotlight] Trails indexed by Spotlight. Is this expected behavior? Why are the TrailEntity instances only appearing in Spotlight via the App Shortcut? Shouldn't the CSSearchableItem instances show up in Spotlight on their own regardless? If not, then what is the purpose of adopting Core Spotlight with App Entities? Does this add the app entities to the semantic index for "new Siri", even though they're not user facing in the Spotlight UI?
0
0
536
Jan ’25
The "right" way to add parameters to Siri voice operations
In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario: My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ." So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do. What's the right way to define this behavior? p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
2
0
1.3k
Jan ’25
AssistantIntent system.search behaviour
Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent. Questions: Has anyone made this to work on a real device?! In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App! Am I doing anything wrong? (adding @Parameter etc doesn't change anything). Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria. Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?! Some code: @available(iOS 18.2, *) @AssistantIntent(schema: .system.search) struct MySearchIntent: ShowInAppSearchResultsIntent { static let searchScopes: [StringSearchScope] = [.general] static let openAppWhenRun = true var criteria: StringSearchCriteria @MainActor func perform() async throws -> some IntentResult { NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri) return .result() } } Along with this ShortCut in AppShortcutsProvider: AppShortcut( intent: MySearchIntent(), phrases: [ "Search \(.applicationName)" ], shortTitle: "Search", systemImageName: "magnifyingglass" )
1
2
531
Jan ’25
sourceImageURL in imagePlaygroundSheet isn't optional
I can't shake the "I don't think I did this correctly" feeling about a change I'm making for Image Playground support. When you create an image via an Image Playground sheet it returns a URL pointing to where the image is temporarily stored. Just like the Image Playground app I want the user to be able to decide to edit that image more. The Image Playground sheet lets you pass in a source URL for an image to start with, which is perfect because I could pass in the URL of that temp image. But the URL is NOT optional. So what do I populate it with when the user is starting from scratch? A friendly AI told me to use URL(string: "")! but that crashes when it gets forced unwrapped. URL(string: "about:blank")! seems to work in that it is ignored (and doesn't crash) when I have the user create the initial image (that shouldn't have a source image). This feels super clunky to me. Am I overlooking something?
1
0
440
Jan ’25