So when I was on the Settings app. I couldn’t see it, but I updated it and I don’t it why don’t is this a glitch please fix it your friend Isaiah.
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want to get depth map that when camera zoom in or zoom out or switch to telephoto.
I have got the depth map using ARkit that provide depth map that the colored RGB image from the wide-range camera and the depth ratings from the LiDAR scanner are fused together.
Now I want to switch camera to telephoto and hope to get new depth map.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Alguem me pode indicar se os devolopers que estão no espaço da união europeia, possam aceder aos serviços apple intelligence ?
Obrigado
Hi everyone,
On the "Apple Intelligence & Siri" settings there's a section titled "Extensions" that specifically mentions ChatGPT.
This got me curious—does Apple provide an API or SDK for developers to create custom integrations or use Apple Intelligence Extensions? Or is this currently limited to the Apple/OpenAI partnership?
I appreciate any insights or links to relevant documentation.
Here's a screenshot of what I mean: https://imgur.com/a/4MuQkIJ
I have an issue with AI writing tools, certain applications, such as LinkedIn, function effectively, while others, like Instagram and WhatsApp, lack the writing tools option.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hey I have a macbook pro M1 and I don't understand why but the download of apple intelligence since macOS 15.2 is remaining block at 100% with the same message telling me to be plug and connect to a network
iOS 18.2 includes a new feature called Visual Intelligence. If I hold down the Camera Control on my iPhone, I can take a photo of an object and use Google to look up items similar to what I've photographed.
Is there a way to programmatically open this interface within my app? If so, can I see which result the user selects?
I live in EU, Ireland, And I don’t have access to apple intelligence. I have ios18 running on iPhone XR, but please make apple intelligence available on EU
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi,
I want to develop an app which makes use of Image Playground.
However, I am located in Europe which makes it impossible for me as Image Playground is not available for me. Even if I would like to distribute the app in the US.
Nor the simulator, nor a physical device will always return that support for ImagePlayground is not supported
(@Environment(.supportsImagePlayground) private var supportsImagePlayground)
How to set my environment such that I can test the feature in my iOS application
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I’m building an app that generates images based on text input from a specific text field. However, I’m encountering a problem:
For short prompts like "a cat and a dog", the entire string is sent to the Image Playground, even when I use the extracted method. For longer inputs, the behavior is inconsistent. Sometimes it extracts keywords correctly, but other times it doesn’t extract anything at all.
Since my app relies on generating images based on the extracted keywords, this inconsistency negatively impacts the user experience in my app. How can I make sure that keywords are always extracted from the input string?
Button("Generate", systemImage: "apple.intelligence") {
isPresented = true
}
.imagePlaygroundSheet(isPresented: $isPresented, concepts: [ImagePlaygroundConcept.extracted(from: text, title: textTitle)]) { url in
imageURL = url
}
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hey dear developers!
This post should be available for the future Siri updates and improvements but also for wishes in this forum so that everyone can share their opinion and idea please stay friendly. have fun! I had already thought about developing a demo app to demonstrate my idea for a better Siri.
My change of many:
Wish Update: Siri's language recognition capabilities have been significantly enhanced. Instead of manually setting the language, Siri can now automatically recognize the language you intend to use, making language switching much more efficient. Simply speak the language you want to communicate in, and Siri will automatically recognize it and respond accordingly. Whether you speak English, German, or Japanese, Siri will respond in the language you choose.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
iPhone
Siri Event Suggestions Markup
Siri and Voice
Apple Intelligence
Hi Apple product owners.
I am missing a unified concept which might be derived from the use cases for mail categories and mail spam for the app "Mail" on Mac.
I need a recommendation on how to use categories in combination with the spam filter to get most out of it.
So I was looking for the use cases for the 2 functionality areas in order to figure out how to organise my mails by using as much automation as possible before I start creating intelligent folders in addition.
What can you recommend where I get this information from? I don't want to guess or read a lot of forum contributions which are based on guesses.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi! I'm trying to use the ImagePlayground API in SwiftUI with the .imagePlaygroundSheet modifier. However, when the sheet is shown (in the preview or in the simulator) it displays the following message: "Image Playground is not available. Image Playground is not available on this iPhone.".
I'm using an iPhone 16 Pro with iOS 18.3.1 in the Xcode (16.2) Simulator.
Anyone else having this problem? How can I fix it?
I’m sure someone though about it already. But let’s have ecosystem, where Apple Intelligence uses your most capable (Apple) hardware at first and the cloud service as second.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi Apple team,
When using AppShortcutsProvider, I hit the hard limit:
Each app may have at most 10 App Shortcuts.
This feels limiting for apps that offer multiple workflows and would benefit from deeper Siri integration.
Could this cap be raised — ideally to 30 — to support broader use of AppIntents, enhance Siri automation, and unlock more system-level capabilities?
AppShortcuts are a fantastic tool. Increasing the limit would make them even more powerful.
Thanks!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Shortcuts
App Intents
Apple Intelligence
iOS26 is supported by a wider range of devices than are able to run AI, e.g iPhone 12 runs iOS26, but does not support AI.
How do we determine in code if AI is supported on a device ?
How do we determine what features use AI under the hood ?
Thanks,
Steve.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Good morning all has anyone encountered the issue of Siri returning back to her original user interface on IOS-26? I’m trying to figure out the cause. I’ve sent feedback via the feedback app. Just seeing if anyone else has the same issue.
Greetings,
Ive been exerimenting with the new Apple intelligence chat. I want to be able to use my custom LLM and I made that work (I can chat back and forward from the left panel with my server) but I cannot find out how to change the editor contents like chatgpt does.
chatgpt is able to change the current editor and, seems like, all files in the pbx. I tried to catch the call with charles with no success.
In the OpenIA platform docs it doesnt mention anything that could change the code shown.
does anyone know how to achieve this? Is the apple intelliece documentation lacking this features and will it be completed soon? will this features even be open for developers?
Hi all,
I'm capturing a photo using AVCapturePhotoOutput, and I've set:
let photoSettings = AVCapturePhotoSettings()
photoSettings.isDepthDataDeliveryEnabled = true
Then I create the handler like this:
let data = photo.fileDataRepresentation()
let handler = try ImageRequestHandler(data: data, orientation: .right)
Now I’m wondering:
If depth data delivery is enabled, is it actually included and used when I pass the Data to ImageRequestHandler?
Or do I need to explicitly pass the depth data using the other initializer?
let handler = try ImageRequestHandler(
cvPixelBuffer: photo.pixelBuffer!,
depthData: photo.depthData,
orientation: .right
)
In short:
Does ImageRequestHandler(data:) make use of embedded depth info from AVCapturePhoto.fileDataRepresentation() — or is the pixel buffer + explicit depth data required?
Thanks for any clarification!
Are there any guidelines for using Foundation Models To generate text for users in response to some canned queries? Should we use a special icon or text to let the user know that Apple Intelligence is generating the text? Should there be a disclaimer like, Apple Intelligence can make mistakes, please check for accuracy, etc?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence