I've been running Sequoia 15.1 since it was released. I soon thereafter was taken off the waitlist and had been using Apple Intelligence until this morning.
My first hint something was wrong was that Writing Tools, which I'd been using extensively, disappeared. I tried in another app, and it wasn't there, either.
I then looked at the Siri icon in my menu bar - which looks different under Apple Intelligence - and it had been reverted to the old icon.
I then checked my Apple Intelligence settings and, sure enough, not only was it off, but I'd been returned to the waitlist.
My iOS and iPadOS devices continue working just fine with Apple Intelligence. Only my MacBook Pro is experiencing this issue.
Has anyone else seen this?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
It's been over 2 hours and my connection is fine. Already tried restarting a couple of times
Well, hello there.
Correct me if I’m wrong: Apple Intelligence will be available only for US developers, and if you want to use it in your app you should immigrate to US?
Because now not only usage of it’s limited for US, but every new API usage are prohibited too.
Dear Apple Team,
I have a suggestion to enhance the Apple Watch user experience.
A new feature could provide personalized
recommendations based on weather conditions and the user’s mood.
For example, during hot weather, it could suggest drink something cold,
or if the user feeling down,it could offer ways the boost their mood.
This kind of feature could make the Apple Watch not just a health and fitness
tracker but also a more functional personal assistant.
“İmprove communication with Apple Watch”
Feature #1: Noise detection and location
suggestions.
Imagine having your Apple Watch detect
ambient noise levels and suggest a quieter location for your call.
Feature #2: Context-aware call response options.
If you can't answer a call,
your Apple Watch could offer pre-set responses to communicate your status and
reduce missed call anxiety.
For example, if you're in a busy restaurant, your Apple Watch could suggest
moving to a quieter spot nearby for a better conversation.
Or if you’re in a movie theater,your Apple Watch could send an automatic
“I’m in the movie’s” text to the caller.
“İmprove user experience and app management“
Automated Sleep Notifications:
The ability for the Apple Watch to automatically turn off notifications or change the watch face
when the user is sleeping would provide a more seamless experience.
For instance,when the watch detects that the user is in sleep mode,it could enable Do Not
Disturb to silence calls and alerts.
Caller Notification:
In addition,it would be great if the Apple Watch
could inform callers that the user is currently sleeping.
This could help manage expectations for those attempting to reach the user at night.
App Management to Conserve Battery:
Implementing a feature that detects
draining app's when the user is asleep could further enhance battery life.
The watch and the iphone could close or pause apps that are using significant
power while the user is not active.
I believe these features could provide valuable advancements in enhancing the
Apple Watch's usability for those who prioritize a restful night's sleep.
Thank you for considering my suggestions.
Best regards,
Mahmut Ötgen
Istanbul,Turkey
Dear Apple Development Team,
I’m writing to express my concerns and request a feature enhancement regarding the ChatGPT app for iOS. Currently, the app's audio functionality does not work when the app is in the background. This limitation significantly affects the user experience, particularly for those of us who rely on the app for ongoing, interactive voice conversations.
Given that many apps, particularly media and streaming services, are allowed to continue audio playback when minimized, it’s frustrating that the ChatGPT app cannot do the same. This restriction interrupts the flow of conversation, forcing users to stay within the app to maintain an audio connection.
For users who multitask on their iPhones, being able to switch between apps while continuing to listen or interact with ChatGPT is essential. The ability to reference notes, browse the web, or even respond to messages while maintaining an ongoing conversation with ChatGPT would greatly enhance the app’s usability and align it with other background-capable apps.
I understand that Apple prioritizes resource management and device performance, but I believe there’s a strong case for allowing apps like ChatGPT to operate with background audio. Given its growing importance as a tool for productivity, learning, and communication, adding this capability would provide significant value to users.
I hope you will consider this feedback for future updates to iOS, or provide guidance on any existing APIs that could be leveraged to enable such functionality.
Thank you for your time and consideration.
Best regards,
luke
yes I used gpt to write this.
I installed the MacOS 15.1 at my Macbook Pro 14
M2 Pro, but apple intelligence doesnt appear
I have a US-based device (because I live in the US) on 18.1 beta 1, but happen to be in Italy at present. I’m getting an “Apple Intelligence is not yet available in your country.” Is there any way around this, or will I need to wait till I’m physically back in the States to test Apple Intelligence?
Now i can open my Apple intelligence at setting ,but it still can’t work . It shows that the processing of it is downloading. I wonder if there any way to fix it
I am attempting to install the macOS 15.1 update alongside the Apple Intelligence beta feature, to experience and then integrate into my developing application. I have a compatible MacBook Air M2 with a regional designation of the United States and language as US English, but is however purchased in mainland China. i have downloaded 15.1 but apple intelligence does not display, is this arising from buying it in China?
I have read that if you install the Sequoia beta on an external drive, it will not include the Apple Intelligence functionality. Is that true? If so, that is a real bummer because I prefer not to install the betas even on a separate internal volume if it all possible.
I am using a insta360 flow pro with iOS 18 public beta. What I would really like to see is tracking for horses .. is this a future implementation?
Kind regards Chielio
I would be happy to beta test this if it's in development.
Hi, does anyone know if there is a country list for the iOS18 features, specifically Apple Intelligence?
As a developer in Switzerland it seems pretty confusing when only terms like "EU" are used without specific countries.
For example:
Apple Intelligence is not available in the EU or China for now
Third-party App Stores are only available in the EU
Switzerland has neither. So does Apple consider us part of the EU, EEA, geographically in Europe... differently on a feature-by-feature basis?
I'm aware that it can get confusing as Switzerland is not part of the EU yet has many individual agreements, and that this can make things complicated in Betas. However, a clear list of feature availability by country for iOS18 must exist somewhere - I haven't found it yet though :)
I'm trying to test the Assistant Schema Intent with the Shortcut app. However, unlike in the WWDC video (https://developer.apple.com/videos/play/wwdc2024/10133/), my Intent conforming to the Assistant Schema does not appear when I search for 'AssistantSchema'. If it doesn't appear here, does that mean this intent will not work with Siri?
Hi,
I have some questions about the new AssistantSchema.CameraEnum.captureDevice introduced with iOS 18 beta 4.
Here's the context. I create a intent:
@AssistantIntent(schema: .camera.setDevice)
struct SetDeviceIntent {
var device: CaptureDevice
func perform() async throws -> some IntentResult {
.result()
}
}
@AssistantEnum(schema: .camera.captureDevice)
enum CaptureDevice: String {
case front
case back
case ultrawide
}
Some CaptureDevice cases are not available on some devices. e.g: CaptureMode.ultrawide is only available on iPhone, not on iPad.
How can we make CaptureDevice dynamic? I don't think AppEnum supports @Dependency or something else.
Hi,
I have some questions about the new AssistantSchema.CameraEnum.captureMode and AssistantSchema.CameraEnum.captureDevice introduced with iOS 18 beta 4.
Here's the context. I create a intent:
@AssistantIntent(schema: .camera.startCapture)
struct StartCaptureIntent {
var captureMode: CaptureMode
var timerDuration: CaptureDuration?
var device: CaptureDevice?
func perform() async throws -> some IntentResult {
.result()
}
}
And these app enums:
@AssistantEnum(schema: .camera.captureDevice)
enum CaptureDevice: String {
case front
case back
case ultrawide
}
@AssistantEnum(schema: .camera.captureMode)
enum CaptureMode: String {
case modeA
case modeB
}
Some CaptureDevice cases are not available in some CaptureMode. e.g: CaptureMode.modeA only supports CaptureDevice.back and CaptureDevice.front.
In a classic AppIntent, I would create an AppEntity to represent CaptureDevice and use @IntentParameterDependency<CapturePhotoIntent>( \.$captureMode) to create a dependency between the captureMode and the captureDevice parameters.
How can we create this dependency between two @AssistantEnum? I'm not sure this is possible as @AssistantEnum creates AppEnum.
The new .photos AssistantSchema for intents allow integrating App Intents for Photos-related actions with Apple Intelligence. I was wondering if it would be possible to create intents that do not require full library access.
Our app supports loading image from Photos via the PHPicker, which doesn't require any user permission. Now we want to support the .photos.openAsset schema in an app intent to allow interactions like "Open this image in BeCasso and apply preset X".
Would that be possible without full library access?
App crashes on iOS 16.4 when there is usage for ImageAnalysisInteraction api from VisionKit. App crashes before even starts.
Here is output:
dyld[3240]: Symbol not found: _$s9VisionKit24ImageAnalysisInteractionC7subject2atAC7SubjectVSgSo7CGPointV_tYaFTu
Referenced from: <BAD7A699-FB4E-3D0E-8CD4-45CC9FC3D5E5> /Users/sereza/Library/Developer/CoreSimulator/Devices/B64EAF39-0DD9-49EC-A3F7-69675C94B8BE/data/Containers/Bundle/Application/F4E30E86-ED4D-4748-AB99-434208D55483/VisionKitChecker.app/VisionKitChecker
Expected in: <F05E3A17-D74A-3EE2-BC8D-DDCC23E48707> /Library/Developer/CoreSimulator/Volumes/iOS_20E247/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 16.4.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/VisionKit.framework/VisionKit
Here is enough code to produce this crash. Please note that this code never gets called. It is enough that it exists in the project:
import VisionKit
@MainActor
final class LiftHelper: ObservableObject {
func doSomething() async throws {
let interaction = ImageAnalysisInteraction()
let _ = try await interaction.image(for: [])
}
}
How could I use Image Playground API in my app without the API's UI ? I haven't find the API now, can anyone tell me ?
Heyy, on my Iphone 15 Pro, i cant use the new Siri. It shows me the old design of Siri. I got IOS 18.0 Developer Beta. I got no clue, why it dont works for me.