Hey everyone,
I recently installed the iOS 18.2 developer beta and connected my ChatGPT Premium account to use it within the Visual Intelligence feature. After taking a photo, I tried pressing the "Ask" button, but it’s completely unresponsive. I've tried troubleshooting by doing a hard reset and a regular restart, but no luck so far.
Has anyone else encountered this issue? If so, do you know of any workaround or fix that might help?
Thanks in advance!
Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.
Post
Replies
Boosts
Views
Activity
Hi, I am not been able to use new memory create function post 18.1 beta update.
Below is the error / response showing.
I’ve been waiting for over 3 days for image creation early access.
stuck on "early access requested". i feel like since we're developing testers we shouldn't have to go through this process...
I have already updated a few days ago and requested for Genmoji, Image wand and Image playground yet it is still in pending. Really want it to work.
After updating the iPadOS 18.2, I requested for early access to genmoji but I waited for a long time and my request was not accepted. Please tell me how I can do this Apple. Don't make me call another Apple advisor, thank you so much!
I’ve been stuck for 40 days . Any ideas, guys?”
I have recently been having trouble with my iOS 18.2 beta update. It has been 2 weeks since I have updated to iOS 18.2 beta and joined the Genmoji and image playground waitlist. I am wondering how much longer I have to wait till my request is approved.
Stuck on Downloading support for Image Playground... Once downloaded, this iPhone will be able to use Image Playground. Does anyone know the solution to continue?
Hi everyone, I work with a company called Dataloop Ai, testing AI features. This is the only feature missing that I need to test. Could you please let me know the estimated waiting time for this feature to be enrolled?
I've been waiting for confirmation for a long time, it finally seems to have arrived, but now the software for working with Playground and Genmoji is not downloaded. For the past 8 hours, the phone has been connected to the network, it has been on hold for more than 20 minutes, everything has passed by. The most interesting thing is there is no rebounding how much to swing
After updating the iPadOS 18.2, I requested for early access to genmoji but I waited for a long time and my request was not accepted. Please tell me how I can do this Apple. Don't make me call another Apple advisor, thank you so much!
Issue
When triggering an App Intent using assistant schemas from Apple Intelligence (voice or text) the App opens without prompting for search criteria.
How to repeat
This can be repeated in the example provided by Apple here: https://developer.apple.com/documentation/appintents/making-your-app-s-functionality-available-to-siri
Download the sample code
Build and run on Xcode 16.1 beta 3
Target iPhone 15 Pro Max on iOS 18.1 beta 7
Trigger Apple Intelligence
Enter prompt: "Search AssistantSchemasExample"
Expected behaviour
Apple Intelligence should prompt the user for the criteria and provide this to the App so that the experience is seamless for the end-user. Otherwise Assistant Intents are nothing more than deep links to search screens.
Notes
The example uses @AssistantIntent(schema: .photos.search) intent.
And I've found the issue is also present in other search intents:
@AssistantIntent(schema: .system.search)
@AssistantIntent(schema: .browser.search)
Questions
Has anyone managed to get the prompt to appear?
Will this only function on iOS 18.2?
Hi, while trying to diagnose why some of my Core ML models are running slower when their configuration is set with compute units .CPU_AND_GPU compared to running with .CPU_ONLY I've been attempting to create Core ML model performance reports in Xcode to identify the operations that are not compatible with the GPU. However, when selecting an iPhone as the connected device and compute unit of 'All', 'CPU and GPU' or 'CPU and Neural Engine' Xcode displays one of the following two error messages:
“There was an error creating the performance report. The performance report has crashed on device”
"There was an error creating the performance report. Unable to compute the prediction using ML Program. It can be an invalid input data or broken/unsupported model."
The performance reports are successfully generated when selecting the connected device as iPhone with compute unit 'CPU only' or Mac with any combination of compute units.
Some of the models I have found the issue to occur with are stateful, some are not. I have tried to replicate the issue with some example models from the CoreML tools stateful model guide/video Bring your machine learning and AI models to Apple silicon. Running the performance report on a model generated from the Simple Accumulator example code the performance report is created successfully when trying all compute unit options, but using models from the toy attention and toy attention with kvcache examples it is only successful with compute units as 'CPU only' when choosing iPhone as the device.
Versions I'm currently working with:
Xcode Version 16.0
MacOS Sequoia 15.0.1
Core ML Tools 8.0
iPhone 16 Pro iOS 18.0.1
Is there a way to avoid these errors? Or is there another way to identify which operations within a CoreML model are supported to run on iPhone GPU/Neural engine?
When I import starts models in Jupyter notebook, I ge the following error:
ImportError: dlopen(/opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/_fblas.cpython-312-darwin.so, 0x0002): Library not loaded: @rpath/liblapack.3.dylib
Referenced from: <5ACBAA79-2387-3BEF-9F8E-6B7584B0F5AD> /opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/_fblas.cpython-312-darwin.so
Reason: tried: '/opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/../../../../liblapack.3.dylib' (no such file), '/opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/../../../../liblapack.3.dylib' (no such file), '/opt/anaconda3/bin/../lib/liblapack.3.dylib' (no such file), '/opt/anaconda3/bin/../lib/liblapack.3.dylib' (no such file), '/usr/local/lib/liblapack.3.dylib' (no such file), '/usr/lib/liblapack.3.dylib' (no such file, not in dyld cache). What should I do?
Hi All,
I am trying to build a new iOS app by following https://developer.apple.com/videos/play/wwdc2024/10163/?time=67
When I trying to remove all legacy VN I am getting error, I would appreciate if someone can help me get up to speed with the new Vision API
Very simple question, is there a way to detect if a user has Apple intelligence enabled on their Mac?
Id like to make some interface tweaks when it’s available And enable.
Faz mais de três dias estou aguardando o app playgrounds para testar essas funcionalidades Da Apple intelligence. sem falar que atualizei o sistema e o app playgrounds não apareceu no meu iPhone 15 pro. alguém sabe me dizer onde eu baixo o app playgrounds??
Hey,
I'm EU based (Poland). I'm Apple Developer, and I'm not able to test apple inteligence features, and APIs.
Is it possible to use it somehow? I get it it's not available for EU, but for developers program?
what am I not understanding here.
in short the view loads text from the jsons descriptions and then should filter out the words. and return and display a list of most used words, debugging shows words being identified by the code but does not filter them out
private func loadWordCounts() {
DispatchQueue.global(qos: .background).async {
let fileManager = FileManager.default
guard let documentsDirectory = try? fileManager.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false) else { return }
let descriptions = loadDescriptions(fileManager: fileManager, documentsDirectory: documentsDirectory)
var counts = countWords(in: descriptions)
let tagsToRemove: Set<NLTag> = [
.verb,
.pronoun,
.determiner,
.particle,
.preposition,
.conjunction,
.interjection,
.classifier
]
for (word, _) in counts {
let tagger = NLTagger(tagSchemes: [.lexicalClass])
tagger.string = word
let (tag, _) = tagger.tag(at: word.startIndex, unit: .word, scheme: .lexicalClass)
if let unwrappedTag = tag, tagsToRemove.contains(unwrappedTag) {
counts[word] = 0
}
}
DispatchQueue.main.async {
self.wordCounts = counts
}
}
}