I have recently been having trouble with my iOS 18.2 beta update. It has been 2 weeks since I have updated to iOS 18.2 beta and joined the Genmoji and image playground waitlist. I am wondering how much longer I have to wait till my request is approved.
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone, I work with a company called Dataloop Ai, testing AI features. This is the only feature missing that I need to test. Could you please let me know the estimated waiting time for this feature to be enrolled?
I've been waiting for confirmation for a long time, it finally seems to have arrived, but now the software for working with Playground and Genmoji is not downloaded. For the past 8 hours, the phone has been connected to the network, it has been on hold for more than 20 minutes, everything has passed by. The most interesting thing is there is no rebounding how much to swing
After updating the iPadOS 18.2, I requested for early access to genmoji but I waited for a long time and my request was not accepted. Please tell me how I can do this Apple. Don't make me call another Apple advisor, thank you so much!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi, I am not been able to use new memory create function post 18.1 beta update.
Below is the error / response showing.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi guys, does anyone know how long I will be given permission to use the Playground app because I already have beta 18.2 and I've been waiting for 7 days and I would like to use the app?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
someone know how to resolve or how much time it take to get access on playground .
has anyone else been waiting for 7+ days to get off the waitlist. The original waitlist for just regular Apple intelligence accepted everyone within a few hours, this one is taking days. Is there any way to speed up the process. I am kind of disappointed with Apple in this sense. A huge company cant even give you the features they said they promised, when we updated shouldn’t it have already been on our phones?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I installed beta 18.2 on the day of release and requested access to playground that day 2 min after I got in it is now a week later
and I still do not have access am I missing something ?
16 pro max
Hello Guys
does anybody know how long this request need to get approved?
I am waiting now since beta 1 of 18.2 is out
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I downloaded IOS 18.2 dev beta to try out some of the features, on my iPhone 15 Pro, iOS 18.1’s features took about 3 hours, but this one, ITS BEEN 8 DAYS. 8 DAYS APPLE. STILL. NO. NEW. FEATURES!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I fell for the ”writing power“ of AI advertising and purchased an AirBook, Only to learn that power is related to Notes and Texts. PAGES needs writing power yesterday. When will that actual writing app get AI?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I have been to image creation for 2 hours. I can do genmoji, playground well for 10 minutes.
After that, when I run the playground and start to generate Animation style of photo, it will stuck in here and no result output. Is there anyone know what's happened...
Image Playground Error: Cannot find protocol declaration for 'ImageGenerationViewControllerDelegate'
@available(macCatalyst 18.1, *)
@available(iOS 18.1, *)
extension CKImageSelectionManager: ImagePlaygroundViewController.Delegate {
public func imagePlaygroundViewController(_ imagePlaygroundViewController: ImagePlaygroundViewController, didCreateImageAt imageURL: URL) {
}
func presentImagePlayground() {
let imagePlaygroundVC = ImagePlaygroundViewController()
// Set delegate to self to receive the callback
imagePlaygroundVC.delegate = self
imagePlaygroundVC.isModalInPresentation = true // Prevents dismissal with swipe if needed
self.delegate?.presentImageSelectionViewController(imagePlaygroundVC)
}
}
This generates an error in the xcode generated swift header.
I am working to add Spotlight indexing for my app entities as discussed in WWDC24's video "What's New in App Intents".
That video goes over the IndexedEntity protocol and the integration with Spotlight via CSSearchableItemAttributeSet.
What I'm seeing though does not match the video. In the video, the presenter goes through the sort of progressive approach you can take to getting this data into Spotlight starting with the basics and then expanding to include more support depending on how much the developer wants to do.
What I'm seeing is that if you conform to IndexedEntity, your entities will appear in Spotlight using the name derived from
public var displayRepresentation: DisplayRepresentation
So, that works. Name appears... BUT the next part of the video goes into how to expand your implementation with more metadata for Spotlight via CSSearchableItemAttributeSet. The issue I'm seeing is that once that's implemented, the items disappear from Spotlight, almost like that implementation is overriding the base implementation in a way that no longer functions.
My expectation is that an item with custom attributes would use them in Spotlight as appropriate, not disappear from search, i.e. what's shown in the video should work.
I've got a sample project here:
https://hanchor.s3.amazonaws.com/misc/IndexingTest.zip
To reproduce with the sample:
Build and run. Indexing is setup in the init() method so it will just run.
Go to Spotlight and search for 'Huntersblau', a string included in the content set. At this point you should see a result - good!
Stop the app and go back and uncomment the var attributeSet: CSSearchableItemAttributeSet implementation in IndexingTestApp.swift. This will provide custom attributes to Spotlight.
Repeat steps 1 and 2 - you'll see now, it no longer appears in the search results - when CSSearchableItemAttributeSet is implemented, the item drops out of Spotlight.
Hi,
I'm trying to analyze images in my Photos library with the following code:
func analyzeImages(_ inputIDs: [String])
{
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
option.isSynchronous = true
option.isNetworkAccessAllowed = true
option.resizeMode = .none
option.deliveryMode = .highQualityFormat
let concurrentTasks=1
let clock = ContinuousClock()
let duration = clock.measure {
let group = DispatchGroup()
let sema = DispatchSemaphore(value: concurrentTasks)
for entry in inputIDs {
if let asset=PHAsset.fetchAssets(withLocalIdentifiers: [entry], options: nil).firstObject {
print("analyzing asset: \(entry)")
group.enter()
sema.wait()
manager.requestImage(for: asset, targetSize: PHImageManagerMaximumSize, contentMode: .aspectFit, options: option) { (result, info) in
if let result = result {
Task {
print("retrieved asset: \(entry)")
let aestheticsRequest = CalculateImageAestheticsScoresRequest()
let fingerprintRequest = GenerateImageFeaturePrintRequest()
let inputImage = result.cgImage!
let handler = ImageRequestHandler(inputImage)
let (aesthetics,fingerprint) = try await handler.perform(aestheticsRequest, fingerprintRequest)
// save Results
print("finished asset: \(entry)")
sema.signal()
group.leave()
}
}
else {
group.leave()
}
}
}
}
group.wait()
}
print("analyzeImages: Duration \(duration)")
}
When running this code, only two requests are being processed simultaneously (due to to the semaphore)... However, if I call the function with a large list of images (>100), memory usage balloons over 1.6GB and the app crashes. If I call with a smaller number of images, the loop completes and the memory is freed.
When I use instruments to look for memory leaks, it indicates no memory leaks are found, but there are 150+ VM:IOSurfaces allocated by CMPhoto, CoreVideo and CoreGraphics @ 35MB each. Shouldn't each surface be released when the task is complete?
Hi!
I recently updated to the latest 18.2 Beta version of iOS on my iPhone 15 Pro Max. Could you please guide me on how to locate and utilize the Image Search feature powered by Apple Intelligence?
Just a little detail: I went on YouTube and the instruction was to hold the camera action button on the iPhone 16 and image search appears.
So far, I haven’t been able to replicate these results on my iPhone 15 Pro Max. This is a great capability and I’d really like to try it out.
“Live long and prosper.” -Spock
-Jordan
I cannot find the hardware requirements for Image Playground documented anywhere. I'm also not sure if they are identical to devices that support Apple Intelligence.
On the App Store, the only requirement listed for Image Playground is iOS 18.2.
Not knowing the requirements is an issue because I need to be able to clearly state the requirements for the feature in my app description.
Also, I'm sure my mother's current iPad is too old, but I'm not sure what models support it if I were to buy her a new one.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I want to get depth map that when camera zoom in or zoom out or switch to telephoto.
I have got the depth map using ARkit that provide depth map that the colored RGB image from the wide-range camera and the depth ratings from the LiDAR scanner are fused together.
Now I want to switch camera to telephoto and hope to get new depth map.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Alguem me pode indicar se os devolopers que estão no espaço da união europeia, possam aceder aos serviços apple intelligence ?
Obrigado