Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.

All subtopics
Posts under Machine Learning & AI topic

Post

Replies

Boosts

Views

Activity

Apple please respond with any information (Image Playgrounds access)
Apple, I speak for the majority when I say that we are frustrated, not exactly from the fact that we are unable to access features and test them and submit feedback and etc. but because of the fact that you are not communicating. If you may, please let us know right here if this is a server bug or if it is initial strategy to rollout the generative features to a small and limited amount of people on IOS18.2 DB1 Thank you!
1
1
442
Oct ’24
Max 16k images for Image Classifier training????
I'm hitting a limit when trying to train an Image Classifier. It's at about 16k images (in line with the error info) - and it gives the error: IOSurface creation failed: e00002be parentID: 00000000 properties: { IOSurfaceAllocSize = 529984; IOSurfaceBytesPerElement = 4; IOSurfaceBytesPerRow = 1472; IOSurfaceElementHeight = 1; IOSurfaceElementWidth = 1; IOSurfaceHeight = 360; IOSurfaceName = CoreVideo; IOSurfaceOffset = 0; IOSurfacePixelFormat = 1111970369; IOSurfacePlaneComponentBitDepths = ( 8, 8, 8, 8 ); IOSurfacePlaneComponentNames = ( 4, 3, 2, 1 ); IOSurfacePlaneComponentRanges = ( 1, 1, 1, 1 ); IOSurfacePurgeWhenNotInUse = 1; IOSurfaceSubsampling = 1; IOSurfaceWidth = 360; } (likely per client IOSurface limit of 16384 reached) I feel like I was able to use more images than this before upgrading to Sonoma - but I don't have the receipts.... Is there a way around this? I have oodles of spare memory on my machine - it's using about 16gb of 64 when it crashes... code to create the model is let parameters = MLImageClassifier.ModelParameters(validation: .dataSource(validationDataSource), maxIterations: 25, augmentation: [], algorithm: .transferLearning( featureExtractor: .scenePrint(revision: 2), classifier: .logisticRegressor )) let model = try MLImageClassifier(trainingData: .labeledDirectories(at: trainingDir.url), parameters: parameters) I have also tried the same training source in CreateML, it runs through 'extracting features', and crashes at about 16k images processed. Thank you
1
0
482
Oct ’24
AFM support for RAG Applications
I‘m excited at the development possibilities presented by Apple Intelligence and have begun imagining retrieval augmented generation use cases. Writing tools suggest that this is possible, but I have not seen any direct statements by Apple regarding use of AFMs for RAG applications. Have any references to APIs or sample code for RAG applications been published?
4
0
510
Oct ’24
NLtagger not filtering words such as "And, to, a, in"
what am I not understanding here. in short the view loads text from the jsons descriptions and then should filter out the words. and return and display a list of most used words, debugging shows words being identified by the code but does not filter them out private func loadWordCounts() { DispatchQueue.global(qos: .background).async { let fileManager = FileManager.default guard let documentsDirectory = try? fileManager.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false) else { return } let descriptions = loadDescriptions(fileManager: fileManager, documentsDirectory: documentsDirectory) var counts = countWords(in: descriptions) let tagsToRemove: Set<NLTag> = [ .verb, .pronoun, .determiner, .particle, .preposition, .conjunction, .interjection, .classifier ] for (word, _) in counts { let tagger = NLTagger(tagSchemes: [.lexicalClass]) tagger.string = word let (tag, _) = tagger.tag(at: word.startIndex, unit: .word, scheme: .lexicalClass) if let unwrappedTag = tag, tagsToRemove.contains(unwrappedTag) { counts[word] = 0 } } DispatchQueue.main.async { self.wordCounts = counts } } }
0
0
549
Oct ’24
Issues with Statsmodels
When I import starts models in Jupyter notebook, I ge the following error: ImportError: dlopen(/opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/_fblas.cpython-312-darwin.so, 0x0002): Library not loaded: @rpath/liblapack.3.dylib Referenced from: &lt;5ACBAA79-2387-3BEF-9F8E-6B7584B0F5AD&gt; /opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/_fblas.cpython-312-darwin.so Reason: tried: '/opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/../../../../liblapack.3.dylib' (no such file), '/opt/anaconda3/lib/python3.12/site-packages/scipy/linalg/../../../../liblapack.3.dylib' (no such file), '/opt/anaconda3/bin/../lib/liblapack.3.dylib' (no such file), '/opt/anaconda3/bin/../lib/liblapack.3.dylib' (no such file), '/usr/local/lib/liblapack.3.dylib' (no such file), '/usr/lib/liblapack.3.dylib' (no such file, not in dyld cache). What should I do?
1
0
684
Oct ’24
Core ML Model Performance report errors when include GPU/Neural Engine in compute unit selection
Hi, while trying to diagnose why some of my Core ML models are running slower when their configuration is set with compute units .CPU_AND_GPU compared to running with .CPU_ONLY I've been attempting to create Core ML model performance reports in Xcode to identify the operations that are not compatible with the GPU. However, when selecting an iPhone as the connected device and compute unit of 'All', 'CPU and GPU' or 'CPU and Neural Engine' Xcode displays one of the following two error messages: “There was an error creating the performance report. The performance report has crashed on device” "There was an error creating the performance report. Unable to compute the prediction using ML Program. It can be an invalid input data or broken/unsupported model." The performance reports are successfully generated when selecting the connected device as iPhone with compute unit 'CPU only' or Mac with any combination of compute units. Some of the models I have found the issue to occur with are stateful, some are not. I have tried to replicate the issue with some example models from the CoreML tools stateful model guide/video Bring your machine learning and AI models to Apple silicon. Running the performance report on a model generated from the Simple Accumulator example code the performance report is created successfully when trying all compute unit options, but using models from the toy attention and toy attention with kvcache examples it is only successful with compute units as 'CPU only' when choosing iPhone as the device. Versions I'm currently working with: Xcode Version 16.0 MacOS Sequoia 15.0.1 Core ML Tools 8.0 iPhone 16 Pro iOS 18.0.1 Is there a way to avoid these errors? Or is there another way to identify which operations within a CoreML model are supported to run on iPhone GPU/Neural engine?
0
0
719
Oct ’24
Missing parameter prompt for search Assistant Intents
Issue When triggering an App Intent using assistant schemas from Apple Intelligence (voice or text) the App opens without prompting for search criteria. How to repeat This can be repeated in the example provided by Apple here: https://developer.apple.com/documentation/appintents/making-your-app-s-functionality-available-to-siri Download the sample code Build and run on Xcode 16.1 beta 3 Target iPhone 15 Pro Max on iOS 18.1 beta 7 Trigger Apple Intelligence Enter prompt: "Search AssistantSchemasExample" Expected behaviour Apple Intelligence should prompt the user for the criteria and provide this to the App so that the experience is seamless for the end-user. Otherwise Assistant Intents are nothing more than deep links to search screens. Notes The example uses @AssistantIntent(schema: .photos.search) intent. And I've found the issue is also present in other search intents: @AssistantIntent(schema: .system.search) @AssistantIntent(schema: .browser.search) Questions Has anyone managed to get the prompt to appear? Will this only function on iOS 18.2?
3
1
533
Oct ’24
Download AI 18.2
I've been waiting for confirmation for a long time, it finally seems to have arrived, but now the software for working with Playground and Genmoji is not downloaded. For the past 8 hours, the phone has been connected to the network, it has been on hold for more than 20 minutes, everything has passed by. The most interesting thing is there is no rebounding how much to swing
2
0
545
Oct ’24