Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.

All subtopics

Post

Replies

Boosts

Views

Activity

tensorflow-metal problems (tf.random.normal) and disappointments
"Last year, I upgraded to an M2 Max laptop, expecting that tensorflow-metal would facilitate effective local prototyping utilizing the Apple Silicon's capabilities. It has been quite some time since tensorflow-metal was last updated, and there appear to be several unresolved issues noted by the community here. I've personally observed the following behavior with my setup: Without tensorflow-metal: import tensorflow as tf for _ in range(10): print(tf.random.normal((3,)).numpy()) [-1.4213976 0.08230731 -1.1260201 ] [ 1.2913705 -0.47693467 -1.2886043 ] [ 0.09144169 -1.0892165 0.9313669 ] [ 1.1081179 0.9865657 -1.0298151] [ 0.03328908 -0.00655857 -0.02662632] [-1.002391 -1.1873596 -1.1168724] [-1.2135247 -1.2823236 -1.0396363] [-0.03492929 -0.9228362 0.19147137] [-0.59353966 0.502279 0.80000925] [-0.82247525 -0.13076428 0.99579334] With tensorflow-metal: import tensorflow as tf for _ in range(10): print(tf.random.normal((3,)).numpy()) [ 1.0031303 0.8095635 -0.0610961] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] Given these observations, it seems there may be an issue with the randomness of tf.random.normal when using tensorflow-metal. My current setup includes MacOS 14.5, tensorflow 2.14.1, and tensorflow-macos 2.14.1. I am interested in understanding if there are known solutions or workarounds for this behavior. Furthermore, could anyone provide an update on whether tensorflow-metal is still being actively developed, or if alternative approaches are recommended for utilizing the GPU capabilities of this hardware?
1
0
57
12h
CreateML Spatial Unexpected Error
I try to use Create ML Spatial template. but unexpected error is occured in 1-3 minitues. I try some times and same results. Spatial template is not available on an M1 mac ? My development environment is Apple M1 Pro macOS: 15.0 Xcode: 16.0 beta CreateML: 6.0 beta
0
0
83
2d
Siri not recognizing the parameter in the phrase
I am trying to create an "OpenShowIntent" that allows the user to open a show from outside the app by either invoking the shortcut and provide a showName and/or by asking Siri "Open (.$showName) in (.applicationName)". Currently the shortcut works and but Siri doesn't recognize the showName and keeps on loading when I say the phrase with a show name. I was wondering where I am going wrong. Here are some of the code below: OpenShowIntent.swift: import Foundation import AppIntents import Models @available(iOS 16.0, *) struct OpenShowIntent: AppIntent { static var title: LocalizedStringResource = "Open a Show" static var openAppWhenRun = true @Parameter(title: "Show", optionsProvider: ShowEntityQuery()) var show: ShowEntity? @Parameter(title: "Show Name") var showName: String static var parameterSummary: some ParameterSummary { Summary("Open \(\.$showName) in (APP NAME)") } @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { var showToOpen: ShowEntity if let show = show { showToOpen = show } else { let params = ElasticSearchParams(query: showName) let searchResults = try await IntentsHelper.getShows(searchParams: params) let entities = searchResults.map { show in ShowEntity(id: show.id, name: show.displayName, posterUrl: show.posterImageUrl) } showToOpen = try await $show.requestDisambiguation( among: entities, dialog: "Choose a show from this list?" ) } let deepLink = DeepLink( type: .show( showId: showToOpen.id, showDateStr: nil, voucher: nil, reviewModalParams: nil ) ) let url = try deepLink.createURL(source: nil) IntentsHelper.openLink(url: url) return .result(dialog: "Show '\(showToOpen.name)' opened successfully.") } } ShowEntity.swift: import Foundation import AppIntents import Models @available(iOS 16.0, *) struct ShowEntity: AppEntity { typealias DefaultQuery = ShowEntityQuery var id: Int var name: String var posterUrl: URL? static var typeDisplayRepresentation: TypeDisplayRepresentation = "Show" var displayRepresentation: DisplayRepresentation { var image: DisplayRepresentation.Image? if let imageUrl = posterUrl { image = DisplayRepresentation.Image(url: imageUrl) } return DisplayRepresentation( title: LocalizedStringResource(stringLiteral: name), subtitle: nil, image: image ) } static var defaultQuery = ShowEntityQuery() init(id: Int, name: String, posterUrl: URL?) { self.id = id self.name = name self.posterUrl = posterUrl } } ShowEntityQuery.swift: import Foundation import AppIntents import Models @available(iOS 16.0, *) struct ShowEntityQuery: EntityStringQuery { func entities(for identifiers: [Int]) async throws -> [ShowEntity] { let params = ElasticSearchParams(showIds: identifiers) let searchResult = try await IntentsHelper.getShows(searchParams: params) return searchResult.map { show in ShowEntity(id: show.id, name: show.displayName, posterUrl: show.posterImageUrl) } } func suggestedEntities() async throws -> [ShowEntity] { let params = ElasticSearchParams( showIds: BookmarksManager.sharedManager.getBookmarkShowIdsAsArray() ) let searchResult = try await IntentsHelper.getShows(searchParams: params) return searchResult.map { show in ShowEntity(id: show.id, name: show.displayName, posterUrl: show.posterImageUrl) } } func entities(matching query: String) async throws -> [ShowEntity] { let params = ElasticSearchParams(query: query) print("entities(matching:) called with query: \(query)") let searchResult = try await IntentsHelper.getShows(searchParams: params) return searchResult.map { show in ShowEntity(id: show.id, name: show.displayName, posterUrl: show.posterImageUrl) } } } ShortcutsProvider.swift: import Foundation import AppIntents @available(iOS 16.0, *) struct TTShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { return [ AppShortcut( intent: OpenShowIntent(), phrases: [ "View show in \(.applicationName)", "Open \(\.$showName) in \(.applicationName)", "Show \(\.$showName) in \(.applicationName)", "Find \(\.$showName) in \(.applicationName)", "Search for \(\.$showName) in \(.applicationName)" ], shortTitle: "Open show", systemImageName: "pencil.circle" ) ] } static var shortcutTileColor: ShortcutTileColor = .blue }
0
0
108
3d
Using MLHandActionClassifierwith visionOS
How do I use either of these data sources with MLHandActionClassifierwith on visionOS? MLHandActionClassifier.DataSource.labeledKeypointsDataFrame MLHandActionClassifier.DataSource.labeledKeypointsData visionOS ARKit HandTracking provides us with 27 joints and 3D co-ordinates which differs from the 21 joint, 2D co-ordinates that these two data sources mention in their documentation.
0
0
88
4d
Use iPad M1 processor as GPU
Hello, I’m currently working on Tiny ML or ML on Edge using the Google Colab platform. Due to the exhaust of my compute unit’s free usage, I’m being prompted to pay. I’ve been considering leveraging the GPU capabilities of my iPad M1 and Intel-based Mac. Both devices utilize Thunderbolt ports capable of sharing connections up to 30GB/s. Since I’m primarily using a classification model, extensive GPU usage isn’t necessary. I’m looking for assistance or guidance on utilizing the iPad’s processor as an eGPU on my Mac, possibly through an API or Apple technology. Any help would be greatly appreciated!
0
0
107
4d
Matmul with quantized weight does not run on ANE with FP16 offset: `ane: Failed to retrieved zero_point`
Hi, the following model does not run on ANE. Inspecting with deCoreML I see the error ane: Failed to retrieved zero_point. import numpy as np import coremltools as ct from coremltools.converters.mil import Builder as mb import coremltools.converters.mil as mil B, CIN, COUT = 512, 1024, 1024 * 4 @mb.program( input_specs=[ mb.TensorSpec((B, CIN), mil.input_types.types.fp16), ], opset_version=mil.builder.AvailableTarget.iOS18 ) def prog_manual_dequant( x, ): qw = np.random.randint(0, 2 ** 4, size=(COUT, CIN), dtype=np.int8).astype(mil.mil.types.np_uint4_dtype) scale = np.random.randn(COUT, 1).astype(np.float16) offset = np.random.randn(COUT, 1).astype(np.float16) # offset = np.random.randint(0, 2 ** 4, size=(COUT, 1), dtype=np.uint8).astype(mil.mil.types.np_uint4_dtype) dqw = mb.constexpr_blockwise_shift_scale(data=qw, scale=scale, offset=offset) return mb.linear(x=x, weight=dqw) cml_qmodel = ct.convert( prog_manual_dequant, compute_units=ct.ComputeUnit.CPU_AND_NE, compute_precision=ct.precision.FLOAT16, minimum_deployment_target=ct.target.iOS18, ) Whereas if I use an offset with the same dtype as the weights (uint4 in this case), it does run on ANE Tested on coremltools 8.0b1, on macOS 15.0 beta 2/Xcode 15 beta 2, and macOS 15.0 beta 3/Xcode 15 beta 3.
0
0
131
6d
Div calculation issue in metal
Hi, all. I've been writing various computational functions using Metal. However, in the following operation functions, unlike + and *, there is an accuracy issue in the / operation. This is a function that divides a matrix of shape [n, x, y] and a scalar [1]. When compared to numpy or torch, if I change the operator of the above function to * or + instead of /, I can get completely the same results, but in the case of /, there is a difference in the mean of more than 1e-5. (For reference, this was written with reference to the metal kernel code in llama.cpp) kernel void kernel_div_single_f16( device const half * src0, device const half * src1, device half * dst, constant int64_t & ne00, constant int64_t & ne01, constant int64_t & ne02, constant int64_t & ne03, uint3 tgpig[[threadgroup_position_in_grid]], uint3 tpitg[[thread_position_in_threadgroup]], uint3 ntg[[threads_per_threadgroup]]) { const int64_t i03 = tgpig.z; const int64_t i02 = tgpig.y; const int64_t i01 = tgpig.x; const uint offset = i03*ne02*ne01*ne00 + i02*ne01*ne00 + i01*ne00; for (int i0 = tpitg.x; i0 < ne00; i0 += ntg.x) { dst[offset + i0] = src0[offset+i0] / *src1; } } My mac book is, Macbork Pro(16, 2021) / macOS 12.5 / Apple M1 Pro. Are there any issues related to Div? Thanks in advance for your reply.
0
0
114
6d
Training a Segmentation model
Hi there,I’m a Computer Science student and I have a MacBook Pro 2019 and I’m thinking in buying a new Mac either a Mac Studio or a MacBook Pro but I want to use it for ML. I’m now doing a segmentation model and I’m wondering if I could use Core Ml or the Apple Neural Engine in the new M3 chips to train it, I’m now using colab and tensorflow to create the model but it’s not doing the job, I’m falling short of Cuda memory. Thanks :)
0
0
143
1w
Chaining app intents in code
I would like to split up my intents into smaller intents with more atomic pieces of functionality that I can then call one intent from another. For example: struct SumValuesIntent: AppIntent { static var title: LocalizedStringResource { "Sum Values" } let a: Int let b: Int init(a: Int, b: Int) { self.a = a self.b = b } init() { self.init(a: 0, b: 0) } func perform() async throws -> some IntentResult { let sum = a + b print("SumValuesIntent:", sum) return .result(value: sum) } } struct PrintValueIntent: AppIntent { static var title: LocalizedStringResource { "Print Value" } let string: String init(string: String) { self.string = string } init() { self.init(string: "") } func perform() async throws -> some IntentResult { print("PrintValueIntent:", string) return .result() } } What is the best way to chain intents like these? I tried .result(opensIntent: PrintValueIntent(string: String(describing: sum))) as the return type of SumValuesIntent.perform but that doesn't seem to work. Then I tried try await PrintValueIntent(string: String(describing: sum)).perform() as the return type and that works but I'm not sure that's the correct way to do it.
0
0
157
1w
EntityPropertyQuery with property from related entity
Hi, I am working on creating a EntityPropertyQuery for my App entity. I want the user to be able to use Shortcuts to search by a property in a related entity, but I'm struggling with how the syntax for that looks. I know the documentation for 'EntityPropertyQuery' suggests that this should be possible with a different initializer for the 'QueryProperty' that takes in a 'entityProvider' but I can't figure out how it works. For e.g. my CJPersonAppEntity has 'emails', which is of type CJEmailAppEntity, which has a property 'emailAddress'. I want the user to be able to find the 'person' by looking up an email address. When I try to provide this as a Property to filter by, inside CJPersonAppEntityQuery, but I get a syntax error: static var properties = QueryProperties { Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in person.emails // error }) { EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) } ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) } } } The error says "Cannot convert value of type '[CJPersonEmailAppEntity]' to closure result type 'CJPersonEmailAppEntity'" So it's not expecting an array, but an individual email item. But how do I provide that without running the predicate query that's specified in the closure? So I tried something like this , just returning something without worrying about correctness: Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in person.emails.first ?? CJPersonEmailAppEntity() // satisfy compiler }) { EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) } ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) } } and it built the app, but failed on another the step 'Extracting app intents metadata': error: Entity CJPersonAppEntity does not contain a property named emailAddress. Ensure that the property is wrapped with an @Property property wrapper So I'm not sure what the correct syntax for handling this case is, and I can't find any other examples of how it's done. Would love some feedback for this.
0
0
180
1w
muting SIRI via AppIntents
We're using App Intents to launch are control our app via Siri. Siri's responses have been fairly random, some with a "Done" popup, others with a verbal confirmation, others saying "I'm sorry, there's been a problem". The latter is bogus and doesn't look good to potential investors when the app is actually working fine. There appears to be no way in code that I've been able to find so far that would have been tell Siri to STFU. Let us handle our own errors. Otherwise is there a means to supply Siri with a dictionary of restored messages that could be triggered inside the app?
1
0
192
1w
Missing GPU implementation Op:StatelessRandomGetKeyCounter for the Embedding layer in tensorflow-metal
The Keras Embedding layer cannot be calculated on Metal because of the missing Op:StatelessRandomGetKeyCounter, as shown in this error message: tensorflow.python.framework.errors_impl.InvalidArgumentError: Could not satisfy device specification '/job:localhost/replica:0/task:0/device:GPU:0'. enable_soft_placement=0. Supported device types [CPU]. All available devices [/job:localhost/replica:0/task:0/device:GPU:0, /job:localhost/replica:0/task:0/device:CPU:0]. [Op:StatelessRandomGetKeyCounter] A workaround is to enable soft placement, but this obviously is slower: tf.config.set_soft_device_placement(True) Reporting it here as recommended by the TensorFlow Plugin Metal team.
0
0
186
1w
DisplayRepresentation.Image and custom SF Symbol
Hi, I have a AppEnum and I try to use a custom SF Symbol as DisplayRepresentation.Image but it's not working. I get a blank image when AppEnum picker appears. The custom SF Symbol is stored in the target's asset catalog and it works in the target (eg: Image(named: "custom_sfsymbol"). The issue occurs when I try to use it in a DisplayRepresentation static var caseDisplayRepresentations: [Self: DisplayRepresentation] = [ .sample : DisplayRepresentation(title: "sample_title", image: DisplayRepresentation.Image(named: "custom_sfsymbol")), Can we use a custom SF Symbol in a DisplayRepresentation.Image?
1
0
141
1w
SiriKit Extension still needed with AppIntents?
I have followed the SoupChef example in migrating Custom Intents from SiriKit to AppIntents. However, we only require one iOS release back, so we can require iOS 17. Thus, I eliminated everything that was strictly for backwards compatibility, most notably the SiriKit Extension that required enormous amounts of code to try to coordinate with the real app which worked poorly anyway. I tested for example that an NFC tag Automation created in Shortcuts works to execute an AppIntent while the app is backgrounded. I am now receiving a beta report that indicates someone trying to execute one of our migrated AppIntents from their HomePod is not working, and they say it used to work sometimes (not all the time). I'm sure most such cases used to require the SiriKit Extension in the old SiriKit world. I am terrified that I may need to rebuild that monster once again when the new (to me) AppIntent API seemed so beautiful without it and seemed to work without it. The AppIntent API documentation seems to indicate that SiriKit Extensions are no longer related or required. What is the truth here? Do I need to re-implement everything twice in the SiriKit Extension like a barbarian, or can we live in the new world with AppIntents? Thank you.
1
0
202
1w