iOS is the operating system for iPhone.

Posts under iOS tag

200 Posts

Post

Replies

Boosts

Views

Activity

Unfinished transactions not being emitted on start of app
I'm using the iOS simulator with a StoreKit configuration file. I can see that there have been transactions while the app has been closed, but my StoreKit 2 listener is never called with those updates to be able to finish them When I open my app from a cold start. I've added a listener on application(_:didFinishLaunching:launchOptions:) like this: func startObservingTransactions() { task = Task(priority: .background) { for await result in Transaction.updates { if case .verified(let transaction) = result { await transaction.finish() } } } } But the Transaction.updates loop never gets called (have added breakpoints to check). It's only ever called when a purchase is made, or subsequent transaction renewals when the app is open. Only then it will get the previously unfinished transactions. Steps to reproduce: Create an app with a StoreKit config file (with sped up transactions) to purchase an item Make a purchase then quit the app Wait for a bit for more transactions to be made while the app is closed. Open the app from a cold start and none of the transactions will be finished by the listener in your app. Cancel the subscription via the transaction manager. Close and open the app from a cold start. The first transaction will be finished by the listener but none of the others will be. In Apple's docs it says If your app has unfinished transactions, the listener receives them immediately after the app launches Why is this not the case?
15
2
5.7k
Feb ’25
On iOS 18, Mandarin is read aloud as Cantonese
Please include the line below in follow-up emails for this request. Case-ID: 11089799 When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16. 1.let utterance = AVSpeechUtterance(string: textView.text) let voice = AVSpeechSynthesisVoice(language: "zh-CN") utterance.voice = voice 2.In the phone settings, Siri is set to Cantonese
3
1
556
Feb ’25
Enabling Just-In-Time compiler on "Emulators" on AppStore and/or enabling hypervisor to the iPad
Hello, why is apple won’t adding Just-In-Time compiler to ”Emulators” in the app store. And/or hypervisor for newer devices. i feel like UTM (which is a PC Emulator) or other Apps that emulate need JIT to work properly, and will consume significantly less battery to emulate/virtualize, And will have a noticeably better performance than just not enabling JIT, and by the way jit is already being used on iPadOS/iOS 18.3/18.3.1 and newer/older version of that so being enabled by the choice of the developer of the App is more convenient than doing it with tools. and by the why apple wont let emulators on iPads and newer iPhones do hypervisor, it’s better than JIT but requires a good cpu, like making it available to people with newer/powerful devices, hypervisor is better than JIT by a lot and removing it in iPadOS/iOS 18.4 was an unnecessary choice?, becuase it had a better potential in virtualization instead of emulating, and I feel like enabling it In M1-M2 iPads and A14-18pro and newer devices is just better from having it disabled, to unlock the fullest potential of the iPad it needs to have a app or something to do instead of just running high graphics games/or Apps.
2
0
1.6k
Feb ’25
Nested method calls in `context.perform` with Swift 6
I'm calling a method with the context as parameter, within the context's perform block – is this really not legal in Swift 6? actor MyActor { func bar(context: NSManagedObjectContext) { /* some code */ } func foo(context: NSManagedObjectContext) { context.performAndWait { self.bar(context: context) // WARN: Sending 'context' risks causing data races; this is an error in the Swift 6 language mode // 'self'-isolated 'context' is captured by a actor-isolated closure. actor-isolated uses in closure may race against later nonisolated uses // Access can happen concurrently } } } The warning appears when I call a method with a context parameter, within the performAndWait-block. Background: In my app I have methods that takes in API data, and I need to call the same methods from multiple places with the same context to store it, and I do not want to copy paste the code and have hundreds of lines of duplicate code. Is there a well-known "this is how you should do it" for situations like this? This is related to a previous post I made, but it's a bit flimsy and got no response: https://developer.apple.com/forums/thread/770605
1
1
916
Feb ’25
OTP autocomplete not working as expected
I have a UITextField with UITextContentType equal to oneTimeCode. It works as expected if the message is in English and the keyword "OTP" exists. It doesn't work if the message is in Greek and the keyword "OTP" is translated also in greek. Is the OTP keyword really needed? Is there any alternative? Which are the keywords for any case? Are these keywords only in English? Thanks in advance!
3
0
917
Feb ’25
Detect change to apps Screen Time Access
I'm creating an app which gamifies Screen Time reduction. I'm running into an issue with apples Screen Time setting where the user can disable my apps "Screen Time access" and get around losing the game. Is there a way to detect when this setting is disabled for my app? I've tried using AuthorizationCenter.shared.authorizationStatus but this didn't do the trick. Does anyone have an ideas?
0
0
420
Feb ’25
Communicating between app & ui test runner
I'd like to set up a communication mechanism between the Ui test runner and my iOS app. The purpose is to be able to collect some custom performance metrics in addition to standard ones like scrollingAndDecelerationMetric. Let's say we measure some specific intervals in our code using signposts, then serialize the result into a structured payload and report it back to the runner. So, are there any good options for that kind of IPC? The primary concern is running on Simulator. However, since it is not a regular UI test but more a performance UI test, and it is usually recommended to run those on a real device, with release optimizations/flags in place, I wonder if it is feasible to have it for device too.
0
0
418
Feb ’25
RealityView IOS Navigation
I have a visionOS app that I’m adding support for IOS and will like to keep using RealityView. I know there are the following modifiers to add some navigation .realityViewCameraControls(.orbit) .realityViewCameraControls(.dolly) .realityViewCameraControls(.pan) But how can I add more than one? For example I would like to orbit with one finger, Pan with 2 fingers and dolly by pinching. Is this possible and if so can someone share some sample code on how to achieve that? Thanks, Guillermo
0
1
452
Feb ’25
Page Freeze Caused by Gesture
When pushing a page in the navigation, changing the state of interactivePopGestureRecognizer causes the page to freeze. Just like this: #import "ViewController.h" @interface ViewController () @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view. CGFloat red = (arc4random_uniform(256) / 255.0); CGFloat green = (arc4random_uniform(256) / 255.0); CGFloat blue = (arc4random_uniform(256) / 255.0); CGFloat alpha = 1.0; // self.view.backgroundColor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha]; UIButton *btn = [UIButton buttonWithType:UIButtonTypeCustom]; btn.frame = CGRectMake(0, 0, 100, 44); btn.backgroundColor = [UIColor redColor]; btn.center = self.view.center; [btn setTitle:@"push click" forState:UIControlStateNormal]; [btn addTarget:self action:@selector(click:) forControlEvents:UIControlEventTouchUpInside]; [self.view addSubview:btn]; } - (void)click:(id)sender { [self.navigationController pushViewController:[ViewController new] animated:YES]; } - (void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; self.navigationController.interactivePopGestureRecognizer.enabled = NO; } - (void)viewDidAppear:(BOOL)animated{ [super viewDidAppear:animated]; self.navigationController.interactivePopGestureRecognizer.enabled = YES; } @end
0
0
342
Feb ’25
Live caller id lookup - Which secret key is used for PIR database HE encryption on test env
Hi, I'm trying to setup PIR service for live caller id lookup (in python but based on swift example: https://github.com/apple/live-caller-id-lookup-example). The swift example provides utilities for database setup and encryption, but I can't find any specification about which key is used for database encryption and how the ios system knows about this key in order to be able to construct the PIR requests. So my question is how does the PIR service communicate the secret key to ios system or vice versa? (specific to the test environment, before onboarding)
0
0
293
Feb ’25
[Error] 18.2.1 HealthKit steps
From the date the app was updated, the step count data retrieved from HealthKit was strange. On the other hand, step count data from the date before the update is imported normally. After deleting and reinstalling the app, the problem did not occur. Is there anyone who has experienced something like this or knows the cause?
2
0
694
Feb ’25
Using Flutter with CallKit
When using CallKit in my flutter app audio(both mic and speaker) stop working. When not using call kit to answer calls the app work fine. I am using the flutter flutter_callkit_incoming to use callkit and flutter_webrtc for the telephony. Flutter_callkit_incoming has some boilerplate code code include sections to uncomment when using webrtc and I have seen multiple fixes to suggest to make sure the to configure sharedAudioSession before the callkit is sent. Neither of this approaches seemed to have worked.
2
0
430
Feb ’25
Resolving URL from bookmark data doesn't automatically mount SMB volume on iOS
On macOS, the Finder allows to connect to a server and store the login credentials. When creating a bookmark to a file on a server and resolving it again, the server is mounted automatically (unless I provide the option URL.BookmarkResolutionOptions.withoutMounting). I just tried connecting to my Mac from my iPad via SMB in the Files app and storing a bookmark to a file on the server, but disconnecting the server and trying to resolve the bookmark throws the error (I translated the English text from Italian): Error Domain=NSFileProviderErrorDomain Code=-2001 "No file provider was found with the identifier "com.apple.SMBClientProvider.FileProvider"'" UserInfo={NSLocalizedDescription=No file provider was found with the identifier "com.apple.SMBClientProvider.FileProvider"., NSUnderlyingError=0x302a1a340 {Error Domain=NSFileProviderErrorDomain Code=-2013 "(null) "}} Every time I disconnect and reconnect to the server, selecting the same file returns a different path. The first time I got /private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/WtFD3Ausername/path/to/file.txt The next time WtFD3A changed to EqHc2g and so on. Is it not possible to automatically mount a server when resolving a bookmark on iOS? The following code allows to reproduce the issue: struct ContentView: View { @State private var isPresentingFilePicker = false @AppStorage("bookmarkData") private var bookmarkData: Data? @State private var url: URL? @State private var stale = false @State private var error: Error? var body: some View { VStack { Button("Open") { isPresentingFilePicker = true } if let url = url { Text(url.path) } else if bookmarkData != nil { Text("couldn't resolve bookmark data") } else { Text("no bookmark data") } if stale { Text("bookmark is stale") } if let error = error { Text("\(error)") .foregroundStyle(.red) } } .padding() .fileImporter(isPresented: $isPresentingFilePicker, allowedContentTypes: [.data]) { result in do { let url = try result.get() if url.startAccessingSecurityScopedResource() { bookmarkData = try url.bookmarkData() } } catch { self.error = error } } .onChange(of: bookmarkData, initial: true) { _, bookmarkData in if let bookmarkData = bookmarkData { do { url = try URL(resolvingBookmarkData: bookmarkData, bookmarkDataIsStale: &stale) } catch { self.error = error } } } } }
2
0
436
Feb ’25
Notification when mounting volume on iOS
On macOS, we have didMountNotification but there doesn't seem to be an equivalent for iOS. Is there a way to be notified when a volume is mounted on iOS? I would like to use it in my iOS app I'm currently porting from macOS, which starts a synchronization from the volume (which has been previously selected in a NSOpenPanel) as soon as it's mounted.
3
0
390
Feb ’25
How Can I Enable NFC Functionality in an Apple Wallet Pass?
Hello, I am working on an Apple Wallet pass with NFC functionality but have been facing issues with getting it to work. The pass gets added to Wallet, but the NFC feature does not seem to activate. Could someone provide a detailed, step-by-step process to properly enable NFC in an Apple Wallet pass? Here is what I have done so far: 1. Set up a Pass Type ID and Certificates: I have registered a Pass Type ID in my Apple Developer account. I have generated and installed the required certificates (Pass Type ID certificate and WWDR certificate). 2. Adding the NFC Field: Added the following nfc field to my pass.json file: { "formatVersion": 1, "passTypeIdentifier": "pass.com.example.mypass", "serialNumber": "123456", "teamIdentifier": "TEAMID12345", "webServiceURL": "https://example.com/api/passes", "authenticationToken": "my_secure_token", "nfc": { "message": "Tap to unlock door", "encryptionPublicKey": "MY_ENCRYPTION_PUBLIC_KEY", "payload": "encrypted_nfc_payload" }, "organizationName": "My Company", "description": "NFC-Enabled Access Pass", "logoText": "My NFC Pass", "foregroundColor": "rgb(255, 255, 255)", "backgroundColor": "rgb(0, 0, 0)", "barcode": { "format": "PKBarcodeFormatQR", "message": "https://example.com", "messageEncoding": "iso-8859-1" } } 3. Tested the Pass: The pass is added to Wallet, but NFC functionality is not working. When the nfc field is removed, the pass works fine without NFC. Questions: 1. Could you provide a comprehensive list of required steps to enable NFC in an Apple Wallet pass, including any specific details on encryption, payload, and public key formatting? 2. Are there any additional configurations or settings that I might be missing? 3. Is there any official documentation or specific tools recommended for testing NFC-enabled passes? Any guidance or solutions to enable NFC in this pass would be greatly appreciated. Thank You
1
0
1k
Feb ’25
Content blocker not removing content
I have a content blocker that generally works correctly, but I need to block an element that has certain text in it. For example, <span id="theId">Some text</span> is easy enough to block because I can locate the id and block that, but what if there is no id, or the id is completely random? What if it's just <span>Some text</span>? How do I block that? Let's say this is my only content blocker rule: [ { "action": { "type": "css-display-none", "selector": ":has-text(/Some text/i)" }, "trigger": { "url-filter": ".*" } } ] No errors are seen when the rule is loaded, so it's syntactically correct, it just doesn't block the HTML. I gather this is because :has-text() works on attributes, not contents, so it works on alt, href, aria-label etc. but not on the contents of the element itself. How do I block Some text in my example above? Thanks!
2
1
648
Feb ’25
[iOS, SwiftUI] UiRefreshControl.tintColor is not the same as given one
Hello! I'm trying to set a UiRefreshControl.tintColor: .onAppear { UIRefreshControl.appearance().tintColor = UIColor.systemBlue } But instead of I get The color in the second picture is a high contrast version of the first one. I can't understand why it works this way. I also tried the following. UIRefreshControl.appearance().tintColor = UIColor(red: 0, green: 0.478, blue: 1, alpha: 1) // doesn't work UIRefreshControl.appearance().tintColor = UIColor(named: "RefreshControlColor") // doesn't work, here set "High contrast" on and indicated Universal.systemBlueColor Perhaps I missed something?
0
0
199
Feb ’25
Issue Integrating C++ SDK
Hello Apple Team, I'm trying to import the Audodesk FBX SDK to my Objective-C iOS Project. The SDK is written in C++, but has support for iOS and the iOS simulator architectures. I've added the path to the include folder in the Header Search Path I've also added the paths to libfbxsdk.a in the Library Search Paths Finally, I've added the libfbxsdk.a file to the Link Binary with Libraries. However, when I build the project, I get the following error: building for 'iOS', but linking in object file (/Users/Lond/Documents/v2/Autodesk/iOS/2020.3.7/lib/ios/debug/libfbxsdk.a[28](fbxalloc.cxx.o)) built for 'macOS' In the terminal, if I type the command: 
lipo -info libfbxsdk.a I get the message Non-fat file: libfbxsdk.a is architecture: arm64 confirming that I'm using the library for the correct architecture.   Do I need to add any other confifuration option? (Like the other linker flag or something else) I'm quite new to C++, and integrating a C++ SDK into iOS is not easy.   I'm using Mac Os Sonoma 14.6.1 Tested on Xcode 15.4 and 16.2 Target Device: iPhone 13 Pro (iOS 17.6.1) iOS FBX SDK version: 2020.3.7 Link to the SDK if needed: https://aps.autodesk.com/developer/overview/fbx-sdk   Any help would be greatly appreciated Thank you
4
0
721
Feb ’25