Posts under App & System Services topic

Post

Replies

Boosts

Views

Activity

AppleScript, Do Shell. How do I write the "test" command to the "script editor"?
Hello. How to write this command correctly on a Macbook, in the script editor, so that I can click the "Run script" button and the script will give the result: if there is no folder, then report that there is no folder, if there is a folder, then report that the folder exists. do shell script "test -d 'Users/user/Desktop/New folder'" Now, if the folder exists, an empty string ("") is returned, if the folder does not exist, the script editor reports that an error has occurred. In general, my task is to write a script that checks the existence of a folder.
14
0
1.6k
1w
BGContinuedProcessingTask expiring unpredictably
I've adopted the new BGContinuedProcessingTask in iOS 26, and it has mostly been working well in internal testing. However, in production I'm getting reports of the tasks failing when the app is put into the background. A bit of info on what I'm doing: I need to download a large amount of data (around 250 files) and process these files as they come down. The size of the files can vary: for some tasks each file might be around 10MB. For other tasks, the files might be 40MB. The processing is relatively lightweight, but the volume of data means the task can potentially take over an hour on slower internet connections (up to 10GB of data). I set the totalUnitCount based on the number of files to be downloaded, and I increment completedUnitCount each time a file is completed. After some experimentation, I've found that smaller tasks (e.g. 3GB, 10MB per file) seem to be okay, but larger tasks (e.g. 10GB, 40MB per file) seem to fail, usually just a few seconds after the task is backgrounded (and without even opening any other apps). I think I've even observed a case where the task expired while the app was foregrounded! I'm trying to understand what the rules are with BGContinuedProcessingTask and I can see at least four possibilities that might be relevant: Is it necessary to provide progress updates at some minimum rate? For my larger tasks, where each file is ~40MB, there might be 20 or 30 seconds between progress updates. Does this make it more likely that the task will be expired? For larger tasks, the total time to complete can be 60–90 mins on slower internet connections. Is there some maximum amount of time the task can run for? Does the system attempt some kind of estimate of the overall time to complete and expire the task on that basis? The processing on each file is relatively lightweight, so most of the time the async stream is awaiting the next file to come down. Does the OS monitor the intensity of workload and suspend the task if it appears to be idle? I've noticed that the task UI sometimes displays a message, something along the lines of "Do you want to continue this task?" with a "Continue" and "Stop" option. What happens if the user simply ignores or doesn't see this message? Even if I tap "Continue" the task still seems to fail sometimes. I've read the docs and watched the WWDC video, but there's not a whole lot of information on the specific issues I mention above. It would be great to get some clarity on this, and I'd also appreciate any advice on alternative ways I could approach my specific use case.
1
0
75
1w
Missing logs in OSLogStore
Hello, I need to monitor the device for an activity that is not supported by ES framework. I can reliably monitor it using correct filters with log stream, for example sudo log stream --info --style compact --predicate 'category = "X"' But I need to provide that functionality through my application. Because of that, I made an instance of a log store, hoping I will be able to retrieve the necessary informations that way. The problem is that the messages are sometimes appearing and sometimes not. The log level I am interested in is info, which according to the docs The system stores info-level messages in memory buffers and, without a configuration change, purges the oldest messages as those buffers fill up. If I understand that correctly, the info messages are being written to the buffer and not the store, only sometimes reaching it. But also that should be modifiable with a configuration change? How could I make such change to always save info logs to the store and retrieve them?
3
0
311
1w
Network devices may not be able to connect to Personal HotSpot.
Explanation of the issue When tethering is enabled and a wireless connection is established, there are instances where an IP address is not assigned. Steps to Reproduce the Issue (if possible) Enable iPhone tethering and connect wirelessly using 11ax. Expected Result The iPhone assigns an IP address, enabling network connectivity. Actual Result Observed DHCP negotiation failed. After attempting communication with the DHCP server via DHCP Discover, a DHCP Offer was returned from the iPhone. If this was missed, it would retry by performing another DHCP Discover. However, the iPhone does not issue a DHCP Offer no matter how many times it retries. The IP address is not assigned unless the wireless connection is disconnected and reconnected. If the initial Discover is missed, does this invalidate subsequent Offer retries? The above issue has been confirmed on iPhone 17 Pro and iPhone 16. It does not appear to occur on iPhone 15.
3
0
54
1w
Apple Pay Test cards not added to Wallet
For Apple Pay testing, I have tried the following: Sign into the Sandbox Account via Developer Settings: Settings > Developer > Sandbox Account Keep your main Apple ID for everything else Add Test Cards to Wallet: Try adding the test card numbers (MasterCard and Visa Debit, as we support only those) Apple provides in their documentation. Unfortunately, none of them are added to the wallet. All the time it gives 'Could Not Add Card'. I tried on devices with iOS 18+. Can anyone advise on this? Thanks
1
0
78
1w
The iPhone 17 series is unable to connect to the Bluetooth module BK3432.
When our Bluetooth device is scanned and a connection is initiated through the app on the iPhone 17, the air log shows that the iPhone sends an LL_LENGTH_REQ to execute the Data Length Update Procedure. However, our peripheral does not support the Bluetooth LE Data Length Extension, so it responds with an LL_UNKNOWN_RSP PDU with the UnknownType field set to LL_LENGTH_REQ. After receiving the LL_UNKNOWN_RSP, the iPhone 17 does not proceed with the subsequent Bluetooth LE service discovery process. The connection is maintained until the peripheral actively disconnects. Once the peripheral disconnects and continues broadcasting Bluetooth signals, the iPhone 17 repeatedly tries to connect to the peripheral and executes the aforementioned process, even if the app has been terminated. According to the Bluetooth 4.2 core specification ([Vol. 6] Part B, Section 5.1.9), which can be found here: https://www.bluetooth.com/specifications/specs/core-specification-amended-4-2/, the iPhone should accept the LL_UNKNOWN_RSP and terminate the Data Length Update Procedure after receiving it, proceeding with the subsequent operations using the default minimum parameters. Phenomenon: When the app calls the - (void)connectPeripheral:(CBPeripheral *)peripheral options:(nullable NSDictionary<NSString *, id> *)options method, the connection result callback is never received. After a period of approximately 10 seconds, it fails with a callback, displaying the message: central did fail to connect to peripheral: <TY : 45E4A697-31AE-9B5A-1C38-53D7CA624D8C, Error Domain=CoreBLEErrorDomain Code=400 "(null)">.
3
0
196
1w
Question about including all project classes in ofClasses parameter when using NSKeyedUnarchiver.unarchivedObject(ofClasses:from:)
Hello, I have a question about data deserialization using NSKeyedUnarchiver in iOS SDK development. Current Situation: Previously, we were using the NSKeyedUnarchiver.unarchiveObject(with: Data) function We have changed to using the NSKeyedUnarchiver.unarchivedObject(ofClasses:from:) method to deserialize complex objects stored in UserDefaults We need to include all types in the ofClasses parameter, including Swift primitive types as well as various custom classes and structs within the project Questions: Implementation Approach: Is it correct pattern to include all classes defined in the project in the ofClasses array? Is this approach recommended? Runtime Stability: When using this approach, is there a possibility of runtime crashes? Are there any performance issues? Alternative Methods: If the current approach is not the correct pattern, what alternatives should we consider? Current Code Structure: All model classes conform to the NSSecureCoding protocol We use the requiringSecureCoding: true parameter We use a whitelist approach, explicitly listing only allowed classes I would like to know if this structure is appropriate, or if we should consider a different approach. Thank you.
4
0
107
1w
CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1
Hi, We've noticed that this issue occurs more frequently after upgrading to iOS 18.4.1 and can result in one-way audio. Our app uses CallKit with WebRTC to establish VoIP connections. However, on iOS 18.4.1, CallKit no longer triggers: func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) We're currently comparing the occurrence rate across different iOS versions to better understand the impact. Could you please help analyze the root cause of this issue?
31
1
1.4k
1w
Regarding "Overview of app transfer"
My iPhone VoIP app, which I'm developing, uses Apple Push Notification service (APNs). I have a question regarding the following statement found in "[Overview of app transfer > Apps using push notifications]" Overview of app transfer You must manually reestablish push notification services if transferring an app that uses the Apple Push Notifications service (APNs). The recipient must create a new client SSL certificate using their developer account, as associated client SSL certificates, TLS certificates, and authentication tokens aren’t transferred. Question Let's say the recipient of the app transfer creates a "new SSL certificates, TLS certificates, and authentication tokens." Afterward, we need to verify that the Apple Push Notification service (APNs) works correctly when combining the transferred app with this "new SSL certificates, TLS certificates, and authentication tokens." However, until the recipient finishes verifying that it works correctly, the transferor want to keep the app available for download as before and be able to use the Apple Push Notification service. Is this possible? More specifically, can the recipient test the app to be transferred on TestFlight "before the transfer is completed"? I want to combine it with the "new SSL certificates, TLS certificates, and authentication tokens." and test it on TestFlight. Reading "[Initiate an app transfer]," it mentions the existence of a "Pending App Transfer" status. During this "Pending App Transfer" status, can the recipient test the app on TestFlight? Initiate an app transfer After you initiate the transfer, the app stays in its previous status, with the Pending App Transfer status added, until the recipient accepts it or the transfer expires after 60 days. Also, if there are any documents describing these procedures, I would appreciate it if you could share them. Thank you very much.
2
0
119
1w
I want to make sure to make my app’s data persist across devices, updates, and reinstalls, you need to store it in the cloud.
i want to save data like images, text,amd mapviews with swiftui. It is only saved but if you delete the app of buy a new iPhone everything is deleted, how can I make if that the information saved on my app is saved even after I update the app, delete the app, or put the app in another iPhone with SwiftUI? i have watched youtube videos and im still confused,please help.
1
0
66
1w
Apple Script for Music app no longer supports current track event
AppleScript for the Music app no longer supports the current track event. Before macOS Tahoe, running the following script in Script Editor would return the current track information: tell application "Music" return name of current track end tell However, when I run this script on a device with macOS 26 Tahoe, I receive this error: "Result: error "Music got an error: Can’t get name of current track." number -1728 from name of current track” I've tested this extensively, and here are my findings: Going to the “songs” tab and playing something from there makes everything work. Playing any song directly will make it work with current track UNLESS this song is NOT in your Music library (either added through Apple Music or uploaded). If you play a song not in your library, current track is not updated even if you clicked on it specifically. Playing an album (in your library obviously) makes all the tracks within it appear in current track until autoplay takes over. Any autoplayed track won’t appear in current track even if in your library (unless: see the last bulletpoint) Music played through the “songs” tab all appear in current track even if autoplay kicks in. I assume this is because this tab is an iTunes legacy (visually and under the hood) and doesn’t use the modern autoplay. This tab also won’t play non-library songs unlike the “albums” tab which seems to use the correct autoplay and suffers the same symptoms as the “recently added”, “home”, “radio”, etc… tabs. Is this a bug, or has Apple simply deprecated this functionality?
10
4
599
1w
Support for custom Matter endpoints, clusters and attributes
I am working on an app for a home automation device. If I were using HomeKit exclusively I could add custom services or custom characteristics on standard services and these things would all be reported to my app via HomeKit. There is sample code from Apple that demonstrates how to do this. When a Matter device is commissioned using HomeKit you might expect custom clusters and/or custom attributes in a standard cluster would be translated to appropriate HomeKit services and characteristics, but this doesn't appear to be the case. Is there a way to have HomeKit do this? If not it seems I would need to use Matter directly rather than via HomeKit to access custom features. But if I commission the device using Matter in my app then I understand a new fabric is created and the device would not show in the Home app. Maybe the user needs to commission the device twice, once with my custom app and once with the Home app? That seems like a poor user experience to me. Perhaps that is the price paid for using a cross-platform standard? Is there a better way to get the same level of customization using Matter that I am able to get using HomeKit?
16
0
2.4k
1w
Can you include an `alert` with a sound within the `end` event?
Q1. Can you place a sound on an end event? That doesn't seem to work for us Additionally: Q2. Is there any way that after you send the end event, still have the Live Activity remain on the Dynamic Island until the dismissal-date? Currently when an end event is sent, it's abruptly ended from the Dynamic Island without any sound. Users are confused until minutes/hours later they see their Lock Screen.
0
0
64
1w
BGContinuedProcessingTask what's the point?
Hi, This post is coming from frustration of working on using BGContinuedProcessingTask for almost 2 weeks, trying to get it to actually complete in the background after the app is backgrounded. My process will randomlly finish and not finish and have no idea why. I'm properly using and setting task?.progress.totalUnitCount = [some number] task?.progress.completedUnitCount = [increment as processed] I know this, because it all looks propler as long as the app insn't backgrounded. So it's not a progress issue. The task will ALWAYS complete. The device has full power, as it is plugged in as I run from within Xcode. So, it's not a power issue. Yes, the process will take a few minutes, but I thought that is BGContinuedProcessingTask purpose in iOS 26. For long running process that a user could place in the background and leave the app, assuming the process would actually finish. Why bother introducing a feature that only works with short tasks that don't actually need long running time in the first place.
7
0
118
1w
How to allocate contiguous memory in DriverKit?
We want to allocate a block of contiguous memory (≤1M) for audio ring DMA usage, but we haven't found any explicit method in the DriverKit documentation for allocating contiguous memory. I'm aware that IOBufferMemoryDescriptor::Create can be used in DriverKit to allocate memory and share it with user space. However, is the allocated memory physically contiguous? Can it guarantee that when I subsequently call PrepareForDMA in IODMACommand, there will be only one segment? Could you please help review this? Thank you!
2
0
86
1w
iOS BLE Connection Timeout Policy: Per-Peripheral or Per-App Radio Session?
Hi Core Bluetooth team, I’m seeking official clarification on iOS system-level BLE connection management policy, specifically regarding idle timeouts. Question: When iOS disconnects a BLE peripheral with the error: "The connection has timed out unexpectedly." Is this timeout enforced: Per peripheral (each connection has its own idle timer), or Per app / per Bluetooth radio session (one idle timer for all BLE connections in the app)? In other words: Does a single GATT operation (e.g., a write or notification) on any one peripheral reset the idle timeout for all other BLE connections in the same app? Context (General, No App Code): This is about system behavior, not a bug in any specific app. Applies to foreground and bluetooth-central background mode. No state restoration involved. iOS 17–18, all modern devices. Why This Matters: If per-app, one keep-alive can protect multiple peripherals → simplifies design. If per-peripheral, each connection needs independent activity → higher power use. Reference: Core Bluetooth docs recommend “regular GATT operations” but don’t specify scope. WWDC sessions mention “shared Bluetooth radio” but not timeout granularity. Official Answer Requested: Is the BLE idle timeout per peripheral or per app/radio session in iOS? Thank you for the authoritative guidance.
0
0
40
1w
Encountered issues when calling the sandbox environment interface to verify receipts during the development of the payment function
Currently, we are using the Apple V2 interface to develop the payment function. We generate a transaction ID on the mobile client and obtain the receipt. Now, when we call the Apple sandbox environment interface to verify the receipt, we always get a 404 error. The above is the request and response. The token used in it is also the original data after decoding, and there is no problem. Could you help me figure out what other possible reasons there might be?
0
0
18
1w
Can't manage my sandbox account
Hi, I can't get into "manage" sandbox account. I either get to a screen to put my password in. Here there is no way to click "next" or "login". (I have tried pressing "enter" on my keyboard to no effect). Or I get directly into "Cannot Connect" page. I have tried this two days in a row. I have tried turning it off and on again. I have tried logging out and in. Device: iPhone 13 pro max, iOS: 16.0.3 (also tried the version before this)
77
12
35k
1w