Background Tasks

RSS for tag

Request the system to launch your app in the background to run tasks using Background Tasks.

Posts under Background Tasks tag

153 Posts

Post

Replies

Boosts

Views

Activity

Sounds (beeps) & haptics not working anymore when in 'wrist down' mode (always on display)
Our watch app, Regatta Timer, is a specialised countdown timer for sailing competitions. It is crucial that the beeps & haptics continue when 'wrist down' on alway on displays. We tried to enable this by adding 'background mode' but that only works in the Xcode Apple Watch simulator, not on an actual device with always on display. Any idea how we can get this working also on the Apple Watch device? In ContentView.swift we currently added this code: WKInterfaceDevice.current().play(sound) } but that doesnt work - regardless of adding , phase == .active` or not. STEPS TO REPRODUCE Install on an ACTUAL DEVICE with always on display start the countdown timer: beeps & sounds are OK (each minute,...) do 'wrist down': the countdown timer continues on the dimmed display, but the sounds & haptics stop working until you raise your wrist to wake up the display.
1
0
833
Jan ’25
App being launched while device is locked
DESCRIPTION OF PROBLEM Logs and data from our application indicate various errors that strongly suggest that our application is being launched in a state in which the device is likely locked. We are looking for guidance on how to identify, debug, reproduce, and fix these cases. Our application does not use any of the common mechanisms for background activity, such as Background App Refresh, Navigation, Audio, etc. Errors we get in our logs such as "authorization denied (code: 23)" when trying to access a file in our app's container on disk (a simple disk cache for data our application uses) strongly suggest that the device is operating in a state, such as being locked, where our application lacks the requisite permissions it would normally have during operation. Furthermore, attempts to access authentication information stored in the keychain also fails. We use kSecAttrAccessibleWhenUnlocked when accessing items we store in the keychain. We have investigated "Prewarming", as well as our notification extension that helps process incoming push notifications, but cannot find any way to recreate this behavior. Are there any steps Apple engineers can recommend to triage and debug this? Some additional questions that would help us: What are all of the symptoms that we can look for if prewarming escapes the intended execution context? What are all of the circumstances in which we would be unauthorized to access the app’s documents/file directories even if it works correctly in normal operation? STEPS TO REPRODUCE Unfortunately, we are unable to forcibly reproduce this behavior in our application, so we're looking for guidance on how we might simulate this behavior in Xcode / Instruments. Are there tools that Apple provides that would allow us to simulate certain behaviors like prewarming to verify our application's functionality? Are there other reasons our application might be launched while the device is locked? Are there other reasons we would receive security errors when accessing the keychain or disk that are unrelated to the device being locked?
1
1
567
Jan ’25
Inquiry About Background Permission Issue in My App
I am writing to address a concern regarding the background permission functionality in my app, which is critical for ensuring user safety as they navigate various terrains. This feature also enables users to smoothly record their navigation tracks for review after their activities. Recently, I've noticed that this functionality is not working as seamlessly as before. Additionally, I observed that the app is not categorized under 'health and fitness'—could reclassifying it improve background activity? Before I delve into a detailed code review, I wanted to check if this issue might be related to sync or settings on the App Store side, such as permission configurations, app updates, or other related factors. Or, is it more likely an issue stemming from the app’s codebase?
1
0
482
Jan ’25
Tracking User Activity (Driving/Walking) Periodically with Background Execution (Background state & Terminated state)
I'm working on an application where, once the user starts driving, I need to periodically check their activity every 2.5 minutes (150 seconds) to determine whether they are still driving, walking, or have stopped. If they are still driving, I want to keep rescheduling the task until the user is no longer in a driving state. Currently, I'm using startMonitoringSignificantLocationChanges to wake the app when the user's location changes, which works as expected. However, the activity detection stops after the first result is received, and I can't continue tracking the user's activity in the background (or after the app is killed from the app switcher). Here's my approach: After receiving a significant location change, I start tracking the user’s activity to check if they are driving or have stopped. I reschedule this task every 2.5 minutes as long as the user remains in a driving state. I need this process to run even when the app is in the background or terminated by the user. Question: Is it possible to keep activity detection running periodically after receiving a location change, even when the app is in the background or terminated? What is the recommended way to implement this? I would appreciate any suggestions or best practices for achieving this functionality.
1
0
602
Jan ’25
Playing Timed Sound Effects in Background
Hi, I'm relatively new to iOS development and kindly ask for some feedback on a strategy to achieve this desired behavior in my app. My Question: What would be the best strategy for sound effect playback when an app is in the background with precise timing? Is this even possible? Context: I created a basic countdown timer app (targeting iOS 17 with Swift/SwiftUI.). Countdown sessions can last up to 30-60 mins. When the timer is started it progresses through a series of sub-intervals and plays a short sound for each one. I used AVAudioPlayer and everything works fine when the app is in the foreground. I'm considering switching to AVAudioEngine b/c precise timing is very important and the AIs tell me this would have better precision. I'm already setting "App plays audio or streams audio/video using AirPlay" in my Plist, and have configured: AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: .mixWithOthers) Curiously, when testing on my iPhone 13 mini, sounds sometimes still play when the app is in the background, but not always. What I've considered: Background Tasks: Would they make any sense for this use-case? Seems like not if the allowed time is short & limited by the system. Pre-scheduling all Sounds: Not sure this would even work and seems like a lot of memory would be needed (could be hundreds of intervals). ActivityKit Alerts: works but with a ~50ms delay which is too long for my purposes. Pre-Render all SFX to 1 large audio file: Seems like a lot of work and processing time and probably not worth it. I hope there's a better solution. I'd really appreciate any feedback.
1
0
1.2k
Dec ’24
App cannot fetch any resource after a while
Hi everyone, We came an issue that, In some scenarios in our app we cannot fetch any resources from device (Photo and Contact). One case we catched is putting app in background and spending time in other commonly used apps and coming back to our app cause this issue but there is a small chance that get this issue during using the application. In cell, we are trying to fetch the image like this imageFetchTask = Task { let image = await CompositionRoot.shared.photosManager.image(requestType: .imageCollections, forId: photoAsset.photoId) self.photoImageView.image(image) } and inner layers of this code we get the PHAsset and request image PHAsset.firstAsset(for: id) let manager = PHImageManager.default() manager.requestImage(for: asset, targetSize: request.targetSize, contentMode: .aspectFill, options: request.options) { (image, info) in continuation.resume(returning: image) } We figured out that issue not happening only in Photos also Contacts and any web request. So any help according to this situation is well appreciated. Thanks
0
0
472
Dec ’24
WidgetCenter.shared.reloadAllTimelines() doesnt work when app in background
I have an iOS app, watchOS app, and iOS Widget that shows the most recent data in the database. The watch app sends data to the iOS app over the WCSession and is received in session(didReceiveMessage, replyHandler). After that data is processed, reloadAllTimelines() is called. When running in Simulator or on device plugged in to debugger, it works, the widget updates when the app is closed (in background, even if force quit). But when running TestFlight or App Store build, the data is still processed and saved to Core Data (I open the app and it's there), but the widget doesn't update. It seems that reloadAllTimelines only works when the app is in foreground (at least in non debug builds). I dont have an iOS 17 device to check but I think this is a recent bug with iOS 18.
2
2
606
Dec ’24
App rejected guidelines performance 2.5.4 - UIBackgroundModes
Hi everyone, I’m encountering a recurring issue with my app submission, and I’d appreciate your insights. My app has been rejected due to Guideline 2.5.4 with the following feedback: Guideline 2.5.4 - Performance - Software Requirements The app continues to declare support for location in the UIBackgroundModes key in your Info.plist file but we are unable to locate any features besides employee tracking that require persistent location. Using the location background mode for the sole purpose of tracking employees is not appropriate. Please note we located the features of the app but the location background tracking of employees is not appropriate with this guideline. Next Steps If the app has a feature besides tracking employees that requires persistent location, reply to this message and let us know how to locate this feature. Otherwise, it would be appropriate to revise the app to include additional features for your users that require the persistent use of real-time location updates while the app is in the background My App’s Use Case: The app is designed to support events where users can check in and check out. Persistent location tracking is essential for the following: 1. During Events: • Tracking users’ real-time location ensures they remain within the event boundaries. • If a user exits the designated area, the system logs the occurrence for compliance and security purposes. 2. Workforce Monitoring: • For work events, the app records working hours based on their presence within the event area. • This ensures accurate logging of attendance and work durations. Steps I’ve Taken: • Limited Scope of Tracking: Persistent location tracking is active only during event check-in and check-out periods. Outside of these periods, tracking is disabled. • User Consent: I’ve implemented clear permission requests and a privacy policy to explain how location data is used. • Info.plist Configuration: I’ve declared the UIBackgroundModes key with location to support background tracking. Despite these measures, my app continues to be rejected with the feedback above. I believe my app’s features align with the guidelines as the location tracking is directly tied to event functionality and user benefit. Questions: 1. How can I better explain this use case to Apple’s review team to demonstrate compliance? 2. Are there any additional features or adjustments I should consider to ensure my app meets the guidelines? 3. Has anyone faced a similar issue with persistent location tracking, and how did you resolve it? Thank you for your guidance and support!
2
0
647
Dec ’24
Speech synthesis from Safari app extension
I'm making a Safari extension for learning languages. I need speech synthesis for any language the user chooses to learn. I initially tried to make this work within JavaScript, but Safari 18 doesn't reliably list voices for all languages on the web SpeechSynthesis API as described here: https://stackoverflow.com/questions/79179072/how-do-you-use-a-japanese-voice-with-speechsynthesis-in-safari-ios-18 As a workaround, I've had to use AVSpeechSynthesizer in SafariWebExtensionHandler (NSExtensionRequestHandling implementation for the extension). This works in the simulator but not on a real device. I've found this note from Apple in a StackOverflow reply: "Safari extensions are very short-lived, hence not fit for audio playback or speech synthesis. Not being able to validate an app extension in Xcode with a manually-added plist entry for background audio is the designed behavior. The general recommendation is to synthesize speech using JavaScript in conjunction with the Web Speech API." Unfortunately, the suggestion to use the Web Speech API is unsuitable as I just explained. Is there a way to set up a background process in the host app that can do speech synthesis? The app extension would need a way to communicate with this process, and start it if it's not running. Is that possible?
0
0
606
Dec ’24
How can I keep my app up to date with the server without throttling
I am trying to build a chat app. I am using FCM to deliver messages to my app accompanied by some custom data like the new message_data, deleted message_id and so on; each message will need to run the app in the background to do some background processing and local database syncing. This continuous background processing is clearly not acceptable as APNs imposes a per-device limit on background push notifications . I am asking how can I push messages and actions payload without being throttled ?
2
0
444
Dec ’24