We have received some information that with the release of iOS 18, there have been notable changes in how this API behaves, can apple team shed some light on this? on ios 17 this worked without much issues, what has changed on ios 18?
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I’m trying to integrate Screen Time usage data into my iOS app.
The goal is to fetch the total time a user spends on their device (daily or weekly), and store this locally for analysis.
So far, I’ve explored the DeviceActivity and FamilyControls frameworks:
1.DeviceActivityReport works but seems tied to extensions that show reports, not directly fetching raw values inside the main app.
2.I haven’t found a way to simply retrieve the total screen-on time (similar to what Settings → Screen Time shows).
My questions:
1.Is there any public API that allows retrieving the user’s total Screen Time (like the one shown in Settings)?
If yes, what’s the correct approach — should I use 2.DeviceActivityMonitorExtension, FamilyActivitySelection, or another framework?
3.If not, is it expected that this data is only available in the Settings app and not exposed to developers?
Any guidance or official confirmation would be really helpful.
Thanks in advance!
We are a research team conducting a study collecting subject's SensorKit speech data, and we've encountered some questions we couldn't resolve ourselves or by looking up the online SensorKit documentation:
Microphone Activation: In general, how is the microphone being turned on to capture a speech session? And how was each session determined to be an independent session?
Negative Values: In the speech classification data, there are entries where some of the start and end values are negative (see screenshot below). How should we interpret and handle these values? Is it safe to filter them out?
Duplicated sessions: From the same screenshot you can see there are multiple session identifiers linked to the same subject with the same timestamp - what does this represent?
Another Negative Values: The same question for speech recognition data's average pause duration, what does the -1 mean and should we remove them as well?
(Note that these screenshot got rid of subject IDs for privacy purposes but each screenshot was from one subject.)
We greatly appreciate your time and help.
Hello, I am building an iMessage extension for my app and I am struggling to figure out how to test it. The extension allows users to send their friends an interactive widget and the recipient experience is very important to test.
I tried to do it in the simulators, but simulators do not support iMessage. I have got a second iPhone and created a sandbox account, but I cannot install TestFlight with the sandbox account, as this feature is not supported.
Reddit, Stackoverflow, ChatGPT and Apple Developer support also did not help. Can someone share their experience with testing recipient experience in the iMessage extension?
I'm currently experimenting with AlarmKit and, when configuring an alarm with a Relative schedule, but that never repeats, I find that it is still scheduled in the AlarmManager, but the documentation says that an alarm that has no repeat schedule will be removed from the AlarmManager after firing.
This behavior is experienced both in my project and with the sample code AlarmKit-ScheduleAndAlert
To reproduce, create a non-repeating alarm, and, after firing, it will still be on the list marked as Scheduled
Am I doing something wrong or is this a bug?
Topic:
App & System Services
SubTopic:
General
Hello.
I've implemented the Live Caller ID Lookup feature in my app, but sometimes I get a weird error. When I call LiveCallerIDLookupManager.shared.refreshPIRParameters(...) from my app, it sometimes throws an error:
Error Domain=com.apple.CipherML Code=1100 "Unable to query status due to errors: The resource could not be loaded because the App Transport Security policy requires the use of a secure connection." UserInfo={NSLocalizedDescription=Unable to query status due to errors: The resource could not be loaded because the App Transport Security policy requires the use of a secure connection., NSUnderlyingError=0x118f65740 {Error Domain=NSURLErrorDomain Code=-1022 "The resource could not be loaded because the App Transport Security policy requires the use of a secure connection." UserInfo={NSLocalizedDescription=The resource could not be loaded because the App Transport Security policy requires the use of a secure connection., NSErrorFailingURLKey=http://www.example.com/config}}}
What does this error mean? And where did the example.com part come from? What should I do to get rid of this error? My Service URL is hardcoded in the Live Caller ID Lookup Extension of my app and it is definitely not example.com.
In release mode, the values are missing. I don't know what's wrong with it. If I install the debug version first and then overwrite it with the release build, I can retrieve the values stored in debug mode. But if I uninstall completely and install the release version directly, it's empty
I am developing multi timer app which works in background mode.
at first, I could go multi timer in background mode using background mode 'audio' which uses slient wav file.
However, app has rejected background mode 'audio' should not use which not for audio app.
I want to know how to develop timer app which works in background mode in ios platform.
native ios timer that let us alarmed time. so I want develop that kind of app
Sincerely,
Topic:
App & System Services
SubTopic:
General
I’m currently developing a spam number blocking app using CallKit.
I’ve confirmed that up to iOS 26 beta 5, there is a bug where number blocking doesn’t work.
In my current tests, the ringtone doesn’t sound and the blocking works fine, but the call still appears in the missed calls list, which is bothersome.
If the bug is fixed in future versions (as it was in previous versions), is there a way to block the number so that it also does not appear in missed calls?
Hi everyone,
I’m building a sports performance app for Apple Watch that uses the onboard IMU to analyze swings and impacts in sports like tennis and golf. The goal is to estimate club/racket head speed, ball speed, and shot quality in real time from wrist motion data.
With Core Motion, I can currently get deviceMotion updates at ~100 Hz. While this is fine for general movement tracking, the actual ball impact happens much faster — 5–10 ms in tennis and ~0.5 ms in golf. Many of the high-frequency vibration/impact components are missed at 100 Hz, making it hard to directly measure or more accurately estimate certain performance metrics.
Questions for Apple / community:
1. Is there a way to access raw accelerometer and gyroscope data at higher sampling rates (e.g., 500–1000 Hz) on Apple Watch?
2. If not, is this due to hardware limitations or an API/software constraint?
3. Are there any research, partner, or beta programs that allow deeper sensor access for sports-science use cases?
Even modest increases in IMU sampling could unlock more accurate ball-speed estimates, impact force analysis, and strike-quality detection without needing external sensors — making Apple Watch a best-in-class wearable for precision sports analytics.
Happy to share more about the current approach, sample data, and potential use cases if helpful.
Thanks,
Max
Hi,
On iOS 26 beta, calls can no longer be reported by swiping left on a call.
This is unfortunately a breaking change.
I have submitted a report on this on June with Feedback Assistant: FB17893517
I hope it will get some more exposure by posting here.
Hello, I am trying to display basic screen time data on my main screen. On the initial load of the screen, the DeviceActivityReport renders correctly and visible, but after being in the background and coming back to the app, the whole view is just blank. I don't think I'm doing anything special. Is this a known bug?
@main
struct MyActivityReportExtension: DeviceActivityReportExtension {
var body: some DeviceActivityReportScene {
// Create a report for each DeviceActivityReport.Context that your app supports.
TotalActivityReport { totalActivity in
TotalActivityView(totalActivity: totalActivity)
}
// Add more reports here...
}
}
extension DeviceActivityReport.Context {
// If your app initializes a DeviceActivityReport with this context, then the system will use
// your extension's corresponding DeviceActivityReportScene to render the contents of the
// report.
static let totalActivity = Self("Total Activity")
}
struct TotalActivityReport: DeviceActivityReportScene {
// Define which context your scene will represent.
let context: DeviceActivityReport.Context = .totalActivity
// Define the custom configuration and the resulting view for this report.
let content: (String) -> TotalActivityView
func makeConfiguration(representing data: DeviceActivityResults<DeviceActivityData>) async -> String {
// Reformat the data into a configuration that can be used to create
// the report's view.
let formatter = DateComponentsFormatter()
formatter.allowedUnits = [.day, .hour, .minute]
formatter.unitsStyle = .abbreviated
formatter.zeroFormattingBehavior = .dropAll
let totalActivityDuration = await data.flatMap { $0.activitySegments }.reduce(0, {
$0 + $1.totalActivityDuration
})
return formatter.string(from: totalActivityDuration) ?? "No activity data"
}
}
struct TotalActivityView: View {
let totalActivity: String
var body: some View {
VStack(alignment: .center, spacing: 4) {
Text("Screen Time")
.font(.system(size: 14, weight: .regular))
.foregroundColor(.secondary)
.frame(maxWidth: .infinity, // stretch to the full cell width
alignment: .center)
Text(totalActivity)
.font(.system(size: 18, weight: .medium))
.foregroundColor(.primary)
}
}
}
And I am using it in my main view:
private var analyticsSection: some View {
HStack(spacing: 24) {
// Some View
DeviceActivityReport(DeviceActivityReport.Context(rawValue: "Total Activity"), filter: DeviceActivityFilter(
segment: .weekly(
during: Calendar.current.dateInterval(
of: .weekOfYear, for: .now
)!
),
users: .all,
devices: .init([.iPhone, .iPad]),
))
.frame(maxWidth: .infinity)
// another view
}
.frame(maxWidth: .infinity, maxHeight: showAnalytics ? 58 : 0)
.padding(.horizontal, showAnalytics ? 24 : 0)
.opacity(showAnalytics ? 1.0 : 0.0)
.clipped()
}
I have a TVTopShelfContentProvider that implements
func loadTopShelfContent() async -> (any TVTopShelfContent)?
When running on Xcode 26 b5 I am seeing the following error in swift 6 mode.
Non-Sendable type '(any TVTopShelfContent)?' cannot be returned from nonisolated override to caller of superclass instance method 'loadTopShelfContent()'
I'm not sure exactly what's changed here as it used to compile just fine but it's unclear now how I can work-around this error or how the API is supposed to be used.
The following definition is enough to trigger the error in Swift 6 language mode.
import TVServices
class ContentProvider: TVTopShelfContentProvider {
override func loadTopShelfContent() async -> (any TVTopShelfContent)? {
return nil
}
}
I can "fix" it by adding @preconcurrency to the TVServices import but it seems like this API is unusable currently? Or maybe it's user error on my part?
I have been using Universal Links since January of this year.
As of January, it was working fine, but when I checked its operation in August, it was no longer working properly.
After investigating, I believe that the reason it is not working is because our firewall is blocking communication from AppleCDN to check for AASA files.
Our firewall blocks communication from outside Japan, and Apple's IP address (17.0.0.0/8) is whitelisted.
Does anyone know the hostname or IP address that is used to check AASA files?
If you know, please let me know.
Can't I just add up all of the accelerations of the accelerometer and then use this physics equation to get distance?
d = v(i) x t + (1/2) x a x t ^2
In this:
v(i) would be 0
t = 1 second
a = all accelerometer speeds added together for.1 second
t = 1 second
Can't I just use this equation to get vertical velocity? A lot of people have said it is impossible but It has been done with variometer apps. I can’t figure out the code. Can anyone guide me in the right direction?
v(f) = v(i) + a x t
v(i) = 0
a = y-axis acceleration for 1 second
t = 1 second
Please let me know if this is possible.
Thank you so much for your help.
I'm trying to work with the beta version of the Declared Age Range framework based on an article's tutorial but am getting the following error:
[C:1-3] Error received: Invalidated by remote connection.
and AgeRangeService.Error.notAvailable is being thrown on the call to requestAgeRange. I'm using Xcode 26 beta 5 and my simulator is running the 26.0 beta. The iCloud account that I have signed into the simulator has a DOB set as well.
This is my full ContentView where I'm trying to accomplish this.
struct ContentView: View {
@Environment(\.requestAgeRange) var requestAgeRange
@State var advancedFeaturesEnabled = false
var body: some View {
VStack {
Button("Advanced Features") {}
.disabled(!advancedFeaturesEnabled)
}
.task {
await requestAgeRangeHelper()
}
}
func requestAgeRangeHelper() async {
do {
let ageRangeResponse = try await requestAgeRange(ageGates: 16)
switch ageRangeResponse {
case let .sharing(range):
if let lowerBound = range.lowerBound, lowerBound >= 16 {
advancedFeaturesEnabled = true
}
case .declinedSharing:
break
// Handle declined sharing
default:
break
}
} catch AgeRangeService.Error.invalidRequest {
print("Invalid request")
// Handle invalid request (e.g., age range < 2 years)
} catch AgeRangeService.Error.notAvailable {
print("Not available")
// Handle device configuration issues
} catch {
print("Other")
}
}
}
in beta5 now the custom sound configuration works and it actually plays sound when alarm runs off BUT the sound is played only for once. has anyone figured out on how to put it on repeat? or do I have to wait on this for another couple of weeks💀
Topic:
App & System Services
SubTopic:
General
I'm using the new AlarmKit framework to build a Swift app that lets users schedule multiple repeating alarms.
The goal is to allow users to stop all alarms for today if they wake up early, but the alarms should still ring on their scheduled days in the future (for example, every Monday).
What I tried:
When the user chooses to stop alarms for today, I delete all alarms and re-add them. However, this doesn't work as expected.
If today is Monday and I delete and re-add the alarm with .weekday = .monday, it still rings today. That means re-adding the alarm doesn't skip today's instance, even though it's repeating.
What I want to achieve:
Skip or suppress today's alarms when the user stops them manually
Keep the same alarms active for their scheduled days in the future
Questions:
Is there a way in AlarmKit to prevent a repeating alarm from ringing today if it was just re-added or there are better alternatives to this problem?
Is the only workaround to delay re-adding until after today’s alarms would have fired?
What is the best approach to achieve this?
Topic:
App & System Services
SubTopic:
General
We are currently developing a VoIP application that supports Local Push extention.
We discovered an issue with this app where the performEndCallAction response to reportCallWithUUID is occasionally slow.(See below for detail)
It usually works without any issues, so we believe there is no problem with the app's processing flow.
This issue only occurs very rarely, but each time it does there is a delay of about 60 seconds,
which leads us to suspect that there is some kind of problem on the iOS side, and that fail-safe processing is occurring after 60 seconds.
Do you know of a workaround for this issue?
A user of my AppKit, document-based app brought to my attention that when setting it as the default app to open a certain file with extension .md (by choosing in the Finder "File > Open With > Other", then selecting my app and enabling "Always open with"), trying to open it with a double-click displays the warning "Apple could not verify [file] is free of malware that may harm your mac or compromise your privacy".
This is what happens for me:
When keeping the default app for a .md file (Xcode in my case), the file opens just fine.
When choosing my app in the "File > Open With" menu, the file opens just fine in my app.
But when setting my app as the default app (see above), the warning is displayed.
From that moment on, choosing my app in the "File > Open With" menu doesn't work anymore. Selecting Xcode doesn't work either.
Only setting Xcode again as the default app allows me to open it in Xcode, but my app still isn't allowed to open it.
Is this a macOS issue, or can I do anything in my app to prevent it? Where should I start looking for the issue in my code?