As of iOS 18.1 being released we are having issues with our users experiencing issues with our app that relies on strobing the device torch.
We have narrowed this down to being caused on devices with adaptive true-tone flash and have submitted a radar: FB15787160.
The issue seems to be caused by ambient light levels. If run in a dark room, the torch strobes exactly as effectively as in previous iOS versions, if run in a light room, or outdoors, or near a window, the strobe will run for ~1s and then the torch will get stuck on for half a second or so (less frequently it gets stuck off) and then it will strobe again for ~1s and this behaviour repeats indefinitely.
If we go to a darker environment, and background and then foreground the app (this is required) the issue is resolved, until moving to an area with higher ambient light levels again. We have done a lot of debugging, and also discovered that turning off "Auto-Brightness" from Settings -> Accessibility -> Display & Text Size resolves the issue.
We have also viewed logs from Console.app at the time of the issue occurring and it seems to be that there are quite sporadic ambient light level readings at the time at which the issue occurs. The light readings transition from ~100 Lux to ~8000 Lux at the point that the issue starts occurring (seemingly caused by the rear sensor being affected by the torch). With "Auto-Brightness" turned off, it seems these readings stay at lower levels.
This is rendering the primary use case of our app essentially useless, would be great to get to the bottom of it! We can't even really detect it in-app as I believe using SensorKit is restricted to research applications and requires a review process with Apple before accessing?
Edit: It's worth noting this is also affecting other apps with strobe functionality in the exact same way
SensorKit
RSS for tagRetrieve data and derived metrics from iPhone sensors or from a paired Apple Watch.
Posts under SensorKit tag
19 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi there.
We are trying to implement SensorKit into our App to explore the data quality of accelerometer data recorded even when the App is terminated. So far we managed everything to work, even the fetch, except the SRSensorReaderDelegate never seems to reach
func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult) -> Bool { ... }
Any clue as to what we need to adjust in our code to get the FetchResult?
import Foundation
import SensorKit
import CoreMotion
import os.log
class SensorKitDataManager: NSObject, SRSensorReaderDelegate {
static let shared = SensorKitDataManager()
// Sensor Readers
let accelerometerReader = SRSensorReader(sensor: .accelerometer)
let rotationRateReader = SRSensorReader(sensor: .rotationRate)
let deviceUsageReader = SRSensorReader(sensor: .deviceUsageReport)
let phoneUsageReader = SRSensorReader(sensor: .phoneUsageReport)
let wristUsageReader = SRSensorReader(sensor: .onWristState)
var startTime: CFTimeInterval = CFTimeInterval(Date().timeIntervalSince1970)
var endTime: CFTimeInterval = CFTimeInterval(Date().timeIntervalSince1970)
override init() {
super.init()
configureSensorReaders()
}
// Configure sensor readers and set delegate
private func configureSensorReaders() {
if SRSensorReader(sensor: .accelerometer).authorizationStatus == .authorized {
accelerometerReader.delegate = self
}
...
}
func sensorReaderWillStartRecording(_ reader: SRSensorReader) {
print("\(reader.description) Delegate starts recording")
}
func sensorReader(_ reader: SRSensorReader, startRecordingFailedWithError error: Error)
{
print("\(reader.description) Delegate failed recording")
}
func sensorReader(_ reader: SRSensorReader, didChange authorizationStatus: SRAuthorizationStatus) {
if reader.sensor == .accelerometer {
if authorizationStatus == SRAuthorizationStatus.authorized {
accelerometerReader.startRecording()
}
else if authorizationStatus == SRAuthorizationStatus.denied {
accelerometerReader.stopRecording()
}
}
...
}
// Request SensorKit Authorization
func requestAuthorization() {
}
if UserDefaults.standard.bool(forKey: "JTrack_accelerometerEnabled") && accelerometerReader.authorizationStatus == .notDetermined {
SRSensorReader.requestAuthorization(sensors: [.accelerometer]) { error in
if let error = error {
os_log("Authorization denied: %@", log: OSLog.default, type: .error, error.localizedDescription)
} else {
os_log("Authorization granted for accelerometer sensor", log: OSLog.default, type: .info)
}
}
}
...
self.startRecordingIfAuthorized()
}
// Start recording for each authorized sensor
private func startRecordingIfAuthorized() {
if accelerometerReader.authorizationStatus == .authorized {
accelerometerReader.startRecording()
}
...
}
func fetchAllDataSinceJoined(from startTime: CFTimeInterval, to endTime: CFTimeInterval) {
self.startTime = startTime
self.endTime = endTime
if accelerometerReader.authorizationStatus == .authorized {
accelerometerReader.fetchDevices()
}
....
}
func stopAllRecordings() {
if accelerometerReader.authorizationStatus == .authorized {
accelerometerReader.stopRecording()
}
...
}
func sensorReader(_ reader: SRSensorReader, didFetch devices: [SRDevice]) {
let now = CFTimeInterval(Date().timeIntervalSince1970)
// Ensure the data is at least 24 hours old
let holdingPeriod: CFTimeInterval = 24 * 60 * 60 // 24 hours in seconds
let earliestFetchTime = now - holdingPeriod
// Adjust the start time if it's within the holding period
let adjustedStartTime = min(startTime, earliestFetchTime)
// If adjustedStartTime is after endTime, no data is available for fetching
guard adjustedStartTime < endTime else {
print("No data available to fetch as it falls within the 24-hour holding period.")
return
}
let fetchRequest = SRFetchRequest()
fetchRequest.from = SRAbsoluteTime(adjustedStartTime)
fetchRequest.to = SRAbsoluteTime(endTime)
// Log information about the devices that contributed data
for device in devices {
print("Device model: \(device.model), OS version: \(device.systemVersion), Identifier: \(device.description)")
if device.model == "iPhone" {
fetchRequest.device = device
}
}
if accelerometerReader.authorizationStatus == .authorized {
accelerometerReader.fetch(fetchRequest)
}
...
}
// SensorKit Delegate Methods
func sensorReader(_ reader: SRSensorReader, didCompleteFetch fetchRequest: SRFetchRequest) {
os_log("Fetch completed for sensor: %@", log: OSLog.default, type: .info, reader.sensor.rawValue)
}
func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult<AnyObject>) -> Bool {
if reader.sensor == .accelerometer {
....
}
....
}
func sensorReaderDidStopRecording(_ reader: SRSensorReader) {
print("\(reader.description) Delegate stops recording")
}
func sensorReader(_ reader: SRSensorReader, stopRecordingFailedWithError error: Error) {
print("\(reader.description) Delegate failed stopping")
}
I´m working within the health felid with a few apps. Accordingly to science one of the most important parts to keep healthy is every day walking.
But it is not to walk slow. You need to come to a little speed (not running or even jogging). But to rais your puls. This is when you get the "real health effect". In general it is around 6km/h.
It would be great if apple could make this info available for us developers. I think lots of developers will be happa and use this to make better apps and get more people in a healtheyer life.
Looking forward to get some feedback on this.
Thank you!
Cheers
Peter
Hello, I am currently developing an application using SensorKit to retrieve visit data. While the data retrieval works smoothly on one iPhone (iPhone 14, iOS 18.0.1), it fails on other devices, including:
iPhone 15 Pro Max with iOS 18.1 Beta
Another iPhone 14 with iOS 18.0
I’ve verified that the entitlements are configured properly, and the app has the necessary SensorKit visit permissions across all devices. Despite these steps, only one of the phones is able to retrieve the visit data correctly.
Is there any minimum hardware requirement or compatibility issue with certain models or configurations that I should be aware of for using SensorKit visits?
Any guidance or insight would be greatly appreciated!
Thank you.
The Deligate 'didFetchResult' method of fetching data past 24 hours from SensorKit is not being called. It is confirmed that you have already granted full access to the SensorKit and that data on the Ambient value in the device's personal information -> research sensor & usage data are recorded.
It is possible to export to an lz4 file. I want to have the data after 24 hours called to the app, but other Deligate methods are called, but only Deligate that gets the illumination value is not called. Is it understood that only data past 24 hours can be imported after startRecoding() is called?
If so, in order to receive data past 24 hours, do I have to continue to receive the illumination data value in the background for more than 24 hours to receive the Ambient value afterwards?
import Foundation
import SensorKit
import UIKit
final class SensorKitManager: NSObject, ObservableObject, SRSensorReaderDelegate {
static let shared = SensorKitManager()
private let ambientReader = SRSensorReader(sensor: .ambientLightSensor)
var availableDevices: [SRDevice] = []
@Published var ambientLightData: [AmbientLightDataPoint] = []
var isFetching = false
var isRecordingAmbientLight = false
private override init() {
super.init()
setupReaders()
checkAndRequestAuthorization()
}
private func setupReaders() {
ambientReader.delegate = self
}
// MARK: - Permission Request
func requestAuthorization() {
SRSensorReader.requestAuthorization(sensors: [.ambientLightSensor]) { [weak self] error in
DispatchQueue.main.async {
guard let self = self else {
print("Permission request aborted")
return
}
if let error = error {
print("Permission request failed: \(error.localizedDescription)")
} else {
print("Permission request succeeded")
self.startRecordingAmbientLightData()
}
}
}
}
func checkAndRequestAuthorization() {
let status = ambientReader.authorizationStatus
switch status {
case .authorized:
print("Ambient light sensor access granted")
startRecordingAmbientLightData()
case .notDetermined:
print("Ambient light sensor access undetermined, requesting permission")
requestAuthorization()
case .denied:
print("Ambient light sensor access denied or restricted")
@unknown default:
print("Unknown authorization status")
}
}
// MARK: - Ambient Light Data Logic
func startRecordingAmbientLightData() {
guard !isRecordingAmbientLight else {
print("Already recording ambient light data.")
return
}
print("Starting ambient light data recording")
isRecordingAmbientLight = true
ambientReader.startRecording()
fetchAmbientLightData()
fetchAmbientDeviceData()
}
func fetchAmbientLightData() {
print("Fetching ambient light data")
let request = SRFetchRequest()
let now = Date()
let fromTime = now.addingTimeInterval(-72 * 60 * 60)
let toTime = now.addingTimeInterval(-25 * 60 * 60)
request.from = SRAbsoluteTime(fromTime.timeIntervalSinceReferenceDate)
request.to = SRAbsoluteTime(toTime.timeIntervalSinceReferenceDate)
print("Fetch request: \(fromTime) ~ \(toTime)")
ambientReader.fetch(request)
}
private func displayAmbientLightData(sample: SRAmbientLightSample) {
print("Ambient light: \(sample.lux.value) lux")
print("Current ambientLightData content:")
for data in ambientLightData {
print("Timestamp: \(data.timestamp), Lux: \(data.lux)")
}
}
// MARK: - Device Data Logic
private func fetchAmbientDeviceData() {
print("Fetching device information")
let request = SRFetchRequest()
let now = Date()
let fromDate = now.addingTimeInterval(-72 * 60 * 60)
let toDate = now.addingTimeInterval(-24 * 60 * 60)
request.from = SRAbsoluteTime(fromDate.timeIntervalSinceReferenceDate)
request.to = SRAbsoluteTime(toDate.timeIntervalSinceReferenceDate)
if availableDevices.isEmpty {
print("No devices available")
ambientReader.fetchDevices()
} else {
for device in availableDevices {
print("Starting data fetch (Device: \(device))")
request.device = device
ambientReader.fetch(request)
print("Fetch request sent (Device: \(device))")
}
}
}
// MARK: - SRSensorReaderDelegate Methods
func sensorReader(_ reader: SRSensorReader, didFetch devices: [SRDevice]) {
availableDevices = devices
for device in devices {
print("Fetched device: \(device)")
}
if !devices.isEmpty {
fetchAmbientDeviceData()
}
}
func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult<AnyObject>) -> Bool {
print("sensorReader(_:fetching:didFetchResult:) method called")
if let ambientSample = result.sample as? SRAmbientLightSample {
let luxValue = ambientSample.lux.value
let timestamp = Date(timeIntervalSinceReferenceDate: result.timestamp.rawValue)
// Check for duplicate data and add it
if !ambientLightData.contains(where: { $0.timestamp == timestamp }) {
let dataPoint = AmbientLightDataPoint(timestamp: timestamp, lux: Float(luxValue))
ambientLightData.append(dataPoint)
print("Added ambient light data: \(luxValue) lux, Timestamp: \(timestamp)")
} else {
print("Duplicate data, not adding: Timestamp: \(timestamp)")
}
// Output data
self.displayAmbientLightData(sample: ambientSample)
}
return true
}
func sensorReader(_ reader: SRSensorReader, didCompleteFetch fetchRequest: SRFetchRequest) {
print("Data fetch complete")
if ambientLightData.isEmpty {
print("No ambient light data within 24 hours.")
} else {
print("ambientLightData updated")
for dataPoint in ambientLightData {
print("Added ambient light data: \(dataPoint.lux) lux, Timestamp: \(dataPoint.timestamp)")
}
}
}
}
Hello,
I am currently developing an iOS application using SensorKit. I encountered an issue when attempting to fetch SensorKit data in the background using background tasks (appRefresh, processing). The following error occurs:
In the delegate function func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, failedWithError error: any Error) {}, I receive the error:
SRErrorDataInaccessible.
In code specific manner:
start and handle background fetch (appRefresh)
func handleAppRefreshTask(task: BGAppRefreshTask) {
logger.logWithServer(level: .default, message: "background fetch start", category: String(describing: BackgroundTaskManager.self))
scheduleBackgroundFetch()
let queue = OperationQueue()
queue.maxConcurrentOperationCount = 1
let fetchOperation = FetchOperation()
queue.addOperation(fetchOperation)
task.expirationHandler = {
self.logger.logWithServer(level: .error, message: "background fetch expirated", category: String(describing: BackgroundTaskManager.self))
queue.cancelAllOperations()
}
fetchOperation.completionBlock = {
task.setTaskCompleted(success: !fetchOperation.isCancelled)
}
}
Background fetch operation class
class FetchOperation: Operation {
override func main() {
guard !isCancelled else { return }
Task {
// this function will execute fetch request for all user allowed sensorReader, 'func fetch(_ request: SRFetchRequest)'
await SensorkitManager.shared.startFetchAndUpload()
}
}
}
I have the following questions:
Is it possible to fetch SensorKit data in the background?
If it is possible, why does the above error occur?
If it is possible, could you provide the solution code and the correct workflow to avoid this error?
Thank you.
Hi! I'm working on an app that records x,y,z accelerometer values and when subjecting the watch to extreme acceleration (a swing) I notice the x, and y acceleration seems to get stuck around ~30G.
Is this a hardware limitation of the watch sensors? I have a Apple Watch Series 7.
Below is the chart of the acceleration recording session.
Appreciate your insights!
Joao
赶紧跟14比起来提升不多
We are developed app based on driver behaviour using location and sensor data.
It's working in both background and foreground using region monitoring.
Its consume more battery comparing other apps like. Its reduce 5% for every 5 to 10 mins of duration.
We are using for our app functionality
Location (Always) - every 1sec
Sensor - Accelerometer - every 20data for 1sec
Background - Region monitoring
Time Action - 1sec for 20time
Any possible way to reduce battery consuming? If any please suggest your points.
Thanks in advance for your comments.
Is it possible to get access to the class SRWristDetection of the SensorKit Framework in a watchOS app? As part of a project, we need to be able to log in to a service and as soon as the watch is taken off, the user must be logged out.
Wanted to reach out for some assistance with troublshooting my watch app not detecting collision, I have set up a few breakpoints and determined that it is not running the game over struct even if it had detected the collision, I have also tried to generate a log file when a collision is detected and that does not work either. I have considered that the objects may not be on the same layer so they are now in the same zstack.
I am working on an application design where I would like to have the Iphone running an app such as Apple Map, and while I might be driving in the car I would like to have my apple Watch invoke a screen shot of the current view of I map showing my location without having to use the iphone.
So the Iphone would be active and in a car holder displaying my location on the map.
Once I made my route point, such as a waypoint. I would like to press a button on my watch app to do a screen print.
Second though is, could I press a button on my Apple Watch app to fire an request to my own IOS Application using WatchConnectivity to capture my Location to a data file.
Third thought is could a press a button on my Apple Watch app to interact with an active app, which is an app like a reminder app and create a new Reminder Note in the active app in focus on my Iphone, but is not a component of my IPhone App.
Thanks for any insight how I could make a Apple Watch app send Button Presses reliably to my Iphone Application active, so that I. could avoid having to touch my iPhone?
Thanks in advance for any guidance.
Trying to find someone who can tell me how to get the accelerometer data from IPhone 14 that was involved in vehicular homicide. Running out of avenues and hoping for some guidance here
Hello All, I am building an application that involves the usage of accelerometer data. How I handle the data will change based on the accelerometer specifications. I have been looking all over for the accelerometer specifications for the iPhone13 (the iPhone that I am testing my application on), but I cannot find it anywhere. Is it public knowledge?
Hello, I'm currently locked out of an old iphone 13 pro max purchased at Apple Store, i've forgotten the icloud account on the device so i can not locate the serial or imei number of the device and i've lost my proof of purchase, the store employee can not access the proof of purchase, the iphone 13 Pro Max is in great condition's. as i turn on the phone it's an Unknown Display. how could i bypass the unknown display or recycle the phone also the phone is running a ios almost 2 year's ago i've payed full price for the device.
Hi All, I like to develop an watch app which capture IMU data (9 axises) in 10Hz and use a ML model check an event occurred. My app should be on/run in background always. ML model also can be put in an iPhone app. I need a suggestion whether I can run app always in watch to capture the IMU data. Thanks
I am working on an application that monitors driving pattern of users. It intents to give distraction alerts and warnings.
I saw SensorKit with a lot of supporting features but it has been stated for research purpose alone. Can I use it if my app is being used for end user notifications and not just for research?
Thanks in advance.
Hi folks!
We are developing a WatchOS companion app which records Accelerometer data. We call CMSensorRecorder.recordAccelerometer(forDuration: _) and retrieve the data by calling CMSensorRecorder.accelerometerData(from: _, to: _). So far so good. But the issue arises when we have a data for say 2 hours. We get the accelerometer data in the SwiftUI Task or using GCD's background queue. The data we get is in the loop, enumerating the CMSensorDataList. The loop goes on when the app is active. But as soon as the Watch app goes to inactive state, say applicationWillResignActive is called, the loop is suspended. I know this is the default behaviour of the OS. But this makes our data parsing so slow that sometimes it takes an hour or so to get 2 hours data. This is making our app not user friendly. Is there a way we can keep our app alive as soon as the data is being processed? Or is there another way for faster data processing that we can send the raw data to phone using WCSession so that the data gets there as soon as recording stops?
Thanks
Is it possible to get the height (Z location) of an iOS device based on sensor data? I've continuously added the accelerometer data up to infer the location but it results in a cascading error, and the inferred location of the phone "drifts" when the phone is placed still. I want to build an app where users measure the height of something by first zeroing the phone on the floor and then raising the phone to the height of the object, like placing it on the table.