Hi everyone,
I’m building a sports performance app for Apple Watch that uses the onboard IMU to analyze swings and impacts in sports like tennis and golf. The goal is to estimate club/racket head speed, ball speed, and shot quality in real time from wrist motion data.
With Core Motion, I can currently get deviceMotion updates at ~100 Hz. While this is fine for general movement tracking, the actual ball impact happens much faster — 5–10 ms in tennis and ~0.5 ms in golf. Many of the high-frequency vibration/impact components are missed at 100 Hz, making it hard to directly measure or more accurately estimate certain performance metrics.
Questions for Apple / community:
1. Is there a way to access raw accelerometer and gyroscope data at higher sampling rates (e.g., 500–1000 Hz) on Apple Watch?
2. If not, is this due to hardware limitations or an API/software constraint?
3. Are there any research, partner, or beta programs that allow deeper sensor access for sports-science use cases?
Even modest increases in IMU sampling could unlock more accurate ball-speed estimates, impact force analysis, and strike-quality detection without needing external sensors — making Apple Watch a best-in-class wearable for precision sports analytics.
Happy to share more about the current approach, sample data, and potential use cases if helpful.
Thanks,
Max
Core Motion
RSS for tagProcess accelerometer, gyroscope, pedometer, and environment-related events using Core Motion.
Posts under Core Motion tag
18 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Can't I just add up all of the accelerations of the accelerometer and then use this physics equation to get distance?
d = v(i) x t + (1/2) x a x t ^2
In this:
v(i) would be 0
t = 1 second
a = all accelerometer speeds added together for.1 second
t = 1 second
Can't I just use this equation to get vertical velocity? A lot of people have said it is impossible but It has been done with variometer apps. I can’t figure out the code. Can anyone guide me in the right direction?
v(f) = v(i) + a x t
v(i) = 0
a = y-axis acceleration for 1 second
t = 1 second
Please let me know if this is possible.
Thank you so much for your help.
During regular use, RealityKit generates an excessive amount of internal logging that is not actionable by third party developers. When developing an iOS RealityKit/ARKit app, this makes the Xcode console challenging to use for regular work.
(FB19173812)
See screenshots below.
Xcode does have an option for filtering out logging from specific SDKs, but enabling this feature to suppress the logging of RealityKit and related SDKs like PHASE is something developers have to do dozens of times each day. After a year of developing a RealityKit app, this process becomes frustrating.
If SDKs like Foundation, UIKit, and SwiftUI generated as much logging as RealityKit and related SDKs, Xcode's console would be unusable.
Is there any way to disable the logging of RealityKit and PHASE permanently?
Thank you for any help you provide.
So I'm testing a microapp that is contained in an IPFS folder. I use a web3 website that is used to view NFTs and their IPFS files. The app has gyro controls, which are enabled through a confirmation gesture.
In iOS 18.5, when I press "Request Permission" button I get the popup to allow the app to acess movement and orientation. In iOS26, pressing the button does nothing. Keep in mind that this only happens through the website, that uses iframes. When I load the IPFS file from a direct link, the popup appears with no issue.
I think this might be because iOS26 uses WebGPU or it might be a bug since iOS26 is still in beta.
I’m developing an activity classifier that I’d like to input using the JSON format of CoreMotion data.
I am getting the error:
Unable to parse /Users/DewG/Downloads/Testing/Step1/Testing.json. It does not appear to be in JSON record format. A SequenceType of dictionaries is expected
I've verified that the format I am using is JSON via various JSON validators, so I am expecting I'm just holding it wrong. Is there an example of a JSON file with CoreMotion data that I can model after?
Hello,
I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework.
However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric.
I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated.
Thank you!
There is a crash while running the project in Xcode 16.2. The project has been using CMPedometer and CoreMotion since 2020. I wonder: I did not have the NSMotionUsageDescription key added, why is it mandatory to add this key now?
“This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSMotionUsageDescription key with a string value explaining to the user how the app uses this data.”
The crash stack is as follows, and this crash only occurs on this model.
I want a solution to keep tracking the user once he started in driving state until parking.
I tried many solutions like use significant location changes, and silent push notifications and background tasks, but no one of them worked as expected.
I need when user started in driving the app be active until the user parked his car.
I'm using CoreMotion and CoreLocation.
The challenge is when the app is not active like killed or suspended.
So, how to do this? is this possible or not?
Hi,
I have received the following report after app termination. I have researched online but cannot determine the root cause. Any tips or ideas would help please.
Could it be Location Services, UserNotification Services, or Network Requests?
Thank you,
Brendan
Translated Report (Full Report Below)
Incident Identifier: 6CD59A17-15B1-4F4E-AE84-0286F22893A4
CrashReporter Key: 3d12fb7359053239708afd24c7eed0267a9cc601
Hardware Model: iPhone13,3
Process: AnchorNet3 [5605]
Path: /private/var/containers/Bundle/Application/5EA7F893-D562-45B8-8995-5EAB15F85A7E/AnchorNet3.app/AnchorNet3
Identifier: com.sailsecrets.AnchorNet3
Version: 3.17 (3.17)
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Coalition: com.sailsecrets.AnchorNet3 [1443]
Date/Time: 2025-02-06 00:12:03.6136 +0100
Launch Time: 2025-02-05 22:11:19.4220 +0100
OS Version: iPhone OS 18.2 (22C5131e)
Release Type: Beta
Baseband Version: 5.20.03
Report Version: 104
Exception Type: EXC_RESOURCE (SIGKILL)
Exception Codes: 0x0000000000020000, 0x0000000000000000
Termination Reason: PORT_SPACE 14123288431434006528 (Limit 131072 ports) Exceeded system-wide per-process Port Limit
Triggered by Thread: 3
Thread 0 name: Dispatch queue: com.apple.main-thread
Thread 0:
0 libsystem_kernel.dylib 0x1e27414e4 kevent_id + 8
1 libdispatch.dylib 0x198f51b40 _dispatch_kq_poll + 228
2 libdispatch.dylib 0x198f51080 _dispatch_event_loop_poke + 340
3 QuartzCore 0x192d4631c CA::Context::commit_transaction(CA::Transaction*, double, double*) + 17164
4 QuartzCore 0x192cb8d58 CA::Transaction::commit() + 648
5 QuartzCore 0x192cb8764 CA::Transaction::flush_as_runloop_observer(bool) + 88
6 UIKitCore 0x193a3fd14 _UIApplicationFlushCATransaction + 52
7 UIKitCore 0x193a3d1e0 __setupUpdateSequence_block_invoke_2 + 332
8 UIKitCore 0x193a3d054 UIUpdateSequenceRun + 84
9 UIKitCore 0x193a3f984 schedulerStepScheduledMainSection + 172
10 UIKitCore 0x193a3d5a0 runloopSourceCallback + 92
11 CoreFoundation 0x1911f1f3c CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION + 28
12 CoreFoundation 0x1911f1ed0 __CFRunLoopDoSource0 + 176
13 CoreFoundation 0x1911f4b30 __CFRunLoopDoSources0 + 244
14 CoreFoundation 0x1911f3d2c __CFRunLoopRun + 840
15 CoreFoundation 0x191246274 CFRunLoopRunSpecific + 588
16 GraphicsServices 0x1de34d4c0 GSEventRunModal + 164
17 UIKitCore 0x193d8f480 -[UIApplication run] + 816
18 UIKitCore 0x1939b5410 UIApplicationMain + 340
19 SwiftUI 0x195b43e30 closure #1 in KitRendererCommon(:) + 168
20 SwiftUI 0x195b43d60 runApp(:) + 100
21 SwiftUI 0x195b43c44 static App.main() + 180
22 AnchorNet3.debug.dylib 0x1025e97bc static MainApp.$main() + 40
23 AnchorNet3.debug.dylib 0x1025eaacc __debug_main_executable_dylib_entry_point + 12
24 dyld 0x1b7352de8 start + 2724
Thread 1 name: com.apple.CoreMotion.MotionThread
Thread 1:
0 libsystem_kernel.dylib 0x1e2741788 mach_msg2_trap + 8
1 libsystem_kernel.dylib 0x1e2744e98 mach_msg2_internal + 80
2 libsystem_kernel.dylib 0x1e2744db0 mach_msg_overwrite + 424
3 libsystem_kernel.dylib 0x1e2744bfc mach_msg + 24
4 CoreFoundation 0x1911f47f4 __CFRunLoopServiceMachPort + 160
5 CoreFoundation 0x1911f3ea0 __CFRunLoopRun + 1212
6 CoreFoundation 0x191246274 CFRunLoopRunSpecific + 588
7 CoreFoundation 0x191259814 CFRunLoopRun + 64
8 CoreMotion 0x19e89cc5c 0x19e88d000 + 64604
9 libsystem_pthread.dylib 0x21bcfb7d0 _pthread_start + 136
10 libsystem_pthread.dylib 0x21bcfb480 thread_start + 8
Thread 2 name: com.apple.uikit.eventfetch-thread
Thread 2:
0 libsystem_kernel.dylib 0x1e2741788 mach_msg2_trap + 8
1 libsystem_kernel.dylib 0x1e2744e98 mach_msg2_internal + 80
2 libsystem_kernel.dylib 0x1e2744db0 mach_msg_overwrite + 424
3 libsystem_kernel.dylib 0x1e2744bfc mach_msg + 24
4 CoreFoundation 0x1911f47f4 __CFRunLoopServiceMachPort + 160
5 CoreFoundation 0x1911f3ea0 __CFRunLoopRun + 1212
6 CoreFoundation 0x191246274 CFRunLoopRunSpecific + 588
7 Foundation 0x18fdc8338 -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212
8 Foundation 0x18ff24e24 -[NSRunLoop(NSRunLoop) runUntilDate:] + 64
9 UIKitCore 0x193e22a74 -[UIEventFetcher threadMain] + 420
10 Foundation 0x18feb4194 NSThread__start + 724
11 libsystem_pthread.dylib 0x21bcfb7d0 _pthread_start + 136
12 libsystem_pthread.dylib 0x21bcfb480 thread_start + 8
Thread 3 name: com.apple.SwiftUI.AsyncRenderer
Thread 3 Crashed:
0 libsystem_kernel.dylib 0x1e274162c _kernelrpc_mach_port_allocate_trap + 8
1 libsystem_kernel.dylib 0x1e2748478 mach_port_allocate + 36
2 QuartzCore 0x192d4552c CA::Context::commit_transaction(CA::Transaction*, double, double*) + 13596
3 QuartzCore 0x192cb8d58 CA::Transaction::commit() + 648
4 QuartzCore 0x192cb8764 CA::Transaction::flush_as_runloop_observer(bool) + 88
5 CoreFoundation 0x19119f894 CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION + 36
6 CoreFoundation 0x19119f3e8 __CFRunLoopDoObservers + 552
7 CoreFoundation 0x1912462c0 CFRunLoopRunSpecific + 664
8 Foundation 0x18fdc8338 -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212
9 Foundation 0x18fdc4500 -[NSRunLoop(NSRunLoop) run] + 64
10 SwiftUI 0x195c276d8 specialized static DisplayLink.asyncThread(arg:) + 792
11 SwiftUI 0x195c273a8 @objc static DisplayLink.asyncThread(arg:) + 72
<>
Note: I have had issues with CMAltimeter since whats seems to have been a major undocumented modification since iOS 17.4.
So I'm using the CMAltimeter absolute locations delivery.
Sometimes, the altimeter seems to be in an uncalibrated mode and therefore, no altitude delivery happens.
Is there a way to be inform of such state? Currently, it just doesn't work and I can't inform the user about this. They just think the app is broken
What message should I give to the users to accelerate the calibration such that the CMAltimeter will work again?
Also, users have reported that the CMAltimeter can temporarily stop delivering altitude updates, even though it should.
So I guess my question resumes to this:
Whats the best practice to handle an uncalibrated CMAltimeter?
Thanks!
I'm developing a watchOS app for Watch Ultra 2 that implements water detection using CMSubmersionManager.
I would like to make it appear in the Auto-Launch settings menu, but my app is not appearing in the settings (Settings > General > Auto-Launch > When Submerged > Selected App)....
What additional steps should I take to make this work?
Environment
Device: Watch Ultra 2
watchOS: 11.2
Xcode: 16.0
Implementation
I have implemented the following as per documentation:
Added the Shallow Depth and Pressure capability and Entitlement.
Added the "Shallow Depth and Pressure" capability
Confirmed entitlement "com.apple.developer.submerged-shallow-depth-and-pressure" was automatically added
Note: I initially thought I should use "com.apple.developer.submerged-depth-and-pressure" (without "-shallow") since I'm targeting a maximum depth of 6 meters, but this resulted in compilation errors.
ref: https://developer.apple.com/forums/thread/740083
ref: https://developer.apple.com/forums/thread/735296
Added NSMotionUsageDescription and WKBackgroundModes
<key>NSMotionUsageDescription</key>
<string>Required for water detection</string>
<key>WKBackgroundModes</key>
<array>
<string>underwater-depth</string>
</array>
According to the documentation:
"It also adds your app to the list of apps that the system can autolaunch when the wearer submerges the watch."
What additional steps are needed to make the app appear in Auto-Launch settings? Has anyone successfully implemented this feature?
I'm working with the Apple Watch's acceleration data and have referred to the document Identify the coordinate axes of the device. According to this document, the X-axis points to the right side of the watch, the Y-axis points towards the top side, and the Z-axis points towards the user.
However, when I place the watch on a flat surface and move it horizontally to the right, I observe that the X-axis acceleration is negative. Similarly, when I move the watch vertically upwards, the Y-axis acceleration also shows a negative value.
Is this expected behavior, or am I misunderstanding something about the coordinate system or acceleration readings?
I am working on an iOS app where I need to detect when a user starts and stops driving using the Apple Core Motion framework. I've implemented the following MotionActivityManager class to handle activity updates and display the detected states in a SwiftUI view.
While I can accurately detect "Stationary" and "Walking" states, detecting the "Driving" (Automotive) state has been unreliable. The accuracy often fails, and the framework frequently misclassifies driving as other states like "Unknown" or "Walking."
Here's the implementation:
@Published var motionStates: [MotionState] = []
@Published var startDate: String = ""
@Published var confidence: String = ""
init() {
setupDefaultStates()
startActivityUpdates()
}
private func setupDefaultStates() {
motionStates = [
MotionState(label: "Stationary", value: false),
MotionState(label: "Walking", value: false),
MotionState(label: "Running", value: false),
MotionState(label: "Automotive", value: false),
MotionState(label: "Cycling", value: false),
MotionState(label: "Unknown", value: false)
]
}
func startActivityUpdates() {
guard CMMotionActivityManager.isActivityAvailable() else {
print("Motion activity is not available.")
return
}
motionActivityManager.startActivityUpdates(to: .main) { [weak self] motion in
guard let self = self, let motion = motion else { return }
DispatchQueue.main.async {
self.updateProperties(with: motion)
}
}
}
private func updateProperties(with motion: CMMotionActivity) {
motionStates = [
MotionState(label: "Stationary", value: motion.stationary),
MotionState(label: "Walking", value: motion.walking),
MotionState(label: "Running", value: motion.running),
MotionState(label: "Automotive", value: motion.automotive),
MotionState(label: "Cycling", value: motion.cycling),
MotionState(label: "Unknown", value: motion.unknown)
]
startDate = dateFormatter.string(from: motion.startDate)
switch motion.confidence {
case .low:
confidence = "Low"
case .medium:
confidence = "Medium"
case .high:
confidence = "High"
@unknown default:
confidence = "Unknown"
}
}
}
struct MotionState: Identifiable {
let id = UUID()
let label: String
let value: Bool
}
struct ContentView: View {
@StateObject private var motionManager = MotionActivityManager()
var body: some View {
ScrollView {
VStack(spacing: 16) {
ForEach(motionManager.motionStates) { state in
LabelView(label: state.label, value: state.value ? "True" : "False")
}
LabelView(label: "Confidence", value: motionManager.confidence)
}
.padding()
}
.onAppear {
UIApplication.shared.isIdleTimerDisabled = true
motionManager.startActivityUpdates()
}
.navigationTitle("Motion Activity")
}
}
Issues:
The motion.automotive state is often not detected accurately.
The confidence level remains low for the automotive state, even when the device is clearly in a car.
How can I improve the detection accuracy of the "Driving" state using the Core Motion framework?
I'm working on an application where, once the user starts driving, I need to periodically check their activity every 2.5 minutes (150 seconds) to determine whether they are still driving, walking, or have stopped. If they are still driving, I want to keep rescheduling the task until the user is no longer in a driving state.
Currently, I'm using startMonitoringSignificantLocationChanges to wake the app when the user's location changes, which works as expected. However, the activity detection stops after the first result is received, and I can't continue tracking the user's activity in the background (or after the app is killed from the app switcher).
Here's my approach:
After receiving a significant location change, I start tracking the user’s activity to check if they are driving or have stopped.
I reschedule this task every 2.5 minutes as long as the user remains in a driving state.
I need this process to run even when the app is in the background or terminated by the user.
Question: Is it possible to keep activity detection running periodically after receiving a location change, even when the app is in the background or terminated? What is the recommended way to implement this?
I would appreciate any suggestions or best practices for achieving this functionality.
Since upgrading to iOS18, roomplan crashes have become more frequent
Crash Txt
I want to get the barometric pressure reading from the built-in barometer to display it in my iOS app. When I use CMAltimeter.startRelativeAltitudeUpdates(), my app receives no relative altitude update events, which means I can't view the barometric pressure from the sensor (because that's only contained in CMAltitudeData, not CMAbsoluteAltitudeData. If I use CMAltimeter.startAbsoluteAltitudeUpdates(), I get absolute altitude update events every second or so. NSMotionUsageDescription is set.
I have tried the following things, all of which haven't worked:
Only calling startRelativeAltitudeUpdates() and not startAbsoluteAltitudeUpdates()
Calling CMSensorRecorder.recordAccelerometer(forDuration: 0.1), as suggested in this thread
Calling CMMotionActivityManager.queryActivityStarting(from: .now, to: .now, to: .main), as suggested here
Physically moving my iPhone up and down about 200 feet using an elevator; I see absolute altitude updates which are in line with what's expected, but still receive no relative altitude update events
Calling the same APIs in a watchOS app on an Apple Watch Series 10; I see much less frequent absolute altitude updates, and still no relative altitude updates
I know the barometer sensor is working, because when I move my iPhone up and down even a foot or two indoors, I see an immediate change in the absolute altitude reading that I know wouldn't come from GPS.
This example code, when run on my iPhone 16 Pro running iOS 18.1.1, prints updates started and then updating absolute every second, but doesn't print anything else. The absolute altitude, accuracy, and authentication status fields update (and the auth status shows 3, indicating .authorized), but the relative altitude and pressure fields remain as --.
struct ContentView: View {
@State private var relAlt: String = "--"
@State private var relPressure: String = "--"
@State private var absAlt: String = "--"
@State private var precision: String = "--"
@State private var accuracy: String = "--"
@State private var status: String = "--"
var body: some View {
VStack {
Text("Altitude: \(relAlt) m")
.font(.title3)
Text("Pressure: \(relPressure) kPa")
.font(.title3)
Text("Altitude (absolute): \(absAlt) m")
.font(.title3)
Text("Precision: \(precision) m")
.font(.title3)
Text("Accuracy: \(accuracy) m")
.font(.title3)
Text("Auth status: \(status)")
.font(.title3)
}
.padding()
.onAppear {
let altimeter = CMAltimeter()
startRelativeBarometerUpdates(with: altimeter)
startAbsoluteBarometerUpdates(with: altimeter)
status = CMAltimeter.authorizationStatus().rawValue.formatted()
print("updates started")
}
}
private func startRelativeBarometerUpdates(with altimeter: CMAltimeter) {
guard CMAltimeter.isRelativeAltitudeAvailable() else {
relAlt = "nope"
relPressure = "nope"
return
}
altimeter.startRelativeAltitudeUpdates(to: .main) { data, error in
if let error = error {
print("Error: \(error.localizedDescription)")
return
}
if let data = data {
print("updating relative")
relAlt = String(format: "%.2f", data.relativeAltitude.doubleValue)
relPressure = String(format: "%.2f", data.pressure.doubleValue)
} else {
print("no data relative")
}
}
}
private func startAbsoluteBarometerUpdates(with altimeter: CMAltimeter) {
guard CMAltimeter.isAbsoluteAltitudeAvailable() else {
absAlt = "nope"
print("no absolute available")
return
}
let altimeter = CMAltimeter()
altimeter.startAbsoluteAltitudeUpdates(to: .main) { data, error in
if let error = error {
print("Error: \(error.localizedDescription)")
return
}
if let data = data {
print("updating absolute")
absAlt = String(format: "%.2f", data.altitude)
precision = String(format: "%.2f", data.precision)
accuracy = String(format: "%.2f", data.accuracy)
}
}
}
}
Is this behavior expected? How can I trigger delivery of relative altitude updates to my app?
We have a web-based viewer which makes use of device orientation.
The permission box only appears on a “touch” and not on a swipe or pinch. Is this normal?
We’ve already tried different listeners: start, end etc but it’s still only initiated the permission request upon a physical touch / tap on screen.
Any ideas / suggestions would be gratefully received.
Many thanks in advance.