Hi, we have developed an application that streams data from two BLE peripherals at a rate of 14.5kbps per peripheral. Until now, our devices streamed in near real time with no lag on all Apple devices with Bluetooth 5.0 or greater. Since the release of the iPhone 17 series and the iPad A16, we have reports from users of the data being streamed at significantly lower rates than expected.
Any help here would be greatly appreciated as our customers are being affected by this change.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have an iOS/iPadOS app and 'm trying to communicate with usb smart card reader using CryptoTokenKit on all platforms (ios/ipados/macos).
Minimal Repro Code
import CryptoTokenKit
import SwiftUI
struct ContentView: View {
@State var status = ""
var body: some View {
VStack {
Text("Status: \(status)")
}
.padding()
.onAppear {
let manager = TKSmartCardSlotManager.default
if manager != nil {
status = "Initialized"
} else {
status = "Unsupported"
}
}
}
}
And my entitlement file has only one key:
com.apple.security.smartcard = YES
Behavior
• iPadOS (on device): status = "Initialized" ✅
• macOS (native macOS app, with the required CryptoTokenKit entitlement): status = "Initialized" ✅
• macOS (Designed for iPad, regardless of CryptoTokenKit entitlement): status = "Unsupported" → TKSmartCardSlotManager.default is nil ❌
Expectation
Given that the same iPadOS build initializes TKSmartCardSlotManager, I expected the iPad app running in Designed for iPad mode on Apple silicon Mac to behave the same (or to have a documented limitation).
Questions
Is CryptoTokenKit (and specifically TKSmartCardSlotManager) supported for iPad apps running on Mac in Designed for iPad mode?
If support exists, what entitlements / capabilities are required for USB smart-card access in this configuration?
If not supported, is Mac Catalyst the correct/only path on macOS to access USB smart-card readers via CryptoTokenKit?
Are there recommended alternatives for iPad apps on Mac (Designed for iPad) to communicate with USB smart-card readers (e.g., ExternalAccessory, DriverKit, etc.), or is this scenario intentionally unsupported?
Thanks!
Hi,
We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed.
Here is our test summary regarding the issue:
TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log
DoorWindow_2.0.1_Google_Success.txt
)
TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log
DoorWindow_2.0.1_Google_by_iOS_NG.txt
)
TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13
In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried:
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails.
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on Android, this works as expected and device pops up in Google home of iOS as well.
Could you help check what's the issue of TestCase2?
Append the environment of our testing:
NestHub 2 version
Google Home app version
We would like to be able to distinguish between iPhones and Apple Watches when scanning for devices using a Laird BLE module.
We know that we can identify an Apple device from the manufacturer data returned in the scan report. 0x004C is the registered identifier for Apple.
In the remaining data returned is it possible identify the device type?
We note that empirically, 4C001005 seems to correlate to an Apple Watch. How reliable is this?
It is useful for us, because it means we do not need to connect to this device to see if it is advertising a service that we own.
Connecting over BLE is of course an expensive operation.
Here is a simple snippet of a Swift App doing a similar thing, to illustrate the question:
func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber)
{
guard
let manufData: Data = advertisementData[CBAdvertisementDataManufacturerDataKey] as? Data
else
{
return
}
let hexEncodedManufData: String = manufData.map { String(format: "%02hhx", $0) }.joined()
print("Manufacturer Data: \(hexEncodedManufData): ")
// Manufacturer Data: 4c001007351ff9f9036238: Apple device
// Manufacturer Data: 4c001006331ec0640f88: Apple device
// Manufacturer Data: 4c0010052b18804eb1: Apple watch?
// Manufacturer Data: 4c0010052b18804eb1: Apple watch?
}
I have a device. After pairing, it is shown as connected in the Bluetooth list of the mobile phone. Before iOS 18, I could retrieve the device through the retrieveConnectedPeripherals method, but after iOS 18, I cannot retrieve it. What is the reason? Because when the device is connected, it no longer sends broadcast packets. How can I retrieve the device?
I am working on an iOS application that relies on CoreMotion Attitude, and I need clarification regarding the behavior of reference frames.
According to the documentation, when setting the attitude reference frame to CMAttitudeReferenceFrameXTrueNorthZVertical:
The Z-axis of the reference frame is vertical (aligned with gravity).
The X-axis of the reference frame points to the geographic North Pole (True North).
When a device’s orientation matches this reference frame, the roll, pitch, and yaw values reported by CMAttitude should be (0,0,0).
However, in my testing:
When I align the device’s position with the CMAttitudeReferenceFrameXTrueNorthZVertical reference frame by orienting the screen (device Z-axis) upward and the right side (device X-axis) toward north, the yaw value reported by CMAttitude is 90 degrees instead of the expected 0 degrees.
To have CMAttitude report yaw as 0, I must instead orient the top side (device Y-axis) toward north.
This seems to contradict my understanding that the X-axis of the device should be aligned with True North, to have the device match the attitude reference frame and have roll, pitch, and yaw values reported by CMAttitude should be (0,0,0).
What I'm missing?
Thank you for your time and assistance.
Topic:
App & System Services
SubTopic:
Hardware
what do i do is it going down too fast?
Battery health reduced to 89 from 98 within 2 months on iPhone 15 Pro and Cycle Count is just 314.
Is it the software update doing this?
Hello,
I am currently working on a USB HID-class device and I wanted to test communications between various OSes and the device.
I was able to communicate through standard USB with the device on other OSes such as Windows and Linux, through their integrated kernel modules and generic HID drivers. As a last test, I wanted to test macOS as well.
This is my code, running in a Swift-based command line utility:
import Foundation
import CoreHID
let matchingCriteria = HIDDeviceManager.DeviceMatchingCriteria(vendorID: 0x1234, productID: 0x0006) // This is the VID/PID combination that the device is actually listed under
let manager = HIDDeviceManager()
for try await notification in await manager.monitorNotifications(matchingCriteria: [matchingCriteria]) {
switch notification {
case .deviceMatched(let deviceReference):
print("Device Matched!")
guard let client = HIDDeviceClient(deviceReference: deviceReference) else {
fatalError("Unable to create client. Exiting.") // crash on purpose
}
let report = try await client.dispatchGetReportRequest(type: .input)
print("Get report data: [\(report.map { String(format: "%02x", $0) }.joined(separator: " "))]")
case .deviceRemoved(_):
print("A device was removed.")
default:
continue
}
}
The client.dispatchGetReportRequest(...) line always fails, and if I turn the try expression into a force-unwrapped one (try!) then the code, unsurprisingly, crashes.
The line raises a CoreHID.HIDDeviceError.unknown() exception with a seemingly meaningless IOReturn code (last time I tried I got an IOReturn code with the value of -536870211).
The first instinct is to blame my own custom USB device for not working properly, but it doesn't cooperate with with ANY USB device currently connected: not a keyboard (with permissions granted), not a controller, nothing.
I did make sure to enable USB device access in the entitlements (when I tried to run this code in a simple Cocoa app) as well.
...What am I doing wrong here? What does the IOReturn code mean?
Thanks in advance for anybody willing to help out!
Topic:
App & System Services
SubTopic:
Hardware
Hello everyone,
I want send haptics to ps4 controller.
CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer good work with iPhone.
On PS4 controller If I use CHHapticPatternPlayer all work good, but if I use CHHapticAdvancedPatternPlayer I get error. I want use CHHapticAdvancedPatternPlayer to use additional settings. I don't found any information how to fix it -
CHHapticEngine.mm:624 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'Не удалось завершить операцию. (com.apple.CoreHaptics, ошибка -4811)'
The engine stopped because a system error occurred.
AVHapticClient.mm:1228 -[AVHapticClient getSyncDelegateForMethod:errorHandler:]_block_invoke: ERROR: Sync XPC call for 'loadAndPrepareHapticSequenceFromEvents:reply:' (client ID 0x21) failed: Не удалось установить связь с приложением-помощником.
Не удалось создать или воспроизвести паттерн: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics" UserInfo={NSDebugDescription=connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics}
My Haptic class -
import Foundation
import CoreHaptics
import GameController
protocol HapticsControllerDelegate: AnyObject {
func didConnectController()
func didDisconnectController()
func enginePlayerStart(value: Bool)
}
final class HapticsControllerManager {
static let shared = HapticsControllerManager()
private var isSetup = false
private var hapticEngine: CHHapticEngine?
private var hapticPlayer: CHHapticAdvancedPatternPlayer?
weak var delegate: HapticsControllerDelegate? {
didSet {
if delegate != nil {
startObserving()
}
}
}
deinit {
NotificationCenter.default.removeObserver(self)
}
private func startObserving() {
guard !isSetup else { return }
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidConnect),
name: .GCControllerDidConnect,
object: nil
)
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidDisconnect),
name: .GCControllerDidDisconnect,
object: nil
)
isSetup = true
}
@objc private func controllerDidConnect(notification: Notification) {
delegate?.didConnectController()
self.createAndStartHapticEngine()
}
@objc private func controllerDidDisconnect(notification: Notification) {
delegate?.didDisconnectController()
hapticEngine = nil
hapticPlayer = nil
}
private func createAndStartHapticEngine() {
guard let controller = GCController.controllers().first else {
print("No controller connected")
return
}
guard controller.haptics != nil else {
print("Haptics not supported on this controller")
return
}
hapticEngine = createEngine(for: controller, locality: .default)
hapticEngine?.playsHapticsOnly = true
do {
try hapticEngine?.start()
} catch {
print("Не удалось запустить движок тактильной обратной связи: \(error)")
}
}
private func createEngine(for controller: GCController, locality: GCHapticsLocality) -> CHHapticEngine? {
guard let engine = controller.haptics?.createEngine(withLocality: locality) else {
print("Failed to create engine.")
return nil
}
print("Successfully created engine.")
engine.stoppedHandler = { reason in
print("The engine stopped because \(reason.message)")
}
engine.resetHandler = {
print("The engine reset --> Restarting now!")
do {
try engine.start()
} catch {
print("Failed to restart the engine: \(error)")
}
}
return engine
}
func startHapticFeedback(haptics: [CHHapticEvent]) {
do {
let pattern = try CHHapticPattern(events: haptics, parameters: [])
hapticPlayer = try hapticEngine?.makeAdvancedPlayer(with: pattern)
hapticPlayer?.loopEnabled = true
try hapticPlayer?.start(atTime: 0)
self.delegate?.enginePlayerStart(value: true)
} catch {
self.delegate?.enginePlayerStart(value: false)
print("Не удалось создать или воспроизвести паттерн: \(error)")
}
}
func stopHapticFeedback() {
do {
try hapticPlayer?.stop(atTime: 0)
self.delegate?.enginePlayerStart(value: false)
} catch {
self.delegate?.enginePlayerStart(value: true)
print("Не удалось остановить воспроизведение вибрации: \(error)")
}
}
}
extension CHHapticEngine.StoppedReason {
var message: String {
switch self {
case .audioSessionInterrupt:
return "the audio session was interrupted."
case .applicationSuspended:
return "the application was suspended."
case .idleTimeout:
return "an idle timeout occurred."
case .systemError:
return "a system error occurred."
case .notifyWhenFinished:
return "playback finished."
case .engineDestroyed:
return "the engine was destroyed."
case .gameControllerDisconnect:
return "the game controller disconnected."
@unknown default:
return "an unknown error occurred."
}
}
}
custom haptic events -
static func changeVibrationPower(power: HapricPower) -> [CHHapticEvent] {
let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0),
CHHapticEventParameter(parameterID: .hapticIntensity, value: power.value)
], relativeTime: 0, duration: 0.5)
return [continuousEvent]
}
I have an Iphone Xsmax and the battery health is degraded to 69
i noticed whenever I put it on charge it just restarts and keeps doing that until I start using it or keep the screen on before it charges
please is it my charger or it’s because the battery health has degraded to 69?
Topic:
App & System Services
SubTopic:
Hardware
So the battery level value is in accurate returns the battery percentage in multiple of 5 values e.g. battery percentage is 42 but the api returns it as 40. So please fix the issue if possible because i checked that the
devices running iOS versions below 17 appear to be working fine.
Im having issue with OneDrive that is affected our company iPads. User are able to drag and drop any folder or files over and now they cant. they are on the latest update for OneDrive and the IOS. Can someone look at this and also i reach to Microsoft and they said that nothing have change on there end.
When are you guys going to fix the CarPlay issues with this new update? I use this for work and it’s really an issue. Nothing is working and it takes up entirely too much space.
After updating to ios18.4, 3d scanning function, including AR function in apple clips, cannot be used. Does anyone else have the same problem?
Topic:
App & System Services
SubTopic:
Hardware
**
Every time after I downloaded an app this window opens and never closes how to close it?**
Topic:
App & System Services
SubTopic:
Hardware
My audio and MIDI sequencer application consumes about 600 % of CPU power with 10 different instruments during playback. While idle approximately 100%.
What is the maximum of CPU power that an application can consume? Are there any limits and could they be modified?
I am asking because if I add more instruments the real-time behaviour gets bad at 700 % of CPU power.
I have got following HW:
MacBook Pro
14-inch, Nov 2024
Apple M4 Pro
24 GB
Hello,
Since updating to iOS 18.3.1, the rear camera on my iPhone 13 Pro Max has not been functioning properly. The Camera app displays a black screen and becomes unresponsive.
I analyzed the crash logs and found that the issue is related to the cameracaptured process, which handles image and video capture on iOS. Here are the key details from the crash log:
📌 Memory Error: "Address size fault"
📌 Impacted Thread: com.apple.coremedia.capturesession.workerQueue
The "Address size fault" error suggests a memory access issue, likely causing the cameracaptured process to crash. This could be due to a bug in the video capture thread management introduced in the update.
What do you think?
name":"cameracaptured","timestamp":"2025-03-12 10:37:31.00 +0100","app_version":"1.0","slice_uuid":"cc45251e-92fc-329d-a3e9-d1c8c019e59e","build_version":"587.82.13","platform":2,"share_with_app_devs":0,"is_first_party":1,"bug_type":"309","os_version":"iPhone OS 18.3.2 (22D82)","roots_installed":0,"incident_id":"E97F5B3A-345F-42A6-97E8-28D175C8C5A9","name":"cameracaptured"}
{
"uptime" : 820,
"procRole" : "Unspecified",
"version" : 2,
"userID" : 501,
"deployVersion" : 210,
"modelCode" : "iPhone14,3",
"coalitionID" : 75,
"osVersion" : {
"isEmbedded" : true,
"train" : "iPhone OS 18.3.2",
"releaseType" : "User",
"build" : "22D82"
},
"captureTime" : "2025-03-12 10:37:30.1093 +0100",
"codeSigningMonitor" : 2,
"incident" : "E97F5B3A-345F-42A6-97E8-28D175C8C5A9",
"pid" : 68,
"translated" : false,
"cpuType" : "ARM-64",
"roots_installed" : 0,
"bug_type" : "309",
"procLaunch" : "2025-03-12 10:04:03.7137 +0100",
"procStartAbsTime" : 225890551,
"procExitAbsTime" : 19918403953,
"procName" : "cameracaptured",
"procPath" : "/usr/libexec/cameracaptured",
"bundleInfo" : {"CFBundleVersion":"587.82.13","CFBundleShortVersionString":"1.0"},
"parentProc" : "launchd",
"parentPid" : 1,
"coalitionName" : "com.apple.cameracaptured",
"crashReporterKey" : "137125638e43c62173057ae3dc983089b1f083cf",
"appleIntelligenceStatus" : {"state":"unavailable","reasons":["siriAssetIsNotReady","selectedLanguageIneligible","selectedLanguageDoesNotMatchSelectedSiriLanguage","notOptedIn","deviceNotCapable","selectedSiriLanguageIneligible","countryLocationIneligible","unableToFetchAvailability","assetIsNotReady"]},
"wasUnlockedSinceBoot" : 1,
"isLocked" : 0,
"throttleTimeout" : 5,
"codeSigningID" : "com.apple.cameracaptured",
"codeSigningTeamID" : "",
"codeSigningFlags" : 570434305,
"codeSigningValidationCategory" : 1,
"codeSigningTrustLevel" : 7,
"instructionByteStream" : {"beforePC":"BgCA0hUnFpTgAxOqIaSGUiFLu3KJJBaU4AMTqqfYDZTozSGQAFEC+Q==","atPC":"IAAg1KiDW/jJkB+QKd1B+SkBQPk/AQjrAQEAVP17Uqn0T1Gp9ldQqQ=="},
"bootSessionUUID" : "33672FC1-99EC-48FC-8BCD-2B96DF170CC3",
"basebandVersion" : "4.20.03",
"exception" : {"codes":"0x0000000000000001, 0x00000001a93909f0","rawCodes":[1,7134054896],"type":"EXC_BREAKPOINT","signal":"SIGTRAP"},
"termination" : {"flags":0,"code":5,"namespace":"SIGNAL","indicator":"Trace/BPT trap: 5","byProc":"exc handler","byPid":68},
"os_fault" : {"process":"cameracaptured"},
"faultingThread" : 4,
"threads" : [{"id":1699,"threadState":{"x":[{"value":268451845},{"value":21592279046},{"value":8589934592},{"value":28600187224064},{"value":0},{"value":28600187224064},{"value":2},{"value":4294967295},{"value":18446744073709550527},{"value":2},{"value":0},{"value":0},{"value":0},{"value":6659},{"value":0},{"value":0},{"value":18446744073709551569},{"value":6677212688,"symbolLocation":56,"symbol":"clock_gettime"},{"value":0},{"value":4294967295},{"value":2},{"value":28600187224064},{"value":0},{"value":28600187224064},{"value":6126594600},{"value":8589934592},{"value":21592279046},{"value":21592279046},{"value":4412409862}],"flavor":"ARM_THREAD_STATE64","lr":{"value":7911718552},"cpsr":{"value":4096},"fp":{"value":6126594448},"sp":{"value":6126594368},"esr":{"value":1442840704,"description":" Address size fault"},"pc":{"value":7911704456},pc":{"value":7911704456},"far":{"value":0}},"queue":"com.apple.main-thread","frames":[{"imageOffset":6024,"symbol":"mach_msg2_trap","symbolLocation":8,"imageIndex":10},{"imageOffset":20120,"symbol":"mach_msg2_internal","symbolLocation":80,"imageIndex":10},{"imageOffset":19888,"symbol":"mach_msg_overwrite","symbolLocation":424,"imageIndex":10},{"imageOffset":19452,"symbol":"mach_msg","symbolL
Topic:
App & System Services
SubTopic:
Hardware
Ever since the last update i have issues with my camera app. Sometimes when I open the app the forward facing cameras don’t work and it’s just a Black screen. I also get a warning that I may not have genuine iPhone parts installed. I have to reboot the phone every time just to have it app function again. It’s annoying. Please fix this. I never had any issues with the camera or its app up until after the update.
We just updated our ATS to the latest 8.3.0 version and tried to run the iAP2 Session Test via BPA100 Bluetooth Analyzer and we are experiencing this EXC_BAD_INSTRUCTION. This same test still seems to work on ATS version 6. Please advise.
Process: ATS [1782]
Path: /private/var/folders/*/ATS.app/Contents/MacOS/ATS
Identifier: com.apple.ATSMacApp
Version: 8.3.0 (1826)
Build Info: ATSMacApp-1826000000000000~2 (1A613)
Code Type: X86-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2025-01-27 11:05:21.1334 -0800
OS Version: macOS 15.2 (24C101)
Report Version: 12
Bridge OS Version: 9.2 (22P2093)
Anonymous UUID: 098E2BB5-CB98-CA1C-CEFE-188AF6EFE8CF
Time Awake Since Boot: 9700 seconds
System Integrity Protection: enabled
Crashed Thread: 2 com.apple.ATSMacApp.FrontlineFrameworkInterface
Exception Type: EXC_BAD_INSTRUCTION (SIGILL)
Exception Codes: 0x0000000000000001, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4
Terminating Process: exc handler [1782]
Topic:
App & System Services
SubTopic:
Hardware
Tags:
Developer Tools
External Accessory
Testing
Core Bluetooth