I updated my iPhone to 26 and I just went to to to see if my iPhone was up to date I went and see but a came across this
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Will UVC native support come for the Iphone as well?
So, using external cameras with the ipad is greatly beneficial, but for the iphone, it can make it a production powerhouse!
So, have there been discussions around bringing UVC support for the Iphone as well? and if so, what were your conclusions?
Hi everyone,
while testing HKWorkoutSession with HKLiveWorkoutBuilder on iOS 26 Beta (cycling workout), I noticed the following behavior:
– Starting a cycling HKWorkoutSession automatically connects to my Bluetooth heart rate monitor and records HR into HealthKit ✅
– However, my Bluetooth cycling power meter and cadence sensor (standard BLE Cycling Power & CSC services) are not connected automatically, and no data is recorded into HealthKit ❌
On Apple Watch, when starting a cycling workout, these sensors do connect automatically and their data is written to HealthKit — which is exactly what I would expect on iOS as well.
Question:
Is this by design, or is support for power and cadence sensors planned for iOS in the same way as on watchOS?
Or do we, as developers, need to implement the BLE Cycling Power and CSC profiles ourselves (via CoreBluetooth) if we want these metrics?
Environment:
– iOS 26 Beta
– HKWorkoutSession & HKLiveWorkoutBuilder (cycling)
– Bluetooth HRM connects automatically
– BLE power & cadence sensors do not
This feature would make it much easier to develop cycling apps with full HealthKit integration, and also create a more consistent user experience compared to watchOS.
Thanks for any insights!
Topic:
App & System Services
SubTopic:
Hardware
Tags:
Health and Fitness
HealthKit
Core Bluetooth
WorkoutKit
Hello,
The application I'm working on must report new hardware connections. To retrieve connected displays information and monitor new connections, I'm using the "Core Graphics" framework (see recommendation https://developer.apple.com/forums/thread/779945).
The monitoring logic relies on a callback function which invokes when the local display configuration changes(kCGDisplayAddFlag/kCGDisplayRemoveFlag).
#import <Cocoa/Cocoa.h>
static void displayChanged(CGDirectDisplayID displayID, CGDisplayChangeSummaryFlags flags, void *userInfo)
{
uint32_t vendor = CGDisplayVendorNumber(displayID);
if (flags & kCGDisplayAddFlag)
{
if (vendor == kDisplayVendorIDUnknown)
{
NSLog(@"I/O Kit cannot identify the monitor. kDisplayVendorIDUnknown. displayId = %u", displayID);
return;
}
NSLog(@"%u connected. vendor(%u)", displayID, vendor);
}
if (flags & kCGDisplayRemoveFlag)
{
NSLog(@"%u disconnected", displayID);
}
}
int main(int argc, const char * argv[])
{
@autoreleasepool
{
CGDisplayRegisterReconfigurationCallback(displayChanged, NULL);
NSApplicationLoad();
CFRunLoopRun();
}
return 0;
}
The test environment is a Mac mini with an external display connected via HDMI. Everything works correctly until the system enters sleep mode. Upon wakeup, the app reports two displays: the first with vendor ID kDisplayVendorIDUnknown and the second with the expected vendor ID.
Why does Core Graphics report two connections during wakeup? Is there any way to avoid this?
Thank you in advance.
Dear Apple Developer / MFi Program Support,
I am exploring technical possibilities for screen sharing and remote interaction between iOS devices and external hardware (e.g., embedded systems, in-vehicle systems) for a prototype we are currently developing.
I have reviewed the public iOS developer documentation, but I would appreciate your guidance and clarification on the following advanced use cases, particularly in the context of MFi or enterprise-level integrations:
Full-Screen Sharing of iOS Device
Is it possible to mirror or stream the entire iOS screen, even when the app is running in the background or not in the foreground?
Does ReplayKit or any other framework under the MFi or enterprise entitlements allow full-device screen capture outside the app context?
Remote Touch Injection and Control
Is there any officially supported mechanism, under MFi or otherwise, that allows external systems to remotely control an iOS device’s touch interface (e.g., simulate gestures, taps, swipes)?
Are any of the following permitted under special entitlements:
Access to IOHIDEventSystem or similar private APIs for input injection?
Communication over USB or network to relay control commands that simulate direct user interaction?
Hardware-Level Integration and Entitlements
Does the MFi Program allow:
Use of private frameworks or entitlements to build low-level integrations for iOS device control or mirroring?
Communication over USB/Lightning/USB-C to enable bi-directional interaction (streaming out, commands in)?
What are the specific APIs or entitlements available under MFi that enable these use cases?
Can you provide references to documentation, SDKs, or prerequisites for companies seeking such capabilities?
Eligibility and Certification Process
What are the criteria to be approved for the MFi program with access to such advanced capabilities?
Can PoC or early-stage research prototypes be eligible, or is MFi access restricted to commercial production intent?
How long does it typically take to gain access to these entitlements (assuming NDA and certification requirements are met)?
Alternative Pathways
If MFi access is not feasible in the short term, is there any Apple-supported alternative path (e.g., test device provisioning, enterprise signing, custom profiles) that permits more advanced capabilities for prototyping purposes?
We are not looking to publish this as a general App Store app at this stage, but rather to demonstrate feasibility as part of an innovation prototype that may lead to further OEM-level engagement in the future.
Thank you for your support and guidance.
Best regards,
Topic:
App & System Services
SubTopic:
Hardware
Hello,
I would like to discuss the behavior of the expiration of NFCPresentmentIntentAssertion (test in iOS 18.5).
In the documentation we have :
The intent assertion expires if any of the following occur:
The intent assertion object deinitializes
Your app goes into the background
15 seconds elapse
BUT; in fact ; only the 1st rule is applied.
The expiration seems to be random after the usage of CardSession and that's difficult to give to the user a good experience.
Has someone faced the same kind of issue; or can give an explanation?
Regards,
François
I am working on an app that requires the usage of CoreBluetooth – using both its CBPeripheralManager and CBCentralManager classes. Our app works with other phones and hardware peripherals to exchange data – so we wanted to explore adding AccessorySetupKit to streamline the hardware connection process.
AccessorySetupKit has been integrated (while CBPeripheralManager is turned off) and works great, but even with ASK added to our app's plist file and not in use, CBPeripheralManager fails with error: Cannot create a CBPeripheralManager while using AccessorySetupKit framework.
Is there any workaround or suggested path forward here? We'd still really like to use ASK while keeping our existing functionality, but are not seeing a clear way to do so.
I am developing an app that communicates with external BLE device over GATT. The device has a secure-read characteristic exposing some of it's data and requires pairing/bonding in order to communicate with it.
I was able to pair and connect with the device using AccessorySetupKit and .bluetoothPairingLE option:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, descriptor: descriptor)
In this case when setting up accessory, I was prompted to compare passkeys and after confirming I can read the characteristic etc.
Then I tried adding .confirmAuthorization to picker item and problems started:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, desc
pickerItem.setupOptions = [.confirmAuthorization]
When setting up, I can see a passkey to be confirmed, but when confirmed the setup ui get's suck in loading state. Under the hood in logs, I can see that my app has connected to peripheral and was able to read the characteristic.
I am unsure why the ui is stuck in loading state in this case. What is the difference when using .confirmAuthorization option and what should be the proper flow of events to setup accessory and then access protoected characteristic?
I have some logic which requires NFC support on the device. This is what I'm using to make sure that it's available:
isNFCMissing = !NFCNDEFReaderSession.readingAvailable && !NFCTagReaderSession.readingAvailable && !NFCVASReaderSession.readingAvailable
Is it possible for isNFCMissing to be true even if the device has an NFC chip.
The minimum iOS version for the application is 16 which is only supported on devices with an NFC chip to begin with.
Hello,
I am a developer planning to build an application using Apple's new SpeechTranscriber technology.
I am facing an issue where SpeechTranscriber is not available on my iPad Pro (11-inch, 2nd generation, model number: MXDC2J/A), even though I have updated it to iPadOS 26. I was under the impression that SpeechTranscriber would be available on any device running iPadOS 26. Could you please clarify if this is incorrect?
Furthermore, I am planning to purchase a new iPad with an A16 chip for the development and deployment of this application. Can you confirm if SpeechTranscriber will be fully functional on an iPad equipped with the A16 chip?
Thank you for your assistance.
Hello,
We are testing Wallet passes with iBeacons in iOS 26 Beta.
In earlier iOS releases, when a device was in proximity to a registered beacon, the corresponding pass would surface automatically.
In iOS 26 Beta, this behavior no longer occurs, even if the pass is already present in Wallet. I have not found documentation of this change in the iOS 26 release notes.
Could you please confirm whether this is expected in iOS 26, or if it may be a Beta-specific issue? Any pointers to updated documentation would be appreciated.
Thank you.
There are CarPlay resources from Apple (https://developer.apple.com/documentation/carplay/) but it seems entirely geared towards iOS app development.
Are there any resources or documentation that help with CarPlay integration on the vehicle/head unit side of things? I believe you need to download the CarPlay Communication Plug-In and integrate that with the head unit platform and applications, but there isn't much documentation that comes along with the Plug-In .zip download.
Is there a different place where resources for head unit developers is located?
I keep seeing this in the Xcode when I launch my app:
-[NFCTagReaderSession beginSessionWithConfig:]:468 error:Error Domain=NFCError Code=203 "System resource unavailable" UserInfo={NSLocalizedDescription=System resource unavailable}, errorCode: 0xcb
Initially, the device was working with NFC. Now on multiple complete restarts of my app, the System resource is still not available. If I lock the screen and reenter iOS again, the problem persists.
Rebooting the device appears to get it working again. I suspect this is only happening on iPhone 15 and iOS 18.01. My other devices I have do not exhibit the problem. I'd wager some service in the OS has crashed?
There also appears to be limitation on renewing nfc sessions. Similar problems on iPhone 16 to what I experienced on iPhone 15. I don't know what they did with iPhone 15 and 16 but the NFC has really deteriorated for longer sessions. Seems the newer Apple HW is worse at NFC than my iPhone 11. Just providing feedback.
The device also has 100% charge. Issues seem more prevalent with lower battery charge but that is not the case here.
I've noticed an issue when using an iPhone or iPad that uses a USB-C port (such as the iPhone 15 Pro or iPad Pro 12.9" 6th generation) and a USB-C to USB-C cable on ARM Macs (such as the M2 Mini). After rebooting the Mac, the iOS device is no longer recognized despite the iOS device continuing to charge.
I can temporarily resolve the issue by reseating the USB-C cable, which allows the device to be recognized again by applications like Finder. However, this isn't a practical solution due to the number of M2 Minis we have (each with an attached iPhone for testing) and the Mini's frequent automatic reboots throughout the day. Using a USB-A to USB-C cable (with USB-A connected to the Mac Mini) seems to avoid this problem altogether, as the iOS device remains consistently recognized after a reboot.
As the title suggests, this issue appears to be specific to ARM-based Macs. We've encountered it on both the M2 Mini and a 2021 Macbook Pro with the M1 Max chip. Interestingly, we haven't been able to reproduce this behavior on Intel-based Macs (tested on an 8,1 Mac Mini and a 2019 16" Macbook), where the iOS devices remain connected after a reboot when using a USB-C to USB-C cable. Here are some additional details:
iOS Devices & Versions:
iPhone 15 Pro: Issue persists on both iOS 17.1 and iOS 18.0.1
iPad Pro 12.9" (6th generation): Issue persists on both iPadOS 17.6.1 and iPadOS 18.0.1
Cables:
Apple's 60W USB-C Charge Cable (USB 2.0): Issue occurs
Generic/Third-party USB-C cable (USB 3): Issue occurs
I have been looking at the new feature in iOS 18 where it is possible to pair Matter accessories without a hub. Using the Home app I can successfully commission and control a Matter Thread Light Bulb directly (without a home hub in the network). I have an iPhone 15 Pro which includes the thread hardware.
I then tried to commission the same device in my own app using the MatterSupport framework. In this case the same user interface is displayed as when using the Home App but an error is displayed - "Thread Border Router Required."
Is it also possible to connect directly to a thread Matter device when using MatterSupport or does this only work when using the Home app?
I'm trying to browse my Homekit accessories and try to show the different accessories communication protocols, i.e. Wifi, Thread, Zigbee, Z-wave!
Zigbee and Z-wave I can have hard-coded depending on model because I have so few!
But…. I can easily find all accessories which are wifi attached by using:
isBriged == false and
uniqueIdentifiersForBridgedAccessories == nil
However, I can see that some of the accessories are thread enabled (I have read the documentation for the accessory) but still in the list.
So, my questions:
Are there any attributes in the accessory that is unique for a Thread accessory?
for accessory in accessories
{
if(accessory.isBridged == false &&
accessory.uniqueIdentifiersForBridgedAccessories == nil )
{
// Can be a Wifi device but need more controls
// requiresThreadRouter???
if (true)
{
wifiAccessaries.append(accessory)
}
}
}
I am developing a virtual Bluetooth HID keyboard device on my Win desktop that connects to my iPad over bluetooth and advertises itself as a keyboard to control the iPad.
It works very well already on Android, but not on iOS. I can see in Packet Logger that it reads well as a HID device, reads the report map and HID information correctly, which data is all valid. It doesn't subscribe to the report's Client Characteristic Configuration, just silently quitting and the keyboard does not work.
I can post more information if needed, but my question in short is what are the requirements for iOS to accept a HID over GATT as a keyboard peripheral. I feel like I am close.
I have a question about Apple certification.
We are planning card reader via HID(human interface device) for iPad that support USB-C.
iPad will receive data as HID protocol.
In this case do I have to get certificate(for example MFi) like Apple USB accessory?
Topic:
App & System Services
SubTopic:
Hardware
I am developing the electronic part of product, it includes a Find-My features. I saw some forum that testing Find-My feature needs a CSR and testing token. Can anyone teach me how to apply CSR and testing token step by step?
Thank you very much.
Best regards
Sam Ng
Hello. I am building a BLE device that is Activity Fitness based and would like a "System Level" BLE connection on WatchOS using an ESP32 (I have built a test of this on the firmware side). Meaning I do not want my iOS app to pass the BLE connection to the WatchOS app. It seems like these App Level connections do not get as many background updates as a System Level connection, and also requires the WatchOS app to be launched to connect to the BLE device
The System Level BLE connection (WatchOS Settings > BLE > Health Devices) allows for auto connection in the background, and gets more reliable background communication between the BLE device and the Apple Watch
On the Apple MFi Page it only mentions iOS:
From Apple MFi Page:
:: Who does NOT need to join -
Developers and manufacturers of accessories that connect to an Apple device using only Bluetooth Low Energy, Core Bluetooth, or standard Bluetooth profiles supported by iOS
Does this apply to WatchOS as well?
So, if I am making an BLE device that is Activity Based, and has one of the allowable Health Device UUIDs, is the BLE System Connection allowed using any BLE chip? Including say an ESP32
I have built a test BLE firmware that is a Health Device UUID, and the WatchOS sees it as a health device
Is this fine then? No need for MFi application and also no need to worry about which BLE chip is used?
thanks