With the same firmware, OTA testing on the DCL test network was successful in September 2025, and the Home app was able to deliver software update notifications. Since the beginning of 2026, however, the Home app no longer delivers software update notifications.
This is bug number:
FB21922369
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
We are currently exploring ways to implement a frictionless Wi-Fi setup for our hardware devices without requiring a dedicated third-party application. We are interested in leveraging Apple's WAC (Wireless Accessory Configuration) to sync Wi-Fi credentials directly from iOS devices. However, we have struggled to find comprehensive technical documentation or specifications regarding the WAC service. Could anyone point us to the official source for these materials?
Additionally, we have a couple of technical questions:
1.We are testing WAC provisioning and found that the Home app can discover our device and successfully get it online. However, it always ends with a "Failed to add accessory" message.
Does WAC support imply that a device should be addable via the Home app? If not, why is the Home app able to discover and start the setup for a non-HomeKit WAC device?
2. Our device is already Apple AirPlay certified. Does implementing WAC require additional standalone certification, or is it covered under the existing MFi/AirPlay certification umbrella?
Any insights or guidance would be greatly appreciated. Thank you!
My MacBook Pro M5 running MacOS Tahoe 26.3 beta fails to detect two identical ASUS ROG Swift OLED PG32UCDM monitors simultaneously. Only one display is recognized at a time.
One potential root cause might be that both monitors report identical binary EDID serial numbers (0x01010101), and the MacBook Pro M5 appears to use this value exclusively for display identity rather than combining it with other more detailed information (e.g., port, or alphanumeric serial number).
I've verified that the monitor EDID binary serial numbers are in fact identical -- however the alphanumerical serial numbers are not identical.
NOTE: This behavior is specific to the MacBook Pro M5 — when connecting both monitors via usb-c to a Mac Mini M4 Pro running the same MacOS Tahoe 26.3 beta, the monitors work fine. The OS detects both and assigns different names to them (PG32UCDM (1) and PG32UCDM (2)).
NOTE: I could be wrong about this root cause, I don't have a way to disprove it, though the fact the monitors work fine on a Mac Mini is suspicious.
What I have tried:
Connecting the two monitors using different monitor ports (one on DisplayPort, another on HDMI, etc.), and different MacBook ports (one on HDMI, another on USB-C, etc.)
Bumping down the resolution on the monitors to "1920x1080 (low resolution)" and 30Hz to rule out bandwidth issues.
Connecting one, or both, monitors to CalDigit TS5 Plus dock. Neither alternate configuration yields the device recognizing both screens.
Using BetterDisplay to import a manually-edited EDID for the screen, with a different binary EDID value, manufacturer name, etc.
I've also verified that if I plug in my Apple Studio Display as one of the monitors, then the MacBook recognizes both one of the PG32UCDM monitors and the Studio Display at the same time. The issue seems to occur only when both monitors plugged into it are the same PG32UCDM model.
When I have both monitors plugged into my MacBook, each time I disconnect the cable to whichever monitor is currently recognized, it immediately recognizes the other monitor. Plugging the cable for the disconnected monitor back in has no effect.
I'm at a loss.
Has anyone run into this issue and found a successful workaround that is not one of the approaches I've described above?
Topic:
App & System Services
SubTopic:
Hardware
Updated version of this post
My HomePod mini is now on version 16.4, so the the temperature and humidity sensors are enabled. The data properly shows up in the Home app on my various devices.
In my HomeKit iPad app running on Mac Catalyst, however, the data does not show up. I would expect the HomePod mini to show up in HMHome.accessories with a service of type HMServiceTypeTempatureSensor. I see all of my other HomeKit accessories, just not the HomePod mini.
I have tried with the latest Xcode (14.3) and highest available iOS Target and Minimum Deployment (16.4), macOS version 13.3. I have not, as of this writing, upgraded my HomeKit architecture, however.
Note that I haven't tried the app on an actual iPad (and the iOS simulator doesn't expose my HomeKit environment.)
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts.
For example:
AS1 (6 channels) is configured with the following channel names:
ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R
AS2 (4 channels) is configured with:
ADAT 1, ADAT 2, HP L, HP R
However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is:
ADAT 1, ADAT 2, ADAT 3, ADAT 4
The system simply hides the last two channels; the names of the remaining channels are not updated.
Initial Topology
My original topology was as follows:
Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following:
To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value.
This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names.
Question
On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts.
For example:
AS1 (6 channels) is configured with the following channel names:
ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R
AS2 (4 channels) is configured with:
ADAT 1, ADAT 2, HP L, HP R
However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is:
ADAT 1, ADAT 2, ADAT 3, ADAT 4
The system simply hides the last two channels; the names of the remaining channels are not updated.
Initial Topology
My original topology was as follows:
Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following:
To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value.
This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names.
Question
On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
My team has developed an app with a biref Matter commissioner feature using the Matter framework on the MatterSupport extension.
Our app support iOS and Android. However, we ran into a problem that the control certificate generated by the iOS app could not control the device on the Android side. And the control certificate generated by the Android app could not control the device on the iOS side.
The Matter library used by Android is compiled by connectedhomeip.
Does anyone have the same problem as us? How to solve this?
Thank you
Are there some undocumented (or, well, documented, but overlooked by me) prerequisites to the readValueWithCompletionHandler: method?
The reason I ask is that occasionally I am getting the Read/Write operation failed error in the callback, even in cases where direct, non-deferred reading of the value worked properly. It seems to happen very consistently with some accessories and characteristics, not randomly; thus, it is not likely a temporary quirk in the communication with the device.
Probably I am overlooking something of importance, but it does not make a good sense to me. My code (is it right, or can you see anything wrong in there?)
// in an HMCharacteristic category
if ([self.properties containsObject:HMCharacteristicPropertyReadable]) {
id val=self.value, ident=[NSString stringWithFormat:@" [%@] %@ (%@)", self.uniqueIdentifier, self.localizedDescription, self.service.accessory.name];
NSLog(@"nondeferred '%@'%@", val, ident);
if (self.service.accessory.reachable) {
[self readValueWithCompletionHandler:^(NSError * _Nullable error) {
if (error) NSLog(@"deferred ERROR %@ -> %@", ident, error);
else NSLog(@"deferred '%@'%@", self.value, ident);
}];
}
}
for most accessories/characteristics works properly, but for some of them I am consistently getting results like
nondeferred '70.5' [64998F70-9C11-502F-B8B4-E99DC5C3171B] Current Relative Humidity (Vlhkoměr TH)
deferred '70.5' ERROR [64998F70-9C11-502F-B8B4-E99DC5C3171B] Current Relative Humidity (Vlhkoměr TH) -> Error Domain=HMErrorDomain Code=74 "Read/Write operation failed." UserInfo={NSLocalizedDescription=Read/Write operation failed.}
Do I do something wrong in my code, or is that normal with some devices?
If the latter, is there perhaps a way to know beforehand that I should not use readValueWithCompletionHandler: (for it is bound to fail anyway), and instead I should simply use self.value non-deferred? For some time it seemed to me it happens with bridged accessories, but not really, this hypothesis proved wrong by further testing.
Thanks!
G'day.
At my office the doors are locked with an NFC reader. We carry around a little NFC tag on our key chains which will read out a number and this then will open the door if the number matches a number in the database.
I am tired of carrying around the tag, people keep loosing it, forgetting it and it would be nice to open the door using a Phone - which we tend to always have on us.
So I used a credit card which is NFC enabled to readout the NFC information, added this number to the database and can now open doors using my credit card. This is pretty cool. If I forget my keys (most likely they will be on the desk but silly me left the desk without them), I may have my wallet with me.
Then I tried Wallet.app on my iPhone and select the same credit card. However the door doesn't open. When looking in the door software I noticed that the tags will always transmit the same number. So does my credit card. However Wallet.app will read out 4 readings (or maybe just one very long one) and they are always different. So I can not make them match with the door database.
Any ideas how to make this work? Can I give somehow wallet.app an NFC number which I can then add to my door database? Or how come the credit card and the very same one in wallet.app don't match?
Thanks for your help! Would be neat if I could make this work out. This will make a lot of people happy at my office!
Cheers!
Which HomeKit API serves for the Home application scene (HMActionSet)-related functionality “Remove from Home View” and “Add to Home View”?
There must be a public API for that, for at the very least one 3rd party application shows/hides scenes appropriately as they are set up in Home; nevertheless, whatever I try, I can't find the API.
Thanks!
Summary
On Mac Studio systems (no built-in camera), macOS does not initialize camera services after a normal reboot if no physical camera is present. As a result, Continuity Camera does not appear anywhere in the system.
Observed behavior
System Information → Camera reports “No video capture devices were found.”
Continuity Camera (iPhone) is completely absent from camera lists.
Plugging in any USB UVC webcam immediately initializes camera services and causes both the USB camera and the iPhone (Continuity Camera) to appear.
The USB camera can then be unplugged and Continuity Camera continues working until the next reboot.
Reproduction steps
Use a Mac Studio (no built-in camera) on recent macOS.
Ensure no USB webcam or external camera is connected.
Reboot the Mac normally.
After login, open System Information → Camera.
Expected
Camera services should initialize even when no physical camera is present, allowing Continuity Camera to be available as the primary camera.
Actual
No camera devices are present unless a physical USB camera is connected at least once after boot.
This reproduces 100% of the time on Mac Studio and appears to be a camera service bootstrap issue where Continuity Camera cannot be the first camera device.
Issue has been filed via Feedback Assistant.
I followed the instructions on the page https://mfi.apple.com/en/help/login-help/How-to-Register-Your-Existing-Apple-ID.html to apply for the MFi Program. According to step 7 of the guide: "You have now created and registered your Apple Account. You will be automatically directed to the MFi Portal to begin the enrollment process," I should have been taken to the enrollment process after logging in. However, instead of accessing the enrollment page, a pop-up message appears stating: "The Apple Account you signed in with does not have permission to view this page. If you believe your company is currently enrolled in the MFi Program, please contact your company’s Account Administrator to request access to the MFi Portal. If your company is not currently enrolled in the MFi Program, please click here to learn about the program and start the enrollment process." This has created an endless loop—I cannot proceed to the enrollment process as instructed, and the pop-up only redirects me to information that leads back to the same login and permission issue. Could you please provide guidance on how to resolve this and successfully access the MFi Program enrollment process?
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple Forums,
We are developing an iOS application that connects to a custom BLE accessory and sends control commands to it.
Our system architecture is as follows:
A separate hardware device collects data and sends it to our backend server via Wi-Fi.
The backend evaluates state changes and determines when the BLE accessory should update its display.
The iOS app acts purely as a BLE command executor for this accessory.
Our goal is to:
Maintain a BLE connection with the accessory while the app is in the background.
Receive state-change events from our backend server.
Upon receiving such events, send a BLE command to the accessory to update its state.
We understand that iOS does not allow arbitrary background execution. We would like to confirm whether there is any supported mechanism, entitlement, or program that allows:
Long-running background execution for BLE control, or
Server-originated events (other than APNs) to trigger background BLE actions.
If this is not supported, we would appreciate confirmation that APNs (silent push) is the only supported way to trigger such background BLE actions, or guidance on any recommended alternative architectures.
Thank you for your guidance.
Hello everyone,
I am developing an iOS application that relies on accelerometer data for precise motion and reaction-time measurements.
Based on practical testing, it appears that third-party iOS applications receive accelerometer data at a maximum rate of approximately 100 Hz, regardless of hardware capabilities or requested update intervals.
I would like to ask for clarification on the following points:
Is there an officially supported way for third-party iOS apps to access accelerometer data at sampling rates higher than ~100 Hz?
If the hardware supports higher sampling rates, is this limitation intentionally enforced at the iOS level for third-party applications?
Are there any public APIs, entitlements, or documented approaches that allow access to higher-frequency sensor data, or is this restricted to system/internal components only?
Thank you in advance for any clarification.
Topic:
App & System Services
SubTopic:
Hardware
I want to add matter device to my own fabric,not same as to homeKit in Home APP
I implemented a demo which add a matter support extension, and it can success, but I use MTRDeviceController to commission,it go wrong, blow is the log
Couldn't read values in CFPrefsPlistSource<0x1062ec100> (Domain: group.wxx.MatterTest, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd
<<5 [E:46634i S:0 M:188511265] (U) Msg Retransmission to 0:0000000000000000 failure (max retries:4)
PASESession timed out while waiting for a response from the peer. Expected message type was 33
controller(:commissioningSessionEstablishmentDone:) error = nil
Error on commissioning step 'AttestationVerification': 'src/controller/CHIPDeviceController.cpp:1288: CHIP Error 0x000000AC: Internal error'
Failed verifying attestation information. Now checking DAC chain revoked status.
Failed in verifying 'Attestation Information' command received from the device: err 101. Look at AttestationVerificationResult enum to understand the errors
Error on commissioning step 'AttestationRevocationCheck': 'src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error'
Failed to send Solitary ack for MessageCounter:265529558 on exchange 46643i:src/messaging/ExchangeContext.cpp:99: CHIP Error 0x00000002: Connection aborted
Creating NSError from src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error (context: (null))
controller(:commissioningComplete:nodeID:metrics:) error = Optional(Error Domain=MTRErrorDomain Code=1 "General error: 172" UserInfo={NSLocalizedDescription=General error: 172, errorCode=172})
Is there any suggestion to me with the issue
Topic:
App & System Services
SubTopic:
Hardware
Hello everybody,
I have a never ending issue with appstore review, an need a QUICK HELP !
I am submitting a new app (oral training), for Iphones only.
I disabled other devices (such as Ipas) via Xcode.
In the appstore informations form, it is obligatory to provide ipad screens, so I provided screens showing Iphone experience.
Appstore team asked me to remove it because I don't authorize Ipads. But if I remove those screens, form cannot be sent.
I don't understantd how to proceed.
Thanks for the help
Regards
Jean
Hi, we have developed an application that streams data from two BLE peripherals at a rate of 14.5kbps per peripheral. Until now, our devices streamed in near real time with no lag on all Apple devices with Bluetooth 5.0 or greater. Since the release of the iPhone 17 series and the iPad A16, we have reports from users of the data being streamed at significantly lower rates than expected.
Any help here would be greatly appreciated as our customers are being affected by this change.
Hi,
I’m developing a Matter commissioning flow and would like to clarify Apple Home’s support for concatenated (multi-device) QR codes.
In my implementation, I generate a single QR code that contains multiple Matter onboarding payloads (concatenated payloads), intended to commission multiple devices in one scan, similar to a multi-pack / multi-accessory flow.
What I’ve tested:
Standard single-device Matter QR codes work as expected in the Apple Home app
A concatenated QR code (multiple Matter payloads combined into one QR) does not get recognized / commissioned by Apple Home
My questions:
Does Apple Home officially support commissioning via concatenated or multi-device Matter QR codes?
If yes, is there a specific payload format or delimiter that Apple Home expects?
If not, is this a known limitation or something planned for future iOS/Home releases?
My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9.
Happens especially when Simulator is opened.
Hello.
I am attempting to wrap the C library libnfc as a Swift library. This is not for use on macOS - it's mainly for use on Linux (Raspberry Pi).
I have a USB reader and my code appears to work so far, however the code/test/debug cycle is suboptimal if I'm running the code on the Pi.
As I use a Mac for day-to-day coding, I'd prefer to use Xcode and my Mac for development. MacOS appears to capture the NFC hardware for its own frameworks and attempting to open a connection to the USB device gives a Unable to claim USB interface (Permission denied) error.
ioreg shows that the hardware is claimed by an Apple framework:
"UsbExclusiveOwner" = "pid 10946, com.apple.ifdbun"
Is there a way to temporarily over-ride that system and use the hardware myself? I've tried Googling but most of the replies are out of date and Claude's advice launchctl unload /System/Library/LaunchDaemons/com.apple.ifdreader.plist doesn't appear to work...
I'm wary of disabling SIP - is there a simple way to have access to the hardware myself?
Thanks.