Hello. I am building a BLE device that is Activity Fitness based and would like a "System Level" BLE connection on WatchOS using an ESP32 (I have built a test of this on the firmware side). Meaning I do not want my iOS app to pass the BLE connection to the WatchOS app. It seems like these App Level connections do not get as many background updates as a System Level connection, and also requires the WatchOS app to be launched to connect to the BLE device
The System Level BLE connection (WatchOS Settings > BLE > Health Devices) allows for auto connection in the background, and gets more reliable background communication between the BLE device and the Apple Watch
On the Apple MFi Page it only mentions iOS:
From Apple MFi Page:
:: Who does NOT need to join -
Developers and manufacturers of accessories that connect to an Apple device using only Bluetooth Low Energy, Core Bluetooth, or standard Bluetooth profiles supported by iOS
Does this apply to WatchOS as well?
So, if I am making an BLE device that is Activity Based, and has one of the allowable Health Device UUIDs, is the BLE System Connection allowed using any BLE chip? Including say an ESP32
I have built a test BLE firmware that is a Health Device UUID, and the WatchOS sees it as a health device
Is this fine then? No need for MFi application and also no need to worry about which BLE chip is used?
thanks
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
HomePod Mini running 18.6 build 22M5054/b - will not update to HomePod OS26
have tried un-enrollment, reset, removal, etc - no dice - anyone else seeing this ? Any known work arounds ?
iPad is running iPadOS 26 Relase 2 - 23A5276f
I am writing to report an issue I’m facing after updating my iPhone 11 Pro Max to iOS 26.
I have been using the Hollyland Lark M2 external microphone via the Lightning port, and it was working perfectly before the update. However, after upgrading to iOS 26, the iPhone no longer detects it correctly. The device now recognizes the mic as a pair of wired earphones, and it fails to capture any audio input.
The microphone itself works flawlessly on other devices, so this appears to be an iOS-specific issue.
Could you please confirm:
• Whether this is a known issue in iOS 26?
• If there are any settings or steps I can take to resolve this?
• Whether a fix is planned in an upcoming iOS patch?
I would appreciate any guidance or solution you can provide.
Thank you for your support.
Topic:
App & System Services
SubTopic:
Hardware
Macbook pro M4 - will not accept any power adapter after beta update
iPhone 16 pro - same exact problem
Devices are dead
Tried multiple chargers - Watch and IPad appear to be taking a charge for now..
I have bought iphone 16 pro max on October 28, ( 9 months ago) and rarely dropped by battery health below 20%. I also set limit to 80% so I can preserve my Battery Health. I am not a multitasking user. I used fan to keep the phone cool during charging. But today I update to iOS26 public beta, It dropped to 99% at 88 Cycle which is quite low cycle. Many other user are getting their battery health dropped to 99% after 120, 130+ cycle with daily usage. Why mine got dropped after updating. I am quite unhappy with it, and iOS 26 is so jittery in my phone
Hello,
We are testing Wallet passes with iBeacons in iOS 26 Beta.
In earlier iOS releases, when a device was in proximity to a registered beacon, the corresponding pass would surface automatically.
In iOS 26 Beta, this behavior no longer occurs, even if the pass is already present in Wallet. I have not found documentation of this change in the iOS 26 release notes.
Could you please confirm whether this is expected in iOS 26, or if it may be a Beta-specific issue? Any pointers to updated documentation would be appreciated.
Thank you.
Hello,
The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine.
My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil?
For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point?
Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch?
Thanks!
Hello,
I am a developer planning to build an application using Apple's new SpeechTranscriber technology.
I am facing an issue where SpeechTranscriber is not available on my iPad Pro (11-inch, 2nd generation, model number: MXDC2J/A), even though I have updated it to iPadOS 26. I was under the impression that SpeechTranscriber would be available on any device running iPadOS 26. Could you please clarify if this is incorrect?
Furthermore, I am planning to purchase a new iPad with an A16 chip for the development and deployment of this application. Can you confirm if SpeechTranscriber will be fully functional on an iPad equipped with the A16 chip?
Thank you for your assistance.
We have recently encountered an App crash, as shown in the picture.
We call this function as:
let session = AVAudioSession.sharedInstance() guard session.currentRoute.outputs.isEmpty == false else { return false }
The TestFlight caught this issue, and the iOS device information is attached:
Do you have any suggestions to avoid this crash?
Topic:
App & System Services
SubTopic:
Hardware
We are preparing to implement document signing using USB tokens on iOS and macOS. Several other applications already support this feature.
From my testing and development efforts, I've been unable to reliably access or utilize certificates stored on a smartcard through the iOS APIs. Here are the specifics:
Environment
iOS: 15 and later
Xcode: Versions 18 and 26
Smartcard/Token: ePass 2003 (eMudhra), Feitien token (Capricorn)
Observed Issue :
The token is recognized at the system level, with certificates visible in Keychain Access.
However, programmatic access to the private keys on the smartcard from within the app is not working.
Signing attempts result in Error 6985 and CACC errors.
Approaches Tried:
Updated provisioning profiles with the following entitlements:
com.apple.developer.smartcard
com.apple.security.device.usb
TKSmartCard
Employed TKSmartCard and TKSmartCardSession for interaction.
The token is detected successfully.
A session can be established, but there's no straightforward method to leverage it for certificate-based signing.
Access to signing functions is unavailable; operations yield Error 6985 or CACC errors.
if let smartCard = TKSmartCard(slot: someSlot) {
smartCard.openSession { session, error in
if let session = session {
let command: [UInt8] = [0x00, 0xA4, 0x04, 0x00]
session.transmit(Data(command)) { response, error in
print("Response: \(String(describing: response))")
print("Error: \(String(describing: error))")
}
}
}
}
TokenKit (macOS/iOS)
- Utilized TKTokenWatcher to identify available tokens on macOS (not available on iOS).
watcher.setInsertionHandler { tokenID in
print("Token detected: \(tokenID)")
}
CryptoKit / Security Framework
- Attempted to retrieve SecCertificate using SecItemCopyMatching queries, which succeeded on macOS but failed on iOS.
let query: [CFString: Any] = [
kSecClass: kSecClassCertificate,
kSecReturnRef: true,
kSecMatchLimit: kSecMatchLimitAll
]
var items: CFTypeRef?
let status = SecItemCopyMatching(query as CFDictionary, &items)
print("Status: \(status)") // macOS succeeds, iOS fails
ExternalAccessory Framework (EAAccessory)
* Investigated using EAAccessory and EASession for external token communication, but it did not function as expected.
This functionality is critical for my project. Has anyone successfully implemented smartcard-based signing on iOS? Any guidance, sample code, or references to relevant Apple documentation would be greatly appreciated.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
iOS
Apple CryptoKit
USBDriverKit
CryptoTokenKit
I have a small furnace that heats a sample of my product, a cylinder measuring 20mm x 100mm. As it is heated, there is thermal expansion, and I at this moment read (using my eyes) a micrometer dial gauge and type into my app the value read in hundredths of a millimeter. My source code uses quartz to draw a PDF graph that represents this thermal expansion. I would like to enhance my set up by getting my app to collect the linear displacement from my dilatometer furnace. One possibility I have been trying to implement is to have my app collect DC millivolts from an instrument, like a voltmeter reading the output of an LVDT (Linear Variable Differential Transformer), but the model I have requires MODBUS-RTU protocol, which is not well implemented by Mac OS X developers. It requires an USB-Serial converter, which complicates things and makes it more unreliable. At this moment, I have to sit next to my furnace and type into my app the temperature and linear displacement for 3 hours. I would like to read some comments and suggestions from you or an Apple Engineer and how this could be achieved.
Here is a video of my dilatometer furnace:
https://www.correiofacil.com/video/IMG_8888.MOV
Here is a PDF I generated using my APP:
https://www.correiofacil.com/PDF/117.pdf
Here is a video showing how I type data manually into my app:
https://www.correiofacil.com/video/recording2.mov
Here is a video that shows my app drawing a graph:
https://www.correiofacil.com/video/recording1.mov
These 2 images show my attempts to use the app ModBusRtuMaster to collect data from the voltmeter which at this moment fails:
https://www.correiofacil.com/PDF/8f9be971dee32a58c36b307dbab3dc2d.png
https://www.correiofacil.com/PDF/4ac20c2ea762fd42ed340a7a2b567900.png
You will notice that although the instrument is displaying 0.000 volts, the data collected changes at each message I send to the instrument, proving me that Modbus RTU set up is not working.
Here is a link to the manual of the DC voltmeter I am using:
https://www.correiofacil.com/PDF/manual1.pdf
Here is an image of the LVDT I am trying to use:
https://www.correiofacil.com/PDF/e0349986512f9273b26b209baa3ba5c7.png
Here is the manual for the LVDT:
https://www.correiofacil.com/PDF/GGD19.pdf
I suspect that there could be a completely different approach to collecting the data and inputting into my app, maybe Bluetooth?
Maybe using a Mitutoyo gauge such as? This:
https://www.mitutoyo.com/Images/CatalogUS-1003/resources/_pdfs_/US-1003_Catalog_251.pdf
And a cable with SPC support like this:
https://www.gw-style.com/product-p-942977.html
I am very grateful for your comments
For what iPhone and iPad models under iOS 26 SpeechTranscriber.isAvailable is true
When our Bluetooth device is scanned and a connection is initiated through the app on the iPhone 17, the air log shows that the iPhone sends an LL_LENGTH_REQ to execute the Data Length Update Procedure. However, our peripheral does not support the Bluetooth LE Data Length Extension, so it responds with an LL_UNKNOWN_RSP PDU with the UnknownType field set to LL_LENGTH_REQ.
After receiving the LL_UNKNOWN_RSP, the iPhone 17 does not proceed with the subsequent Bluetooth LE service discovery process. The connection is maintained until the peripheral actively disconnects.
Once the peripheral disconnects and continues broadcasting Bluetooth signals, the iPhone 17 repeatedly tries to connect to the peripheral and executes the aforementioned process, even if the app has been terminated.
According to the Bluetooth 4.2 core specification ([Vol. 6] Part B, Section 5.1.9), which can be found here: https://www.bluetooth.com/specifications/specs/core-specification-amended-4-2/, the iPhone should accept the LL_UNKNOWN_RSP and terminate the Data Length Update Procedure after receiving it, proceeding with the subsequent operations using the default minimum parameters.
Phenomenon: When the app calls the - (void)connectPeripheral:(CBPeripheral *)peripheral options:(nullable NSDictionary<NSString *, id> *)options method, the connection result callback is never received. After a period of approximately 10 seconds, it fails with a callback, displaying the message: central did fail to connect to peripheral: <TY : 45E4A697-31AE-9B5A-1C38-53D7CA624D8C, Error Domain=CoreBLEErrorDomain Code=400 "(null)">.
Since 17.4 Dev Beta 2, I have been having Bluetooth issues.
I had hoped it would have cleared up but even in 17.4.1 it continues.
Airpod and Echo Auto are the only 2 audio devices I have.
The audio will become chopping, rubber band or sound robotic and sometime completely disconnect.
While driving it will occur on both audio devices.
Sometimes I'm stopped at red light and the issue occurs.
The phone is less than 3 feet from the device at all times.
I have read forums and removed and readded the devices but that did not help.
I really do not want to have to reset my phone since my 2FA apps do not recover in a restore.
Anyone have any suggestions?
Hi everyone,
I try to understand Matter Support and how to get the onboardingPayload from the commissionDevice func.
I followed the docs from https://developer.apple.com/documentation/mattersupport/adding-matter-support-to-your-ecosystem
I also added the Matter Extension, added the NSBonjourServices, included the Matter Extension and did .perform(). The UI shows up correctly and I can scan the QR-Code, which shows pair it to your ecosystem.
I launched the extension via Xcode in my application, but the RequestHandler isn't triggering.
Did I miss something? Can someone point me into the right direction please?
I have a 2019 iMac 5K with an Apple 2TB internal, which I've been using since June 2019. DriveDx says the drive is working correctly and has 94% of lifetime remaining. When I switched from Sonoma to Sequoia, I backed everything up several ways, reformatted the internal, installed 15.0 from a USB drive, copied all of my data back and then installed each app one by one.
For years I generally ran Disk Utility every couple of weeks and never had any problems.
Because I've had lots of software problems running Sequoia, I've tried just about every version of the OS--release, public beta and developer beta. Right now I'm running 15.3 beta 2.
When I installed this version, I reformatted the internal drive, installed 15.3 beta 2 and restored my programs and data from Time Machine. After that I ran Disk Utility. At the top level, Apple SSD SM2048L..., showed no errors and neither did any of the drives below the top. But after a few days if I run Disk Utility, Container disk3 shows errors, as does the bottom Macintosh HD. See bellow. And this happens after every fresh install. Good and clean for a couple of days, then the errors in Disk Utility start showing up.
These errors have apparently caused no problems, but I'd like to get them fixed. How do I do it as Disk Utility is not fixing them? And because I'm having these errors in DU is my internal going bad?
I know Disk Utility shows it's performing repairs and the disk is OK afterward, but it's apparently not as I get the same result every time I run Disk Utility.
Thank you for your help.
/Users/imac4/Desktop/Disk Utility/Screenshot 2025-01-16 at 8.42.58 AM.jpg
Topic:
App & System Services
SubTopic:
Hardware
Hi everyone,
I am seeking clarification regarding the communication capabilities between an ESP32 microcontroller and Apple's latest devices, specifically the iPhone 16 Pro Max and iPad Pro, both equipped with USB-C ports.
Background:
MFi Certification: Historically, establishing communication between external devices and iOS devices required MFi (Made for iPhone/iPad) certification. But I remember this being necessary in the Lightning Cable to USB era.
With the introduction of USB-C ports in recent iPhone and iPad models, there is an indication that MFi certification may no longer be necessary for certain peripherals. Perhaps I'm not confident on the terminology here: https://mfi.apple.com/en/who-should-join
Project Requirements: I am working on a sensor research project that necessitates the collection of low-latency time-series data from an ESP32 microcontroller, which features a USB-C port. The data needs to be transmitted to an iPhone 16 Pro Max or iPad Pro. Bluetooth communication has proven insufficient due to its limited data transfer rates (~1.2 Mbps with L2CAP). While NEHotspot could be an alternative, it restricts the iPad's internet connectivity. Therefore, establishing a direct USB-C connection between the ESP32 and the iOS device appears to be the most viable solution.
Questions:
MFi Certification Necessity: Is MFi certification still required for an ESP32 microcontroller to communicate with iPhone 16 Pro Max or iPad Pro via USB-C?
USB-C Communication Support: Do the iPhone 16 Pro Max and iPad Pro natively support serial communication over USB-C with microcontrollers like the ESP32? If not, are there recommended protocols or interfaces to facilitate this communication?
App Development Considerations: Would developing a custom iOS application be necessary to handle data transmission from the ESP32 over USB-C? If so, are there specific APIs or frameworks provided by Apple to support this functionality?
Data Transfer Rates: Considering the need for high-speed data transfer, are there any limitations or considerations regarding the data transfer rates achievable through a USB-C connection between the ESP32 and iOS devices?
Thank you!
We are experiencing problems with the USB port on iPad Pro 11 inch (M4) model number MVW13NF/A. Our custom peripheral device (based on Raspberry Pi Pico + tinyUSB stack, is configured as a network adapter class and has communication with our App over UDP protocol. Our device also acts as a DHCP server, providing the IP address for iPad.
The problem can be described as a “bus stall” or "bus hold" after sleep mode. To reproduce it we just send the iPad into sleep mode using a power button, the USB bus on M4 goes to the suspended state and won’t resume anymore when we wake the iPad up.
The problem has occurred since the upgrade to iOS 18.2.1 and has not been observed before on the previously installed iOS 17 on the same iPad Pro M4.
Also, the problem does not happen on the iPad Pro 11 inch (3rd gen with M1) model number MHW73FD/A, with the same iOS 18.2.1 installed. The problem also does not arise, if we connect our device via USB hub to the same iPad Pro M4.
We have tested different versions of tinyUSB stack (either included in RPi Pico SDK or native unpatched). The problem is independent of the library version. It occurs always if our device is connected directly to the USB port of iPad Pro (M4) with iOS 18. It also stays after upgrading to the latest iOS 18.3 (beta)
In the attached logs is (reduced for clarity) debug output from tinyUSB library about events on the USB bus. These logs were captured via RTT debugging output, using Segger J-Link debugger, so logging process does not affect the timings on the USB bus.
There are three logs attached, for cases 1: "iPad Pro M4 + iOS18" (i.e. problematic case), 2: "iPad Pro M1 + iOS18", and 3: "iPad Pro M4 + iOS18 + external USB hub" (they are non-problematic cases).
case1_usbd_log.txt
case2_usbd_log.txt
case3_usbd_log.txt
This was already posted as feedback with id FB16366509
Hello,
What is the best and Apple recommended way to get display name and its vendor information?
The CoreGraphics framework provides ModelNumber and VendorNumber only.
Looks like IOKit does not provide any documented way at all.
Are there any daemon safe way to get such information?
Thank you in advance,
Pavel
We have an application that is built for communication for emergency first responders. Our app streams video from emergency responder mobile devices to other responders, however, when the app moves into the background, or the screen locks, the stream is terminated. Is there a way to allow the stream to persist.
Topic:
App & System Services
SubTopic:
Hardware