We are develop an application with corebluetooth framework. we connect to device with ble. and open two l2cap channels. it can transfer data with stream. but when it close the second l2cap channel, it always close the first l2cap channel.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm trying to browse my Homekit accessories and try to show the different accessories communication protocols, i.e. Wifi, Thread, Zigbee, Z-wave!
Zigbee and Z-wave I can have hard-coded depending on model because I have so few!
But…. I can easily find all accessories which are wifi attached by using:
isBriged == false and
uniqueIdentifiersForBridgedAccessories == nil
However, I can see that some of the accessories are thread enabled (I have read the documentation for the accessory) but still in the list.
So, my questions:
Are there any attributes in the accessory that is unique for a Thread accessory?
for accessory in accessories
{
if(accessory.isBridged == false &&
accessory.uniqueIdentifiersForBridgedAccessories == nil )
{
// Can be a Wifi device but need more controls
// requiresThreadRouter???
if (true)
{
wifiAccessaries.append(accessory)
}
}
}
I am developing a virtual Bluetooth HID keyboard device on my Win desktop that connects to my iPad over bluetooth and advertises itself as a keyboard to control the iPad.
It works very well already on Android, but not on iOS. I can see in Packet Logger that it reads well as a HID device, reads the report map and HID information correctly, which data is all valid. It doesn't subscribe to the report's Client Characteristic Configuration, just silently quitting and the keyboard does not work.
I can post more information if needed, but my question in short is what are the requirements for iOS to accept a HID over GATT as a keyboard peripheral. I feel like I am close.
I have a question about Apple certification.
We are planning card reader via HID(human interface device) for iPad that support USB-C.
iPad will receive data as HID protocol.
In this case do I have to get certificate(for example MFi) like Apple USB accessory?
Topic:
App & System Services
SubTopic:
Hardware
I am developing the electronic part of product, it includes a Find-My features. I saw some forum that testing Find-My feature needs a CSR and testing token. Can anyone teach me how to apply CSR and testing token step by step?
Thank you very much.
Best regards
Sam Ng
Hello. I am building a BLE device that is Activity Fitness based and would like a "System Level" BLE connection on WatchOS using an ESP32 (I have built a test of this on the firmware side). Meaning I do not want my iOS app to pass the BLE connection to the WatchOS app. It seems like these App Level connections do not get as many background updates as a System Level connection, and also requires the WatchOS app to be launched to connect to the BLE device
The System Level BLE connection (WatchOS Settings > BLE > Health Devices) allows for auto connection in the background, and gets more reliable background communication between the BLE device and the Apple Watch
On the Apple MFi Page it only mentions iOS:
From Apple MFi Page:
:: Who does NOT need to join -
Developers and manufacturers of accessories that connect to an Apple device using only Bluetooth Low Energy, Core Bluetooth, or standard Bluetooth profiles supported by iOS
Does this apply to WatchOS as well?
So, if I am making an BLE device that is Activity Based, and has one of the allowable Health Device UUIDs, is the BLE System Connection allowed using any BLE chip? Including say an ESP32
I have built a test BLE firmware that is a Health Device UUID, and the WatchOS sees it as a health device
Is this fine then? No need for MFi application and also no need to worry about which BLE chip is used?
thanks
HomePod Mini running 18.6 build 22M5054/b - will not update to HomePod OS26
have tried un-enrollment, reset, removal, etc - no dice - anyone else seeing this ? Any known work arounds ?
iPad is running iPadOS 26 Relase 2 - 23A5276f
Macbook pro M4 - will not accept any power adapter after beta update
iPhone 16 pro - same exact problem
Devices are dead
Tried multiple chargers - Watch and IPad appear to be taking a charge for now..
I am writing to report an issue I’m facing after updating my iPhone 11 Pro Max to iOS 26.
I have been using the Hollyland Lark M2 external microphone via the Lightning port, and it was working perfectly before the update. However, after upgrading to iOS 26, the iPhone no longer detects it correctly. The device now recognizes the mic as a pair of wired earphones, and it fails to capture any audio input.
The microphone itself works flawlessly on other devices, so this appears to be an iOS-specific issue.
Could you please confirm:
• Whether this is a known issue in iOS 26?
• If there are any settings or steps I can take to resolve this?
• Whether a fix is planned in an upcoming iOS patch?
I would appreciate any guidance or solution you can provide.
Thank you for your support.
Topic:
App & System Services
SubTopic:
Hardware
Hello,
The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine.
My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil?
For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point?
Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch?
Thanks!
We have recently encountered an App crash, as shown in the picture.
We call this function as:
let session = AVAudioSession.sharedInstance() guard session.currentRoute.outputs.isEmpty == false else { return false }
The TestFlight caught this issue, and the iOS device information is attached:
Do you have any suggestions to avoid this crash?
Topic:
App & System Services
SubTopic:
Hardware
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
I have a small furnace that heats a sample of my product, a cylinder measuring 20mm x 100mm. As it is heated, there is thermal expansion, and I at this moment read (using my eyes) a micrometer dial gauge and type into my app the value read in hundredths of a millimeter. My source code uses quartz to draw a PDF graph that represents this thermal expansion. I would like to enhance my set up by getting my app to collect the linear displacement from my dilatometer furnace. One possibility I have been trying to implement is to have my app collect DC millivolts from an instrument, like a voltmeter reading the output of an LVDT (Linear Variable Differential Transformer), but the model I have requires MODBUS-RTU protocol, which is not well implemented by Mac OS X developers. It requires an USB-Serial converter, which complicates things and makes it more unreliable. At this moment, I have to sit next to my furnace and type into my app the temperature and linear displacement for 3 hours. I would like to read some comments and suggestions from you or an Apple Engineer and how this could be achieved.
Here is a video of my dilatometer furnace:
https://www.correiofacil.com/video/IMG_8888.MOV
Here is a PDF I generated using my APP:
https://www.correiofacil.com/PDF/117.pdf
Here is a video showing how I type data manually into my app:
https://www.correiofacil.com/video/recording2.mov
Here is a video that shows my app drawing a graph:
https://www.correiofacil.com/video/recording1.mov
These 2 images show my attempts to use the app ModBusRtuMaster to collect data from the voltmeter which at this moment fails:
https://www.correiofacil.com/PDF/8f9be971dee32a58c36b307dbab3dc2d.png
https://www.correiofacil.com/PDF/4ac20c2ea762fd42ed340a7a2b567900.png
You will notice that although the instrument is displaying 0.000 volts, the data collected changes at each message I send to the instrument, proving me that Modbus RTU set up is not working.
Here is a link to the manual of the DC voltmeter I am using:
https://www.correiofacil.com/PDF/manual1.pdf
Here is an image of the LVDT I am trying to use:
https://www.correiofacil.com/PDF/e0349986512f9273b26b209baa3ba5c7.png
Here is the manual for the LVDT:
https://www.correiofacil.com/PDF/GGD19.pdf
I suspect that there could be a completely different approach to collecting the data and inputting into my app, maybe Bluetooth?
Maybe using a Mitutoyo gauge such as? This:
https://www.mitutoyo.com/Images/CatalogUS-1003/resources/_pdfs_/US-1003_Catalog_251.pdf
And a cable with SPC support like this:
https://www.gw-style.com/product-p-942977.html
I am very grateful for your comments
Hello,
We are testing Wallet passes with iBeacons in iOS 26 Beta.
In earlier iOS releases, when a device was in proximity to a registered beacon, the corresponding pass would surface automatically.
In iOS 26 Beta, this behavior no longer occurs, even if the pass is already present in Wallet. I have not found documentation of this change in the iOS 26 release notes.
Could you please confirm whether this is expected in iOS 26, or if it may be a Beta-specific issue? Any pointers to updated documentation would be appreciated.
Thank you.
When our Bluetooth device is scanned and a connection is initiated through the app on the iPhone 17, the air log shows that the iPhone sends an LL_LENGTH_REQ to execute the Data Length Update Procedure. However, our peripheral does not support the Bluetooth LE Data Length Extension, so it responds with an LL_UNKNOWN_RSP PDU with the UnknownType field set to LL_LENGTH_REQ.
After receiving the LL_UNKNOWN_RSP, the iPhone 17 does not proceed with the subsequent Bluetooth LE service discovery process. The connection is maintained until the peripheral actively disconnects.
Once the peripheral disconnects and continues broadcasting Bluetooth signals, the iPhone 17 repeatedly tries to connect to the peripheral and executes the aforementioned process, even if the app has been terminated.
According to the Bluetooth 4.2 core specification ([Vol. 6] Part B, Section 5.1.9), which can be found here: https://www.bluetooth.com/specifications/specs/core-specification-amended-4-2/, the iPhone should accept the LL_UNKNOWN_RSP and terminate the Data Length Update Procedure after receiving it, proceeding with the subsequent operations using the default minimum parameters.
Phenomenon: When the app calls the - (void)connectPeripheral:(CBPeripheral *)peripheral options:(nullable NSDictionary<NSString *, id> *)options method, the connection result callback is never received. After a period of approximately 10 seconds, it fails with a callback, displaying the message: central did fail to connect to peripheral: <TY : 45E4A697-31AE-9B5A-1C38-53D7CA624D8C, Error Domain=CoreBLEErrorDomain Code=400 "(null)">.
We are preparing to implement document signing using USB tokens on iOS and macOS. Several other applications already support this feature.
From my testing and development efforts, I've been unable to reliably access or utilize certificates stored on a smartcard through the iOS APIs. Here are the specifics:
Environment
iOS: 15 and later
Xcode: Versions 18 and 26
Smartcard/Token: ePass 2003 (eMudhra), Feitien token (Capricorn)
Observed Issue :
The token is recognized at the system level, with certificates visible in Keychain Access.
However, programmatic access to the private keys on the smartcard from within the app is not working.
Signing attempts result in Error 6985 and CACC errors.
Approaches Tried:
Updated provisioning profiles with the following entitlements:
com.apple.developer.smartcard
com.apple.security.device.usb
TKSmartCard
Employed TKSmartCard and TKSmartCardSession for interaction.
The token is detected successfully.
A session can be established, but there's no straightforward method to leverage it for certificate-based signing.
Access to signing functions is unavailable; operations yield Error 6985 or CACC errors.
if let smartCard = TKSmartCard(slot: someSlot) {
smartCard.openSession { session, error in
if let session = session {
let command: [UInt8] = [0x00, 0xA4, 0x04, 0x00]
session.transmit(Data(command)) { response, error in
print("Response: \(String(describing: response))")
print("Error: \(String(describing: error))")
}
}
}
}
TokenKit (macOS/iOS)
- Utilized TKTokenWatcher to identify available tokens on macOS (not available on iOS).
watcher.setInsertionHandler { tokenID in
print("Token detected: \(tokenID)")
}
CryptoKit / Security Framework
- Attempted to retrieve SecCertificate using SecItemCopyMatching queries, which succeeded on macOS but failed on iOS.
let query: [CFString: Any] = [
kSecClass: kSecClassCertificate,
kSecReturnRef: true,
kSecMatchLimit: kSecMatchLimitAll
]
var items: CFTypeRef?
let status = SecItemCopyMatching(query as CFDictionary, &items)
print("Status: \(status)") // macOS succeeds, iOS fails
ExternalAccessory Framework (EAAccessory)
* Investigated using EAAccessory and EASession for external token communication, but it did not function as expected.
This functionality is critical for my project. Has anyone successfully implemented smartcard-based signing on iOS? Any guidance, sample code, or references to relevant Apple documentation would be greatly appreciated.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
iOS
Apple CryptoKit
USBDriverKit
CryptoTokenKit
Hello,
I am a developer planning to build an application using Apple's new SpeechTranscriber technology.
I am facing an issue where SpeechTranscriber is not available on my iPad Pro (11-inch, 2nd generation, model number: MXDC2J/A), even though I have updated it to iPadOS 26. I was under the impression that SpeechTranscriber would be available on any device running iPadOS 26. Could you please clarify if this is incorrect?
Furthermore, I am planning to purchase a new iPad with an A16 chip for the development and deployment of this application. Can you confirm if SpeechTranscriber will be fully functional on an iPad equipped with the A16 chip?
Thank you for your assistance.
For what iPhone and iPad models under iOS 26 SpeechTranscriber.isAvailable is true
I know this will sound like a weird use case, but it is potentially relevant to a product I am working on.
If I were to link two or more iPhones to the same iCloud account (I know this is uncommon and generally a bad idea, but it is possible so I have to consider it). Next I pair a BLE device to one of the phones using my app which has the background bluetooth permission with the app acting as the central so it can automatically restore a connection when the BLE device comes back into range.
I then install the same app on the other iPhone(s) linked to this account and separate them physically so they are out of BLE range of each other.
The BLE device is taken out of range of the first phone and into range of one of the others.
First question is will the BLE device automatically connect to the other iPhone?
If yes, will the app have any way of determining that it's running on a different iPhone with the same device connected?
If yes, can the app prevent connection to the other phones in some way?
Hello fellow developers,
I’m developing a smart standing desk under my brand Beflo, which integrates technology with traditional furniture. As part of this, we’re working to make our Tenon smart desk compatible with the Apple HomeKit ecosystem.
While exploring HomeKit profiles, I noticed there isn’t a specific profile for standing desks. However, there are similar profiles, such as curtains, which also use motors for operation and could align with our product's functionality, like height adjustments.
I have a few questions:
What is the process for proposing or applying for a new product profile in HomeKit?
Is it feasible to adapt an existing profile (like curtains) in the interim while awaiting approval for a new profile?
Has anyone had experience navigating this process, and are there any best practices or resources you recommend?
I’d love to hear insights from anyone who has experience with HomeKit development or working on similar product integrations. Thank you for your time and guidance!