Hi everyone,
I am developing a .NET MAUI Mac Catalyst app (sandboxed) that communicates with a custom vendor-specific HID USB device.
Within the Catalyst app, I am using a native iOS library (built with Objective-C and IOKit) and calling into it via P/Invoke from C#.
The HID communication layer relies on IOHIDManager and IOUSBInterface APIs.
The device is correctly detected and opened using IOHIDManager APIs.
However, IOHIDDeviceRegisterInputReportCallback never triggers — I don’t receive any input reports.
To investigate, I also tried using low-level IOKit USB APIs via P/Invoke from my Catalyst app, calling into a native iOS library.
When attempting to open the USB interface using IOUSBInterfaceOpen() or IOUSBInterfaceOpenSeize(), both calls fail with: kIOReturnNotPermitted (0xe00002e2).
— indicating an access denied error, even though the device enumerates and opens successfully.
Interestingly, when I call IOHIDDeviceSetReport(), it returns status = 0, meaning I can successfully send feature reports to the device.
Only input reports (via the InputReportCallback) fail to arrive.
I’ve confirmed this is not a device issue — the same hardware and protocol work perfectly under Windows using the HIDSharp library, where both input and output reports function correctly.
What I’ve verified
•Disabling sandboxing doesn’t change the behavior.
•The device uses a vendor-specific usage page (not a standard HID like keyboard/mouse).
•Enumeration, open, and SetReport all succeed — only reading input reports fails.
•Tried polling queues, in queues Input_Misc element failed to add to the queues.
•Tried getting report in a loop but no use.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
as i want to tract activity of iphone user using core motion framework , guide me through .
Hi everyone — I’m developing an iOS passkey/password manager where the private key material must be stored on a physical device (NFC card / USB token). I’m hitting a hard limitation: CoreNFC is not available for use from app extensions, which prevents an appex (e.g. password/credential provider or other extension) from talking directly to an NFC card during an authentication flow. 
My questions:
1. Is there any plan to make CoreNFC (or some limited NFC-API) available to app extensions in a future iOS version? If not, could Apple clarify why (security/entitlements/architecture reasons)?
2. Are there any recommended/approved workarounds for a passkey manager extension that needs to access a physical NFC token during authentication? (For example: background tag reading that launches the containing app, or some entitlement for secure NFC card sessions.) I’ve read about background tag reading, but that seems to be about system/OS handling of tags rather than giving extensions direct NFC access. 
3. Is the only supported pattern for my use case to have the containing app perform NFC operations and then share secrets with the extension via App Groups / Keychain Sharing / custom URL flow? (I’m already evaluating App Groups / Keychain access groups for secure sharing, but I’d like official guidance.) 
Implementation details that may help responders:
• Target: iOS (latest SDK), building a Credential Provider / password manager extension (appex).
• Intended physical token: NFC smartcard / ISO7816 contactless (so CoreNFC APIs like NFCISO7816Tag would be ideal).
• Security goals: private key never leaves the physical token; extension should be able to trigger/sign during a browser/app AutoFill flow.
Possible alternatives I’m considering (open to feedback): designing the UX so that the extension opens the main app (only possible for Today widget in a supported way) which runs the NFC flow and stores/returns a short-lived assertion to the extension. Are any of these patterns sanctioned / recommended by Apple for credential providers? 
Thanks — any pointers to docs, entitlement names, or example apps/samples would be extremely helpful.
Hi everyone,
I submitted an app for review and was met with a rejection for unresolved issues.
This was what was asked in the rejection:
Provide detailed answers to the following questions:
-Does your app interact with any hardware?
Would that be referring to the camera/microphone of the device? My app uses haptics when you select an option. I didn't see anything in connect where I needed to specify the use of haptics.
Also, does this mean that when the reviewer answers me I have to resubmit as a version 1.1? I'm not sure what I would need to change. This is my first app so I'm not entirely sure on the procedure.
Topic:
App & System Services
SubTopic:
Hardware
Prerequisite: After the MDM APP issues the command, the camera on the phone is no longer visible (unusable).
After upgrading to iOS 26.1, the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable.
The isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method on iOS 26.0.1 is normal, returning false when the camera is unavailable and true when it is available.
When our Bluetooth device is scanned and a connection is initiated through the app on the iPhone 17, the air log shows that the iPhone sends an LL_LENGTH_REQ to execute the Data Length Update Procedure. However, our peripheral does not support the Bluetooth LE Data Length Extension, so it responds with an LL_UNKNOWN_RSP PDU with the UnknownType field set to LL_LENGTH_REQ.
After receiving the LL_UNKNOWN_RSP, the iPhone 17 does not proceed with the subsequent Bluetooth LE service discovery process. The connection is maintained until the peripheral actively disconnects.
Once the peripheral disconnects and continues broadcasting Bluetooth signals, the iPhone 17 repeatedly tries to connect to the peripheral and executes the aforementioned process, even if the app has been terminated.
According to the Bluetooth 4.2 core specification ([Vol. 6] Part B, Section 5.1.9), which can be found here: https://www.bluetooth.com/specifications/specs/core-specification-amended-4-2/, the iPhone should accept the LL_UNKNOWN_RSP and terminate the Data Length Update Procedure after receiving it, proceeding with the subsequent operations using the default minimum parameters.
Phenomenon: When the app calls the - (void)connectPeripheral:(CBPeripheral *)peripheral options:(nullable NSDictionary<NSString *, id> *)options method, the connection result callback is never received. After a period of approximately 10 seconds, it fails with a callback, displaying the message: central did fail to connect to peripheral: <TY : 45E4A697-31AE-9B5A-1C38-53D7CA624D8C, Error Domain=CoreBLEErrorDomain Code=400 "(null)">.
Hello,
We are testing Wallet passes with iBeacons in iOS 26 Beta.
In earlier iOS releases, when a device was in proximity to a registered beacon, the corresponding pass would surface automatically.
In iOS 26 Beta, this behavior no longer occurs, even if the pass is already present in Wallet. I have not found documentation of this change in the iOS 26 release notes.
Could you please confirm whether this is expected in iOS 26, or if it may be a Beta-specific issue? Any pointers to updated documentation would be appreciated.
Thank you.
Hi everyone,
I’m working on a custom camera implementation in iOS using native code. My goal is to capture unprocessed, realistic images directly from the camera — without any filters or post-image processing applied by the system.
I’ve implemented RAW image capture using the native camera APIs (AVFoundation) and successfully received .dng files. However, even the RAW outputs don’t look like the real environment — the colors, tone, and exposure still seem processed or corrected in some way.
I’ve tried various configurations such as photoSettings.rawPhotoPixelFormatType, experimenting with AVCaptureDevice and AVCapturePhotoOutput settings, and reviewing ProRAW and standard RAW behavior, but I’m still not getting truly unprocessed results that reflect the actual sensor data.
Has anyone experienced similar results when capturing RAW images on iOS, or found a way to bypass Apple’s image signal processing (ISP) pipeline for more realistic captures?
Any insights or references from Apple’s camera framework behavior would be greatly appreciated.
Thank you!
Prerequisite: After the MDM APP issues the command, the camera on the phone is no longer visible (unusable).
After upgrading to iOS 26.1, the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable.
The isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method on iOS 26.0.1 is normal, returning false when the camera is unavailable and true when it is available.
Please fix this method to determine
If the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method cannot determine whether the camera is available, please provide an available judgment method.
Dove in and upgraded two Macs today to beta 1. Unfortunately, it appears L2TP VPN is broken or something changed in the way it works. I can longer get a connection to any VPN concentrator I used previously. I tested with Cisco Anyconnect SSL VPN client and can connect to the same concentrators (as they're configured to accept L2TP or SSL clients).
I also tested from my phone running iOS 16 beta and it still works for the L2TP connections.
The Mac not working with L2TP VPN ppp.log shows this
Fri Jun 10 19:18:52 2022 : L2TP connecting to server 'IP removed' (IP removed)...
Fri Jun 10 19:18:52 2022 : IPSec connection started
Fri Jun 10 19:18:52 2022 : IPSec phase 1 client started
Fri Jun 10 19:19:02 2022 : IPSec connection failed
Connecting a Mac successfully on 12.4 the log shows
Fri Jun 10 19:12:33 2022 : L2TP connecting to server 'IP removed' (IP removed)...
Fri Jun 10 19:12:33 2022 : IPSec connection started
Fri Jun 10 19:12:33 2022 : IPSec phase 1 client started
Fri Jun 10 19:12:33 2022 : IPSec phase 1 server replied
Fri Jun 10 19:12:34 2022 : IPSec phase 2 started
Fri Jun 10 19:12:34 2022 : IPSec phase 2 established
Fri Jun 10 19:12:34 2022 : IPSec connection established
(and then a ton more lines of the entire process ending with client getting an IP that I won't bother posting)
VPN wasn't high on my list of apps I was concerned about breaking with the beta. But, now that it is broke and I need it for work I'm kinda screwed myself.
Anyway, if anyone knows a way to fix this please let me know.
Hello, we are developing hardware that needs to connect to an iPhone via Wi-Fi to send requests to a server. On Android, we have managed to create a programmatic local hotspot within the app to facilitate connection and improve the user experience.
On iOS, however, Personal Hotspot must be manually enabled from the system settings, and the user must manually enter the SSID and password, which significantly degrades the UX.
My questions are:
Is there a workaround, unofficial method, or private API to generate a local hotspot from an app on iOS, similar to what can be done on Android?
Is there an alternative within the MFi program or through specific frameworks to facilitate a quick and automatic connection between the hardware and the iPhone without relying on the manual Personal Hotspot?
Are there any best practices for improving the local Wi-Fi connection experience between an accessory and an iPhone in the absence of hotspot controls?
I would appreciate any guidance, experience, or resources that would help me better understand the feasible options in iOS for scenarios where fast and direct communication between hardware and mobile devices via Wi-Fi is required.
Translated with DeepL.com (free version)
Topic:
App & System Services
SubTopic:
Hardware
For what iPhone and iPad models under iOS 26 SpeechTranscriber.isAvailable is true
I'm building a React Native call application using the following combination of libraries:
https://github.com/react-native-webrtc/react-native-callkeep
https://github.com/react-native-webrtc/react-native-webrtc
https://github.com/react-native-webrtc/react-native-voip-push-notification
When I press the speaker button on the call screen displayed by CallKit and change it to ON, the speaker button display on the call screen reverts back to OFF after a few seconds.
However, when the speaker button display reverts to OFF, the actual audio output route does not return to the earpiece - the audio continues to output from the speaker without any change.
Could you please advise on what cases might cause the speaker button display to revert, and if there are any potential solutions?
[sysdiagnose_2025.10.01_18-29-27+0800_iPhone-OS_iPhone_23A341]
I got sysdiagnose from my app user.He can't pair his bluetooth peripheral.
in the sysdiagnose,I found this:
device AC:7A:94:85:47:F4 is already paired, with a different irk (old:F5 C9 4F 5A 4E BE D0 20 0A 1F F7 DC 3A 89 E0 3A new 4A 8A 00 4C FF D0 CE 7B 61 13 FA B3 84 F4 65 29 ). Unpair first and then restart pairing. (status=65535)
but there is no device in his iphone's SYSTEM BLUETOOTH PREPHERAL LIST.
I don't know how to delete the irk info when you can't find it in SYSTEM BLUETOOTH PREPHERAL LIST.
PLEASE answer me. THANKS.
Someone is using all of these tools to hack my phone I have several device and my daughters tabkets and androids are also compromised . Use my home hub / home kit : cloudy it / Xcode / swift ui and siri search . If anyone could help I’d highly appreciate it I have contacted the FBI and have an apt in October b it if someone or anyone could help me debug my phone in the mean time I’d highly appreciate it . Have used my doctors apps and changed iclouds . Used family sharing with my daughter and screen time and have used share across devices . Starting wi try my outlook account that Jeremy Walker had access to . Please help me it’s effects my mental and physical health and quality time with My daughter . Using ethernet and hardware keyboards / voice over and AI . Please assist as this is extremely exhausting
help pls
I have a small furnace that heats a sample of my product, a cylinder measuring 20mm x 100mm. As it is heated, there is thermal expansion, and I at this moment read (using my eyes) a micrometer dial gauge and type into my app the value read in hundredths of a millimeter. My source code uses quartz to draw a PDF graph that represents this thermal expansion. I would like to enhance my set up by getting my app to collect the linear displacement from my dilatometer furnace. One possibility I have been trying to implement is to have my app collect DC millivolts from an instrument, like a voltmeter reading the output of an LVDT (Linear Variable Differential Transformer), but the model I have requires MODBUS-RTU protocol, which is not well implemented by Mac OS X developers. It requires an USB-Serial converter, which complicates things and makes it more unreliable. At this moment, I have to sit next to my furnace and type into my app the temperature and linear displacement for 3 hours. I would like to read some comments and suggestions from you or an Apple Engineer and how this could be achieved.
Here is a video of my dilatometer furnace:
https://www.correiofacil.com/video/IMG_8888.MOV
Here is a PDF I generated using my APP:
https://www.correiofacil.com/PDF/117.pdf
Here is a video showing how I type data manually into my app:
https://www.correiofacil.com/video/recording2.mov
Here is a video that shows my app drawing a graph:
https://www.correiofacil.com/video/recording1.mov
These 2 images show my attempts to use the app ModBusRtuMaster to collect data from the voltmeter which at this moment fails:
https://www.correiofacil.com/PDF/8f9be971dee32a58c36b307dbab3dc2d.png
https://www.correiofacil.com/PDF/4ac20c2ea762fd42ed340a7a2b567900.png
You will notice that although the instrument is displaying 0.000 volts, the data collected changes at each message I send to the instrument, proving me that Modbus RTU set up is not working.
Here is a link to the manual of the DC voltmeter I am using:
https://www.correiofacil.com/PDF/manual1.pdf
Here is an image of the LVDT I am trying to use:
https://www.correiofacil.com/PDF/e0349986512f9273b26b209baa3ba5c7.png
Here is the manual for the LVDT:
https://www.correiofacil.com/PDF/GGD19.pdf
I suspect that there could be a completely different approach to collecting the data and inputting into my app, maybe Bluetooth?
Maybe using a Mitutoyo gauge such as? This:
https://www.mitutoyo.com/Images/CatalogUS-1003/resources/_pdfs_/US-1003_Catalog_251.pdf
And a cable with SPC support like this:
https://www.gw-style.com/product-p-942977.html
I am very grateful for your comments
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
Hello,
I am a developer planning to build an application using Apple's new SpeechTranscriber technology.
I am facing an issue where SpeechTranscriber is not available on my iPad Pro (11-inch, 2nd generation, model number: MXDC2J/A), even though I have updated it to iPadOS 26. I was under the impression that SpeechTranscriber would be available on any device running iPadOS 26. Could you please clarify if this is incorrect?
Furthermore, I am planning to purchase a new iPad with an A16 chip for the development and deployment of this application. Can you confirm if SpeechTranscriber will be fully functional on an iPad equipped with the A16 chip?
Thank you for your assistance.
Hello.
Is there a solution to the issue where Core Bluetooth does not run in the background on the iPhone17?
https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html
The bluetooth-central Background Execution Mode
When an app that implements the central role includes the UIBackgroundModes key with the bluetooth-central value in its Info.plist file, the Core Bluetooth framework allows your app to run in the background to perform certain Bluetooth-related tasks. While your app is in the background you can still discover and connect to peripherals, and explore and interact with peripheral data. In addition, the system wakes up your app when any of the CBCentralManagerDelegate or CBPeripheralDelegate delegate methods are invoked, allowing your app to handle important central role events, such as when a connection is established or torn down, when a peripheral sends updated characteristic values, and when a central manager’s state changes.
Although you can perform many Bluetooth-related tasks while your app is in the background, keep in mind that scanning for peripherals while your app is in the background operates differently than when your app is in the foreground. In particular, when your app is scanning for device while in the background:
The CBCentralManagerScanOptionAllowDuplicatesKey scan option key is ignored, and multiple discoveries of an advertising peripheral are coalesced into a single discovery event.
If all apps that are scanning for peripherals are in the background, the interval at which your central device scans for advertising packets increases. As a result, it may take longer to discover an advertising peripheral.
These changes help minimize radio usage and improve the battery life on your iOS device.
Recently, I've noticed that background Bluetooth scanning stops when I move an app to the background on an iPhone 17 device with Bluetooth 6. I'm curious about a solution. Background Bluetooth scanning doesn't stop on devices older than iOS 26, or on devices that were updated from an iPhone 17 or earlier to iOS 26.