Hello,
The application I'm working on must report new hardware connections. To retrieve connected displays information and monitor new connections, I'm using the "Core Graphics" framework (see recommendation https://developer.apple.com/forums/thread/779945).
The monitoring logic relies on a callback function which invokes when the local display configuration changes(kCGDisplayAddFlag/kCGDisplayRemoveFlag).
#import <Cocoa/Cocoa.h>
static void displayChanged(CGDirectDisplayID displayID, CGDisplayChangeSummaryFlags flags, void *userInfo)
{
uint32_t vendor = CGDisplayVendorNumber(displayID);
if (flags & kCGDisplayAddFlag)
{
if (vendor == kDisplayVendorIDUnknown)
{
NSLog(@"I/O Kit cannot identify the monitor. kDisplayVendorIDUnknown. displayId = %u", displayID);
return;
}
NSLog(@"%u connected. vendor(%u)", displayID, vendor);
}
if (flags & kCGDisplayRemoveFlag)
{
NSLog(@"%u disconnected", displayID);
}
}
int main(int argc, const char * argv[])
{
@autoreleasepool
{
CGDisplayRegisterReconfigurationCallback(displayChanged, NULL);
NSApplicationLoad();
CFRunLoopRun();
}
return 0;
}
The test environment is a Mac mini with an external display connected via HDMI. Everything works correctly until the system enters sleep mode. Upon wakeup, the app reports two displays: the first with vendor ID kDisplayVendorIDUnknown and the second with the expected vendor ID.
Why does Core Graphics report two connections during wakeup? Is there any way to avoid this?
Thank you in advance.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am creating a barcode reader using the AVfoundation framework for iOS and IPadOS. The read result goes into payloadstringvalue, but I want to check the control characters contained in the symbol, so I am using the raw data of the description, which is a property of NSObjectProtocol inherited by VNBarcodeObservation. However, I noticed that if the length set in the raw data exceeds 26, some of the raw data in the description is omitted. So my question is, is it possible to set it so that all the raw data in the description is written out without omitting any raw data? If so, could you please tell me how to set this up? Also, if you know of any other way to extract the raw barcode data, I would appreciate it if you could let me know.
Thank you.
Dear Apple Developer / MFi Program Support,
I am exploring technical possibilities for screen sharing and remote interaction between iOS devices and external hardware (e.g., embedded systems, in-vehicle systems) for a prototype we are currently developing.
I have reviewed the public iOS developer documentation, but I would appreciate your guidance and clarification on the following advanced use cases, particularly in the context of MFi or enterprise-level integrations:
Full-Screen Sharing of iOS Device
Is it possible to mirror or stream the entire iOS screen, even when the app is running in the background or not in the foreground?
Does ReplayKit or any other framework under the MFi or enterprise entitlements allow full-device screen capture outside the app context?
Remote Touch Injection and Control
Is there any officially supported mechanism, under MFi or otherwise, that allows external systems to remotely control an iOS device’s touch interface (e.g., simulate gestures, taps, swipes)?
Are any of the following permitted under special entitlements:
Access to IOHIDEventSystem or similar private APIs for input injection?
Communication over USB or network to relay control commands that simulate direct user interaction?
Hardware-Level Integration and Entitlements
Does the MFi Program allow:
Use of private frameworks or entitlements to build low-level integrations for iOS device control or mirroring?
Communication over USB/Lightning/USB-C to enable bi-directional interaction (streaming out, commands in)?
What are the specific APIs or entitlements available under MFi that enable these use cases?
Can you provide references to documentation, SDKs, or prerequisites for companies seeking such capabilities?
Eligibility and Certification Process
What are the criteria to be approved for the MFi program with access to such advanced capabilities?
Can PoC or early-stage research prototypes be eligible, or is MFi access restricted to commercial production intent?
How long does it typically take to gain access to these entitlements (assuming NDA and certification requirements are met)?
Alternative Pathways
If MFi access is not feasible in the short term, is there any Apple-supported alternative path (e.g., test device provisioning, enterprise signing, custom profiles) that permits more advanced capabilities for prototyping purposes?
We are not looking to publish this as a general App Store app at this stage, but rather to demonstrate feasibility as part of an innovation prototype that may lead to further OEM-level engagement in the future.
Thank you for your support and guidance.
Best regards,
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple team and community,
I’m reporting a critical issue affecting iPhone 13 (128 GB) on iOS 26 Public Beta 3.
Problem Summary:
• Device stays stuck at 1% battery, even while charging
• Battery Health shows 0% in Settings
• Phone reboots every 5 minutes while unplugged
• Only works when connected to power
• Cannot update, charge properly, or maintain uptime
Additional Context:
• The issue appeared immediately after installing iOS 26 beta 3
• Affected devices often have a replaced battery (even official or high-quality replacements)
• Seems to be a software validation bug related to battery firmware
• Reported by many users across Reddit, Apple Forums, and Twitter — but not listed in Known Issues
What Has Been Tried:
• Recovery Mode / Safe charging / Clean install (same version) – no effect
• Third-party repair tools (ReiBoot, 3uTools) — partial workaround
• Jailbreak with Nugget or iCleaner to disable crash daemons – temporarily helps
• Apple Support suggested full device replacement (!)
⸻
Request:
Please investigate and acknowledge this issue. This bug renders devices unusable for users with legitimate battery replacements — we need a fix in an upcoming beta.
Hello
I've noticed that this product, heavily promoted on the ASC forums for many years, is no longer available from the Apple App Store.
Can anyone tell me the reason why the product is no longer supported?
Friends have asked me if it is 'safe' to use.
Is it?
Note to moderator: If I'm asking in the wrong places, please redirect my question. Thank you.
Hello,
I would like to discuss the behavior of the expiration of NFCPresentmentIntentAssertion (test in iOS 18.5).
In the documentation we have :
The intent assertion expires if any of the following occur:
The intent assertion object deinitializes
Your app goes into the background
15 seconds elapse
BUT; in fact ; only the 1st rule is applied.
The expiration seems to be random after the usage of CardSession and that's difficult to give to the user a good experience.
Has someone faced the same kind of issue; or can give an explanation?
Regards,
François
On the iPad Pro 12.9-inch (3rd generation) cellular model, when you touch the screen with four fingers and then move your fingers, the touch is no longer detected. The same operation with one to three fingers works normally.
This phenomenon does not occur when accessibility is turned on.
Is this a beta-specific issue that will be fixed in the official release?
I am working on an app that requires the usage of CoreBluetooth – using both its CBPeripheralManager and CBCentralManager classes. Our app works with other phones and hardware peripherals to exchange data – so we wanted to explore adding AccessorySetupKit to streamline the hardware connection process.
AccessorySetupKit has been integrated (while CBPeripheralManager is turned off) and works great, but even with ASK added to our app's plist file and not in use, CBPeripheralManager fails with error: Cannot create a CBPeripheralManager while using AccessorySetupKit framework.
Is there any workaround or suggested path forward here? We'd still really like to use ASK while keeping our existing functionality, but are not seeing a clear way to do so.
I am developing an app that communicates with external BLE device over GATT. The device has a secure-read characteristic exposing some of it's data and requires pairing/bonding in order to communicate with it.
I was able to pair and connect with the device using AccessorySetupKit and .bluetoothPairingLE option:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, descriptor: descriptor)
In this case when setting up accessory, I was prompted to compare passkeys and after confirming I can read the characteristic etc.
Then I tried adding .confirmAuthorization to picker item and problems started:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, desc
pickerItem.setupOptions = [.confirmAuthorization]
When setting up, I can see a passkey to be confirmed, but when confirmed the setup ui get's suck in loading state. Under the hood in logs, I can see that my app has connected to peripheral and was able to read the characteristic.
I am unsure why the ui is stuck in loading state in this case. What is the difference when using .confirmAuthorization option and what should be the proper flow of events to setup accessory and then access protoected characteristic?
I am writing to seek assistance regarding an iBeacon implementation issue we are experiencing in our iOS application.
Issue Description: We have successfully implemented iBeacon functionality in our app, but we are encountering a specific problem with background region monitoring:
When app is in foreground: Our app successfully detects iBeacon signals and triggers notifications when entering beacon regions.
When app is terminated: Our app fails to respond when entering our own iBeacon regions. However, we have observed an interesting behavior:
Third-party iBeacon apps can still detect and trigger notifications for their beacon regions
After a third-party app triggers, our app suddenly starts receiving notifications for our own iBeacon hardware
Technical Details:
iOS Version: 18.0
Xcode Version: 16.。4
Device Models Tested: iPhone 15 Pro
Questions:
What could be causing our app to fail detecting iBeacon regions when terminated, while third-party apps work correctly?
Why does our iBeacon detection start working only after another iBeacon app triggers?
Are there specific implementation requirements or best practices for reliable background iBeacon monitoring?
Could this be related to iOS background app refresh policies or system resource management?
Current Implementation: We have implemented the standard Core Location framework with:
CLLocationManager with appropriate authorization
Region monitoring setup with CLBeaconRegion
Background modes enabled for location services
Proper delegate methods implemented
We would greatly appreciate your guidance on resolving this issue, as it significantly impacts our app's user experience.
Thank you for your time and support.
Topic:
App & System Services
SubTopic:
Hardware
My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9.
Happens especially when Simulator is opened.
According to Accessory Design Guidelines iPadOS support HID trackpad.
Is there a design example of such supported devices?
I have tried to adapt device software to guidelines without any result on iPad.
Topic:
App & System Services
SubTopic:
Hardware
I have some logic which requires NFC support on the device. This is what I'm using to make sure that it's available:
isNFCMissing = !NFCNDEFReaderSession.readingAvailable && !NFCTagReaderSession.readingAvailable && !NFCVASReaderSession.readingAvailable
Is it possible for isNFCMissing to be true even if the device has an NFC chip.
The minimum iOS version for the application is 16 which is only supported on devices with an NFC chip to begin with.
I am debugging ImageCaptureCore to communicate with external cameras.
When I called the PTP function below to send a command and add data, the response timed out for more than 5 seconds. After waiting for a period of time, I obtained the response. However, the response callback function obtained responsivData.length as zero and ptpResponseData.length as zero too.
(void)requestSendPTPCommand:(NSData *)ptpCommand
outData:(NSData *)ptpData
completion:(void (^)(NSData *responseData, NSData *ptpResponseData, NSError *error))completion;
data is below:
Wrote 1 = 0x1 bytes PTP:send data: (hexdump of 1 bytes)
[ ] I/PTP (14564): 0000 01 - .
Topic:
App & System Services
SubTopic:
Hardware
Hello,
The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine.
My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil?
For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point?
Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch?
Thanks!
We are preparing to implement document signing using USB tokens on iOS and macOS. Several other applications already support this feature.
From my testing and development efforts, I've been unable to reliably access or utilize certificates stored on a smartcard through the iOS APIs. Here are the specifics:
Environment
iOS: 15 and later
Xcode: Versions 18 and 26
Smartcard/Token: ePass 2003 (eMudhra), Feitien token (Capricorn)
Observed Issue :
The token is recognized at the system level, with certificates visible in Keychain Access.
However, programmatic access to the private keys on the smartcard from within the app is not working.
Signing attempts result in Error 6985 and CACC errors.
Approaches Tried:
Updated provisioning profiles with the following entitlements:
com.apple.developer.smartcard
com.apple.security.device.usb
TKSmartCard
Employed TKSmartCard and TKSmartCardSession for interaction.
The token is detected successfully.
A session can be established, but there's no straightforward method to leverage it for certificate-based signing.
Access to signing functions is unavailable; operations yield Error 6985 or CACC errors.
if let smartCard = TKSmartCard(slot: someSlot) {
smartCard.openSession { session, error in
if let session = session {
let command: [UInt8] = [0x00, 0xA4, 0x04, 0x00]
session.transmit(Data(command)) { response, error in
print("Response: \(String(describing: response))")
print("Error: \(String(describing: error))")
}
}
}
}
TokenKit (macOS/iOS)
- Utilized TKTokenWatcher to identify available tokens on macOS (not available on iOS).
watcher.setInsertionHandler { tokenID in
print("Token detected: \(tokenID)")
}
CryptoKit / Security Framework
- Attempted to retrieve SecCertificate using SecItemCopyMatching queries, which succeeded on macOS but failed on iOS.
let query: [CFString: Any] = [
kSecClass: kSecClassCertificate,
kSecReturnRef: true,
kSecMatchLimit: kSecMatchLimitAll
]
var items: CFTypeRef?
let status = SecItemCopyMatching(query as CFDictionary, &items)
print("Status: \(status)") // macOS succeeds, iOS fails
ExternalAccessory Framework (EAAccessory)
* Investigated using EAAccessory and EASession for external token communication, but it did not function as expected.
This functionality is critical for my project. Has anyone successfully implemented smartcard-based signing on iOS? Any guidance, sample code, or references to relevant Apple documentation would be greatly appreciated.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
iOS
Apple CryptoKit
USBDriverKit
CryptoTokenKit
Hi, as other threads have already discussed, I'd like to record audio from a keyboard extension.
The keyboard has been granted both full access and microphone access. Nonetheless whenever I attempt to start a recording from my keyboard, it fails to start with the following error:
Recording failed to start: Error Domain=com.apple.coreaudio.avfaudio Code=561145187 "(null)" UserInfo={failed call=err = PerformCommand(*ioNode, kAUStartIO, NULL, 0)}
This is the code I am using:
import Foundation
import AVFoundation
protocol AudioRecordingServiceDelegate: AnyObject {
func audioRecordingDidStart()
func audioRecordingDidStop(withAudioData: Data?)
func audioRecordingPermissionDenied()
}
class AudioRecordingService {
weak var delegate: AudioRecordingServiceDelegate?
private var audioEngine: AVAudioEngine?
private var audioSession: AVAudioSession?
private var isRecording = false
private var audioData = Data()
private let targetFormat = AVAudioFormat(commonFormat: .pcmFormatInt16,
sampleRate: 16000,
channels: 1,
interleaved: false)!
private func setupAudioSession() throws {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .spokenAudio,
options: [.mixWithOthers, .allowBluetooth, .defaultToSpeaker])
try session.setPreferredIOBufferDuration(0.005)
try session.setActive(true, options: .notifyOthersOnDeactivation)
audioSession = session
}
func checkMicrophonePermission(completion: @escaping (Bool) -> Void) {
switch AVAudioApplication.shared.recordPermission {
case .granted:
completion(true)
case .denied:
delegate?.audioRecordingPermissionDenied()
completion(false)
case .undetermined:
AVAudioApplication.requestRecordPermission { [weak self] granted in
if !granted {
self?.delegate?.audioRecordingPermissionDenied()
}
completion(granted)
}
@unknown default:
delegate?.audioRecordingPermissionDenied()
completion(false)
}
}
func toggleRecording() {
if isRecording {
stopRecording()
} else {
checkMicrophonePermission { [weak self] granted in
if granted {
self?.startRecording()
}
}
}
}
private func startRecording() {
guard !isRecording else { return }
do {
try setupAudioSession()
audioEngine = AVAudioEngine()
guard let engine = audioEngine else { return }
let inputNode = engine.inputNode
let inputFormat = inputNode.inputFormat(forBus: 0)
audioData.removeAll()
guard let converter = AVAudioConverter(from: inputFormat, to: targetFormat) else {
print("Failed to create audio converter")
return
}
inputNode.installTap(onBus: 0, bufferSize: 1024, format: inputFormat) { [weak self] buffer, _ in
guard let self = self else { return }
let frameCount = AVAudioFrameCount(Double(buffer.frameLength) * 16000.0 / buffer.format.sampleRate)
guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: self.targetFormat,
frameCapacity: frameCount) else { return }
outputBuffer.frameLength = frameCount
var error: NSError?
converter.convert(to: outputBuffer, error: &error) { _, outStatus in
outStatus.pointee = .haveData
return buffer
}
if error == nil, let channelData = outputBuffer.int16ChannelData {
let dataLength = Int(outputBuffer.frameLength) * 2
let data = Data(bytes: channelData.pointee, count: dataLength)
self.audioData.append(data)
}
}
engine.prepare()
try engine.start()
isRecording = true
delegate?.audioRecordingDidStart()
} catch {
print("Recording failed to start: \(error)")
stopRecording()
}
}
private func stopRecording() {
audioEngine?.inputNode.removeTap(onBus: 0)
audioEngine?.stop()
isRecording = false
let finalData = audioData
audioData.removeAll()
delegate?.audioRecordingDidStop(withAudioData: finalData)
try? audioSession?.setActive(false, options: .notifyOthersOnDeactivation)
}
deinit {
if isRecording {
stopRecording()
}
}
}
Granting the deprecated "Inter-App Audio" capability did not solve the problem either.
Is recording audio from a keyboard extension even possible in general? If so, how do I fix it?
Related threads:
https://developer.apple.com/forums/thread/108055
https://developer.apple.com/forums/thread/742601
We have recently encountered an App crash, as shown in the picture.
We call this function as:
let session = AVAudioSession.sharedInstance() guard session.currentRoute.outputs.isEmpty == false else { return false }
The TestFlight caught this issue, and the iOS device information is attached:
Do you have any suggestions to avoid this crash?
Topic:
App & System Services
SubTopic:
Hardware
We are develop an application with corebluetooth framework. we connect to device with ble. and open two l2cap channels. it can transfer data with stream. but when it close the second l2cap channel, it always close the first l2cap channel.
Recently, I've noticed that background Bluetooth scanning stops when I move an app to the background on an iPhone 17 device with Bluetooth 6. I'm curious about a solution. Background Bluetooth scanning doesn't stop on devices older than iOS 26, or on devices that were updated from an iPhone 17 or earlier to iOS 26.