Hello,
We are testing Wallet passes with iBeacons in iOS 26 Beta.
In earlier iOS releases, when a device was in proximity to a registered beacon, the corresponding pass would surface automatically.
In iOS 26 Beta, this behavior no longer occurs, even if the pass is already present in Wallet. I have not found documentation of this change in the iOS 26 release notes.
Could you please confirm whether this is expected in iOS 26, or if it may be a Beta-specific issue? Any pointers to updated documentation would be appreciated.
Thank you.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm building a React Native call application using the following combination of libraries:
https://github.com/react-native-webrtc/react-native-callkeep
https://github.com/react-native-webrtc/react-native-webrtc
https://github.com/react-native-webrtc/react-native-voip-push-notification
When I press the speaker button on the call screen displayed by CallKit and change it to ON, the speaker button display on the call screen reverts back to OFF after a few seconds.
However, when the speaker button display reverts to OFF, the actual audio output route does not return to the earpiece - the audio continues to output from the speaker without any change.
Could you please advise on what cases might cause the speaker button display to revert, and if there are any potential solutions?
Someone is using all of these tools to hack my phone I have several device and my daughters tabkets and androids are also compromised . Use my home hub / home kit : cloudy it / Xcode / swift ui and siri search . If anyone could help I’d highly appreciate it I have contacted the FBI and have an apt in October b it if someone or anyone could help me debug my phone in the mean time I’d highly appreciate it . Have used my doctors apps and changed iclouds . Used family sharing with my daughter and screen time and have used share across devices . Starting wi try my outlook account that Jeremy Walker had access to . Please help me it’s effects my mental and physical health and quality time with My daughter . Using ethernet and hardware keyboards / voice over and AI . Please assist as this is extremely exhausting
help pls
I have a small furnace that heats a sample of my product, a cylinder measuring 20mm x 100mm. As it is heated, there is thermal expansion, and I at this moment read (using my eyes) a micrometer dial gauge and type into my app the value read in hundredths of a millimeter. My source code uses quartz to draw a PDF graph that represents this thermal expansion. I would like to enhance my set up by getting my app to collect the linear displacement from my dilatometer furnace. One possibility I have been trying to implement is to have my app collect DC millivolts from an instrument, like a voltmeter reading the output of an LVDT (Linear Variable Differential Transformer), but the model I have requires MODBUS-RTU protocol, which is not well implemented by Mac OS X developers. It requires an USB-Serial converter, which complicates things and makes it more unreliable. At this moment, I have to sit next to my furnace and type into my app the temperature and linear displacement for 3 hours. I would like to read some comments and suggestions from you or an Apple Engineer and how this could be achieved.
Here is a video of my dilatometer furnace:
https://www.correiofacil.com/video/IMG_8888.MOV
Here is a PDF I generated using my APP:
https://www.correiofacil.com/PDF/117.pdf
Here is a video showing how I type data manually into my app:
https://www.correiofacil.com/video/recording2.mov
Here is a video that shows my app drawing a graph:
https://www.correiofacil.com/video/recording1.mov
These 2 images show my attempts to use the app ModBusRtuMaster to collect data from the voltmeter which at this moment fails:
https://www.correiofacil.com/PDF/8f9be971dee32a58c36b307dbab3dc2d.png
https://www.correiofacil.com/PDF/4ac20c2ea762fd42ed340a7a2b567900.png
You will notice that although the instrument is displaying 0.000 volts, the data collected changes at each message I send to the instrument, proving me that Modbus RTU set up is not working.
Here is a link to the manual of the DC voltmeter I am using:
https://www.correiofacil.com/PDF/manual1.pdf
Here is an image of the LVDT I am trying to use:
https://www.correiofacil.com/PDF/e0349986512f9273b26b209baa3ba5c7.png
Here is the manual for the LVDT:
https://www.correiofacil.com/PDF/GGD19.pdf
I suspect that there could be a completely different approach to collecting the data and inputting into my app, maybe Bluetooth?
Maybe using a Mitutoyo gauge such as? This:
https://www.mitutoyo.com/Images/CatalogUS-1003/resources/_pdfs_/US-1003_Catalog_251.pdf
And a cable with SPC support like this:
https://www.gw-style.com/product-p-942977.html
I am very grateful for your comments
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
Hello,
I am a developer planning to build an application using Apple's new SpeechTranscriber technology.
I am facing an issue where SpeechTranscriber is not available on my iPad Pro (11-inch, 2nd generation, model number: MXDC2J/A), even though I have updated it to iPadOS 26. I was under the impression that SpeechTranscriber would be available on any device running iPadOS 26. Could you please clarify if this is incorrect?
Furthermore, I am planning to purchase a new iPad with an A16 chip for the development and deployment of this application. Can you confirm if SpeechTranscriber will be fully functional on an iPad equipped with the A16 chip?
Thank you for your assistance.
Hello.
Is there a solution to the issue where Core Bluetooth does not run in the background on the iPhone17?
https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html
The bluetooth-central Background Execution Mode
When an app that implements the central role includes the UIBackgroundModes key with the bluetooth-central value in its Info.plist file, the Core Bluetooth framework allows your app to run in the background to perform certain Bluetooth-related tasks. While your app is in the background you can still discover and connect to peripherals, and explore and interact with peripheral data. In addition, the system wakes up your app when any of the CBCentralManagerDelegate or CBPeripheralDelegate delegate methods are invoked, allowing your app to handle important central role events, such as when a connection is established or torn down, when a peripheral sends updated characteristic values, and when a central manager’s state changes.
Although you can perform many Bluetooth-related tasks while your app is in the background, keep in mind that scanning for peripherals while your app is in the background operates differently than when your app is in the foreground. In particular, when your app is scanning for device while in the background:
The CBCentralManagerScanOptionAllowDuplicatesKey scan option key is ignored, and multiple discoveries of an advertising peripheral are coalesced into a single discovery event.
If all apps that are scanning for peripherals are in the background, the interval at which your central device scans for advertising packets increases. As a result, it may take longer to discover an advertising peripheral.
These changes help minimize radio usage and improve the battery life on your iOS device.
Hello, we are developing hardware that needs to connect to an iPhone via Wi-Fi to send requests to a server. On Android, we have managed to create a programmatic local hotspot within the app to facilitate connection and improve the user experience.
On iOS, however, Personal Hotspot must be manually enabled from the system settings, and the user must manually enter the SSID and password, which significantly degrades the UX.
My questions are:
Is there a workaround, unofficial method, or private API to generate a local hotspot from an app on iOS, similar to what can be done on Android?
Is there an alternative within the MFi program or through specific frameworks to facilitate a quick and automatic connection between the hardware and the iPhone without relying on the manual Personal Hotspot?
Are there any best practices for improving the local Wi-Fi connection experience between an accessory and an iPhone in the absence of hotspot controls?
I would appreciate any guidance, experience, or resources that would help me better understand the feasible options in iOS for scenarios where fast and direct communication between hardware and mobile devices via Wi-Fi is required.
Translated with DeepL.com (free version)
Topic:
App & System Services
SubTopic:
Hardware
Recently, I've noticed that background Bluetooth scanning stops when I move an app to the background on an iPhone 17 device with Bluetooth 6. I'm curious about a solution. Background Bluetooth scanning doesn't stop on devices older than iOS 26, or on devices that were updated from an iPhone 17 or earlier to iOS 26.
We are develop an application with corebluetooth framework. we connect to device with ble. and open two l2cap channels. it can transfer data with stream. but when it close the second l2cap channel, it always close the first l2cap channel.
We have recently encountered an App crash, as shown in the picture.
We call this function as:
let session = AVAudioSession.sharedInstance() guard session.currentRoute.outputs.isEmpty == false else { return false }
The TestFlight caught this issue, and the iOS device information is attached:
Do you have any suggestions to avoid this crash?
Topic:
App & System Services
SubTopic:
Hardware
Hi, as other threads have already discussed, I'd like to record audio from a keyboard extension.
The keyboard has been granted both full access and microphone access. Nonetheless whenever I attempt to start a recording from my keyboard, it fails to start with the following error:
Recording failed to start: Error Domain=com.apple.coreaudio.avfaudio Code=561145187 "(null)" UserInfo={failed call=err = PerformCommand(*ioNode, kAUStartIO, NULL, 0)}
This is the code I am using:
import Foundation
import AVFoundation
protocol AudioRecordingServiceDelegate: AnyObject {
func audioRecordingDidStart()
func audioRecordingDidStop(withAudioData: Data?)
func audioRecordingPermissionDenied()
}
class AudioRecordingService {
weak var delegate: AudioRecordingServiceDelegate?
private var audioEngine: AVAudioEngine?
private var audioSession: AVAudioSession?
private var isRecording = false
private var audioData = Data()
private let targetFormat = AVAudioFormat(commonFormat: .pcmFormatInt16,
sampleRate: 16000,
channels: 1,
interleaved: false)!
private func setupAudioSession() throws {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .spokenAudio,
options: [.mixWithOthers, .allowBluetooth, .defaultToSpeaker])
try session.setPreferredIOBufferDuration(0.005)
try session.setActive(true, options: .notifyOthersOnDeactivation)
audioSession = session
}
func checkMicrophonePermission(completion: @escaping (Bool) -> Void) {
switch AVAudioApplication.shared.recordPermission {
case .granted:
completion(true)
case .denied:
delegate?.audioRecordingPermissionDenied()
completion(false)
case .undetermined:
AVAudioApplication.requestRecordPermission { [weak self] granted in
if !granted {
self?.delegate?.audioRecordingPermissionDenied()
}
completion(granted)
}
@unknown default:
delegate?.audioRecordingPermissionDenied()
completion(false)
}
}
func toggleRecording() {
if isRecording {
stopRecording()
} else {
checkMicrophonePermission { [weak self] granted in
if granted {
self?.startRecording()
}
}
}
}
private func startRecording() {
guard !isRecording else { return }
do {
try setupAudioSession()
audioEngine = AVAudioEngine()
guard let engine = audioEngine else { return }
let inputNode = engine.inputNode
let inputFormat = inputNode.inputFormat(forBus: 0)
audioData.removeAll()
guard let converter = AVAudioConverter(from: inputFormat, to: targetFormat) else {
print("Failed to create audio converter")
return
}
inputNode.installTap(onBus: 0, bufferSize: 1024, format: inputFormat) { [weak self] buffer, _ in
guard let self = self else { return }
let frameCount = AVAudioFrameCount(Double(buffer.frameLength) * 16000.0 / buffer.format.sampleRate)
guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: self.targetFormat,
frameCapacity: frameCount) else { return }
outputBuffer.frameLength = frameCount
var error: NSError?
converter.convert(to: outputBuffer, error: &error) { _, outStatus in
outStatus.pointee = .haveData
return buffer
}
if error == nil, let channelData = outputBuffer.int16ChannelData {
let dataLength = Int(outputBuffer.frameLength) * 2
let data = Data(bytes: channelData.pointee, count: dataLength)
self.audioData.append(data)
}
}
engine.prepare()
try engine.start()
isRecording = true
delegate?.audioRecordingDidStart()
} catch {
print("Recording failed to start: \(error)")
stopRecording()
}
}
private func stopRecording() {
audioEngine?.inputNode.removeTap(onBus: 0)
audioEngine?.stop()
isRecording = false
let finalData = audioData
audioData.removeAll()
delegate?.audioRecordingDidStop(withAudioData: finalData)
try? audioSession?.setActive(false, options: .notifyOthersOnDeactivation)
}
deinit {
if isRecording {
stopRecording()
}
}
}
Granting the deprecated "Inter-App Audio" capability did not solve the problem either.
Is recording audio from a keyboard extension even possible in general? If so, how do I fix it?
Related threads:
https://developer.apple.com/forums/thread/108055
https://developer.apple.com/forums/thread/742601
We are preparing to implement document signing using USB tokens on iOS and macOS. Several other applications already support this feature.
From my testing and development efforts, I've been unable to reliably access or utilize certificates stored on a smartcard through the iOS APIs. Here are the specifics:
Environment
iOS: 15 and later
Xcode: Versions 18 and 26
Smartcard/Token: ePass 2003 (eMudhra), Feitien token (Capricorn)
Observed Issue :
The token is recognized at the system level, with certificates visible in Keychain Access.
However, programmatic access to the private keys on the smartcard from within the app is not working.
Signing attempts result in Error 6985 and CACC errors.
Approaches Tried:
Updated provisioning profiles with the following entitlements:
com.apple.developer.smartcard
com.apple.security.device.usb
TKSmartCard
Employed TKSmartCard and TKSmartCardSession for interaction.
The token is detected successfully.
A session can be established, but there's no straightforward method to leverage it for certificate-based signing.
Access to signing functions is unavailable; operations yield Error 6985 or CACC errors.
if let smartCard = TKSmartCard(slot: someSlot) {
smartCard.openSession { session, error in
if let session = session {
let command: [UInt8] = [0x00, 0xA4, 0x04, 0x00]
session.transmit(Data(command)) { response, error in
print("Response: \(String(describing: response))")
print("Error: \(String(describing: error))")
}
}
}
}
TokenKit (macOS/iOS)
- Utilized TKTokenWatcher to identify available tokens on macOS (not available on iOS).
watcher.setInsertionHandler { tokenID in
print("Token detected: \(tokenID)")
}
CryptoKit / Security Framework
- Attempted to retrieve SecCertificate using SecItemCopyMatching queries, which succeeded on macOS but failed on iOS.
let query: [CFString: Any] = [
kSecClass: kSecClassCertificate,
kSecReturnRef: true,
kSecMatchLimit: kSecMatchLimitAll
]
var items: CFTypeRef?
let status = SecItemCopyMatching(query as CFDictionary, &items)
print("Status: \(status)") // macOS succeeds, iOS fails
ExternalAccessory Framework (EAAccessory)
* Investigated using EAAccessory and EASession for external token communication, but it did not function as expected.
This functionality is critical for my project. Has anyone successfully implemented smartcard-based signing on iOS? Any guidance, sample code, or references to relevant Apple documentation would be greatly appreciated.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
iOS
Apple CryptoKit
USBDriverKit
CryptoTokenKit
Hello,
The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine.
My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil?
For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point?
Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch?
Thanks!
I am debugging ImageCaptureCore to communicate with external cameras.
When I called the PTP function below to send a command and add data, the response timed out for more than 5 seconds. After waiting for a period of time, I obtained the response. However, the response callback function obtained responsivData.length as zero and ptpResponseData.length as zero too.
(void)requestSendPTPCommand:(NSData *)ptpCommand
outData:(NSData *)ptpData
completion:(void (^)(NSData *responseData, NSData *ptpResponseData, NSError *error))completion;
data is below:
Wrote 1 = 0x1 bytes PTP:send data: (hexdump of 1 bytes)
[ ] I/PTP (14564): 0000 01 - .
Topic:
App & System Services
SubTopic:
Hardware
I have some logic which requires NFC support on the device. This is what I'm using to make sure that it's available:
isNFCMissing = !NFCNDEFReaderSession.readingAvailable && !NFCTagReaderSession.readingAvailable && !NFCVASReaderSession.readingAvailable
Is it possible for isNFCMissing to be true even if the device has an NFC chip.
The minimum iOS version for the application is 16 which is only supported on devices with an NFC chip to begin with.
According to Accessory Design Guidelines iPadOS support HID trackpad.
Is there a design example of such supported devices?
I have tried to adapt device software to guidelines without any result on iPad.
Topic:
App & System Services
SubTopic:
Hardware
My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9.
Happens especially when Simulator is opened.
I am writing to seek assistance regarding an iBeacon implementation issue we are experiencing in our iOS application.
Issue Description: We have successfully implemented iBeacon functionality in our app, but we are encountering a specific problem with background region monitoring:
When app is in foreground: Our app successfully detects iBeacon signals and triggers notifications when entering beacon regions.
When app is terminated: Our app fails to respond when entering our own iBeacon regions. However, we have observed an interesting behavior:
Third-party iBeacon apps can still detect and trigger notifications for their beacon regions
After a third-party app triggers, our app suddenly starts receiving notifications for our own iBeacon hardware
Technical Details:
iOS Version: 18.0
Xcode Version: 16.。4
Device Models Tested: iPhone 15 Pro
Questions:
What could be causing our app to fail detecting iBeacon regions when terminated, while third-party apps work correctly?
Why does our iBeacon detection start working only after another iBeacon app triggers?
Are there specific implementation requirements or best practices for reliable background iBeacon monitoring?
Could this be related to iOS background app refresh policies or system resource management?
Current Implementation: We have implemented the standard Core Location framework with:
CLLocationManager with appropriate authorization
Region monitoring setup with CLBeaconRegion
Background modes enabled for location services
Proper delegate methods implemented
We would greatly appreciate your guidance on resolving this issue, as it significantly impacts our app's user experience.
Thank you for your time and support.
Topic:
App & System Services
SubTopic:
Hardware
I am developing an app that communicates with external BLE device over GATT. The device has a secure-read characteristic exposing some of it's data and requires pairing/bonding in order to communicate with it.
I was able to pair and connect with the device using AccessorySetupKit and .bluetoothPairingLE option:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, descriptor: descriptor)
In this case when setting up accessory, I was prompted to compare passkeys and after confirming I can read the characteristic etc.
Then I tried adding .confirmAuthorization to picker item and problems started:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, desc
pickerItem.setupOptions = [.confirmAuthorization]
When setting up, I can see a passkey to be confirmed, but when confirmed the setup ui get's suck in loading state. Under the hood in logs, I can see that my app has connected to peripheral and was able to read the characteristic.
I am unsure why the ui is stuck in loading state in this case. What is the difference when using .confirmAuthorization option and what should be the proper flow of events to setup accessory and then access protoected characteristic?