I have a small furnace that heats a sample of my product, a cylinder measuring 20mm x 100mm. As it is heated, there is thermal expansion, and I at this moment read (using my eyes) a micrometer dial gauge and type into my app the value read in hundredths of a millimeter. My source code uses quartz to draw a PDF graph that represents this thermal expansion. I would like to enhance my set up by getting my app to collect the linear displacement from my dilatometer furnace. One possibility I have been trying to implement is to have my app collect DC millivolts from an instrument, like a voltmeter reading the output of an LVDT (Linear Variable Differential Transformer), but the model I have requires MODBUS-RTU protocol, which is not well implemented by Mac OS X developers. It requires an USB-Serial converter, which complicates things and makes it more unreliable. At this moment, I have to sit next to my furnace and type into my app the temperature and linear displacement for 3 hours. I would like to read some comments and suggestions from you or an Apple Engineer and how this could be achieved.
Here is a video of my dilatometer furnace:
https://www.correiofacil.com/video/IMG_8888.MOV
Here is a PDF I generated using my APP:
https://www.correiofacil.com/PDF/117.pdf
Here is a video showing how I type data manually into my app:
https://www.correiofacil.com/video/recording2.mov
Here is a video that shows my app drawing a graph:
https://www.correiofacil.com/video/recording1.mov
These 2 images show my attempts to use the app ModBusRtuMaster to collect data from the voltmeter which at this moment fails:
https://www.correiofacil.com/PDF/8f9be971dee32a58c36b307dbab3dc2d.png
https://www.correiofacil.com/PDF/4ac20c2ea762fd42ed340a7a2b567900.png
You will notice that although the instrument is displaying 0.000 volts, the data collected changes at each message I send to the instrument, proving me that Modbus RTU set up is not working.
Here is a link to the manual of the DC voltmeter I am using:
https://www.correiofacil.com/PDF/manual1.pdf
Here is an image of the LVDT I am trying to use:
https://www.correiofacil.com/PDF/e0349986512f9273b26b209baa3ba5c7.png
Here is the manual for the LVDT:
https://www.correiofacil.com/PDF/GGD19.pdf
I suspect that there could be a completely different approach to collecting the data and inputting into my app, maybe Bluetooth?
Maybe using a Mitutoyo gauge such as? This:
https://www.mitutoyo.com/Images/CatalogUS-1003/resources/_pdfs_/US-1003_Catalog_251.pdf
And a cable with SPC support like this:
https://www.gw-style.com/product-p-942977.html
I am very grateful for your comments
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Recently, I've noticed that background Bluetooth scanning stops when I move an app to the background on an iPhone 17 device with Bluetooth 6. I'm curious about a solution. Background Bluetooth scanning doesn't stop on devices older than iOS 26, or on devices that were updated from an iPhone 17 or earlier to iOS 26.
We are preparing to implement document signing using USB tokens on iOS and macOS. Several other applications already support this feature.
From my testing and development efforts, I've been unable to reliably access or utilize certificates stored on a smartcard through the iOS APIs. Here are the specifics:
Environment
iOS: 15 and later
Xcode: Versions 18 and 26
Smartcard/Token: ePass 2003 (eMudhra), Feitien token (Capricorn)
Observed Issue :
The token is recognized at the system level, with certificates visible in Keychain Access.
However, programmatic access to the private keys on the smartcard from within the app is not working.
Signing attempts result in Error 6985 and CACC errors.
Approaches Tried:
Updated provisioning profiles with the following entitlements:
com.apple.developer.smartcard
com.apple.security.device.usb
TKSmartCard
Employed TKSmartCard and TKSmartCardSession for interaction.
The token is detected successfully.
A session can be established, but there's no straightforward method to leverage it for certificate-based signing.
Access to signing functions is unavailable; operations yield Error 6985 or CACC errors.
if let smartCard = TKSmartCard(slot: someSlot) {
smartCard.openSession { session, error in
if let session = session {
let command: [UInt8] = [0x00, 0xA4, 0x04, 0x00]
session.transmit(Data(command)) { response, error in
print("Response: \(String(describing: response))")
print("Error: \(String(describing: error))")
}
}
}
}
TokenKit (macOS/iOS)
- Utilized TKTokenWatcher to identify available tokens on macOS (not available on iOS).
watcher.setInsertionHandler { tokenID in
print("Token detected: \(tokenID)")
}
CryptoKit / Security Framework
- Attempted to retrieve SecCertificate using SecItemCopyMatching queries, which succeeded on macOS but failed on iOS.
let query: [CFString: Any] = [
kSecClass: kSecClassCertificate,
kSecReturnRef: true,
kSecMatchLimit: kSecMatchLimitAll
]
var items: CFTypeRef?
let status = SecItemCopyMatching(query as CFDictionary, &items)
print("Status: \(status)") // macOS succeeds, iOS fails
ExternalAccessory Framework (EAAccessory)
* Investigated using EAAccessory and EASession for external token communication, but it did not function as expected.
This functionality is critical for my project. Has anyone successfully implemented smartcard-based signing on iOS? Any guidance, sample code, or references to relevant Apple documentation would be greatly appreciated.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
iOS
Apple CryptoKit
USBDriverKit
CryptoTokenKit
We have recently encountered an App crash, as shown in the picture.
We call this function as:
let session = AVAudioSession.sharedInstance() guard session.currentRoute.outputs.isEmpty == false else { return false }
The TestFlight caught this issue, and the iOS device information is attached:
Do you have any suggestions to avoid this crash?
Topic:
App & System Services
SubTopic:
Hardware
Hello,
I am a developer planning to build an application using Apple's new SpeechTranscriber technology.
I am facing an issue where SpeechTranscriber is not available on my iPad Pro (11-inch, 2nd generation, model number: MXDC2J/A), even though I have updated it to iPadOS 26. I was under the impression that SpeechTranscriber would be available on any device running iPadOS 26. Could you please clarify if this is incorrect?
Furthermore, I am planning to purchase a new iPad with an A16 chip for the development and deployment of this application. Can you confirm if SpeechTranscriber will be fully functional on an iPad equipped with the A16 chip?
Thank you for your assistance.
I have some logic which requires NFC support on the device. This is what I'm using to make sure that it's available:
isNFCMissing = !NFCNDEFReaderSession.readingAvailable && !NFCTagReaderSession.readingAvailable && !NFCVASReaderSession.readingAvailable
Is it possible for isNFCMissing to be true even if the device has an NFC chip.
The minimum iOS version for the application is 16 which is only supported on devices with an NFC chip to begin with.
Hello,
The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine.
My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil?
For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point?
Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch?
Thanks!
According to Accessory Design Guidelines iPadOS support HID trackpad.
Is there a design example of such supported devices?
I have tried to adapt device software to guidelines without any result on iPad.
Topic:
App & System Services
SubTopic:
Hardware
Hello,
We are testing Wallet passes with iBeacons in iOS 26 Beta.
In earlier iOS releases, when a device was in proximity to a registered beacon, the corresponding pass would surface automatically.
In iOS 26 Beta, this behavior no longer occurs, even if the pass is already present in Wallet. I have not found documentation of this change in the iOS 26 release notes.
Could you please confirm whether this is expected in iOS 26, or if it may be a Beta-specific issue? Any pointers to updated documentation would be appreciated.
Thank you.
My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9.
Happens especially when Simulator is opened.
We are develop an application with corebluetooth framework. we connect to device with ble. and open two l2cap channels. it can transfer data with stream. but when it close the second l2cap channel, it always close the first l2cap channel.
I am debugging ImageCaptureCore to communicate with external cameras.
When I called the PTP function below to send a command and add data, the response timed out for more than 5 seconds. After waiting for a period of time, I obtained the response. However, the response callback function obtained responsivData.length as zero and ptpResponseData.length as zero too.
(void)requestSendPTPCommand:(NSData *)ptpCommand
outData:(NSData *)ptpData
completion:(void (^)(NSData *responseData, NSData *ptpResponseData, NSError *error))completion;
data is below:
Wrote 1 = 0x1 bytes PTP:send data: (hexdump of 1 bytes)
[ ] I/PTP (14564): 0000 01 - .
Topic:
App & System Services
SubTopic:
Hardware
I am writing to seek assistance regarding an iBeacon implementation issue we are experiencing in our iOS application.
Issue Description: We have successfully implemented iBeacon functionality in our app, but we are encountering a specific problem with background region monitoring:
When app is in foreground: Our app successfully detects iBeacon signals and triggers notifications when entering beacon regions.
When app is terminated: Our app fails to respond when entering our own iBeacon regions. However, we have observed an interesting behavior:
Third-party iBeacon apps can still detect and trigger notifications for their beacon regions
After a third-party app triggers, our app suddenly starts receiving notifications for our own iBeacon hardware
Technical Details:
iOS Version: 18.0
Xcode Version: 16.。4
Device Models Tested: iPhone 15 Pro
Questions:
What could be causing our app to fail detecting iBeacon regions when terminated, while third-party apps work correctly?
Why does our iBeacon detection start working only after another iBeacon app triggers?
Are there specific implementation requirements or best practices for reliable background iBeacon monitoring?
Could this be related to iOS background app refresh policies or system resource management?
Current Implementation: We have implemented the standard Core Location framework with:
CLLocationManager with appropriate authorization
Region monitoring setup with CLBeaconRegion
Background modes enabled for location services
Proper delegate methods implemented
We would greatly appreciate your guidance on resolving this issue, as it significantly impacts our app's user experience.
Thank you for your time and support.
Topic:
App & System Services
SubTopic:
Hardware
On the iPad Pro 12.9-inch (3rd generation) cellular model, when you touch the screen with four fingers and then move your fingers, the touch is no longer detected. The same operation with one to three fingers works normally.
This phenomenon does not occur when accessibility is turned on.
Is this a beta-specific issue that will be fixed in the official release?
I am working on an app that requires the usage of CoreBluetooth – using both its CBPeripheralManager and CBCentralManager classes. Our app works with other phones and hardware peripherals to exchange data – so we wanted to explore adding AccessorySetupKit to streamline the hardware connection process.
AccessorySetupKit has been integrated (while CBPeripheralManager is turned off) and works great, but even with ASK added to our app's plist file and not in use, CBPeripheralManager fails with error: Cannot create a CBPeripheralManager while using AccessorySetupKit framework.
Is there any workaround or suggested path forward here? We'd still really like to use ASK while keeping our existing functionality, but are not seeing a clear way to do so.
Hello,
I would like to discuss the behavior of the expiration of NFCPresentmentIntentAssertion (test in iOS 18.5).
In the documentation we have :
The intent assertion expires if any of the following occur:
The intent assertion object deinitializes
Your app goes into the background
15 seconds elapse
BUT; in fact ; only the 1st rule is applied.
The expiration seems to be random after the usage of CardSession and that's difficult to give to the user a good experience.
Has someone faced the same kind of issue; or can give an explanation?
Regards,
François
Hello,
The application I'm working on must report new hardware connections. To retrieve connected displays information and monitor new connections, I'm using the "Core Graphics" framework (see recommendation https://developer.apple.com/forums/thread/779945).
The monitoring logic relies on a callback function which invokes when the local display configuration changes(kCGDisplayAddFlag/kCGDisplayRemoveFlag).
#import <Cocoa/Cocoa.h>
static void displayChanged(CGDirectDisplayID displayID, CGDisplayChangeSummaryFlags flags, void *userInfo)
{
uint32_t vendor = CGDisplayVendorNumber(displayID);
if (flags & kCGDisplayAddFlag)
{
if (vendor == kDisplayVendorIDUnknown)
{
NSLog(@"I/O Kit cannot identify the monitor. kDisplayVendorIDUnknown. displayId = %u", displayID);
return;
}
NSLog(@"%u connected. vendor(%u)", displayID, vendor);
}
if (flags & kCGDisplayRemoveFlag)
{
NSLog(@"%u disconnected", displayID);
}
}
int main(int argc, const char * argv[])
{
@autoreleasepool
{
CGDisplayRegisterReconfigurationCallback(displayChanged, NULL);
NSApplicationLoad();
CFRunLoopRun();
}
return 0;
}
The test environment is a Mac mini with an external display connected via HDMI. Everything works correctly until the system enters sleep mode. Upon wakeup, the app reports two displays: the first with vendor ID kDisplayVendorIDUnknown and the second with the expected vendor ID.
Why does Core Graphics report two connections during wakeup? Is there any way to avoid this?
Thank you in advance.
Dear Apple Developer / MFi Program Support,
I am exploring technical possibilities for screen sharing and remote interaction between iOS devices and external hardware (e.g., embedded systems, in-vehicle systems) for a prototype we are currently developing.
I have reviewed the public iOS developer documentation, but I would appreciate your guidance and clarification on the following advanced use cases, particularly in the context of MFi or enterprise-level integrations:
Full-Screen Sharing of iOS Device
Is it possible to mirror or stream the entire iOS screen, even when the app is running in the background or not in the foreground?
Does ReplayKit or any other framework under the MFi or enterprise entitlements allow full-device screen capture outside the app context?
Remote Touch Injection and Control
Is there any officially supported mechanism, under MFi or otherwise, that allows external systems to remotely control an iOS device’s touch interface (e.g., simulate gestures, taps, swipes)?
Are any of the following permitted under special entitlements:
Access to IOHIDEventSystem or similar private APIs for input injection?
Communication over USB or network to relay control commands that simulate direct user interaction?
Hardware-Level Integration and Entitlements
Does the MFi Program allow:
Use of private frameworks or entitlements to build low-level integrations for iOS device control or mirroring?
Communication over USB/Lightning/USB-C to enable bi-directional interaction (streaming out, commands in)?
What are the specific APIs or entitlements available under MFi that enable these use cases?
Can you provide references to documentation, SDKs, or prerequisites for companies seeking such capabilities?
Eligibility and Certification Process
What are the criteria to be approved for the MFi program with access to such advanced capabilities?
Can PoC or early-stage research prototypes be eligible, or is MFi access restricted to commercial production intent?
How long does it typically take to gain access to these entitlements (assuming NDA and certification requirements are met)?
Alternative Pathways
If MFi access is not feasible in the short term, is there any Apple-supported alternative path (e.g., test device provisioning, enterprise signing, custom profiles) that permits more advanced capabilities for prototyping purposes?
We are not looking to publish this as a general App Store app at this stage, but rather to demonstrate feasibility as part of an innovation prototype that may lead to further OEM-level engagement in the future.
Thank you for your support and guidance.
Best regards,
Topic:
App & System Services
SubTopic:
Hardware
I’m currently experiencing a concerning battery issue on my MacBook Pro and wanted to share it here in case others are facing something similar or if it’s possibly tied to a software-level behavior.
Device & Software Info:
• MacBook Pro 14” (2023)
• macOS 26.0(25A5316i) • Battery Cycle Count: 307
• Battery Health: 91% (as of today)
• Condition: Normal
Issue:
• Battery health has dropped from 96% to 91% in less than a week.
• For the last week each day, the maximum capacity seems to decline by 1%, even when usage is light and the Mac is plugged in most of the time with Battery Health Management enabled.
• When the battery drops below 10%, it drains extremely fast — the percentage decreases nearly second by second, which doesn’t feel normal.
I have bought iphone 16 pro max on October 28, ( 9 months ago) and rarely dropped by battery health below 20%. I also set limit to 80% so I can preserve my Battery Health. I am not a multitasking user. I used fan to keep the phone cool during charging. But today I update to iOS26 public beta, It dropped to 99% at 88 Cycle which is quite low cycle. Many other user are getting their battery health dropped to 99% after 120, 130+ cycle with daily usage. Why mine got dropped after updating. I am quite unhappy with it, and iOS 26 is so jittery in my phone