I am attempting to create a product and application that utilizes the Face-ID sensor for something outside of Face-ID itself. It is possible to re-use this sensor for something outside of Face-ID?
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
We are currently using Hub which uses CYPD3125 PD chip, It is used to connect with both Android and iOS devices. While our device works seamlessly with Android devices, we are encountering an issue when connecting to iOS devices, specifically the iPad Pro.
Issue Description:
The Powerpack/Hub is intended to handle Power Delivery (PD) communications. When connected to an Android device, the PD packets are exchanged correctly, and the device functions as expected. However, when connected to an iPad Pro, we observe abnormal PD packet exchanges which lead to malfunctioning of the Powerpack/Hub. Observations:
Attached is a snapshot of the PD packets we captured while troubleshooting the issue in a scenario where the AC power adapter was initially connected. After a few seconds, we removed the plug, waited for a few seconds, and then plugged in the AC power again. This was the scenario when we captured the PD packets, as seen in the snapshot. The packets appear to be different when compared to those captured with an Android device.
Below is the screenshot of the PD packet capture with Apple device:
Below is the screenshot of the PD packet capture with Android device:
Technical Observations:
Initial Connection: The connection initiates but does not follow the expected PD communication sequence. Packet Structure: In the capture, the iPad Pro shows a series of PD Msg types including Src Cap, Req, and Accept, but there are also unexpected messages such as Hard Reset and Soft Reset that disrupt the communication. Timing Issues: The timestamps show irregular intervals between packets when connected to the iPad Pro, suggesting possible timing synchronization issues. Unexpected Resets: The capture shows a Hard Reset event at packet 9, which is not observed in the Android device captures. This suggests the iPad Pro might be detecting an error and attempting to reset the connection. Steps Taken:
Verified the firmware and hardware implementation of the Powerpack/Hub. Ensured compliance with USB PD standards. Tested with multiple iPad Pro units to rule out device-specific issues. Additional Details: We have also tested with iPad Air and observed the same issue. The tests were conducted on both iOS version 16 and 17. We are attaching a USB PD capture with an Android device where it is working fine as expected. The PD packets were captured in a scenario where the AC power adapter was initially connected. After a few seconds, we removed the plug, waited for a few seconds, and then plugged in the AC power again. This was the scenario when we captured the PD packets, as seen in the snapshot.
Despite these steps, the issue persists. We seek guidance on any issues or peculiarities with iOS devices and USB PD communication.
Thanks
Topic:
App & System Services
SubTopic:
Hardware
I am having difficulty figuring out two indicators for a custom piece of battery hardware.
Firstly, on the home screen, scrolling all the way to the left in the widget screen, you can see the battery for connected wireless devices, electronic pencils, etc.
Additionally, when you use Apple battery packs, you can see the battery for it on the top right of your phone
I am wondering where I should look to see how I could integrate both of these. I have searched the documentation for a while, and I am having a hard time knowing where to start. If anyone can point me to something, it would be very appreciated. Thank you!
I have a mobile phone and a number of peripherals. I want to know the maximum number of peripherals that can be connected to a mobile phone. Is there any clear explanation for this, or what is the relationship between the number of connections
My phone keeps restarting randomly every few seconds to every few minutes at random. It also feels like its overheating at the same time.
Heres the panic log:
bug_type":"210","timestamp":"2024-10-22 12:23:26.00 -0400","os_version":"iPhone OS 18.0.1 (22A3370)","roots_installed":0,"incident_id":"E2DC5427-D751-4339-827B-181625987FCF"}
{
"build" : "iPhone OS 18.0.1 (22A3370)",
"product" : "iPhone15,3",
"socId" : "8120",
"socRevision" : "11",
"incident" : "E2DC5427-D751-4339-827B-181625987FCF",
"crashReporterKey" : "df026d01d48e533d7258182dafc82a95473d59d4",
"kernel" : "Darwin Kernel Version 24.0.0: Thu Aug 8 01:15:37 PDT 2024; root:xnu-11215.2.562/RELEASE_ARM64_T8120",
"date" : "2024-10-22 12:23:26.77 -0400",
"panicString" : "panic(cpu 0 caller 0xfffffff040697444): AOP PANIC - !pulse pearl@0x1173490 - power(2) OUTBOX3 not ready - \nuser handlers:\nEiger::probe=0 [e4 30 1] conn=0\n\nPrAS Comp = stat [0, 0], dbg [205452, 41680, 416832, 0, 12665, 404163, 1, 9973, 4] \n\n\n!pulse pearl@0x1173490\nRTKit: RTKit-2758.2.1.debug - Client: iphone15aop:DEBUG:AppleSPUFirmwareBuilder-632.0.717199\n!UUID: 26e83ac2-7536-30ff-9f4c-57d91477a498\nASLR slide: 0x0000000000000000\nTime: 0x000000025b5db6d5\n\nFaulting task 2 Call Stack: 0x00000000011078c4 0x0000000001107250 0x0000000001107064 0x000000000110acf4 0x000000000110adb4 0x00000000010ef884 0x00000000010f3c38 0x00000000010e9438 0x00000000010148cc 0x00000000010e93a8 0x00000000011069b4 0x0000000001106730
Topic:
App & System Services
SubTopic:
Hardware
I added some AVCaptureControl, then removed all of it. The phenomenon is that AVCaptureSession controls are 0 in number, but the keys can still display the previous AvcaptureControls
guard(_session.supportsControls) else return;
for (AVCaptureControl *control in _session.controls) {
[_session removeControl:control];
}
@weakify(self);
if (self.captureControl.zoom) {
if (self.zoomScaleControl) {
self.zoomScaleControl.enabled = false;
[_session removeControl:self.zoomScaleControl];
}
AVCaptureSlider *zoomSlider = [self.captureControl.zoom fetchCaptureSlider];
[zoomSlider setActionQueue:dispatch_get_main_queue() action:^(float zoomFactor) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeZoomScale:)]) {
[self.dataOutputDelegate videoCaptureSession:self tryChangeZoomScale:zoomFactor];
}
}];
self.zoomScaleControl = zoomSlider;
} else {
self.zoomScaleControl = nil;
}
if (self.captureControl.exposure) {
if (self.exposureBiasControl) {
self.exposureBiasControl.enabled = false;
[_session removeControl:self.exposureBiasControl];
}
AVCaptureSlider *exposureSlider = [self.captureControl.exposure fetchCaptureSlider];
[exposureSlider setActionQueue:dispatch_get_main_queue() action:^(float bias) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeExposureBias:)]) {
[self.dataOutputDelegate videoCaptureSession:self tryChangeExposureBias:bias];
}
}];
self.exposureBiasControl = exposureSlider;
} else {
self.exposureBiasControl = nil;
}
if (self.captureControl.len) {
if (self.lenControl) {
self.lenControl.enabled = false;
[_session removeControl:self.lenControl];
}
ORLenCaptureControlCustomModel *len = self.captureControl.len;
AVCaptureIndexPicker *picker = [len fetchCaptureSlider];
[picker setActionQueue:dispatch_get_main_queue() action:^(NSInteger selectedIndex) {
@strongify(self);
if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:didChangeLenIndex:datas:)]) {
[self.dataOutputDelegate videoCaptureSession:self didChangeLenIndex:selectedIndex datas:self.captureControl.len.indexDatas];
}
}];
self.lenControl = picker;
} else {
self.lenControl = nil;
}
if ([_session canAddControl:self.zoomScaleControl]) {
[_session addControl:self.zoomScaleControl];
} else {
self.zoomScaleControl = nil;
}
if ([_session canAddControl:self.lenControl]) {
[_session addControl:self.lenControl];
} else {
self.lenControl = nil;
}
if ([_session canAddControl:self.exposureBiasControl]) {
[_session addControl:self.exposureBiasControl];
} else {
self.exposureBiasControl = nil;
}
[_session setControlsDelegate:self queue:GetCaptureControlQueue()];
Topic:
App & System Services
SubTopic:
Hardware
In October the 9th I had 97% battery health and after not even a month (on the 28th October) my battery health went down to 94%. Anyone knows why ? I haven’t been using my phone more or charging it more than usual.
using an iPhone 15 pro max.
This is really concerning !!! 3% in less than a month?
I don't know what else to say other than the 18.0.1 update did not resolve my camera issue on my iPhone 16 pro. Still crashes a few times a week and I have to restart my phone every time. I heavily use my camera as I have 2 toddlers. C'mon, Apple! Get it together, I've owned my phone for a month now.
We have a USB dongle that would like to connect to the iPhone for power (using USB-C). Since it is not MFi, we will then use bluetooth for communication between the dongle and an app.
When doing bluetooth pairing between the dongle and iPhone, it would be ideal to only see the dongle that is plugged into the iPhone listed in the app. This is to avoid connecting to other dongles that may be in the area.
We think this could be possible using USB descriptors. We assume the iPhone can read the USB descriptors for non-MFi dongles.
Our question is, can our app see the USB-descriptors of the dongle? Is iOS able to pass that info to the app?
Then, we could have a unique USB descriptor for each dongle and the app could only list bluetooth devices with that descriptor (effectively filtering out any other dongles in the area).
Any help and/or feedback is greatly appreciated :)
Hello Apple Support Team,
I'm experiencing an issue with my iPhone 15 Pro. Although the battery health shows 100%, the phone shuts down unexpectedly at various charge levels, sometimes as high as 70% or even 40%. My iPhone is currently on iOS 18.2 beta 2, but this issue began with iOS 18.2 beta 1.
I’ve tried multiple troubleshooting steps:
Formatted the iPhone
Performed a force reboot
Upgraded and then downgraded the software
Unfortunately, none of these solutions resolved the problem. The panic report doesn’t appear to show anything conclusive. I'm attaching the panic report for further analysis, as I’m unsure if this is a software bug related to the beta or a hardware issue.
Thank you for your assistance.
Topic:
App & System Services
SubTopic:
Hardware
Hi, I made an earier post about this and unfortunately can’t locate it so I’m doing a new one with
screenshots. After I click on the Done button, I can briefly see it already residing on my phone.
Please see the attached photos.
Is this normal?
I have a iPhone 15 ProMax.
Thank you,
D
Topic:
App & System Services
SubTopic:
Hardware
target to mac,it seems not to be able to
tried on ios,“import homekit”,but didnt find my device-homepod mini,which is new one with upgraded version
用户已经授权了本地网络权限,系统还会报unsatisfied (Local network prohibited), interface: en0[802.11], ipv4, uses wifi) 是什么原因,重启手机就可以了
Topic:
App & System Services
SubTopic:
Hardware
Hello, I am making an app that requires the use of builtInTrueDepthCamera. I am trying to set a custom exposure mode but the completion for:
open func setExposureModeCustom(duration: CMTime, iso ISO: Float) async -> CMTime
never gets called.
It works perfectly fine for builtInLiDARDepthCamera. In my function I am confirming that a custom exposure mode is supported and it is within the range of acceptable durations. Is it known that this just does not work?
Here is my AVCaptureDevice extension function:
extension AVCaptureDevice {
func setExposureTime(to milliseconds: Double) async {
print("setting exposure time 1")
await withCheckedContinuation { continuation in
// Check if custom exposure mode is supported
guard self.isExposureModeSupported(.custom) else {
print("Custom exposure mode is not supported on this device.")
continuation.resume() // Resume immediately if not supported
return
}
// Convert milliseconds to CMTime
let exposureTime = CMTimeMake(value: Int64(milliseconds * 1_000), timescale: 1_000_000)
print("Exposure time var : \(exposureTime.seconds * 1000)")
print("Exposure time min : \(self.activeFormat.minExposureDuration.seconds * 1000)")
print("Exposure time max : \(self.activeFormat.maxExposureDuration.seconds * 1000)")
// Ensure the exposure time is within the supported range
guard exposureTime >= self.activeFormat.minExposureDuration,
exposureTime <= self.activeFormat.maxExposureDuration else {
print("Exposure time is out of the supported range.")
continuation.resume() // Resume immediately if out of range
return
}
print("setting exposure time 2")
// Attempt to set the exposure time
do {
try self.lockForConfiguration()
print("setting exposure time 3")
self.setExposureModeCustom(duration: exposureTime, iso: AVCaptureDevice.currentISO) { time in
print("Exposure time set to: \(time.seconds * 1000) ms")
continuation.resume() // Resume after the completion handler is executed
}
self.unlockForConfiguration()
} catch {
print("Failed to configure exposure: \(error)")
continuation.resume() // Resume on failure
}
}
}
}
Topic:
App & System Services
SubTopic:
Hardware
I am running the same Python script using the TensorFlow Metal module on computers with M3 and M4 GPUs. While 1 epoch takes 5 minutes on the M3 device, it takes 15 minutes on the M4 device. What could be the reason for this? Could it be that TensorFlow Metal is not yet optimized for the M4 architecture?
Topic:
App & System Services
SubTopic:
Hardware
Tags:
ML Compute
Metal Performance Shaders
tensorflow-metal
I am developing a Bluetooth pointer device to control an iPad using HID. Most functionality works well, including mouse movement and button presses. However, I am encountering a strange issue with button releases. For the HID descriptor for the iPad I defined the following:
0501
0902
A101
8503
0509 // Mouse Buttons
1901
2902 // 2 buttons
1500
2501
9502 // Report count for two buttons
7501
8102
9501
7506 // Padding
8103
0501
0901
A100
1500 // Min value
26FF7F // Max value (0...32767)
0930
0931
7510
9502
8102 // Absolute coordinate pointer
C0
Do you see an issue with the descriptor?
Example packages sent over bluetooth:
0xA103 01 4F3A FB50 // 01 is a left button click, works well, an icon is clicked on iPad.
0xA103 00 4F3A FB50 // 00 should be a left button release, package is sent and received, but button is NOT released, but held on the iPad! The mouse coordinates are updated well, however, I expect the button to be released when sending 0xA103 00 4F3A FB50, but it is held down instead.
Perhaps there is a special requirement for iOS to make this work? It is close to be fully functioning.
I am working on a Bluetooth Low Energy (BLE) project using the nRF52840 Development Kit (DK), which has been reconfigured to simulate an nRF52805 chip. The firmware is based on Nordic Semiconductor's ble_app_hids_keyboard example, with modifications to implement a BLE HID Gamepad. I am using the S113 SoftDevice and have successfully tested the functionality with Android devices. The gamepad is recognized as a HID device, and it works as expected on Android, verified using the hardwareTester website.
However, when I connect the gamepad to an iPhone via BLE, the same hardwareTester website does not respond as it does on Android, indicating that the iPhone does not recognize the device as a gamepad. The BLE connection is established successfully, but it seems iOS does not interpret the HID report descriptor or the BLE HID service correctly. I suspect there might be compatibility issues with the HID descriptor or the GATT attributes for iOS-specific BLE HID requirements.
I would like to have some help.
i never imagined that an apple product could do such. a thing . i 've updated to the latest version , 15.3 what should i do next time? i've had to restart it three times, the last one finally helped
here is the link https://youtu.be/-aqjzVKMZGA
I have a C++/Objective-C command line application, running on MacOs (15.1.1 (24B91)), that communicates with a Bluetooth LE peripheral. The application is build with Apple clang 16.0.0 and CMake as build system using Boost.Asio.
I'm able to establish a L2CAP channel and after the channel is established, the peripheral sends a first (quite small) SDU on that channel to the application. The PSM is 0x80 and was chosen by the peripherals BLE stack. The application receives the PSM via GATT notification.
I can see the SDU being send in a single LL PDU with Wireshark. I can also see the SDU being received in Apples PacketLogger. But I miss the corresponding call to a stream event handler. For all other GATT related events, the corresponding delegates / callbacks are called.
The code that creates a dispatch queue and passes it to the CBCentralManager looks like this:
dispatch_queue = dispatch_queue_create("de.torrox.ble_event_queue", NULL);
manager = [[CBCentralManager alloc] initWithDelegate:self queue:dispatch_queue options:nil];
When the L2CAP channel is established, the didOpenL2CAPChannel callback gets called from a thread within the dispatch_queue (has been verified with lldb):
- (void)peripheral:(CBPeripheral *)peripheral
didOpenL2CAPChannel:(CBL2CAPChannel *)channel
error:(NSError *)error
{
[channel inputStream].delegate = self;
[channel outputStream].delegate = self;
[[channel inputStream] scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[[channel outputStream] scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[[channel inputStream] open];
[[channel outputStream] open];
...
// a reference to the channel is stored in the outside channel object
[channel retain];
...
}
Yet, not a single stream event is generated:
- (void)stream:(NSStream *)stream
handleEvent:(NSStreamEvent)event_code
{
Log( @"stream:handleEvent %@, %lu", stream, event_code );
...
}
When I add a functionality, to poll the input stream, the stream will report the expected L2CAP input. But no event is generated.
The main thread of execution is usually blocking on a boost::asio::io_context::run() call. The design is, to have the stream callback stream:handleEvent to post call back invocations on that io_context, and thus to wake up the main thread and get that callbacks being invoked on the main thread.
All asynchronous GATT delegate calls are working as expected. The only missing events, are the events from the L2CAP streams. The same code worked in an older project on an older version of MacOs and an older version of Boost.
How can I find out, why the stream delegates are not called?
At present, I am using the avfoundation external device API to connect my iPad to a DSLR camera for data collection. On my end, I am using AVCapture Video Data Output to obtain raw data for processing and rendering. However, the pixelbuf returned from the system layer is incomplete, with only a portion cropped in the middle. But using the Mac API is normal. I would like to ask how to obtain the complete pixelbuf of the image on iPad