On the iPad Pro 12.9-inch (3rd generation) cellular model, when you touch the screen with four fingers and then move your fingers, the touch is no longer detected. The same operation with one to three fingers works normally.
This phenomenon does not occur when accessibility is turned on.
Is this a beta-specific issue that will be fixed in the official release?
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Someone is using all of these tools to hack my phone I have several device and my daughters tabkets and androids are also compromised . Use my home hub / home kit : cloudy it / Xcode / swift ui and siri search . If anyone could help I’d highly appreciate it I have contacted the FBI and have an apt in October b it if someone or anyone could help me debug my phone in the mean time I’d highly appreciate it . Have used my doctors apps and changed iclouds . Used family sharing with my daughter and screen time and have used share across devices . Starting wi try my outlook account that Jeremy Walker had access to . Please help me it’s effects my mental and physical health and quality time with My daughter . Using ethernet and hardware keyboards / voice over and AI . Please assist as this is extremely exhausting
help pls
Prerequisite: After the MDM APP issues the command, the camera on the phone is no longer visible (unusable).
After upgrading to iOS 26.1, the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable.
The isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method on iOS 26.0.1 is normal, returning false when the camera is unavailable and true when it is available.
Please fix this method to determine
If the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method cannot determine whether the camera is available, please provide an available judgment method.
Hi everyone,
I’m working on a custom camera implementation in iOS using native code. My goal is to capture unprocessed, realistic images directly from the camera — without any filters or post-image processing applied by the system.
I’ve implemented RAW image capture using the native camera APIs (AVFoundation) and successfully received .dng files. However, even the RAW outputs don’t look like the real environment — the colors, tone, and exposure still seem processed or corrected in some way.
I’ve tried various configurations such as photoSettings.rawPhotoPixelFormatType, experimenting with AVCaptureDevice and AVCapturePhotoOutput settings, and reviewing ProRAW and standard RAW behavior, but I’m still not getting truly unprocessed results that reflect the actual sensor data.
Has anyone experienced similar results when capturing RAW images on iOS, or found a way to bypass Apple’s image signal processing (ISP) pipeline for more realistic captures?
Any insights or references from Apple’s camera framework behavior would be greatly appreciated.
Thank you!
Hi everyone,
I submitted an app for review and was met with a rejection for unresolved issues.
This was what was asked in the rejection:
Provide detailed answers to the following questions:
-Does your app interact with any hardware?
Would that be referring to the camera/microphone of the device? My app uses haptics when you select an option. I didn't see anything in connect where I needed to specify the use of haptics.
Also, does this mean that when the reviewer answers me I have to resubmit as a version 1.1? I'm not sure what I would need to change. This is my first app so I'm not entirely sure on the procedure.
Topic:
App & System Services
SubTopic:
Hardware
Im running macOS Tahoe and I have the proper
nvram boot-args , however when I try to poke the log stream im not getting any verb information related to the card im using. The audio system im using is AppleHDA.kext from the Beta 1 KDK.
I've tried asking AI it doesn't make a difference what it suggests to me..... In the meantime of while im asking for assistance what ill do is go ahead and let it template me a kernel extension that I guess just traffics it to the Log for me and hopefully this isn't filtered out as what I suspect is it saying is happening is is that it actually masks some of the information.
Why am I doing this? not For the Linux Driver its so I can see from the Log where it came from as this is what the developer said he did GitHub/davidjo/snd_hda_macbookpro is the kabylake iMac.
We have an app that connects to an external device that we developed in-house that measures electroencephalography (EEG), as well as PPG and IMU. This is not a medical device and we have stated that many times but the app review process keeps rejecting the app for the same reason 1.4.1 - Safety physical harm because they say it is connecting to a medical device. We have submitted documentation for FCC certification for safety but we do not have FDA certification because it is not used for medical purposes - purely wellness. Despite several messages explaining it is not a medical device the response is always the same without actually addressing any of the supporting documents we have sent. Any help to find a way to explain to the Apple team that not all EEG devices are medical and in fact most are NOT FDA approved would be appreciated as it seems like whoever is reviewing the app doesn't understand that.
Topic:
App & System Services
SubTopic:
Hardware
We have developed an accessory that supports Find My. When using the Find My app to set it up, it occasionally gets stuck at the final " setting up"" interface. The app just stays like that. We would like to know what could cause this situation and how to resolve it.
Thanks a lot.
We are currently planning to develop a third‑party hardware accessory that supports Wi‑Fi Aware using AccessorySetupKit on iOS, based on the official documentation:
https://developer.apple.com/documentation/accessorysetupkit/
Before finalizing our hardware and firmware design, we would like to better understand the real‑world behavior and user experience of Wi‑Fi Aware in actual third‑party accessories.
Specifically, we would like to ask:
Existing Third‑Party Hardware
Are there any commercially available third‑party accessories (not Apple products) that already support Wi‑Fi Aware via AccessorySetupKit?
If so, are there any public examples, reference designs, or recommended products we can purchase to observe the real onboarding, discovery, and pairing experience?
Reference or Evaluation Hardware
Does Apple provide any reference hardware, evaluation kits, or recommended vendor solutions (for example, based on common Wi‑Fi chipsets) that are known to work well with Wi‑Fi Aware on iOS?
Are there specific Wi‑Fi chipset vendors that have validated interoperability with AccessorySetupKit?
Practical Behavior and Limitations
In real usage, what are the typical discovery latency, reliability, and background/foreground behavior developers should expect?
Are there known limitations or best practices when designing hardware that relies on Wi‑Fi Aware for initial accessory discovery and setup?
Our goal is to evaluate the feasibility and user experience of Wi‑Fi Aware for third‑party accessories by testing against existing implementations or recommended hardware, before investing heavily in custom hardware development.
Any guidance, examples, or pointers to existing accessories or partners would be greatly appreciated.
I want to add matter device to my own fabric,not same as to homeKit in Home APP
I implemented a demo which add a matter support extension, and it can success, but I use MTRDeviceController to commission,it go wrong, blow is the log
Couldn't read values in CFPrefsPlistSource<0x1062ec100> (Domain: group.wxx.MatterTest, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd
<<5 [E:46634i S:0 M:188511265] (U) Msg Retransmission to 0:0000000000000000 failure (max retries:4)
PASESession timed out while waiting for a response from the peer. Expected message type was 33
controller(:commissioningSessionEstablishmentDone:) error = nil
Error on commissioning step 'AttestationVerification': 'src/controller/CHIPDeviceController.cpp:1288: CHIP Error 0x000000AC: Internal error'
Failed verifying attestation information. Now checking DAC chain revoked status.
Failed in verifying 'Attestation Information' command received from the device: err 101. Look at AttestationVerificationResult enum to understand the errors
Error on commissioning step 'AttestationRevocationCheck': 'src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error'
Failed to send Solitary ack for MessageCounter:265529558 on exchange 46643i:src/messaging/ExchangeContext.cpp:99: CHIP Error 0x00000002: Connection aborted
Creating NSError from src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error (context: (null))
controller(:commissioningComplete:nodeID:metrics:) error = Optional(Error Domain=MTRErrorDomain Code=1 "General error: 172" UserInfo={NSLocalizedDescription=General error: 172, errorCode=172})
Is there any suggestion to me with the issue
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple Forums,
We are developing an iOS application that connects to a custom BLE accessory and sends control commands to it.
Our system architecture is as follows:
A separate hardware device collects data and sends it to our backend server via Wi-Fi.
The backend evaluates state changes and determines when the BLE accessory should update its display.
The iOS app acts purely as a BLE command executor for this accessory.
Our goal is to:
Maintain a BLE connection with the accessory while the app is in the background.
Receive state-change events from our backend server.
Upon receiving such events, send a BLE command to the accessory to update its state.
We understand that iOS does not allow arbitrary background execution. We would like to confirm whether there is any supported mechanism, entitlement, or program that allows:
Long-running background execution for BLE control, or
Server-originated events (other than APNs) to trigger background BLE actions.
If this is not supported, we would appreciate confirmation that APNs (silent push) is the only supported way to trigger such background BLE actions, or guidance on any recommended alternative architectures.
Thank you for your guidance.
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts.
For example:
AS1 (6 channels) is configured with the following channel names:
ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R
AS2 (4 channels) is configured with:
ADAT 1, ADAT 2, HP L, HP R
However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is:
ADAT 1, ADAT 2, ADAT 3, ADAT 4
The system simply hides the last two channels; the names of the remaining channels are not updated.
Initial Topology
My original topology was as follows:
Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following:
To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value.
This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names.
Question
On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts.
For example:
AS1 (6 channels) is configured with the following channel names:
ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R
AS2 (4 channels) is configured with:
ADAT 1, ADAT 2, HP L, HP R
However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is:
ADAT 1, ADAT 2, ADAT 3, ADAT 4
The system simply hides the last two channels; the names of the remaining channels are not updated.
Initial Topology
My original topology was as follows:
Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following:
To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value.
This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names.
Question
On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
Hello,
I am an individual developer working on a macOS application using SwiftUI and RealityKit.
I would like to understand the feasibility of face-related tracking on macOS when using an external USB camera, compared to iOS/iPadOS.
Specifically:
• Does macOS provide an ARKit Face Tracking–equivalent API (e.g., real-time facial expressions, gaze direction, depth)?
• If not, is it common to rely on Vision / AVFoundation as alternatives for:
• Facial expression coefficients
• Gaze estimation
• Depth approximation
• In an environment without dedicated sensors such as TrueDepth, is it correct to assume that accurate depth data and high-fidelity blend shape extraction are realistically difficult?
Any clarification on official limitations, recommended alternatives, or relevant documentation would be greatly appreciated.
Thank you.
Topic:
App & System Services
SubTopic:
Hardware
Hi all,
I’m facing a device-specific issue in a live production iOS app distributed privately via the App Store . The app crashes immediately after login on one client’s iPhone, while the same account works fine on other devices. There’s no crash log generated in Analytics, and the app just pops to the home screen.
Environment:
App: Production app on App Store
iOS version: 26.3
Devices: Only one device exhibits the crash; other iPhones work fine
Login flow: App calls an API and writes the response to a local SQLite database immediately after login
Distribution: App Store (Privately). The user is install via the redemption codes.
Observations:
All users on the problematic device crash immediately after login.
The crash does not occur on any other devices, including the same iOS version.
The client had already uninstalled and reinstalled the app via App Store cloud download, but the crash persisted.
No crash log appears in Analytics or Xcode (process just terminates).
Device restart had not been attempted before reinstall.
App does not use Keychain tokens; local DB is only SQLite in the app sandbox.
Hypotheses so far:
Corrupted binary or cached app installation on that device
SQLite database corruption or write failure
Device-specific OS/environment issue (temp files, file locks, provisioning)
iOS watchdog silently terminating the app during post-login DB write
Language / region differences unlikely
Questions:
Is it possible for a device to retain a corrupted app binary or cached installation even after uninstall + cloud download reinstall from the App Store?
Can uninstalling, restarting the device, and reinstalling guarantee a fresh binary and sandbox?
Are there any known iOS behaviors where a local SQLite write could trigger an instant crash on one device only, without generating crash logs?
Any other suggestions for diagnosing this device-specific post-login crash in a live production environment?
Thanks in advance for any guidance — this issue is affecting a client’s live usage, and we’d like to understand the root cause and best way to resolve it safely.
Hello Apple Developer Technical Support Team,
I’m working on an iOS banking/security SDK and we’re trying to match an Android feature that reads payment cards via NFC (EMV). On Android, this is implemented using an NFC scanning screen (e.g., “NfcScanActivity”) that can read EMV data from contactless credit/debit cards.
Could you please clarify the current iOS capabilities and App Store policy around this?
On iOS, is it currently possible for a third-party App Store app to read contactless credit/debit cards using Core NFC (i.e., accessing EMV application data/AIDs from payment cards)?
If this is possible, what are the supported APIs/frameworks and any entitlement requirements (if applicable)?
If this is not possible for App Store apps, could you recommend the closest acceptable alternatives for achieving a similar user outcome? For example:
Using Apple Pay / PassKit flows for payment-related experiences
Card scanning alternatives (camera-based OCR) for capturing card details (if allowed)
Using an external certified card reader accessory (MFi) and required approach/entitlements
Any other Apple-recommended approach for “card verification / identification” without reading EMV NFC data
Our goal is not to bypass security restrictions, but to provide a compliant solution on iOS comparable to Android’s NFC-based card reading, or to adopt an Apple-approved alternative if direct EMV reading is not supported.
If helpful, I can share a brief technical summary of the Android behavior and the exact data we need to obtain (e.g., whether it’s card presence verification vs. reading specific EMV tags).
Thank you for your guidance.
Best regards,
Imran
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple Developer Technical Support Team,
I’m working on an iOS banking/security SDK and we’re trying to match an Android feature that reads payment cards via NFC (EMV). On Android, this is implemented using an NFC scanning screen (e.g., “NfcScanActivity”) that can read EMV data from contactless credit/debit cards.
Could you please clarify the current iOS capabilities and App Store policy around this?
On iOS, is it currently possible for a third-party App Store app to read contactless credit/debit cards using Core NFC (i.e., accessing EMV application data/AIDs from payment cards)?
If this is possible, what are the supported APIs/frameworks and any entitlement requirements (if applicable)?
If this is not possible for App Store apps, could you recommend the closest acceptable alternatives for achieving a similar user outcome? For example:
Using Apple Pay / PassKit flows for payment-related experiences
Card scanning alternatives (camera-based OCR) for capturing card details (if allowed)
Using an external certified card reader accessory (MFi) and required approach/entitlements
Any other Apple-recommended approach for “card verification / identification” without reading EMV NFC data
Our goal is not to bypass security restrictions, but to provide a compliant solution on iOS comparable to Android’s NFC-based card reading, or to adopt an Apple-approved alternative if direct EMV reading is not supported.
If helpful, I can share a brief technical summary of the Android behavior and the exact data we need to obtain (e.g., whether it’s card presence verification vs. reading specific EMV tags).
Thank you for your guidance.
Best regards,
Anis
Topic:
App & System Services
SubTopic:
Hardware