Prerequisite: After the MDM APP issues the command, the camera on the phone is no longer visible (unusable).
After upgrading to iOS 26.1, the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable.
The isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method on iOS 26.0.1 is normal, returning false when the camera is unavailable and true when it is available.
Please fix this method to determine
If the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method cannot determine whether the camera is available, please provide an available judgment method.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
I’m working on a custom camera implementation in iOS using native code. My goal is to capture unprocessed, realistic images directly from the camera — without any filters or post-image processing applied by the system.
I’ve implemented RAW image capture using the native camera APIs (AVFoundation) and successfully received .dng files. However, even the RAW outputs don’t look like the real environment — the colors, tone, and exposure still seem processed or corrected in some way.
I’ve tried various configurations such as photoSettings.rawPhotoPixelFormatType, experimenting with AVCaptureDevice and AVCapturePhotoOutput settings, and reviewing ProRAW and standard RAW behavior, but I’m still not getting truly unprocessed results that reflect the actual sensor data.
Has anyone experienced similar results when capturing RAW images on iOS, or found a way to bypass Apple’s image signal processing (ISP) pipeline for more realistic captures?
Any insights or references from Apple’s camera framework behavior would be greatly appreciated.
Thank you!
Hi everyone,
I submitted an app for review and was met with a rejection for unresolved issues.
This was what was asked in the rejection:
Provide detailed answers to the following questions:
-Does your app interact with any hardware?
Would that be referring to the camera/microphone of the device? My app uses haptics when you select an option. I didn't see anything in connect where I needed to specify the use of haptics.
Also, does this mean that when the reviewer answers me I have to resubmit as a version 1.1? I'm not sure what I would need to change. This is my first app so I'm not entirely sure on the procedure.
Topic:
App & System Services
SubTopic:
Hardware
Im running macOS Tahoe and I have the proper
nvram boot-args , however when I try to poke the log stream im not getting any verb information related to the card im using. The audio system im using is AppleHDA.kext from the Beta 1 KDK.
I've tried asking AI it doesn't make a difference what it suggests to me..... In the meantime of while im asking for assistance what ill do is go ahead and let it template me a kernel extension that I guess just traffics it to the Log for me and hopefully this isn't filtered out as what I suspect is it saying is happening is is that it actually masks some of the information.
Why am I doing this? not For the Linux Driver its so I can see from the Log where it came from as this is what the developer said he did GitHub/davidjo/snd_hda_macbookpro is the kabylake iMac.
We have an app that connects to an external device that we developed in-house that measures electroencephalography (EEG), as well as PPG and IMU. This is not a medical device and we have stated that many times but the app review process keeps rejecting the app for the same reason 1.4.1 - Safety physical harm because they say it is connecting to a medical device. We have submitted documentation for FCC certification for safety but we do not have FDA certification because it is not used for medical purposes - purely wellness. Despite several messages explaining it is not a medical device the response is always the same without actually addressing any of the supporting documents we have sent. Any help to find a way to explain to the Apple team that not all EEG devices are medical and in fact most are NOT FDA approved would be appreciated as it seems like whoever is reviewing the app doesn't understand that.
Topic:
App & System Services
SubTopic:
Hardware
We have developed an accessory that supports Find My. When using the Find My app to set it up, it occasionally gets stuck at the final " setting up"" interface. The app just stays like that. We would like to know what could cause this situation and how to resolve it.
Thanks a lot.
We are currently planning to develop a third‑party hardware accessory that supports Wi‑Fi Aware using AccessorySetupKit on iOS, based on the official documentation:
https://developer.apple.com/documentation/accessorysetupkit/
Before finalizing our hardware and firmware design, we would like to better understand the real‑world behavior and user experience of Wi‑Fi Aware in actual third‑party accessories.
Specifically, we would like to ask:
Existing Third‑Party Hardware
Are there any commercially available third‑party accessories (not Apple products) that already support Wi‑Fi Aware via AccessorySetupKit?
If so, are there any public examples, reference designs, or recommended products we can purchase to observe the real onboarding, discovery, and pairing experience?
Reference or Evaluation Hardware
Does Apple provide any reference hardware, evaluation kits, or recommended vendor solutions (for example, based on common Wi‑Fi chipsets) that are known to work well with Wi‑Fi Aware on iOS?
Are there specific Wi‑Fi chipset vendors that have validated interoperability with AccessorySetupKit?
Practical Behavior and Limitations
In real usage, what are the typical discovery latency, reliability, and background/foreground behavior developers should expect?
Are there known limitations or best practices when designing hardware that relies on Wi‑Fi Aware for initial accessory discovery and setup?
Our goal is to evaluate the feasibility and user experience of Wi‑Fi Aware for third‑party accessories by testing against existing implementations or recommended hardware, before investing heavily in custom hardware development.
Any guidance, examples, or pointers to existing accessories or partners would be greatly appreciated.
We would like to be able to distinguish between iPhones and Apple Watches when scanning for devices using a Laird BLE module.
We know that we can identify an Apple device from the manufacturer data returned in the scan report. 0x004C is the registered identifier for Apple.
In the remaining data returned is it possible identify the device type?
We note that empirically, 4C001005 seems to correlate to an Apple Watch. How reliable is this?
It is useful for us, because it means we do not need to connect to this device to see if it is advertising a service that we own.
Connecting over BLE is of course an expensive operation.
Here is a simple snippet of a Swift App doing a similar thing, to illustrate the question:
func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber)
{
guard
let manufData: Data = advertisementData[CBAdvertisementDataManufacturerDataKey] as? Data
else
{
return
}
let hexEncodedManufData: String = manufData.map { String(format: "%02hhx", $0) }.joined()
print("Manufacturer Data: \(hexEncodedManufData): ")
// Manufacturer Data: 4c001007351ff9f9036238: Apple device
// Manufacturer Data: 4c001006331ec0640f88: Apple device
// Manufacturer Data: 4c0010052b18804eb1: Apple watch?
// Manufacturer Data: 4c0010052b18804eb1: Apple watch?
}
I have a device. After pairing, it is shown as connected in the Bluetooth list of the mobile phone. Before iOS 18, I could retrieve the device through the retrieveConnectedPeripherals method, but after iOS 18, I cannot retrieve it. What is the reason? Because when the device is connected, it no longer sends broadcast packets. How can I retrieve the device?
I am working on an iOS application that relies on CoreMotion Attitude, and I need clarification regarding the behavior of reference frames.
According to the documentation, when setting the attitude reference frame to CMAttitudeReferenceFrameXTrueNorthZVertical:
The Z-axis of the reference frame is vertical (aligned with gravity).
The X-axis of the reference frame points to the geographic North Pole (True North).
When a device’s orientation matches this reference frame, the roll, pitch, and yaw values reported by CMAttitude should be (0,0,0).
However, in my testing:
When I align the device’s position with the CMAttitudeReferenceFrameXTrueNorthZVertical reference frame by orienting the screen (device Z-axis) upward and the right side (device X-axis) toward north, the yaw value reported by CMAttitude is 90 degrees instead of the expected 0 degrees.
To have CMAttitude report yaw as 0, I must instead orient the top side (device Y-axis) toward north.
This seems to contradict my understanding that the X-axis of the device should be aligned with True North, to have the device match the attitude reference frame and have roll, pitch, and yaw values reported by CMAttitude should be (0,0,0).
What I'm missing?
Thank you for your time and assistance.
Topic:
App & System Services
SubTopic:
Hardware
what do i do is it going down too fast?
Battery health reduced to 89 from 98 within 2 months on iPhone 15 Pro and Cycle Count is just 314.
Is it the software update doing this?
Hello,
I am currently working on a USB HID-class device and I wanted to test communications between various OSes and the device.
I was able to communicate through standard USB with the device on other OSes such as Windows and Linux, through their integrated kernel modules and generic HID drivers. As a last test, I wanted to test macOS as well.
This is my code, running in a Swift-based command line utility:
import Foundation
import CoreHID
let matchingCriteria = HIDDeviceManager.DeviceMatchingCriteria(vendorID: 0x1234, productID: 0x0006) // This is the VID/PID combination that the device is actually listed under
let manager = HIDDeviceManager()
for try await notification in await manager.monitorNotifications(matchingCriteria: [matchingCriteria]) {
switch notification {
case .deviceMatched(let deviceReference):
print("Device Matched!")
guard let client = HIDDeviceClient(deviceReference: deviceReference) else {
fatalError("Unable to create client. Exiting.") // crash on purpose
}
let report = try await client.dispatchGetReportRequest(type: .input)
print("Get report data: [\(report.map { String(format: "%02x", $0) }.joined(separator: " "))]")
case .deviceRemoved(_):
print("A device was removed.")
default:
continue
}
}
The client.dispatchGetReportRequest(...) line always fails, and if I turn the try expression into a force-unwrapped one (try!) then the code, unsurprisingly, crashes.
The line raises a CoreHID.HIDDeviceError.unknown() exception with a seemingly meaningless IOReturn code (last time I tried I got an IOReturn code with the value of -536870211).
The first instinct is to blame my own custom USB device for not working properly, but it doesn't cooperate with with ANY USB device currently connected: not a keyboard (with permissions granted), not a controller, nothing.
I did make sure to enable USB device access in the entitlements (when I tried to run this code in a simple Cocoa app) as well.
...What am I doing wrong here? What does the IOReturn code mean?
Thanks in advance for anybody willing to help out!
Topic:
App & System Services
SubTopic:
Hardware
Hello everyone,
I want send haptics to ps4 controller.
CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer good work with iPhone.
On PS4 controller If I use CHHapticPatternPlayer all work good, but if I use CHHapticAdvancedPatternPlayer I get error. I want use CHHapticAdvancedPatternPlayer to use additional settings. I don't found any information how to fix it -
CHHapticEngine.mm:624 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'Не удалось завершить операцию. (com.apple.CoreHaptics, ошибка -4811)'
The engine stopped because a system error occurred.
AVHapticClient.mm:1228 -[AVHapticClient getSyncDelegateForMethod:errorHandler:]_block_invoke: ERROR: Sync XPC call for 'loadAndPrepareHapticSequenceFromEvents:reply:' (client ID 0x21) failed: Не удалось установить связь с приложением-помощником.
Не удалось создать или воспроизвести паттерн: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics" UserInfo={NSDebugDescription=connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics}
My Haptic class -
import Foundation
import CoreHaptics
import GameController
protocol HapticsControllerDelegate: AnyObject {
func didConnectController()
func didDisconnectController()
func enginePlayerStart(value: Bool)
}
final class HapticsControllerManager {
static let shared = HapticsControllerManager()
private var isSetup = false
private var hapticEngine: CHHapticEngine?
private var hapticPlayer: CHHapticAdvancedPatternPlayer?
weak var delegate: HapticsControllerDelegate? {
didSet {
if delegate != nil {
startObserving()
}
}
}
deinit {
NotificationCenter.default.removeObserver(self)
}
private func startObserving() {
guard !isSetup else { return }
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidConnect),
name: .GCControllerDidConnect,
object: nil
)
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidDisconnect),
name: .GCControllerDidDisconnect,
object: nil
)
isSetup = true
}
@objc private func controllerDidConnect(notification: Notification) {
delegate?.didConnectController()
self.createAndStartHapticEngine()
}
@objc private func controllerDidDisconnect(notification: Notification) {
delegate?.didDisconnectController()
hapticEngine = nil
hapticPlayer = nil
}
private func createAndStartHapticEngine() {
guard let controller = GCController.controllers().first else {
print("No controller connected")
return
}
guard controller.haptics != nil else {
print("Haptics not supported on this controller")
return
}
hapticEngine = createEngine(for: controller, locality: .default)
hapticEngine?.playsHapticsOnly = true
do {
try hapticEngine?.start()
} catch {
print("Не удалось запустить движок тактильной обратной связи: \(error)")
}
}
private func createEngine(for controller: GCController, locality: GCHapticsLocality) -> CHHapticEngine? {
guard let engine = controller.haptics?.createEngine(withLocality: locality) else {
print("Failed to create engine.")
return nil
}
print("Successfully created engine.")
engine.stoppedHandler = { reason in
print("The engine stopped because \(reason.message)")
}
engine.resetHandler = {
print("The engine reset --> Restarting now!")
do {
try engine.start()
} catch {
print("Failed to restart the engine: \(error)")
}
}
return engine
}
func startHapticFeedback(haptics: [CHHapticEvent]) {
do {
let pattern = try CHHapticPattern(events: haptics, parameters: [])
hapticPlayer = try hapticEngine?.makeAdvancedPlayer(with: pattern)
hapticPlayer?.loopEnabled = true
try hapticPlayer?.start(atTime: 0)
self.delegate?.enginePlayerStart(value: true)
} catch {
self.delegate?.enginePlayerStart(value: false)
print("Не удалось создать или воспроизвести паттерн: \(error)")
}
}
func stopHapticFeedback() {
do {
try hapticPlayer?.stop(atTime: 0)
self.delegate?.enginePlayerStart(value: false)
} catch {
self.delegate?.enginePlayerStart(value: true)
print("Не удалось остановить воспроизведение вибрации: \(error)")
}
}
}
extension CHHapticEngine.StoppedReason {
var message: String {
switch self {
case .audioSessionInterrupt:
return "the audio session was interrupted."
case .applicationSuspended:
return "the application was suspended."
case .idleTimeout:
return "an idle timeout occurred."
case .systemError:
return "a system error occurred."
case .notifyWhenFinished:
return "playback finished."
case .engineDestroyed:
return "the engine was destroyed."
case .gameControllerDisconnect:
return "the game controller disconnected."
@unknown default:
return "an unknown error occurred."
}
}
}
custom haptic events -
static func changeVibrationPower(power: HapricPower) -> [CHHapticEvent] {
let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0),
CHHapticEventParameter(parameterID: .hapticIntensity, value: power.value)
], relativeTime: 0, duration: 0.5)
return [continuousEvent]
}
I have an Iphone Xsmax and the battery health is degraded to 69
i noticed whenever I put it on charge it just restarts and keeps doing that until I start using it or keep the screen on before it charges
please is it my charger or it’s because the battery health has degraded to 69?
Topic:
App & System Services
SubTopic:
Hardware
So the battery level value is in accurate returns the battery percentage in multiple of 5 values e.g. battery percentage is 42 but the api returns it as 40. So please fix the issue if possible because i checked that the
devices running iOS versions below 17 appear to be working fine.
Im having issue with OneDrive that is affected our company iPads. User are able to drag and drop any folder or files over and now they cant. they are on the latest update for OneDrive and the IOS. Can someone look at this and also i reach to Microsoft and they said that nothing have change on there end.
When are you guys going to fix the CarPlay issues with this new update? I use this for work and it’s really an issue. Nothing is working and it takes up entirely too much space.
After updating to ios18.4, 3d scanning function, including AR function in apple clips, cannot be used. Does anyone else have the same problem?
Topic:
App & System Services
SubTopic:
Hardware
**
Every time after I downloaded an app this window opens and never closes how to close it?**
Topic:
App & System Services
SubTopic:
Hardware