I updated my iPhone to 26 and I just went to to to see if my iPhone was up to date I went and see but a came across this
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
when I go to software and update it says can’t check for updateS.
Topic:
App & System Services
SubTopic:
Hardware
Hey, I need to know how to use texture mapping for rendering a spectrogram in metal. As I need smoothens the spectrogram. In my current project I am using vertex based approach which results in blocky behaviour between each quad. I need to smooth across each qaud so that It will smoothly gradient over.
I want to get Cover-Art Image over Bluetooth. From the SDP Discover iPhoneSE, it support Cover-Art Feature.
I connect and config OBEX Channel with L2CAP ERTM, it is successful.
I send connect_request the OBEX Connect, the iPhone always no any response, and l2cap s-frame response from iPhone always failed.
From the Apple Developer, it need Apple Authentication Chip.
Is it still necessary now? Are there any other ways to obtain album pictures?
I need to implement an app that exchanges data with peripherals through TypeC on Apple 15 phones, but I have two questions that I need to ask for help:
Which library is used to communicate with peripherals through the TypeC port of the Apple mobile phone?
Do peripherals need to pass MFi authentication before they can communicate with the App?
Topic:
App & System Services
SubTopic:
Hardware
Hello. I am building a BLE device that is Activity Fitness based and would like a "System Level" BLE connection on WatchOS using an ESP32 (I have built a test of this on the firmware side). Meaning I do not want my iOS app to pass the BLE connection to the WatchOS app. It seems like these App Level connections do not get as many background updates as a System Level connection, and also requires the WatchOS app to be launched to connect to the BLE device
The System Level BLE connection (WatchOS Settings > BLE > Health Devices) allows for auto connection in the background, and gets more reliable background communication between the BLE device and the Apple Watch
On the Apple MFi Page it only mentions iOS:
From Apple MFi Page:
:: Who does NOT need to join -
Developers and manufacturers of accessories that connect to an Apple device using only Bluetooth Low Energy, Core Bluetooth, or standard Bluetooth profiles supported by iOS
Does this apply to WatchOS as well?
So, if I am making an BLE device that is Activity Based, and has one of the allowable Health Device UUIDs, is the BLE System Connection allowed using any BLE chip? Including say an ESP32
I have built a test BLE firmware that is a Health Device UUID, and the WatchOS sees it as a health device
Is this fine then? No need for MFi application and also no need to worry about which BLE chip is used?
thanks
How to remove Matter accessory connection artefacts? This appears after connecting and then removing a Matter test accessory. Please see attached screenshot:
We are developing a matter device. How can we display our own APP under the Home device information?
I have an accessory with MFi authenticaiton passed(got 0xAA05) and identification accepted (got 0x1D02). But when I try to open the target stream by using iAP2 EA session framework, I always enounter the same error looking like:
XPC connection error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.accessories.externalaccessory-server was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.accessories.externalaccessory-server was invalidated from this process.}
anybody can tell me what it related with? And what can I do to go through it quickly? Thank you much in advance.
Hi
I am implement my app with Core NFC , I found the module have their own UI , can I implement without the UI? we want the NFC scanning in background , not blocking the UI.
Hey everyone, how’s it going?
I’d like to know if, by enrolling in Apple’s MFi program, I’ll gain access to develop my own tags and my own app to track them using Apple’s Find network. I also read that there’s an estimated cost of $4 per device—does that apply to each device produced, or only at the time of registering the device, with no fee for additional units?
On iOS 18.x when try to create EASession we get nil, but on iOS 17.x everything works.
We have app which use USB cable for connecting external accessories.
Scenario is when we have fresh instal, connecting with accessory work fine, EASession is created, streams are opened.
When we unplug USB, we close streams, remove any reference to session and accessory, remove accessory delegate.
When plug it again, creating EASession is returning nil. Only after restarting iPhone, we can create new EASession with appropriate protocol and accessory. Every next attempt without reseting iPhone is failing.
Logs from accessory is following:
00:05:51.811000 : onUSBDeviceFound(pDevice=0xffc818)) iPhone USB device already in the device list w/id=1 -> update status now[21;1H
00:05:51.830000 : setConnectionStatus(status=connected) [devId=1] state updated -> forward[21;1H
Capabilities indicate HostMode possibility => role switch is triggered
00:05:52.848000 : updateDIPODeviceConnections() iPhoneUSB w/caps=5 (=CarPlay or HostMode), deviceTag=2 in Device mode -> request role switch[21;1H
Role switch seems to be successful
00:05:54.914000 : setSwitching('stable') changed[21;1H
00:05:54.915000 : updateDIPODeviceConnections() iPhoneUSB w/caps=2, id=1, deviceTag=2 and native transport -> request app launch and call connectUSB[21;1H
00:05:54.967000 : ConnectiAP2(05ac:12a8, s/n='00008101000160921E90801E', writeFD='/dev/ffs/ep3', readFD='/dev/ffs/ep4', hostMode){3}[21;1H
Native transport should become available but does not (the following line is not present for failed case. Taken from successful case)
00:05:24.983000 : OnDBusPropChanged_NativeTransport(): deviceId=2, started=1, iAP2iOSAppIdentifier=1, sinkEndpoint=3, sourceEndpoint=4, TransactionID=1
EAP Start event not received (trace line from success try)
00:05:25.057000 : EAPSessionStart(ctx=0x74e0b800){2} called[21;1H
Is there any braking change on iOS 18 considering EASession?
Also what is strange is that it works on fresh instal/restart iPhone, but not working on second attempt?
I have a 4-input, 4-output hardware device and an 8-input, 8-output virtual device, which I combine into an aggregate device. I am using the SimplyCoreAudio library to get the channel count. The code is as follows:
aggregationDevice!.channels(scope: .input) =>> 12 aggregationDevice!.channels(scope: .output) =>> 12
When the program's MicrophoneMode is set to standard, the channel count is correct. However, when I set the MicrophoneMode to voiceIsolation, the channel count is incorrect:
aggregationDevice!.channels(scope: .input) =>> 4 aggregationDevice!.channels(scope: .output) =>> 12
Below is the code for creating the aggregate device:
func createAggregateDevice(mainDevice: AudioDevice,
secondDevice: AudioDevice?,
named name: String,
uid: String) -> AudioDevice?
{
guard let mainDeviceUID = mainDevice.uid else { return nil }
var deviceList: [[String: Any]] = [
[
kAudioSubDeviceUIDKey: mainDeviceUID,
kAudioSubDeviceDriftCompensationKey:1
]
]
// make sure same device isn't added twice
if let secondDeviceUID = secondDevice?.uid, secondDeviceUID != mainDeviceUID {
deviceList.append([
kAudioSubDeviceUIDKey: secondDeviceUID,
kAudioSubDeviceDriftCompensationKey:1,
kAudioSubDeviceInputChannelsKey:8
])
}
let desc: [String: Any] = [
kAudioAggregateDeviceNameKey: name,
kAudioAggregateDeviceUIDKey: uid,
kAudioAggregateDeviceSubDeviceListKey: deviceList,
kAudioAggregateDeviceMainSubDeviceKey: mainDeviceUID,
kAudioAggregateDeviceIsPrivateKey:false,
]
var deviceID: AudioDeviceID = 0
let error = AudioHardwareCreateAggregateDevice(desc as CFDictionary, &deviceID)
guard error == noErr else {
return nil
}
return AudioDevice.lookup(by: deviceID)
}
I hope someone can tell me the reason Thank you!
Dear Sir,
I have some questions of IC firmware development of Find My.
Any rule request that item must include dual bank feature in IC?
I am using Nordic SDK_Connect SDK, Apple has own SDK? If yes, can I download it to use?
In Find-My, Apple has service UUID in bluetooth IC?
Thank you.
Best regards,
Sam Ng
I am developing the electronic part of product, it includes a Find-My features. I saw some forum that testing Find-My feature needs a CSR and testing token. Can anyone teach me how to apply CSR and testing token step by step?
Thank you very much.
Best regards
Sam Ng
We are integrating the Microsoft Web Chat Control inside a WKWebView in our iOS application. The microphone button (used for speech input) works as expected when the app is active. However, we are facing an issue when the app is sent to the background and then brought back to the foreground.
Issue Details:
When the app returns from background to foreground:
🔹 The mic button becomes unresponsive (taps are not recognized).
🔹 No permission prompt or speech functionality is triggered.
🔹 Other elements of the Web Chat control continue to work fine.
This issue seems isolated to iOS and WKWebView usage.
We have verified that microphone permissions are granted and there are no system-level blocks.
Environment:
Platform: iOS
Web Container: WKWebView
Microsoft Web Chat Version: Devanshu Kinariwala add version here
iOS Version: iOS 18.3.1
Devices: iPhone 13, iPhone 14 Pro
error in browser console
[Error] A MediaStreamTrack ended due to a capture failure (x3)
[Error] WebSocket connection to 'wss://directline.botframework.com/v3/directline/conversations/JQ1k0phVogeJ30ZQddBvAQ-in/stream?watermark=-&t=eyJhbGciOiJSUzI1NiIsImtpZCI6ImlmOEs0aFg4R1hXVnZkS3pwdFRFWFJveURTUSIsIng1dCI6ImlmOEs0aFg4R1hXVnZkS3pwdFRFWFJveURTUSIsInR5cCI6IkpXVCJ9.eyJib3QiOiJhaWFhcy1xYS1jb252YWktYm90Iiwic2l0ZSI6InRWcW14cDBQZU9vIiwiY29udiI6IkpRMWswcGhWb2dlSjMwWlFkZEJ2QVEtaW4iLCJuYmYiOjE3NDI5NzE1MTgsImV4cCI6MTc0Mjk3MTU3OCwiaXNzIjoiaHR0cHM6Ly9kaXJlY3RsaW5lLmJvdGZyYW1ld29yay5jb20vIiwiYXVkIjoiaHR0cHM6Ly9kaXJlY3RsaW5lLmJvdGZyYW1ld29yay5jb20vIn0.Mx3MMVP3t9Ex36UW-YARskZLny0iORxc6-B0ewvNp0S-ivUjvOS43kZc0J5HoOgYRkoGaKemo00_JSkzryAbKKoSwqMjahf0VotqTZsJjoIgtyNJFfAYyGVriBHMV_6FfH_YEezDMD5puY6R89eM-atQOw-CfoClwrxn8jgVL5Kn19WdDZvmQwFIArklA7as8bboKcWv4PveEKptM9xCokttaGzv-S5pdbNETMoJzIhLcJDHmEVJ6oJ0TFs5XS7RGMSQlM_gs95TySzVjVL7XV6qEOt_A10lRzmx0PxPIUw_nqllEIbWFy5H7AfsxbKRtM1nLe4lRm1KS7_xw9dSlw' failed: The operation couldn’t be completed. Software caused connection abort
[Error] WebSocket connection to 'wss://eastus2.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed&Ocp-Apim-Subscription-Key=4JuKqIMwLMhgAxfORVDEYfiuTL6Hrbnj3isAeGfs7aks4AOltun6JQQJ99AKACHYHv6XJ3w3AAAYACOGJFYj&X-ConnectionId=A93B097C62F14C55B30A851798609F73' failed: The operation couldn’t be completed. Software caused connection abort
[Error] WebSocket connection to 'wss://eastus2.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed&Ocp-Apim-Subscription-Key=4JuKqIMwLMhgAxfORVDEYfiuTL6Hrbnj3isAeGfs7aks4AOltun6JQQJ99AKACHYHv6XJ3w3AAAYACOGJFYj&X-ConnectionId=E99DF3A6CE734E0294A5FB5296D725CC' failed: The operation couldn’t be completed. Software caused connection abort
[Error] WebSocket connection to 'wss://eastus2.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed&Ocp-Apim-Subscription-Key=4JuKqIMwLMhgAxfORVDEYfiuTL6Hrbnj3isAeGfs7aks4AOltun6JQQJ99AKACHYHv6XJ3w3AAAYACOGJFYj&X-ConnectionId=8B3370005E7A4946BEA174E804F64FF7' failed: The operation couldn’t be completed. Software caused connection abort
Swift: // Method to check microphone permission
func checkMicrophonePermission(completion: @escaping (Bool) -> Void) {
let authorizationStatus = AVCaptureDevice.authorizationStatus(for: .audio)
switch authorizationStatus {
case .authorized:
// Permission granted
completion(true)
case .notDetermined:
// Request permission
AVCaptureDevice.requestAccess(for: .audio) { granted in
DispatchQueue.main.async {
completion(granted)
}
}
case .denied, .restricted:
// Permission denied or restricted
completion(false)
@unknown default:
// Handle future cases
completion(false)
}
}
Topic:
App & System Services
SubTopic:
Hardware
I'm device manufacturer and in future planning to get my device matter certified.
If I want my device data for analytics purpose into my cloud than what is the best way possible.
My research says that the most latest approach suggested by apple is,
developing a custom mobile app using device homekit sdk and subscribe to device state and send it to my cloud.
If I go that route, will it work even though the device was onboarded via homekit app and homekit hub device is also there.
I want to make sure that both path will be active, device to hub to home app and device to custom app to my cloud, and both on matter ecosystem.
The homekit sdk and matter support mentioned here https://developer.apple.com/apple-home/matter, are these two same thing?
I am using NFC when the phone is near the NFC reader times below the error:
2024-07-15 15:43:03.608427+0800 TestNFC[16022:1038141] [xpc.exceptions] <NSXPCConnection: 0x282ba90e0> connection to service with pid 58 named com.apple.nfcd.service.corenfc: Exception caught during decoding of received selector didDetectExternalReaderWithNotification:, dropping incoming message.
Exception: Exception while decoding argument 0 (#2 of invocation):
Exception: decodeObjectForKey: class "NFFieldNotification" not loaded or does not exist
my code:
#import <CoreNFC/CoreNFC.h>
@interface ViewController ()<NFCTagReaderSessionDelegate>
@property (strong, nonatomic) NFCTagReaderSession *session;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
if (@available(iOS 13.0, *)) {
// 初始化 NFC 设置代理 NFCTagReaderSessionDelegate
if (NFCNDEFReaderSession.readingAvailable)
{
self.session = [[NFCTagReaderSession alloc]
initWithPollingOption:NFCPollingISO14443 delegate:self queue:nil];
// NFC 显示提示信息
self.session.alertMessage = @"准备扫描,请将卡片贴近手机";
// 开启 NFC
[self.session beginSession];
}
} else {
}
}
#pragma mark - NFCNDEFReaderSessionDelegate
//读取失败回调-读取成功后还是会回调这个方法
- (void)tagReaderSessionDidBecomeActive:(NFCTagReaderSession *)session API_AVAILABLE(ios(13.0)){
NSLog(@"tagReaderSessionDidBecomeActive");
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didInvalidateWithError:(NSError *)error API_AVAILABLE(ios(13.0)){
NSLog(@"readerSession:didInvalidateWithError: (%@)", [error localizedDescription]);
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didDetectTags:(NSArray<__kindof id<NFCTag>> *)tags API_AVAILABLE(ios(13.0)){
}
We are currently managing our iPhones and iPads using Microsoft Intune and I wanted to get your feedback on other Enterprise management software that gives the admin the ability to manage more granular functions.
Topic:
App & System Services
SubTopic:
Hardware
Hi there,
I am using Core NFC and I established the connection with the card, but after sending the command 'tag.sendCommand()' I receive this message:
-[NFCTagReaderSession _connectTag:error:]:748 Error Domain=NFCError Code=2 "Missing required entitlement" UserInfo={NSLocalizedDescription=Missing required entitlement}.
The version of XCode I am using is 16.3, and the iPhone version is iOS 18.4
Here is my entitlements file:
com.apple.developer.nfc.readersession.formats
NDEF
TAG
And my info.plist:
NFCReaderUsageDescription
NFC
com.apple.developer.nfc.readersession.iso7816.select-identifiers
A000112233445566
Signing & Capabilities has added Near Field Communication Tag Reading.