Hi everyone,
I’m having trouble getting my iPhone 11 to detect a DWM3001CDK as an accessory using the Apple Nearby Interaction app. Here’s the background:
Two years ago, I successfully tested UWB ranging between the same devices (iPhone 11 and DWM3001CDK, which is based on the Qorvo DW3110 IC and an nRF52833 SoC with Bluetooth 5.2). At that time, the Nearby Interaction app was in beta and worked well for my tests. Now, with the stable version of the app, I’m encountering an issue.
Here’s what I’ve done so far:
I erased the DWM3001C and flashed it with the Qorvo Nearby Interaction firmware (v3.2.0, "DWM3001CDK-QANI-FreeRTOS_full_QNI_3_0_0.hex") using J-Flash Lite V7.86g on Windows.
With this configuration, I can connect the iPhone 11 to the accessory using the Qorvo NI apps, both in the foreground and background.
However, when I compile and run the project "ImplementingSpatialInteractionsWithThirdPartyAccessories" (available on the Apple Developer website) on my iPhone 11 (running iOS 17.7), the app remains stuck on the "Scanning for accessory" screen and doesn’t find the device, even though I’ve given the app permission to use Bluetooth.
Could this be due to an issue with the firmware I flashed on the DWM3001CDK, or might there be something else causing the problem?
Any help or insights would be appreciated!
Thanks in advance.
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am writing an app that records accelerometer data in the background. When we call sensor.accelerometerData(from:to:), the for loop when iterating CMSensorDataList suspends/pauses as the watch screen turns off and resumes when app comes to foreground. Is there a way we can keep the processing on until it completes? I used Location Updates capability in Background Modes as a hack but it will make my app rejected from App store just to acquire capability for background execution.
Please note that our data is comprised of larger sets, say hours. We retrieve data in batches of 30 minutes each but this also does not complete.
Hi, so as I understand it is not possible to know what all possible sources of files are available on iOS using some api call (by sources I mean smb shares connected, iCloud, gdrive, etc), the only paths I can get are the app sandbox, app group container and the same on iCloud. I can get the list of mount points in macOS using getmntinfo(), app/group sandbox and apart from these whatever standard locations I have given access to to my sandboxed app. Are there other paths that I can get?
I want to know how I can determine the volume given a user picks a file using a file picker. Say, they picked 10 files from gdrive and another 5 from local storage. If I encounter some errors on the files from gdrive I want to stop working on all 10 of them but to do that I need to be able to determine that that are on this particular volume. Is there a way to do this programmatically?
Ex: gdrive on iOS : "/private/var/mobile/Containers/Shared/AppGroup/6208BBEE-24BF-4CC9-A9ED-846F987C0442/File Provider Storage/39822865/1P8WD1tWEaq81ZB_DodTTZhXm0p00QaF7/test.txt"
on MacOS:
"/Users/username/Library/CloudStorage/GoogleDrive-useremailid/My Drive"
I've implemented GADMobileAds and my Info.plist file includes the following entry:
<key>SKAdNetworkItems</key>
<array>
...
<dict>
<key>SKAdNetworkIdentifier</key>
<string>tl55sbb4fm.skadnetwork</string> <!-- Verve Group -->
</dict>
...
</array>
Unfortunately, I get the following error still:
...[Default] <Google> <Google:HTML> 1 required SKAdNetwork identifier(s) missing from Info.plist. Missing network(s): Verve. See [Enable SKAdNetwork to track conversions] (https://googlemobileadssdk.page.link/enable-skadnetwork).
I've tried everything from cleaning the build folder to adding all of Verve Group's other SKAdNetworkIdentifier strings.
What am I missing?
If an app with a Message Filter Extension is run on an iPhone with iOS 18 installed then there's no logging output to the console (using print or NSLog), however there is logging in all previous versions of OS.
Being able to view logging at run time for this component is essential as a debugging aid to see, for example, if the extension launches, if a text is handled locally or deferred to the network, to see if there's a network error, to examine the server response etc.
Is there a specific reason it was disabled or is it accidental?
Thank you
Our app has these settings configured in its Info.plist:
<key>LSSupportsOpeningDocumentsInPlace</key>
<true/>
<key>UIFileSharingEnabled</key>
<true/>
We have an Enterprise app, distributed via MDM, that was created approximately 6 years ago in Xcode 9 or 10.
In iOS 17 the app's Document directory is correctly listed as a folder in the Files app under "On My iPhone".
After updating a device to iOS 18, the app's Document directory is no longer listed under "On My iPhone". However, if you search for the app's name the search results does show the app's folder.
If I then run the app from Xcode directly on to an iOS 18 device, the app's Document directory folder is not listed under "On My iPhone" in the Files app and searching for it no longer finds it. If run the app directly on to an iOS 17 device, the app's Document directory is, correctly, listed as before.
I created a new test project in Xcode 15 and ran it on an ISO 18 device, but it works as expected, so it seems to have affected older projects.
This is the exact same problem, occurring for other developers:
https://stackoverflow.com/questions/79025597/ios-apps-document-folder-no-longer-accessible-from-files-app-after-ios-18-updat
On iPhone devices with dynamic islands (e.g. iPhone 15 Pro Max), the proximity sensor takes about 3 seconds to activate, is this normal?
The iPhone 13 responds almost instantly, but the iPhone 15 Pro Max seems to take a while.
class ProximitySensorManager {
// Shared instance for global access (optional)
static let shared = ProximitySensorManager()
// Proximity sensor activation flag
private(set) var isSensorEnabled: Bool = false
// Start observing proximity sensor changes
func enableProximitySensor(observer: Any) {
guard !isSensorEnabled else { return }
isSensorEnabled = true
UIDevice.current.isProximityMonitoringEnabled = true
NotificationCenter.default.addObserver(
observer,
selector: #selector(proximityStateChanged),
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Stop observing proximity sensor changes
func disableProximitySensor(observer: Any) {
guard isSensorEnabled else { return }
isSensorEnabled = false
UIDevice.current.isProximityMonitoringEnabled = false
NotificationCenter.default.removeObserver(
observer,
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Proximity sensor state change handler
@objc private func proximityStateChanged() {
if UIDevice.current.proximityState {
print("Proximity sensor detected close object")
// Additional functionality can be added here (e.g. lowering the screen brightness)
} else {
print("Proximity sensor detected no object")
}
}
deinit {
//disableProximitySensor() // Clean up observer on deinitialization
}
}
On and off I've been trying to figure out how to do hang detection in-application (at least from the user's point of view). Qualitatively what I'd like to do is have a process which runs sample(1) on the application after it's been unresponsive for more than a second or so. Basically, an in-app replacement for Spin Control. The problem I've been stuck on is: how do I tell?
There used to be Core Graphics SPI (CGSRegisterNotifyProc with a value of kCGSEventNotificationAppIsUnresponsive) for doing this, but it doesn't work anymore (either due to sandboxing or system-wide security changes, I can't tell which but it doesn't matter).
One thought I had was to have an XPC service which would expect to receive a checkin once per second from the host (via a timer set up by the host). If it didn't, it would start sample(1). This seems pretty heavyweight to me, since it means that once per second, I'm going to be consuming cycles to check in with the service. But I haven't been able to come up with a scheme that doesn't include some kind of check-in by the target process.
Are there any APIs or strategies that I could use to accomplish this? Or is there some entitlement which would allow the application to request "application became unresponsive"/"application became responsive" notifications from the window server?
On iPhone devices with dynamic islands (e.g. iPhone 15 Pro Max), the proximity sensor takes about 3 seconds to activate, is this normal?
The iPhone 13 responds almost instantly, but the iPhone 15 Pro Max seems to take a while.
// Shared instance for global access (optional)
static let shared = ProximitySensorManager()
// Proximity sensor activation flag
private(set) var isSensorEnabled: Bool = false
// Start observing proximity sensor changes
func enableProximitySensor(observer: Any) {
guard !isSensorEnabled else { return }
isSensorEnabled = true
UIDevice.current.isProximityMonitoringEnabled = true
NotificationCenter.default.addObserver(
observer,
selector: #selector(proximityStateChanged),
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Stop observing proximity sensor changes
func disableProximitySensor(observer: Any) {
guard isSensorEnabled else { return }
isSensorEnabled = false
UIDevice.current.isProximityMonitoringEnabled = false
NotificationCenter.default.removeObserver(
observer,
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Proximity sensor state change handler
@objc private func proximityStateChanged() {
if UIDevice.current.proximityState {
print("Proximity sensor detected close object")
// Additional functionality can be added here (e.g. lowering the screen brightness)
} else {
print("Proximity sensor detected no object")
}
}
deinit {
//disableProximitySensor() // Clean up observer on deinitialization
}
}
I'm trying to migrate from Complication with CLKComplication to WidgetKit.
I have implemented the required methods in https://developer.apple.com/documentation/widgetkit/converting-a-clockkit-app, but the migration is not working. There is no evidence that the method for migration is also called.
It was the same with Xcode 14.0.1 and Xcode 14.1RC.
class ComplicationController: NSObject, CLKComplicationDataSource, CLKComplicationWidgetMigrator {
...
@available(watchOS 9.0, *)
var widgetMigrator: CLKComplicationWidgetMigrator {
return self
}
@available(watchOS 9.0, *)
func widgetConfiguration(from complicationDescriptor: CLKComplicationDescriptor) async -> CLKComplicationWidgetMigrationConfiguration? {
return CLKComplicationStaticWidgetMigrationConfiguration(kind: "MyWidget", extensionBundleIdentifier: "com.example.myapp.mywatchkitapp.mywidget")
}
}
What's wrong? Has anyone been able to migrate?
I am also currently having the same issue after updating to IOS18 And iPhone 16pro max Whether wired or wireless sound quality is poof and very mono. At some point it corrects itself then you touch the phone screen then back to mono again. Making phone calls big issue. Nothing respond,nothing else resolve it. My Bluetooth works fine. Please fix this… so tired of this
Dear Apple R&D Team, I want to report a bug of CallKit caused by Voicemail in iOS18.0.
The following are the header files referenced by our code:
#import <CallKit/CXCallObserver.h>
#import <CallKit/CXCall.h>
Our VoIP app uses the callObserver callback function provided by CXCallObserverDelegate to monitor changes in the system phone status (see the attached picture). The problems we encountered are:
Before upgrading to iOS18.0, when we rejected a call, we would receive a callObserver callback, and the status value of hasConnected of CXCall was NO;
After upgrading to iOS18.0, when we also rejected a call, the status value of hasConnected of CXCall was YES.
So we checked the new features of iOS18.0 and found that it was the influence of the new feature Voicemail.
On iOS18.0 devices, if you reject a call, you will enter the Voicemail state by default, which means that I rejected the call, but callObserver told me that the call was connected, which is inconsistent with the user's intuitive experience and will also cause our App to respond incorrectly (cannot resume audio automatically).
In fact, the Voicemail function is enabled by default for devices upgraded to iOS 18.0. After I reject a call, I have to wait for the caller to hang up before I can receive callObserver telling me that the call has been hung up. This experience is very bad, because when the caller does not hang up the call, our App cannot sense that I have hung up the call and thinks that I am still on the call, but in fact I did hang up the call.
Please refer to the code in the attached picture. According to our understanding, the status should flow like this:
When I receive an incoming call, the callObserver callback will trigger, call.hasConnected is NO and call.hasEnded is NO;
When I reject the call and switch to Voicemail, the callObserver callback will trigger, call.hasConnected should be NO and call.hasEnded should be YES;
When I manually restore the call through Voicemail, the callObserver callback will trigger, call.hasConnected should be YES and call.hasEnded should be NO;
When the caller or I finally hang up the call, the callObserver callback will trigger, call.hasConnected is YES, call.hasEnded is YES;
However, the current status is as follows:
When I receive an incoming call, the callObserver callback will trigger, call.hasConnected is NO and call.hasEnded is NO;
When I reject the call and switch to Voicemail, the callObserver callback will trigger, call.hasConnected is YES and call.hasEnded is NO;
When I manually restore the call through Voicemail, the callObserver callback will not trigger;
Only when the caller finally hangs up, the callObserver callback will trigger, call.hasConnected is YES and call.hasEnded is YES;
Our request is very simple. We need to pause audio when a call comes in and resume audio when the call is hung up or rejected. This is our sample code:
- (void)callObserver:(CXCallObserver *)callObserver callChanged:(CXCall *)call API_AVAILABLE(ios(10.0))
{
bool pause_audio = false;
BOOL calling = !call.onHold && !call.hasConnected && !call.hasEnded;
BOOL disconnected = !call.onHold && !call.hasConnected && call.hasEnded;
BOOL connected = !call.onHold && call.hasConnected && !call.hasEnded;
BOOL hang_up = !call.onHold && call.hasConnected && call.hasEnded;
if (calling) {
xc_log(XC_LOG_INFO, "A phone call is %s", call.outgoing ? "outgoing" : "incoming");
pause_audio = true;
} else if (disconnected) {
xc_log(XC_LOG_INFO, "An %s phone call has been disconnected", call.outgoing ? "outgoing" : "incoming");
// action of phone call end, resume audio
} else if (connected) {
xc_log(XC_LOG_INFO, "An %s phone call has just been connected", call.outgoing ? "outgoing" : "incoming");
pause_audio = true;
} else if (hang_up) {
xc_log(XC_LOG_INFO, "An %s phone call has been hang up", call.outgoing ? "outgoing" : "incoming");
// action of phone call end, resume audio
} else {
xc_log(XC_LOG_INFO, "Unknown telephony state occured");
}
if (pause_audio) {
dispatch_async(dispatch_get_main_queue(), ^{
// action of phone call begin, pause audio
});
}
}
I would appreciate any help !!!
Hi,
I am trying to determine if the Mac that is running my app has an active screen sharing session or not. Is there a way to detect this? Potentially using system API's or a system command?
Any help would be greatly appreciated, thank you!
Topic:
App & System Services
SubTopic:
General
Hello,
I'm trying to add a working shortcut to my app that will open a Privacy & Security page in System Settings under Security section where prompts to allow system extension appears.
Typically open x-apple.systempreferences:com.apple.settings.PrivacySecurity.extension from the Terminal only open the Privacy & Security page.
I want to emulate the button from this system window.
Topic:
App & System Services
SubTopic:
General
Is it possible to write an iOS app or app extension to block an incoming call based on the area code (or some other portion of the incoming call number)?
I'm trying to send an API call to generate the payload for an App Clips Rich Link in Apple Messages. My request has header and body like in the description in the image, and in the authorization header I have a valid secret and fresh iat stamp. but the request returns 403 forbidden status. Did it happend to anyone? Can some one point me to the problem?
https://register.apple.com/resources/messages/msp-rest-api/construct-payload
Topic:
App & System Services
SubTopic:
General
I'm trying to display my images in a tableView, I'm using NSFIleManager and NSDirectoryEnumerator to get all files in the current folder:
NSString *path = @"/Users/eagle/Documents/avatars";
NSFileManager *fileManager = NSFileManager.defaultManager;
NSDirectoryEnumerator *directoryEnum = [fileManager enumeratorAtPath:path];
NSString *file;
while (file = [directoryEnum nextObject])
{
// ...
}
the problem is that this line
file = [directoryEnum nextObject]
always returns nil, what gives?
I already made sure that this folder has no subfolders and contains only images, so what's the problem here?
Topic:
App & System Services
SubTopic:
General
Tags:
Foundation
AppKit
App Sandbox
Files and Storage
Hello there!
My problem concerns Universal Links:
My website is https://www.xn--voil-3na.app/ and has a proper apple-app-site-association set, which is validate by all validators, and also well cached on Apple side. (requested with curl https://app-site-association.cdn-apple.com/a/v1/xn--voil-3na.app)
My app is voilà and :
My provisioning profile allows Associated Domains
I can see the associated domains is well defined to xn--voil-3na.app on XCode
My AppID and bundle (in the entitlements) are matching the ones on my apple-app-site-association
After having made all the debugs following the docs, i'm wondering if the problem doesn't come from my international domain name.
Really looking forward some help, as my application is growing (really) fast and I'd like to serve this feature to my users...
Sincerly,
AB from voilà
Hello Apple Developer Community,
We're developing a parental control app using Apple's ScreenTime API and Family Sharing capabilities. We've encountered several persistent issues that are affecting our users' experience. We've found similar reports from other developers, suggesting these might be widespread problems. We're hoping to get some insight or solutions from the community or Apple experts.
Issues we're facing:
Parent apps visible on child's device:
After granting Family Sharing permission on the family picker, sometimes the parent's apps are visible instead of the child's apps.
Related issue: https://forums.developer.apple.com/forums/thread/749672
Inconsistent app visibility on family picker:*
The behaviour of the family picker is unpredictable:
Sometimes, no apps are visible at all, only categories.
Other times, categories are displayed and upon selection, enforcement works correctly.
In some instances, the stream delivering updates to the selection from the app extension doesn't send anything.
Related issue: https://forums.developer.apple.com/forums/thread/729198
Rules not enforced with different OS versions:
When the parent and child devices are running different iOS versions (both above iOS 16), ScreenTime rules don't seem to work correctly.
General inconsistencies:
We've observed various other inconsistencies in the behavior of the ScreenTime API. These issues are less predictable but contribute to an overall unreliable user experience.
Steps to Reproduce:
Add a child device using a parental control app (in our case, Adora for Kids).
Grant the Family Sharing permission.
Open the family picker to set up ScreenTime rules. Observe the inconsistent behaviour:
a. Sometimes no apps are visible.
b. Sometimes only categories are visible.
c. Sometimes both categories and apps are visible.
When categories are visible, select a category and attempt to enforce a rule.
Enforce a ScreenTime rule. Ensure the parent and child devices are running different iOS versions (both iOS 16+).
Test various ScreenTime rules and observe their enforcement across different device configurations.
Questions:
Are these known issues with the ScreenTime API?
What could be causing the inconsistent behavior in the family picker and the stream of updates from the app extension?
Are there any workarounds or best practices to mitigate these problems?
Is there any additional information we can provide to help investigate these issues?
Are there any plans to improve the stability and consistency of the ScreenTime API in future iOS releases?
We've tried researching these issues through various channels, including Apple's documentation and community forums, but haven't found definitive solutions. Any insights or assistance would be greatly appreciated.
Thank you for your time and help!
Environment:
Development: Xcode 15.4, macOS 14.2.1
Runtime: iOS 16+
App: Adora (App Store ID: 1671825554)
We removed the APP Clip feature in the latest version, and our server-side synchronous update the "apple-app-site-association" file, remove "appclips" content.
now, i use Camera app scan the QR code(regardless of whether the app is installed), the following error occurs:"This app clip is not currently available in your country or region".
I checked some information and it might be due to Apple CDN cache, or is there something else I need to do?