Team-scoped keys introduce the ability to restrict your token authentication keys to either development or production environments. Topic-specific keys in addition to environment isolation allow you to associate each key with a specific Bundle ID streamlining key management.
For detailed instructions on accessing these features, read our updated documentation on establishing a token-based connection to APNs.
Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello, Apple developers. I have joined the MFi certification program. My Accessory to use USB HID IAP2 communication protocol and IOS devices connected, but in the first step, the Accessory to send the following data (0 XFF, 0 x55, 0 x02, 0 x00 to 0 xee, 0 x10) for IOS devices, but the IOS devices have no response, this is probably the reason why?
I have connected my Apple TV 4k to Sony Bravia TV using an HDMI Cable. Sometimes, all of a sudden, the screen starts blinking and Screen resolution (1080p) is displayed continuously. What is happening? I have tried different HDMI Cables, same result.
Topic:
App & System Services
SubTopic:
Hardware
I have added an auto updatable pass to wallet. As soon as it is added I am able to receive the registration request. When there is change available for the pass I am able to trigger push notification to registered device via APNS. I am also able to receive the consecutive call about serial numbers. In the final call made from device to /v1/passes endpoint , server is sending the pass with these headers - Content-Type, Content-Disposition, Last-Modified however the pass is not getting updated. There are no logs received on /v1/log endpoint. From a MacBook when I hit the same endpoint, I am able to download and open the pass. Anything else that is needed on the headers so it can work on device?
Is it necessary to have change description in the updated pass so that pass can update on device?
Topic:
App & System Services
SubTopic:
Apple Pay
I would like to replace the deprecated method FSMountServerVolumeSync with the newer one NetFSMountURLSync.
I can properly mount my FTP volume using "Connect to Server" in the Finder and with FSMountServerVolumeSync (the volume comes as read only, but it's ok).
When I try to mount the FTP volume with NetFSMountURLSync I get this error message:
"The share does not exist on the server. Please check the share name and then try again".
But as I know I can't define the share on a FTP server. How can I fix this issue?
Hello, I've been working to implement PTT in the way recommended by the documentation. The main issue is that the bluetooth methods are opaque, so I cannot solve for what I need. The result will be that I will have to resort to hacky approaches that the PTT framework seems to be intended to solve (playing silent clips, playing custom notification sounds, having long running background audio sessions).
I am testing with Anker soundcore mini as well as airpod pro.
Here's the issue: there are 2 very different behaviours depending on whether I'm using a call/fullDuplex session and a halfDuplex session.
halfDuplex
Anchor mini
Current behaviour
long press activates siri
pressing again after siri is active, starts transmission
long press activates siri again
pressing again after siri is active, stops transmission
pause/play routes to the ongoing media session and plays music
Expected behaviour
play/pause should map to transmit/stopTransmit
IF I have to use long press, it should at least not trigger siri
AirPod pro
Current behaviour
long press changes noise cancellation
pause/play routes to the ongoing media session and plays music
Expected behaviour
play/pause should map to transmit/stopTransmit
fullDuplex/call
Anchor mini:
Current behaviour
long press activates siri
pressing again after siri is active, starts transmission
long press activates siri again
pressing again after siri is active, stops transmission
pause/play routes to the ongoing media session and plays music
Expected behaviour
play/pause should map to transmit/stopTransmit
IF I have to use long press, it should at least not trigger siri
AirPod pro
Current behaviour
long press changes noise cancellation
pause/play maps to mute/unmute (even if media is playing)
Expected behaviour
This makes sense for call behaviour, I wish it worked this well for PTT
The intention here is to be able to fully interact with a channel hands-free. The current API seems to make that impossible. Is that by design? Reading all the docs seems to suggest its intended for transmit/stopTransmit to be doable just with the play/pause buttons, but even apple hardware seems to not support that.
frequentPushEnablementUpdates asynchronous sequence is never called even if 'More Frequent Updates' is toggled ON or OFF.
for await frequentPushEnabled in ActivityAuthorizationInfo().frequentPushEnablementUpdates {
// never called
}
Though we are able to get the 'More Frequent Updates' value once by the following:
var isEnabled = ActivityAuthorizationInfo().frequentPushesEnabled //true if ON, false if OFF
This only gives the result once as it is not async observation sequence.
But the 'frequentPushEnablementUpdates' async sequence is never called. As per the doc - 'frequentPushEnablementUpdates' is an asynchronous sequence you use to observe whether a person permitted you to update Live Activities with frequent ActivityKit push notifications.
I've created a font family, but Font Book refuses to include it in the English language set, despite my best efforts.
The font has every glyph in the OpenType "Std" set, plus several others.
I've checked various boxes for Latin 1 and Macintosh Character Codepages; plus Unicode ranges for Basic Latin, additional Latin, etc, etc.
I've compared it to several other fonts that are in the English set, and I can't see anything that they have that my fonts don't. (In fact, many of them seem to have much less!)
I've created other fonts that are in the English set, but I've no idea what the difference is.
Given that macOS relies on these Language sets, in order to hide the thousands of unnecessary fonts that are permanently installed in the OS, there ought to be some guidance on how to do this.
I use the write-review query parameter in my App Store URL to bring up the review prompt in the App Store app:
https://apps.apple.com/app/id0123456789?action=write-review
(0123456789 is just an example ID, obviously replace that with your app ID)
This is exactly what is supposed to be done as per the documentation: https://developer.apple.com/documentation/storekit/requesting_app_store_reviews#4312600
However, on macOS it just opens the product page as if I never put the query parameter in the URL. It works fine on iOS 18.2.
I am using macOS 15.2 beta 3 (24C5079e)
Feedback ID: FB15866683
I have a need to do a coverage test for the networkextension function code implemented by the system extension, but I don't know how to implement this method.
For example, how do you use gtest or how do you use xctest to achieve these capabilities?
If you know, please let me know. Thanks
Is it possible to reset SwiftData to a state identical to that of a newly installed app?
I have experienced some migration issues where, when I add a new model, I need to reinstall the entire application for the ModelContainer creation to work.
Deleting all existing models does not seem to make any difference.
A potential solution I currently have, which appears to work but feels quite hacky, is as follows:
let _ = try! ModelContainer()
modelContainer = try! ModelContainer(for: Student.self, ...)
This seems to force out this error CoreData: error: Error: Persistent History (66) has to be truncated due to the following entities being removed: (...) which seems to reset SwiftData.
Any other suggestions?
Under iPhone iOS18.1 and Xcode 16.1, use Matter and MatterSupport to pair Matter devices. It was found that Matter over WIFI devices could not trigger the PASE process
try deviceController?.setupCommissioningSession(with: payload, newNodeID: commissioningDeviceID)
After some time:
func controller(_: MTRDeviceController, statusUpdate status: MTRCommissioningStatus)
Return failure
https://developer.apple.com/documentation/applemapsserverapi/creating-and-using-tokens-with-maps-server-api
This doesn't really say what to do with .p8 private key. I know I am a noob but I really don't understand Apple Doc at all. There is no example or anything like that.
const appleMapKit = await fetch("https://maps-api.apple.com/v1/token", {
headers: {
Authorization: `Bearer ${server_api_token}`,
},
});
This is what I did but because I created the token on website, server_api_token only lasts 7 days.
So, I tried to use Profile - Key to replace that. I have .p8 file and how can I use this to create the token for the server api?
Hi folks,
I have trouble with reading NFC tag in background while my application is running.
I still receives a message in my iPhone 13, NFC tag of web Open "donator.cz" in application Safari.
But I would like to open the app Donator which is here: https://apps.apple.com/dk/app/don%C3%A1tor/id6473955033
My NFC chip scanned by "NFC Tools" is ISO 14443-3A (NXP-NTAG213)
In the tag is mentioned https://donator.cz/<8digits-number>**text
In my Info.plist of application is:
<key>LSApplicationQueriesSchemes</key>
<array>
<string>https://donator.cz</string>
</array>
<key>com.apple.developer.nfc.readersession.felica.systemcodes</key>
<array>
<string>12FC</string>
</array>
<key>com.apple.developer.nfc.readersession.iso7816.select-identifiers</key>
<array>
<string>D2760000850101</string>
</array>
Apple CDN looks like: https://app-site-association.cdn-apple.com/a/v1/donator.cz
I have read in Apple documentation https://developer.apple.com/documentation/corenfc/adding-support-for-background-tag-reading
that application should be in progress A Core NFC reader session is in progress.
How to do it in AppDelegate or SceneDelegate.
One note is that everything works properly in case of reading NFC based on button click is pressed.
My NFCHandler looks like:
//
// NFCReader.swift
// Donator
//
// Created by Petr Hracek on 22.11.2024.
//
import Foundation
import UIKit
import CoreNFC
class NFCHandler: NSObject, NFCNDEFReaderSessionDelegate {
var readerSession: NFCNDEFReaderSession?
func startBackgroundScanning() {
guard NFCNDEFReaderSession.readingAvailable else {
return
}
readerSession = NFCNDEFReaderSession(delegate: self, queue: DispatchQueue.main, invalidateAfterFirstRead: true)
readerSession?.alertMessage = "Prilozte NFC tag"
readerSession?.begin()
}
func readTag(session: NFCNDEFReaderSession , tags: [NFCNDEFTag] ) {
if tags.count > 1 {
// Restart polling in 500ms
let retryInterval = DispatchTimeInterval.milliseconds(500)
session.alertMessage = "More than 1 tag is detected, please remove all tags and try again."
DispatchQueue.global().asyncAfter(deadline: .now() + retryInterval, execute: {
session.restartPolling()
})
return
}
// Connect to the found tag and perform NDEF message reading
let tag = tags.first!
session.connect(to: tag, completionHandler: { (error: Error?) in
if nil != error {
session.alertMessage = "Unable to connect to tag."
session.invalidate()
return
}
tag.queryNDEFStatus(completionHandler: { (ndefStatus: NFCNDEFStatus, capacity: Int, error: Error?) in
if .notSupported == ndefStatus {
session.alertMessage = "Tag is not NDEF compliant"
session.invalidate()
return
} else if nil != error {
session.alertMessage = "Unable to query NDEF status of tag"
session.invalidate()
return
}
tag.readNDEF(completionHandler: { (message: NFCNDEFMessage?, error: Error?) in
var statusMessage: String
if nil != error || nil == message {
statusMessage = "Fail to read NDEF from tag"
} else {
statusMessage = "Found 1 NDEF message"
DispatchQueue.main.async {
// Process detected NFCNDEFMessage objects.
debugPrint(message!)
//self.tableView.reloadData()
}
}
session.alertMessage = statusMessage
session.invalidate()
})
})
})
}
/// - Tag: sessionBecomeActive
/// This method will called when NFC session become active
/// Tells the delegate that the session detected NFC tags with NDEF messages.
func readerSessionDidBecomeActive(_ session: NFCNDEFReaderSession) {
debugPrint("NFC session become active======")
}
func readerSessionDidInvalidate(_ session: NFCNDEFReaderSession) {
debugPrint("NFC session invalidate======")
}
func readerSession(_ session: NFCNDEFReaderSession, didInvalidateWithError error: Error) {
print(error)
}
func readerSession(_ session: NFCNDEFReaderSession, didDetectNDEFs messages: [NFCNDEFMessage]) {
for message in messages {
for record in message.records {
print("Type name format: \(record.typeNameFormat)")
print("Payload: \(record.payload)")
print("Type: \(record.type)")
print("Identifier: \(record.identifier)")
}
}
}
/// - Tag: processingNDEFTag
/// if This method is not implement then only func readerSession(_ session: NFCNDEFReaderSession, didDetectNDEFs messages: [NFCNDEFMessage]) will call
/// Tells the delegate that the session detected NFC tags with NDEF messages and enables read-write capability for the session.
func readerSession(_ session: NFCNDEFReaderSession, didDetect tags: [NFCNDEFTag]) {
self.readTag(session: session, tags: tags)
}
}
Dear Apple engineers,
Is it possible to make a fileprovider cloud volume mount path independent of the user's home folder in order to have constant path across desktop clients when files are referenced / placed by applications like Adobe Creative Cloud ?
Ideally mount or link the fileprovider cloud volume under /Volumes
Thanks,
We are experiencing an infrequent issue with the handoff between our Siri intent, our iOS app, and our CarPlay extension. Siri correctly understands the request, and the
handler(for intent: INIntent)
method is called. In the final step, we respond using:
INStartCallIntentResponse(code: .continueInApp, userActivity: userActivity)
with an instance of NSUserActivity initialized as:
NSUserActivity(activityType: "our.unique.StartCallIntent")
This "our.unique.StartCallIntent" type is included in the app’s NSUserActivityTypes attribute within the Info.plist.
The callback is handled in the main view of the app through:
view.onContinueUserActivity("our.unique.StartCallIntent", perform: handleSiriIntent)
Additionally, we handle the callback in the CarPlay extension using:
func scene(_: UIScene, continue userActivity: NSUserActivity)
This is necessary because when Siri is invoked while CarPlay is active, the CarPlay extension should receive the callback.
Most of the time, both callbacks are triggered as expected. However, on rare occasions, the handoff fails, and neither onContinueUserActivity nor scene(_: UIScene, continue userActivity:) receives a callback from the Siri intent.
Is this a known issue? If so, are there any guidelines or best practices for ensuring that our Siri intent handoff consistently triggers the callbacks?
[Submitted as FB14860454, but posting here since I rarely get responses in Feedback Assistant]
In a simple SwiftData app that adds items to a list, memory usage drastically increases as items are added. After a few hundred items, the UI lags and becomes unusable.
In comparison, a similar app built with CoreData shows only a slight memory increase in the same scenario and does NOT lag, even past 1,000 items.
In the SwiftData version, as each batch is added, memory spikes the same amount…or even increases! In the CoreData version, the increase with each batch gets smaller and smaller, so the memory curve levels off.
My Question
Are there any ways to improve the performance of adding items in SwiftData, or is it just not ready for prime time?
Example Projects
Here are the test projects on GitHub if you want to check it out yourself:
PerfSwiftData
PerfCoreData
Does visionOS support mapkit?
Hi,
I'm working on implementing Apple Pay on the Web.
I noticed, both on my web but also on official Apple Pay on the Web Demo page (https://applepaydemo.apple.com/apple-pay-js-api) when you're sending request for recurring payment, it takes much longer to get response from Apple server (even in onpaymentauthorized method) than when using regular payment.
You can test on the page mentioned above. When you authorise test card with basic payment it's pretty fast, but when you do authorisation with test card for recurring payment (or Deferred or Automatic Reload) "processing payment" is much longer.
Is there a reason why is this and is there a way to speed it up?
Thank you.
Kind regards,
Zoran
Im not a dev but trying to create something
trying to create an app that includes an iMessage extension AND a sticker pack. My first attempt I tried to create a iMessage app but apparently I cant include a sticker pack. Ive tried to create a shell app with a sticker pack and iMessage extension but it's just not working.
Can someone please let me know how I can do this. How can I get an iMessage extension app and a sticker pack installed at the same time from the same app.
Ive tried everything, tried creating a seperate iOS app with sticker pack and iMessage extension and nothing. A lot of times a get an error like "CompileAssetCatalogVariant failed with a nonzero exit code"" If I remove the sticker pack builds successfully.
thank you in advance
We want to support apple login in our application, and our domain has protection from public access, so we have followed document "Configuring your environment for Sign in with Apple" from Xcode ---> window --- > Developer Document to use ip connect to apple.
when we ping the ip list in the document, like:
17.32.139.128/27
17.32.139.160/27
17.140.126.0/27
17.140.126.32/27
17.179.144.128/27
17.179.144.160/27
17.179.144.192/27
17.179.144.224/27
17.253.0.0/16
but they have no response.
what's wrong with it? hope help!
Topic:
App & System Services
SubTopic:
General