Touchscreen gestures in CarPlay aren't recognized in the app (delegate CPMapTemplateDelegate functions aren't called). Tried also in the Coastal Roads demo app to add test functions to check that pan functions are called - the same:
func mapTemplateDidBeginPanGesture(_ mapTemplate: CPMapTemplate) {
MemoryLogger.shared.appendEvent("Did begin pan gesture.")
}
func mapTemplate(_ mapTemplate: CPMapTemplate, panBeganWith direction: CPMapTemplate.PanDirection) {
MemoryLogger.shared.appendEvent("Did begin pan gesture with direction \(direction.rawValue).")
}
Note: buttons (on carplay app) are working when pressing on them. Also, the desktop of carplay can be panned (by swipe gesture).
Using Xcode 14.3, MacBook pro M1
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
So, I'm trying to create my own text-to-speech setup. Problem I'm having is whenever I do a test run, the speech gets a bit choppy at the start kind of skipping over maybe a word or a few characters.
A few details:
I've essentially built a separate class for handling the speech events.
AVSpeechSynthesizer is set up as a private variable for the class so I don't expect deallocation to be the issue. Especially since it's a problem at the start.
I've got a queue set up for what it's worth so that shouldn't be a problem.
I'd appreciate any advice.
Hi,
I'm implementing a BADownloaderExtension in my app for essential assets. I would like to treat the install case differently than the update case, however it seems whether I "install" or "update" the app (via TestFlight) I always end up getting a BAContentRequest of type .install. I can simulate an update via xcrun, but cannot seem to get into that case in the wild. Is this expected?
My requirement is to open a specific screen of my app with when user says " Start Sleep meditation for 10 minutes" where Sleep and 10 minutes are dynamic values in the phrase. Is it possible just with a phrase we can get the values. Or do i need to ask using siri "which meditation" and then "how much tine". I am planning to use AppIntent and AppShortcut, along with Entities. But unable to open the shortcut when siri invokes with phrase i discussed above.
Topic:
App & System Services
SubTopic:
General
Tags:
Siri Event Suggestions Markup
Siri and Voice
SiriKit
App Intents
My iOS app get access to Calendars on iPhone and iPad (iOS 17) bey when running on Mac (designed for iPad) the app gets the ".notDetermined" authorizationStatus after a call to EKEventStore.authorizationStatus(for: .event).
What should I do so that my App gain access to Calendars ?
Hello,
I am implementing an App Intent which shows a confirmation dialog before proceeding with the operation execution.
It works fine when the intent is started from a shortcut, but it always fails when started from Siri: I obtain the error message depicted in the attached screenshot ("An error occurred, try again").
That message appears as soon as the requestConfirmation method is called in the perform method of my App Intent:
try await requestConfirmation(actionName: .do, dialog: "app_intent_sim_confirmation_message") {
SIMRechargeIntentSummaryView(...)
}
...
How can I solve the problem?
Thanks
I'd like to block the apps selected in FamilyActivityPicker individually when a certain threshold is met.
For example, let's say the threshold is 15 minutes, and I want to block both Photos and Freeform. If I spend 15 minutes on Photos, Photos should be blocked. Then, if I spend 15 minutes on Freeform, Freeform should also be blocked.
Currently, only Photos gets blocked after 15 minutes, but Freeform does not. How can I fix this problem so that each app is blocked individually when its respective 15-minute threshold is met?
Thank you in advance
File 1 :
class GlobalSelection {
static let shared = GlobalSelection()
var selection = FamilyActivitySelection()
private init() {}
}
extension DeviceActivityName{
static let daily = Self("daily")
}
@objc(DeviceActivityMonitorModule)
class DeviceActivityMonitorModule: NSObject {
private let store = ManagedSettingsStore()
@objc
func startMonitoring(_ limitInMinutes: Int) {
let schedule = DeviceActivitySchedule(
intervalStart: DateComponents(hour: 0, minute: 0),
intervalEnd: DateComponents(hour: 23, minute: 59),
repeats: true
)
let threshold = DateComponents(minute: limitInMinutes)
var events: [DeviceActivityEvent.Name: DeviceActivityEvent] = [:]
// Iterate over each selected application's token
for token in GlobalSelection.shared.selection.applicationTokens {
// Create a unique event name for each application
let eventName = DeviceActivityEvent.Name("dailyLimitEvent_\(token)")
// Create an event for this specific application
let event = DeviceActivityEvent(
applications: [token], // Single app token
threshold: threshold
)
// Add the event to the dictionary
events[eventName] = event
}
// Register the monitor with the activity name and schedule
do {
try DeviceActivityCenter().startMonitoring(.daily, during: schedule, events: events)
print("24/7 Monitoring started with time limit : \(limitInMinutes) m")
} catch {
print("Failed to start monitoring: \(error)")
}
}
@objc
static func requiresMainQueueSetup() -> Bool {
return true
}
}
FIle 2 :
class DeviceActivityMonitorExtension: DeviceActivityMonitor {
let store = ManagedSettingsStore()
var blockedApps: Set<ApplicationToken> = []
func scheduleNotification(with title: String) {
let center = UNUserNotificationCenter.current()
center.requestAuthorization(options: [.alert, .sound, .badge]) { granted, error in
if granted {
let content = UNMutableNotificationContent()
content.title = "Notification" // Using the custom title here
content.body = title
content.sound = UNNotificationSound.default
let trigger = UNTimeIntervalNotificationTrigger(timeInterval: 5, repeats: false) // 5 seconds from now
let request = UNNotificationRequest(identifier: "MyNotification", content: content, trigger: trigger)
center.add(request) { error in
if let error = error {
print("Error scheduling notification: \(error)")
}
}
} else {
print("Permission denied. \(error?.localizedDescription ?? "")")
}
}
}
// Function to retrieve selected apps
func retrieveSelectedApps() -> FamilyActivitySelection? {
if let sharedDefaults = UserDefaults(suiteName: "group.timelimit.com.zerodistract") {
// Retrieve the encoded data
if let data = sharedDefaults.data(forKey: "selectedAppsTimeLimit") {
// Decode the data back into FamilyActivitySelection
let decoder = JSONDecoder()
if let selection = try? decoder.decode(FamilyActivitySelection.self, from: data) {
return selection
}
}
}
return nil // Return nil if there was an error
}
override func intervalDidStart(for activity: DeviceActivityName){
super.intervalDidStart(for: activity)
scheduleNotification(with: "Interval did start")
scheduleNotification(with: "\(retrieveSelectedApps())")
}
override func intervalDidEnd(for activity: DeviceActivityName) {
super.intervalDidEnd(for: activity)
}
override func eventDidReachThreshold(_ event: DeviceActivityEvent.Name, activity: DeviceActivityName) {
super.eventDidReachThreshold(event, activity: activity)
// Notify that the threshold is met
scheduleNotification(with: "Threshold met")
// Retrieve the selected apps
if let selectedApps = retrieveSelectedApps() {
// Extract the app token identifier from the event name
let appTokenIdentifier = event.rawValue.replacingOccurrences(of: "dailyLimitEvent_", with: "")
// Iterate over the selected application tokens
for appToken in selectedApps.applicationTokens {
// Convert the app token to a string representation (or use its debugDescription)
let tokenString = "\(appToken)"
// Check if the app token matches the token identifier in the event name
if tokenString == appTokenIdentifier {
blockedApps.insert(appToken)
// Block only the app associated with this event
store.shield.applications = blockedApps
scheduleNotification(with: "store.shield.applications = blockedApps is reached")
break
}
}
} else {
scheduleNotification(with: "No stored data for selectedAppsTimeLimit")
}
}
override func intervalWillStartWarning(for activity: DeviceActivityName) {
super.intervalWillStartWarning(for: activity)
// Handle the warning before the interval starts.
}
override func intervalWillEndWarning(for activity: DeviceActivityName) {
super.intervalWillEndWarning(for: activity)
// Handle the warning before the interval ends.
}
override func eventWillReachThresholdWarning(_ event: DeviceActivityEvent.Name, activity: DeviceActivityName) {
super.eventWillReachThresholdWarning(event, activity: activity)
// Handle the warning before the event reaches its threshold.
}
}
On iOS Bundle.main.preferredLocalizations returns the list of languages the application bundle supports in user-preferred order with the first element being the language the application is running in.
Additionally Locale.preferredLanguages returns the list of languages in the order they are presented in Preferences.app > General > Language & Region > Preferred Languages with the first element being the user's "primary language" (i.e. the language the system is running in).
However this only seems to be true unless the user has chosen a per-app language which is different from the primary language in which case Locale.preferredLanguages.first is equal to Bundle.main.preferredLocalizations.first - regardless of the latter's position in the Preferred Languages list.
Furthermore this seems to change depending on the value of the "AppleLanguages" key in the User Defaults' global domain (see c.f. https://stackoverflow.com/a/42648166).
Is this behaviour documented anywhere?
Addendum: I know that according to https://forums.developer.apple.com/forums/thread/718512?answerId=733680022#733680022
AppleLanguages is an implementation detail, not something that’s considered API.
Locale.preferredLanguages is API, though.
To resolve this issue, please revise the app preview to only use video screen captures of the app. These may include narration and video or textual overlays for added clarity
I need some assistance with the Screen Time API’s DeviceActivityReport extension. I know the extension is sandboxed but I need the data inside my app. Jomo is currently doing this so it’s not impossible. I see they’re saying it’s an estimate which is about 5 - 10 off of the actual screen time, but how are they doing this?
Any attempt to store the screen time data inside some sort of database or UserDefaults always fails of course due to the sandbox.
Any advice would be greatly appreciated!
I want to add a pkcanvasview image and change its position and size by dragging. And I want to draw a picture using a pencil on top.
The image must be able to change its position at any time, and a picture must be added on top.
If I add it to the subview of pkcanvasview, I cannot do what I want. How can I do this?
Hello.
Here is my AASA file (my appID changed):
{
"applinks": {
"apps": [],
"details": [
{
"appIDs": [ "A123B4567C.app.myapp.tool" ],
"components": [
{ "/": "/en", "exclude": true },
{ "/": "/en/*", "exclude": true },
{ "/": "/workspace/*", "exclude": true },
{ "/": "*" }
],
"paths": [ "NOT /en", "NOT /en/*", "NOT /workspace/*", "*" ]
}
]
}
}
I need to open all links with my app except those with excluded flag.
When I open 'right' links, my app opens them (that's great).
When I open excluded link (e.g. https://myapp.app/workspace/personal) Safari opens it (that's great) but then the app is launched (that's what not expected).
What I already checked:
added the old "paths" property (it wasn't there originally)
rebooted my device
reinstalled my app from the TestFlight then from the AppStore
asked ChatGPT
used AASA validator (branch.io one)
cleared Safari cache
checked my links for redirects
What else can I check? Thanks.
So I have a button on a widget styled as seen below. I want this button to take up the entirety of the width, problem is, when it does so either using a frame(maxWidth: .infinity) or if I increase the horizontal padding, the button still only gets clicked if the user taps near the buttons center. Otherwise, it will open the app.
Relevant code:
Button(intent: Intent_StartRest() ){
Text("stop")
}
.buttonStyle(PlainButtonStyle())
.tint(.clear)
.padding(.vertical, 6)
.padding(.horizontal, 100)
.background(RoundedRectangle(cornerRadius: 30).fill(.button))
.foregroundStyle(.buttonText) // Just sets text color
.useAppFont(size: 18, relativeTo: .caption, weight: .bold) // Just sets font
Any pointers?
I am trying to parse CMSensorDataList in watchOS. The maximum batch of data is comprised of 30 minutes. The sampling rate is 50Hz, which makes it 90,000 records for 30 minutes. We iterate over each item and finally write the data to a CSV file. As this is a slow processing keeping in view the execution limits of watchOS, the process suspends when app goes to the background. This way, it takes too much time to parse a significant time data. My question is, is there a way we can serialize this CMSensorDataList as a whole and transfer it to the phone using WCSession? Or is there another effective way to achieve this?
A large number of crashes were detected in the background when users were using Object Capture
Crash TXT
Hi,
We have an IPv6 only server setup, where we have put AASA file as required:
https://qa-jen.noknoktest.com/.well-known/apple-app-site-association
But Apple CDN does not found it:
https://app-site-association.cdn-apple.com/a/v1/qa-jen.noknoktest.com
Is there any restriction on IPv6 only servers?
Everything works with our other IPv4 servers.
Note: With alternate mode configuration in application, the AASA is accessible to devices.
There is no any geo restriction or IP filtering for server.
What is missing to force CDN cache the file fro mentioned server?
I'm working with the FamilyControls API and am running into an issue with sharing ActivityTokens between devices in the same family sharing network.
Based on this documentation, ActivityTokens are only accessible and readable by other members in the family sharing network. My app is based on the idea that if one user selects the Games category in the FamilyActivityPicker, then this token can be shared with another device in the same family-sharing network and this other device can read and display the category.
So my question is:
If a user in the network selects an activity category in the FamilyActivityPicker, can this category token be shared, read, and used by another user in the family-sharing network?
Hi , how I can run my shortcut by tapping on button from widget?
After updating to watchOS 11.1, updates using WidgetCenter.shared.reloadAllTimelines() in WKRefreshBackgroundTask stopped working. When the background task is triggered, it gets data from the phone and updates the WidgetKit complications. But now the refresh call WidgetCenter.shared.reloadAllTimelines() does not update the complications.
When opening our App Clip from a Live Activity, the iOS system Handoff alert blocks our app on open. It is reproducible 100% of the time. The description in the system alert is: Waiting for Handoff to {My App}. We never had this issue before and believe it is related to iOS 18. I don't have Handoff enabled anywhere in my app.
All uses of NSUserActivity explicitly block handoff
userActivity.isEligibleForHandoff = false
We have been able to locate this same issue in other iOS apps that use Live Activities and App Clips. Is this an iOS 18 system-level bug?