We're currently evaluating offering an App Clip experience alongside our full app and I'm particularly interested in understanding how updates to App Clips are rolled out to users.
The documentation, specifically the section on Updating and Deleting App Clip Experiences provides some insights into how to make an update to an already existing App Clip:
To update an already released App Clip, create a new App Clip experience and attach it to a new version of your app. When the new version passes App Review and you publish it on the App Store, it uses the new App Clip experience.
From this, my understanding is that all users, including those who have previously accessed the older version of the App Clip, will automatically receive the updated version upon their next use. Is my interpretation correct? I'm particularly curious about how seamless this transition is for users who might have the previous version cached. Do they encounter any disruptions, or is the switch to the updated version smooth (assuming they have a decent network connection)?
I would greatly appreciate any insights or shared experiences regarding the App Clip distribution process.
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Post
Replies
Boosts
Views
Activity
I've encountered an issue while reviewing logs from my device and hope someone here can shed some light on it. In the process of diagnosing an application behavior, I noticed that some entries in my logs are marked as , specifically next to bundle IDs, which makes it challenging to understand which app or process is involved.
Here are the relevant log entries:
Feb 21 17:40:53 vCw-2 suggestd(CoreSuggestionsInternals)[30399] <Notice>: SGDSuggestManager: realtimeSuggestionsForMailOrMessageWithHash: com.apple.MobileSMS : <private>
Feb 21 17:40:53 vCw-2 suggestd(CoreSuggestionsInternals)[30399] <Notice>: SGDSuggestManager: realtimeSuggestionsForMailOrMessageWithHash: <private>: results: (null)
Feb 21 17:40:53 vCw-2 suggestd(CoreSuggestionsInternals)[30399] <Notice>: SGDSuggestManager: realtimeSuggestionsForMailOrMessageWithHash: com.apple.MobileSMS : <private>
Feb 21 17:40:53 vCw-2 suggestd(CoreSuggestionsInternals)[30399] <Notice>: SGDSuggestManager: starting dissection.
The identification of this hidden bundle ID is essential for allowing the specific iMessage Business Chat feature to function as intended in our MDM-managed devices.
Does anyone have insights into why the bundle ID might be hidden or how to uncover it? Are there tools or methods available that could help me identify this bundle ID for MDM whitelist configuration purposes?
I appreciate any guidance or recommendations you can provide. Thank you for your time and assistance.
I have a watch companion app that sends several message to the iPhone and the phone replies with the data the watch needs. This works perfectly on the simulator and when I deploy directly from XCode to my phone/watch. However, when deployed to test flight the watch never receives replies from the phone. I've added logging and WCSession is active, WCSession.isSupported is true and WCSession.isReachable is true. The watch just doesn't receive a reply from the phone. Is there something I have to put in the Plist to make this work in Test Flight? Has anyone ever had a similar problem?
Hi,
I am new to MAC development and not sure if this is possible at all, but I would like to test how my app reacts to NSWorkspaceWillSleepNotification and NSWorkspaceDidWakeNotification events. So I want to setup googletest which will:
initialize my own stuff
from another thread send an event to my tested code that will make it "think" system will go to sleep
perform "OnSleep" logic
after some time, from another thread, mimic wake event
detect in tested code I am awaken and check if my code works ok
Additionally, if there are multiple handlers registered to sleep/wake events - did they all get executed (I guess - yes) and in which order (in order of registration)?
Many thanks in advance
Hello Everyone,
we've encountered an issue with our iOS mobile application, particularly affecting users on iOS version 17. It appears that our universal deep links coming from redirects have ceased to function properly.
We've got quite a few marketing campaigns that send out promotional emails through an emailing system, which wraps our site's links inside their own. Obviously tapping these links does not open our app, instead directs the user to web mobile.
Older phones like iPhone X with iOS 16 still work properly.
We have checked domain association, and also when tapping direct deep link everything works properly. However, the issue arises specifically when users interact with marketing tracking links that redirect to a deep link. In this scenario, both Safari and Chrome fail to redirect outside of the browser environment. Instead, they execute an HTTP 301 redirection within the browser itself.
Is this something be expected from iOS version 17, is there a way to avoid this behaviour and still continue using marketing links with redirection?
Hi Team,
Please confirm on below concern.
For any mobile application who do not have website or web page available then account deletion can only be provide on mobile app?
Is there any mandatory policy to provide user account deletion on mobile app and need to create webpage just for account deletion functionality?
I have created a visionOS framework from Xcode to bridge between Unity and visionOS which is used to call APIs from Unity to VisionOS.
visionOS framework we attached in Unity assets is not getting builded along with Unity visionOS build.
This method is working fine for iOS build. i.e, created a iOS framework from Xcode to bridge between Unity and iOS which is used to call APIs from Unity to iOS.
iOS framework we attached in Unity assets is getting builded along with Unity iOS build.
Whether the VisionOS framework will work in Unity same as that of iOS? Is there any additional settings to setup framework for visionOS?
Reference link for the setup method used: https://betterprogramming.pub/setting-up-ios-framework-for-unity-9ef4e577db89
If you have associated domains enabled, the autofill feature is used, and then you get a UIAlertController that asks you to save or update the password.
The problem is that if the user puts applications in the background before tapping either the 'Update Password' or the 'Not Now' button, the keyboard for text fields is no longer shown after switching back to the foreground.
I reproduced this in several apps, which seems like an iOS Bug. This appears to be an old issue as well (see https://stackoverflow.com/a/61165107/9435282)
So I woke up today and I saw a screen time limit was placed on my phone. No one in my family placed the screentime limit, and I never placed one when I got my phone. I dont know a screen time passcode because I never made one, because I never set up a screentime limit. I tried restarting and force restarting, didnt work, I have no new updates so that doesnt work either. I dont know what to do. Ive watched like 6 hour video essays on this phone, and its never gave me a screentime limit or warning, so I know its a bug.
iOS and macOS native calendar application detects URL's from video conferencing apps (e.g. Webex, Zoom, Microsoft Teams, Google Meet etc.). Even though these apps are not installed on the iPhone, it detects from the URL that the URL is of video conferencing/voip app and shows a 'Join' button next to URL and video icon, also the application icon if application is already installed on phone and tapping "Join" will redirect to app and Joins meeting. For example if the URL is for Zoom meeting is pasted in the iOS or macOS native Calendar event, it shows are Zoom meeting with "Join" button and video icon to indicated it video conference.
Does apple provide and api for this integration, I have app which also do a video conference with URL, what is the way to make my app's URL recognized as Video Conference and show "Join" button as other above mentioned application.
Does anyone know how to achieve this or made it work for their application.
Appreciate any help, thanks
Hello,
Our team is leveraging WeatherKit for our product, with requirements to obtain historical, current, and forecasted weather data. Our research indicates that WeatherKit supports all these needs, albeit with a caveat regarding historical data. According to this discussion on the developer forums, historical data should be available post-August 2021.
However, we've encountered issues accessing historical data for certain locations beyond this date. Are these gaps in data availability a known issue within WeatherKit's service? If so, could you provide any insights into when we might expect a resolution?
Thank you for your assistance.
Hello.
My project includes a widget target that provides interactive widget functionalities. The document "Adding Interactivity to Widgets and Live Activities" says the following:
Additionally, note that the perform() function is marked as throws. Be sure to handle errors instead of rethrowing them, and update your app, widget, and Live Activity as needed. For example, update a widget’s interface to indicate that it displays outdated information if it cannot load new data.
https://developer.apple.com/documentation/widgetkit/adding-interactivity-to-widgets-and-live-activities#Implement-the-perform-function, column 3
However, I couldn't find a way how to handle an error in an interactive widget. The Button(intent:) and Toggle(intent:) initializers don't have mechanisms for error handling.
Does anyone know a solution for handling errors in interactive widgets?
I have copied an sqlite file from document to app group container. I found that if I delete the app and reinstall, the file is automatically created.
How is that happed? The app group container is not just a folder and can be synchronized by iCloud?
I cannot find any informations about the feature. If there is, please show me the link, thanks
Have an app in the store - 10K users. Using the same algorithm for years to download objects, convert them to ManagedObjects, then save them in a context.
Been using the exact same Objective-C code for over 5 years - no changes.
We build the app with Xcode 15.1, release it a few weeks ago, then slowly start getting reports of the app won't boot.
Run the app in 15.1, look at memory usage, and it's a flat line up. But the code is littered with autorelease statements. For this download, max memory was 2.3G! No wonder so many users crashing!
[Worked two weekends straight to get this fixed, but why did it happen???]
The last developer told me he added those to reduce memory pressure, and that they worked for him. (Unfortunately no old memory usage graphs).
But look at the attached image - memory usage increments in a straight line - no saw tooth where memory would get released.
Oh, and this is in one runloop on the main thread (don't blame me, I didn't write the original code!):
Im honestly a bit lost and looking for general pointers. Here is the general flow of my project. I have an Xcode project where I want to return and convert the temperature values accessed from the apple smc and I found a GitHub repo with all the smc key sensors for the M3Pros/Max chips: https://github.com/exelban/stats/issues/1703 basically, I have all these keys stored in an array in obj-c like so:
NSArray *smcKeys = @[ @"Tp01", @"Tp05", @"Tp09", @"Tp0D", @"Tp0b", @"Tp0f", @"Tp0j", @"Tp0n",@"Tp0h", @"Tp0L", @"Tp0S", @"Tp0V", @"Tp0z", @"Tp0v", @"Tp17", @"Tp1F", @"Tp1J", @"Tp1p", @"Tp1h", @"Tp1R", ];
I am passing all these keys by passing 'smcKeys' in a regular C code file I have here that is meant to open, close and read the data shown here:
#include "smc.h"
#include <mach/mach.h>
#include <IOKit/IOKitLib.h>
#include "smckeys.h"
io_connect_t conn;
kern_return_t openSMC(void) {
kern_return_t result;
kern_return_t service;
io_iterator_t iterator;
service = IOServiceGetMatchingServices(kIOMainPortDefault, IOServiceMatching("AppleSMC"), &iterator);
if(service == 0) {
printf("error: could not match dictionary");
return 0;
}
result = IOServiceOpen(service, mach_task_self(), 0, &conn);
IOObjectRelease(service);
return 0;
}
kern_return_t closeSMC(void) {
return IOServiceClose(conn);
}
kern_return_t readSMC(char *smcKeys, SMCVal_t *val) {
kern_return_t result;
uint32_t keyCode = *(uint32_t *)smcKeys;
SMCVal_t inputStruct;
SMCVal_t outputStruct;
inputStruct.datasize = sizeof(SMCVal_t);
inputStruct.datatype = 'I' << 24; //a left shift operation. turning the I into an int by shifting the ASCII value 24 bits to the left
inputStruct.data[0] = keyCode;
result = IOConnectCallStructMethod(conn, 5, &inputStruct, sizeof(SMCVal_t), &outputStruct, (size_t*)&inputStruct.datasize);
if (result == kIOReturnSuccess) {
if (val -> datasize > 0) {
if (val -> datatype == ('f' << 24 | 'l' << 16 | 't' << 8 )) { //bit shifting to from 32bit operation associated with the ASCII charecters'f', 'l', and 't', sets datatype field.
double temp = *(double *)val -> data;
return temp;
}
}
}
return 0.0;
}
Which I am then then calling the functions from this file in a swift file and converting the values to Fahrenheit but no data is being printed in my console:
import IOKit
public class getTemperature {
public struct SMCVal_t {
var datasize: UInt32
var datatype: UInt32
var data: (UInt8, UInt8, UInt8, UInt8, UInt8, UInt8, UInt8, UInt8)
}
@_silgen_name("openSMC")
func openSMC() -> kern_return_t
@_silgen_name("closeSMC")
func closeSMC() -> kern_return_t
@_silgen_name("readSMC")
func readSMC(key: UnsafePointer<CChar>?,val: UnsafeMutablePointer<SMCVal_t>) -> kern_return_t
func convertAndPrintTempValue(key:UnsafePointer<CChar>?,scale: Character, showTemp: Bool ) -> kern_return_t {
let openSM = openSMC()
guard openSM == 0 else {
print("Failed to open SMC: \(openSM)")
return kern_return_t()
}
let closeSM = closeSMC()
guard closeSM == 0 else {
print("could not close SMC: \(closeSM)")
return IOServiceClose(conn)
}
func convertAndPrint(val: SMCVal_t) -> Double {
if val.datatype == (UInt32("f".utf8.first!) << 24 | UInt32("l".utf8.first!) << 16 | UInt32("t".utf8.first!) << 8) {
let extractedTemp = Double(val.data.0)
return( extractedTemp * 9.0 / 5.0 + 32.0 )
}
return 0.0
}
let smcValue = SMCVal_t(datasize: 0, datatype: 0, data: (0,0,0,0,0,0,0,0))
let convertedVal = convertAndPrint(val: smcValue)
print("Temperarure:\(convertedVal)F°")
return kern_return_t()
}
}
I know this is a lot but I am honestly looking for any tips to fill in any gaps in my knowledge for anyone who's built a similar application meant to extract any sort of data from Mac hardware.
I want to get spatial videos in HEVC format, but after sharing to the share extension, I found that the video was automatically transcoded to AVC format.
Hello, I wanted to create a shortcut in the Shortcuts app, which includes QuickTime's Screen recording, however I only see the Screenshot option (screen photo) there.
Is there any way (maybe using AppleScript) to include Screen recording (only defined area, not the entire screen) in the Shortcuts app, so I can do furter edits to the given video?
Thank you for your help.
guard let fileURL = intent.attachments?.first?.audioMessageFile?.fileURL else {
print("Couldn't get fileNameWithExtension from intent.attachments?.first?.audioMessageFile?.fileURL?.lastPathComponent")
return failureResponse
}
defer {
fileURL.stopAccessingSecurityScopedResource()
}
let fileURLAccess = fileURL.startAccessingSecurityScopedResource()
print("FileURL: \(fileURLAccess)")
let tempDirectory = FileManager.default.temporaryDirectory
let tempFileURL = tempDirectory.appendingPathComponent(UUID().uuidString + "_" + fileURL.lastPathComponent)
do {
// Check if the file exists at the provided URL
guard FileManager.default.fileExists(atPath: fileURL.path) else {
print("Audio file does not exist at \(fileURL)")
return failureResponse
}
fileURL.stopAccessingSecurityScopedResource()
// Check if the temporary file already exists and remove it if necessary
if FileManager.default.fileExists(atPath: tempFileURL.path) {
try FileManager.default.removeItem(at: tempFileURL)
print("Removed existing temporary file at \(tempFileURL)")
}
// Copy the audio file to the temporary directory
try FileManager.default.copyItem(at: fileURL, to: tempFileURL)
print("Successfully copied audio file from \(fileURL) to \(tempFileURL)")
// Update your response based on the successful upload
// ...
} catch {
// Handle any errors that occur during file operations
print("Error handling audio file: \(error.localizedDescription)")
return failureResponse
}
guard let audioData = try? Data(contentsOf: tempFileURL), !audioData.isEmpty else {
print("Couldn't get audioData from intent.attachments?.first?.audioMessageFile?.data")
return failureResponse
}
Error:
FileURL: false
Audio file does not exist at file:///var/mobile/tmp/SiriMessages/BD57CB69-1E75-4429-8991-095CB90959A9.caf
is something I'm missing?
I'm looking to integrate the Google OAuth process into my custom Safari extension, but I haven't been able to find specific documentation on how to do this, similar to what's available for Chrome extensions. After some research and testing, I've tried using both browser.identity.getAuthToken and safari.identity.getAuthToken, but neither seems to be working. I'm wondering if anyone can provide a solution for this issue. The extension works fine in Google Chrome, but not in Safari.
Below is the code I'm currently using for Safari:
`browser.identity.getAuthToken({ interactive: true }, function (token) {
localStorage.setItem("accessToken", token);
});`
Is there any documentation available for using the Google OAuth process in my custom Safari extension, Because I am not able to found anything related to it?
Could someone please inform me of any mistakes I might be making here?
We have an iOS app with a web View.
To develop and debug, we inspect the web view using Safari Dev Tools (Developer menu).
This Works: We are able to inspect the web view when the application is run on physical devices as well as on simulated devices (iPhone and iPad).
This Does NOT Work: We are not able to inspect the web view of the same app running, unmodified, on same Mac. Safari’s Developer menu says “No Inspectable Applications” against the Mac.
Requesting guidance or leads on ways to inspect and debug the app under these conditions. Many thanks!