I am integrating per-app VPN functionality into an iOS app using Wireguard. Chrome is designated as a per-app application for this purpose. However, upon opening Chrome, the VPN icon appears in the notification bar, but there is no internet connection within the Chrome browser.
I have verified this behavior with OpenVPN, and it works correctly. While I am familiar with the MDM payload and how to implement per-app VPN, my primary concern is understanding why per-app VPN functionality is not functioning as expected with WireGuard.
An observation we made in the server-side logs is the message: "wireguard: wg0: Packet has incorrect size from peer 1"
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Hi,
I have few questions regarding the widgets.
I would like to know whether widget and app extensions are same ? This link(https://developer.apple.com/app-extensions/) says widget is type of app extension but I am not quite sure as few link in web says they are different. so need to confirm here :)
Can a widget share same bundle id as the main app ? so basically can we use the same provisioning profile as the main app?
If we use the same bundle id and provisioning profile, will there be any issue during the app store submission process.?
My application targets iOS 15+. Attempting to build and run it on my iPhone SE (orphaned at iOS 15.6) results in "Failed to prepare the device for development."
I'm building with Xcode 15.2.
What is the expected procedure here?
After the update to iOS 17, tapping on message notification shown on CarPlay Dashboard is navigating to the CarPlay app instead of announcing the message notification.
Announce Notifications turned ON
Announce Messages turned ON
Announce New Messages option is selected
Other apps message notifications are announced as expected when tapping on the notification implying that the settings are set as required.
Enabled com.apple.developer.carplay-communication
Class CustomCarPlaySceneDelegate: UIResponder, CPTemplateApplicationSceneDelegate {
func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene,
didConnect interfaceController: CPInterfaceController)
func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene,
didDisconnectInterfaceController interfaceController: CPInterfaceController)
func scene(_ scene: UIScene, willContinueUserActivityWithType userActivityType: String)
}
In my application, I use CallKit and have supportsHolding = true set. During my phone call, another call comes in (e.g., GSM). I accept the incoming call and put the current call on hold.
If I end the active call myself, everything is fine, and CallKit calls the
method provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession).
However, if the other party ends the call, the second call remains on hold. In the application, the user clicks on unhold, and I notify CallKit that the hold has ended.
But in this case, the didActivate method is not called at all. If I try to activate the audio myself after unhold, I receive the error:
Domain=NSOSStatusErrorDomain Code=561017449 "Session activation failed" UserInfo={NSLocalizedDescription=Session activation failed}
AVAudioSessionErrorInsufficientPriority == NSOSStatusErrorDomain Code: 561017449
What needs to be done for CallKit to activate my audio?
New Apple Developer here,
I've built my first iOS app in Xcode and wants to get it onto some friends' iPhones to do some external testing and give me some feedback. I've read the Apple Developer documentation on how to use TestFlight, but it's not understandable since step 1 is entering information about the app in App Store Connect and step 2 is uploading the app to App Store Connect?
Can someone please write some easy step-by-step directions on how to get an iOS app from Xcode onto external testers' iPhones for a first-timer in 2024?
Thank you so much!
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
iOS
App Store Connect
TestFlight
Testing
A simple view has misaligned localized content after being converted to an image using ImageRenderer.
This is still problematic on real phone and TestFlight
I'm not sure what the problem is, I'm assuming it's an ImageRenderer bug.
I tried to use UIGraphicsImageRenderer, but the UIGraphicsImageRenderer captures the image in an inaccurate position, and it will be offset resulting in a white border. And I don't know why in some cases it encounters circular references that result in blank images.
"(1) days" is also not converted to "1 day" properly.
I have recently submitted a new app version to the Appstore with Xcode 15.0. Unfortunately, I have started to see the below crash in the Xcode organiser > Crashes section occurring for more number of times.
UIKitCore: +[UIAlertController _alertControllerContainedInViewController:] + 160
The exception trace is not leading to main() function but not pointing to any of the code line. I had used UIAlertController in the past versions to show the alerts but there is no code written in the current version code related to UIAlertController. Only from the latest version, this kind of crash started to surface.
In the latest release, We have added a third party SDK and while implementing the SDK, we had added the Location and Bluetooth Permissions in Info.plist file. But as we don't want to use/track the Location and Bluetooth details from the app, the SDK team has disabled the Location and Bluetooth settings to not reflect in the tracked data.
Is this behaviour creating any conflict with the UIAlertController and logging the crash? Because by default the OS tries to show the alert when the permissions exist in the plist file, but the alert will not come as the service is disabled on the SDK server settings. Is this creating any conflict and logging the crash.
Please extend your help.
Hey there,
i implemented live activity in my app and iam trying to start the live activity from push notification, updates works fine even when the app is in background but starting the activity creating issue mostly on background and kill mode when i check the delivery of live activity on cloudkit console it says stored for device power considerations.
anyone having the same issue ?
I'm creating an App that can accepted PDFs from a shared context.
I am using iOS, Swift, and UIKit with IOS 17.1+
The logic is:
get the context
see who is sending in (this is always unknown)
see if I can open in place (in case I want to save later)
send the URL off to open the (PDF) document and
load it into PDFKit's pdfView.document
I have no trouble loading PDF docs with the file picker.
And everything works as expected for shares from apps like Messages, email, etc... (in which case URLContexts.first.options.openInPlace == False)
The problem is with opening (sharing) a PDF that is sent from the Files App. (openInPlace == True)
If the PDF is in the App's Document Folder, I need the Security scoped resource, to access the URL from the File's App so that I can copy the PDF's data to the PDFViewer.document. I get Security scoped resource access granted each time I get the File App's context URL.
But, when I call fileCoordinator.coordinate and try to access a file outside of the App's document folder using the newUrl, I get an error.
FYI - The newUrl (byAccessor) and context url (readingItemAt) paths are always same for the Files App URL share context.
I can, however, copy the file to a new location in my apps directory and then open it from there and load in the data. But I really do not want to do that.
. . . . .
Questions:
Am I missing something in my pList or are there other parameters specific to sharing a file from the Files App?
I'd appreciate if someone shed some light on this?
. . . . .
Here are the parts of my code related to this with some print statements...
. . . . .
SceneDelegate
func scene(_ scene: UIScene, openURLContexts URLContexts: Set<UIOpenURLContext>) {
// nothing to see here, move along
guard let urlContext = URLContexts.first else {
print("No URLContext found")
return
}
// let's get the URL (it will be a PDF)
let url = urlContext.url
let openInPlace = urlContext.options.openInPlace
let bundleID = urlContext.options.sourceApplication
print("Triggered with URL: \(url)")
print("Can Open In Place?: \(openInPlace)")
print("For Bundle ID: \(bundleID ?? "None")")
// get my Root ViewController from window
if let rootViewController = self.window?.rootViewController {
// currently using just the view
if let targetViewController = rootViewController as? ViewController {
targetViewController.prepareToLoadSharedPDFDocument(at: url)
}
// I might use a UINavigationController in the future
else if let navigationController = rootViewController as? UINavigationController,
let targetViewController = navigationController.viewControllers.first as? ViewController {
targetViewController.prepareToLoadSharedPDFDocument(at: url)
}
}
}
. . . .
ViewController function
I broke out the if statement for accessingScope just to make it easier for me the debug and play around with the code in accessingScope == True
func loadPDF(fromUrl url: URL) {
// If using the File Picker / don't use this
// If going through a Share.... we pass the URL and have three outcomes (1, 2a, 2b)
// 1. Security scoped resource access NOT needed if from a Share Like Messages or EMail
// 2. Security scoped resource access granted/needed from 'Files' App
// a. success if in the App's doc directory
// b. fail if NOT in the App's doc directory
// Set the securty scope variable
var accessingScope = false
// Log the URLs for debugging
print("URL String: \(url.absoluteString)")
print("URL Path: \(url.path())")
// Check if the URL requires security scoped resource access
if url.startAccessingSecurityScopedResource() {
accessingScope = true
print("Security scoped resource access granted.")
} else {
print("Security scoped resource access denied or not needed.")
}
// Stop accessing the scope once everything is compeleted
defer {
if accessingScope {
url.stopAccessingSecurityScopedResource()
print("Security scoped resource access stopped.")
}
}
// Make sure the file is still there (it should be in this case)
guard FileManager.default.fileExists(atPath: url.path) else {
print("File does not exist at URL: \(url)")
return
}
// Let's see if we can open it in place
if accessingScope {
let fileCoordinator = NSFileCoordinator()
var error: NSError?
fileCoordinator.coordinate(readingItemAt: url, options: [], error: &error) { (newUrl) in
DispatchQueue.main.async {
print(url.path())
print(newUrl.path())
if let document = PDFDocument(url: newUrl) {
self.pdfView.document = document
self.documentFileName = newUrl.deletingPathExtension().lastPathComponent
self.fileLoadLocation = newUrl.path()
self.updateGUI(pdfLoaded: true)
self.setPDFScale(to: self.VM.pdfPageScale, asNewPDF: true)
} else {
print("Could not load PDF directly from url: \(newUrl)")
}
}
}
if let error = error {
PRINT("File coordination error: \(error)")
}
} else {
DispatchQueue.main.async {
if let document = PDFDocument(url: url) {
self.pdfView.document = document
self.documentFileName = url.deletingPathExtension().lastPathComponent
self.fileLoadLocation = url.path()
self.updateGUI(pdfLoaded: true)
self.setPDFScale(to: self.VM.pdfPageScale, asNewPDF: true)
} else {
PRINT("Could not load PDF from url: \(url)")
}
}
}
}
. . . .
Other relevant pList settings I've added are:
Supports opening documents in place - YES
Document types - PDFs (com.adobe.pdf)
UIDocumentBrowserRecentDocumentContentTypes - com.adobe.pdf
Application supports iTunes file sharing - YES
And iCloud is one for Entitlements with
iCloud Container Identifiers
Ubiquity Container Identifiers
. . . .
Thank you in advance!.
B
Feedback Ticket: FB13812251
Problem Statement: We are currently facing internet connectivity issue with our VPN application where we try to disconnect the VPN from the Packet Tunnel Network Extension using - (void)cancelTunnelWithError:(nullable NSError *)error. Which API to use to disconnect the VPN from Packet Tunnel as VPN app is not running such that device retains its internet connectivity as soon as VPN disconnects.
Configuration: We have configured PacketTunnelProvider with the following settings:
(NETunnelProviderManager *)tunnelProvider.protocolConfiguration.includeAllNetworks = YES;
(NETunnelProviderManager *)tunnelProvider.protocolConfiguration.excludeLocalNetworks = NO;
(NETunnelProviderManager *)tunnelProvider.protocolConfiguration.enforceRoutes = NO;
These settings are applied from the VPN app and allow us to successfully establish a VPN connection, with all traffic being routed through the tunnel as expected.We are setting above properties to address local net attack.
Issue we are facing:
However, we encounter a problem when we attempt to disconnect the VPN from. When we call the following method from PacketTunnel network extension:
(void)cancelTunnelWithError:(nullable NSError *)error
Upon calling this method, the VPN disconnects as expected, but the device loses all internet connectivity and is unable to access any resources. This is not the desired behavior.
Observation : Interestingly, when we call the following method from the app side. The VPN disconnects and the device retains its internet connectivity.
[enabledConfig.connection stopVPNTunnel];
We would like to achieve the same behavior when disconnecting the VPN from the Network Extension. So we are looking for an API that could be called from NE without causing any internet connectivity issue.
Any guidance on how to resolve this issue would be greatly appreciated.
Hello,
I’ve run into an issue with a configuration profile on my supervised iPhone. I’m wondering if anyone here might be able to help?
The profile contains the allowListedAppBundleIDs key within the restrictions payload. My Apple Watch is paired with the iPhone. The iPhone was supervised manually with Apple Configurator, hence the Apple Watch has not been directly supervised itself.
The profile works completely as expected when installed on the phone. As soon as the profile is installed on the iPhone, I can witness the apps on the Apple Watch rearrange themselves as some apps are hidden. So clearly the profile is applying its restrictions to the Apple Watch to some degree.
My issue however is that apps listed in the whitelist are hidden from the Watch. The apps that are missing from my Watch are Walkie Talkie, Find My Items, Find My Friends, Messages, Alarm, Remote, Now Playing, Sleep, Meditation and Heart Rate. This is despite the following bundle IDs being listed in the whitelist array: com.apple.findmy.findpeople, com.apple.findmy.finddevices, com.apple.HeartRate, com.apple.SessionTrackerApp, com.apple.NanoWorldClock, com.apple.findmy.finditems, com.apple.Mind, com.apple.NanoOxygenSaturation, com.apple.watchmemojieditor com.apple.NanoSleep com.apple.NanoNowPlaying com.apple.noise com.apple.tincan com.apple.NanoRemote com.apple.NanoAlarm com.apple.private.NanoTimer com.apple.NanoStopwatch
I’ve done some testing, but not sure what I’ve found really. I’ve so far identified 3 scenarios.
Scenario 1: I have the whitelist profile installed on the iPhone. I download an app that appears in the whitelist from my watch (or at least its iPhone version does). The apps show up on the iPhone automatically and can be launched there. These apps cannot be launched on the watch.
Scenario 2: I downloaded a few apps to my watch, that didn’t automatically install on my iPhone at the same time. They were on the whitelist. These ones couldn’t be launched from my Watch. I then downloaded them to the iPhone and they could be launched there (since they were on the whitelist).
Scenario 3: A couple of 3rd party apps on the whitelist could be downloaded and launched from the watch with the whitelist installed.
It seems as though there are different kinds of Apple Watch app and this is what I’ve read elsewhere. First of all there are Watch-only apps, which do not automatically install a companion iPhone app. Secondly there are companion apps, which when installed from the Watch App Store download their companion app to the iPhone in the background. Someone please correct me - I’m bound to be overlooking something here.
So maybe the apps that when installed from Watch automatically install on iPhone and can only be launched from the iPhone have a separate bundle ID for their Watch app which I haven’t included?
Apps that are on the whitelist AND do not automatically install an iPhone app AND can be launched from the Watch, include:
solstice
What3words
So maybe these do not need a companion app, but have the same Bundle ID as their iPhone app?
However, I’m still not sure why many stock Apple Watch apps are missing from the Watch…. The most obvious answer is that I’ve got their Bundle IDs wrong, but I don’t think I have given I extracted the bundle IDs from the App Store pages of the Apple WatchOS apps.
I noticed at this Apple Support page (https://support.apple.com/en-gb/guide/deployment/dep34c5cd30f/1/web/1.0) that there is no mention of whitelisting or blacklisting apps on WatchOS using MDM, yet something definitely happens on the watch when the configuration profile is installed on the iPhone. Furthermore, if I tap on a configuration profile, which comprises a blacklist, on my iPhone it will ask me if I want to install it on the iPhone or Watch. The same pop-up question doesn’t happen when the profile contains a whitelist.
All this to say, I’m massively confused as to why I can’t get this working. I’d really appreciate anyone’s advice which is bound to be expert.
Thank you
Recently, my application was having trouble sending udp messages after it was reinstalled. The cause of the problem was initially that I did not grant local network permissions when I reinstalled, I was aware of the problem, so udp worked fine after I granted permissions. However, the next time I repeat the previous operation, I also do not grant local network permissions, and then turn it back on in the Settings, and udp does not work properly (no messages can be sent, the system version and code have not changed).
Fortunately, udp worked after rebooting the phone, and more importantly, I was able to repeat the problem many times.
So I want to know if the process between when I re-uninstall the app and deny local network permissions, and when I turn it back on in Settings, is that permissions have been granted normally, and not fake, and not required a reboot to reset something for udp to take effect.
I'm not sure if it's the system, or if it's a similar situation as described here, hopefully that will help me find out
Hi Guys,
I want to support my client for enable the developer mode, But they not accept to connect with any other devices(Mac Xcode) to enable developer mode.
They are nearly 10 people to enable developer mode. But I think without mac we can't enable developer mode in some of devices. So I need a clarification with IOS versions. That's only we are excepting to list out which IOS versions don't have developer mode option default. Please list out that IOS versions
Like below:
default developer mode available IOS 17.4.1
default developer mode not available IOS 17.5.1
I'm an iOS developer, and I've been testing our app in iOS 18.0 Beta. I noticed that there's a problem with the font rendering, and after troubleshooting, I've found out that it's caused by the removal of the PingFang.ttc font in 18.0.
I would like to ask the reason for removing this font file and which font should be used to display Chinese in the future?
My test device is an iPhone 11 Pro and the system version is iOS 18.0 (22A5297). I have also tested Beta 1 and it has the same issue.
In previous versions of the system, the PingFang font is located in this directory /System/Library/Fonts/LanguageSupport/PingFang.ttc. But in iOS 18.0, the font file in this directory has become Kohinoor.ttc, and I've tested that this font can't display Chinese either.
I traversed the following system font directories and could not find the PingFang.ttc font file.
/System/Library/Fonts/AppFonts
/System/Library/Fonts/Core
/System/Library/Fonts/CoreAddition
/System/Library/Fonts/CoreUI
/System/Library/Fonts/LanguageSupport
/System/Library/Fonts/UnicodeSupport
/System/Library/Fonts/Watch
Looking for answers, thanks for the help!
A lot of apps use undocumented App-prefs URLs to help users get to the iOS Settings screen needed to set up the app. In iOS 18, it seems like these all stopped working.
Here are the ones I currently use:
App-prefs:MESSAGES - broken in iOS 18
Used for SMS Protection.
App-prefs:Phone - broken in iOS 18
Used for Live Voicemail, Silence Unknown Callers, and SMS Reporting.
Some but not most paths have specific documented replacements. E.g. for Call Blocking & Identification you can use CXCallDirectoryManager.sharedInstance.openSettings() and this still works in iOS 18. But I don't see any other direct replacements.
Apple probably doesn't consider this a bug but I filed FB14378568 anyway.
I consider this an accessibility issue because many older, inexperienced, or users with disabilities have trouble finding the right Settings screen based on a textual description alone.
I'm building a SwiftUI app with a UITextView subclass, and it seems that the software keyboard doesn't trigger the pressesBegan or pressesEnded functions of UITextView. With a hardware keyboard, pressesBegan works as expected, allowing us to intercept key presses in our subclass.
I can't find any documentation about this, or any other forum posts (here or on Stack Overflow) that talk about a discrepancy between software and hardware keyboard behaviors, and I can't believe this is an intended behavior. Our app is a SwiftUI app, in case that's relevant.
Does anyone have any guidance? Is this a bug or am I not understanding this API? Any information or work arounds would be greatly appreciated.
I've made a sample project that demonstrates this issue, which you can grab from GitHub at https://github.com/nyousefi/KeyPressSample. To see this in action, run the sample project and start pressing keys. The hardware keyboard will print the key press at the top of the screen (above the text view), while the software keyboard won't.
i export apple SF as custom sf for test.
code is simple:
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(
kind:"ControlWidgetConfiguration"
) {
ControlWidgetButton(action: DynamicWidgetIntent()) {
Text("test")
Image("custom_like")
}
}.displayName("test")
}
as we can see, it can't show image in the preview. but it can show image in the Control widget center.
am i do some thing wrong?
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5?
I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this:
content.camera = .worldTracking
However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line
GKLocalPlayer.local.authenticateHandler = { viewController, error in
// ... some more code ...
}
So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
With the Live Caller ID example server, the caller lookup dataset is defined in an input.txtpd and processed by running a ConstructDatabase command which creates a block.binpb and an identity.binpb file.
In other words, a static input file is being processed into static block and identity files.
However, in the real world, the data content for identified and blocked numbers is something which is in a constant state of flux and evolution, as new numbers becoming available, old ones become stale, numbers which were initially considered safe change into being considered malicious etc. etc.
Is the example server just that, merely an example using fixed datasets, and an actual production server is able to use live every changing data to formulate its response back to the iPhone OS query?
Here's a concrete use case - suppose it's a requirement to permit US nanp numbers but to block anything else. The total number of non US nanp numbers is so large and ever changing that it would be unfeasible to attempt to capture them in an input.txtpd file and then process that, and then to re-capture and re-process it endlessly. Instead what would be required is the ability for the Live Caller ID server to evaluate at query time, using a regular expressions for example, if a number is nanp or not.
Is this possible?