Swift is a powerful and intuitive programming language for Apple platforms and beyond.

Posts under Swift tag

200 Posts

Post

Replies

Boosts

Views

Activity

Can't find server for API Endpoint that works.
Hi, I am making a AI-Powered app that makes api requests to the openai API. However, for security, I set up a vercel backend that handles the API calls securely, while my frontend makes a call to my vercel-hosted https endpoint. Interestingly, whenever I try to make that call on my device, an iPhone, I get this error: Task <91AE4DE0-2845-4348-89B4-D3DD1CF51B65>.<10> finished with error [-1003] Error Domain=NSURLErrorDomain Code=-1003 "A server with the specified hostname could not be found." UserInfo={_kCFStreamErrorCodeKey=-72000, NSUnderlyingError=0x1435783f0 {Error Domain=kCFErrorDomainCFNetwork Code=-1003 "(null)" UserInfo={_kCFStreamErrorDomainKey=10, _kCFStreamErrorCodeKey=-72000, _NSURLErrorNWResolutionReportKey=Resolved 0 endpoints in 3ms using unknown from query, _NSURLErrorNWPathKey=satisfied (Path is satisfied), interface: pdp_ip0[lte], ipv4, ipv6, dns, expensive, uses cell}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask <91AE4DE0-2845-4348-89B4-D3DD1CF51B65>.<10>, _NSURLErrorRelatedURLSessionTaskErrorKey=( "LocalDataTask <91AE4DE0-2845-4348-89B4-D3DD1CF51B65>.<10>" ), NSLocalizedDescription=A server with the specified hostname could not be found., NSErrorFailingURLStringKey=https://[my endpoint], NSErrorFailingURLKey=https://[my endpoint], _kCFStreamErrorDomainKey=10} I'm completely stuck because when I directly make https requests to other api's like openai's endpoint, without the proxy, it finds the server completely fine. Running my endpoint on terminal with curl also works as intended, as I see api key usages. But for some reason, on my project, it does not work. I've looked through almost every single post I could find online, but a lot all of the solutions are outdated and unhelpful. I'm willing to schedule a call, meeting, whatever to resolve this issue and get help more in depth as well.
1
0
138
Jun ’25
Constraining Beacon with CLBeaconIdentityCondition
In reference to this webpage, I'm turning my iPad to an iBeacon device. class BeaconViewModel: NSObject, ObservableObject, CBPeripheralManagerDelegate { private var peripheralManager: CBPeripheralManager? private var beaconRegion: CLBeaconRegion? private var beaconIdentityConstraint: CLBeaconIdentityConstraint? //private var beaconCondition: CLBeaconIdentityCondition? override init() { super.init() if let uuid = UUID(uuidString: "abc") { beaconIdentityConstraint = CLBeaconIdentityConstraint(uuid: uuid, major: 123, minor: 456) beaconRegion = CLBeaconRegion(beaconIdentityConstraint: beaconIdentityConstraint!, identifier: "com.example.myDeviceRegion") peripheralManager = CBPeripheralManager(delegate: self, queue: nil, options: nil) } } func peripheralManagerDidUpdateState(_ peripheral: CBPeripheralManager) { switch peripheral.state { case .poweredOn: startAdvertise() case .poweredOff: peripheralManager?.stopAdvertising() default: break } } func startAdvertise() { guard let beaconRegion = beaconRegion else { return } let peripheralData = beaconRegion.peripheralData(withMeasuredPower: nil) peripheralManager?.startAdvertising(((peripheralData as NSDictionary) as! [String: Any])) } func stopAdvertise() { peripheralManager?.stopAdvertising() } } In Line 10, I'm using CLBeaconidentityConstraint to constrain the beacon. Xcode says that this class is deprecated and suggests that we use CLBeaconIdentityCondition. But if I try to use it, Xcode says Cannot find type 'CLBeaconIdentityCondition' in scope I've just updated Xcode to 16.4. I still get the same error. So how do we use CLBeaconIdentityCondition to constrain the beacon? My macOS version is Sequoia 15.5. Thanks.
2
0
176
Jun ’25
Specific ARKit node not showing in AR
After two types of objects correctly inserted as nodes in an augmented reality setting, I replicated exactly the same procedure with a third kind of objects that unfortunately refuse to show up. I checked the flow and it is the same as the other objects as well the content of the LocationAnnotation, but there is surely something that escapes me. Could someone help with some ideas? This is the common code, apart of the class: func appendInAR(ghostElement: Ghost){ let ghostElementAnnotationLocation=GhostLocationAnnotationNode(ghost: ghostElement) ghostElementAnnotationLocation.scaleRelativeToDistance = true sceneLocationView.addLocationNodeWithConfirmedLocation(locationNode: ghostElementAnnotationLocation) shownGhostsAnnotations.append(ghostElementAnnotationLocation) }
9
0
134
Jun ’25
SCNTechnique clearColor Always Shows sceneBackground When Passes Share Depth Buffer
Problem Description I'm encountering an issue with SCNTechnique where the clearColor setting is being ignored when multiple passes share the same depth buffer. The clear color always appears as the scene background, regardless of what value I set. The minimal project for reproducing the issue: https://www.dropbox.com/scl/fi/30mx06xunh75wgl3t4sbd/SCNTechniqueCustomSymbols.zip?rlkey=yuehjtk7xh2pmdbetv2r8t2lx&st=b9uobpkp&dl=0 Problem Details In my SCNTechnique configuration, I have two passes that need to share the same depth buffer for proper occlusion handling: "passes": [ "box1_pass": [ "draw": "DRAW_SCENE", "includeCategoryMask": 1, "colorStates": [ "clear": true, "clearColor": "0 0 0 0" // Expecting transparent black ], "depthStates": [ "clear": true, "enableWrite": true ], "outputs": [ "depth": "box1_depth", "color": "box1_color" ], ], "box2_pass": [ "draw": "DRAW_SCENE", "includeCategoryMask": 2, "colorStates": [ "clear": true, "clearColor": "0 0 0 0" // Also expecting transparent black ], "depthStates": [ "clear": false, "enableWrite": false ], "outputs": [ "depth": "box1_depth", // Sharing the same depth buffer "color": "box2_color", ], ], "final_quad": [ "draw": "DRAW_QUAD", "metalVertexShader": "myVertexShader", "metalFragmentShader": "myFragmentShader", "inputs": [ "box1_color": "box1_color", "box2_color": "box2_color", ], "outputs": [ "color": "COLOR" ] ] ] And the metal shader used to display box1_color and box2_color with splitting: fragment half4 myFragmentShader(VertexOut in [[stage_in]], texture2d<half, access::sample> box1_color [[texture(0)]], texture2d<half, access::sample> box2_color [[texture(1)]]) { half4 color1 = box1_color.sample(s, in.texcoord); half4 color2 = box2_color.sample(s, in.texcoord); if (in.texcoord.x < 0.5) { return color1; } return color2; }; Expected Behavior Both passes should clear their color targets to transparent black (0, 0, 0, 0) The depth buffer should be shared between passes for proper occlusion Actual Behavior Both box1_color and box2_color targets contain the scene background instead of being cleared to transparent (see attached image) This happens even when I explicitly set clearColor: "0 0 0 0" for both passes Setting scene.background.contents = UIColor.clear makes the clearColor work as expected, but I need to keep the scene background for other purposes What I've Tried Setting different clearColor values - all are ignored when sharing depth buffer Using DRAW_NODE instead of DRAW_SCENE - didn't solve the issue Creating a separate pass to capture the background - the background still appears in the other passes Various combinations of clear flags and render orders Environment iOS/macOS, running with "My Mac (Designed for iPad)" Xcode 16.2 Question Is this a known limitation of SceneKit when passes share a depth buffer? Is there a workaround to achieve truly transparent clear colors while maintaining a shared depth buffer for occlusion testing? The core issue seems to be that SceneKit automatically renders the scene background in every DRAW_SCENE pass when a shared depth buffer is detected, overriding any clearColor settings. Any insights or workarounds would be greatly appreciated. Thank you!
0
0
158
Jun ’25
BUG in implemented StoreKit External Link Account API in a Vue + Capacitor app
I would like to clarify that my app is a Reader APP and a hybrid application built with Vue.js and Capacitor. To comply with Apple’s guidelines, I am not using any third-party SDKs for account management or payments. Instead, I am attempting to use the official StoreKit External Link Account API as required. To achieve this, I created a custom native Capacitor plugin in Swift, which calls the StoreKit 2 classes (SKStoreExternalLinkAccountRequest and SKStoreExternalLinkAccountViewController) to present the required modal before redirecting users to manage their accounts externally. However, I am encountering a technical issue: When building the app in Xcode 16 (with iOS Deployment Target set to 16+), the Swift compiler cannot find the StoreKit 2 classes (SKStoreExternalLinkAccountRequest and SKStoreExternalLinkAccountViewController). I have attached a screenshot showing the error in Xcode. Could you please clarify if there are any additional requirements or steps needed to access these StoreKit 2 APIs in a hybrid (Capacitor/Vue) app? Is there any limitation for hybrid apps, or is there a specific configuration needed in Xcode or the project to make these APIs available? I am committed to fully complying with Apple’s guidelines and want to ensure the best and safest experience for my users. Any guidance or documentation you can provide would be greatly appreciated. my plugin: my app in xcode - build failed I would really appreciate it if someone could help me.
1
0
125
Jun ’25
Launching a Unity fully immersive game from SwiftUI
I am trying to launch a fully immersive game from Unity on a SwiftUI view. The game is using Metal Rendering with Compositor Services. I added the unity Xcode project into the workspace, added the necessary bridge code. When I click on the button to call ufw?.showUnityWindow(), it does not start and I get the following in the console: AR session failed to start after 5 seconds. Is the app configured to use an immersive space?
2
0
103
Jun ’25
Locale.Script seems to be returning a value even though the language has no script
We just dropped support for iOS 16 in our app and migrated to the new properties on Locale to extract the language code, region, and script. However, after doing this we are seeing an issue where the script property is returning a value when the language has no script. Here is the initializer that we are using to populate the values. The identifier is coming from the preferredLanguages property that is found on Locale. init?(identifier: String) { let locale = Locale(identifier: identifier) guard let languageCode = locale.language.languageCode?.identifier else { return nil } language = languageCode region = locale.region?.identifier script = locale.language.script?.identifier } Whenever I inspect locale.language I see all of the correct values. However, when I inspect locale.language.script directly it is always returning Latn as the value. If I inspect the deprecated locale.scriptCode property it will return nil as expected. Here is an example from the debugger for en-AU. I also see the same for other languages such as en-AE, pt-BR. Since the language components show the script as nil, then I would expect locale.language.script?.identifier to also return nil.
1
0
122
Jun ’25
How I can localize an array ?
Hello, I am trying to localize my app in other languages. Most of the text are automatically extracted by Xcode when creating a string catalog and running my app. However I realized few texts aren't extracted, hence I find a solution for most of them by adding For example I have some declared variable as var title: String = "Continue" Hence for them to be trigger I changed the String by LocalizedStringResource which gives var title: LocalizedStringResource = "Continue" But I still have an issue with some variables declared as an array with this for example @State private var genderOptions = ["Male", "Female", "Not Disclosed"] I tried many things however I don't success to make these arrays to be translated as the word here "Male", "Female" and "Not Disclosed". I have more array like that in my code and I am trying to find a solution for them to be extracted and be able to be localized Few things I tried that doesn't worked @State private var genderOptions : LocalizedStringResource = ["Male", "Female", "Not Disclosed"] @State private var genderOptions = [LocalizedStringResource("Male"), LocalizedStringResource("Female"), LocalizedStringResource("Not Disclosed")] Any idea more than welcome Thank guys
2
0
135
Jun ’25
Enterprise Vendor Id changing when it shouldn't
Hi All, Really weird one here... I have two bundle ids with the same reverse dns name... com.company.app1 com.company.app2 app1 was installed on the device a year ago. app2 was also installed on the device a year ago but I released a new updated version and pushed it to the device via Microsoft InTunes. A year ago the vendor Id's matched as the bundle id's were on the same domain of com.company. Now for some reason the new build of app2 or any new app I build isn't being recognised as on the same domain as app1 even though the bundle id should make it so and so the Vendor Id's do not match and it is causing me major problems as I rely on the Vendor Id to exchange data between the apps on a certain device. In an enterprise environment, does anyone know of any other reason or things that could affect the Vendor Id? According to Apple docs, it seems that only the bundle name affects the vendor id but it isn't following those rules in this instance.
10
0
262
Jun ’25
Issue with Swift 6 migration issues
We are migrating to swift 6 from swift 5 using Xcode 16.2. we are getting below errors in almost each of our source code files : Call to main actor-isolated initializer 'init(storyboard:bundle:)' in a synchronous non isolated context Main actor-isolated property 'delegate' can not be mutated from a nonisolated context Call to main actor-isolated instance method 'register(cell:)' in a synchronous nonisolated context Call to main actor-isolated instance method 'setup()' in a synchronous nonisolated context Few questions related to these compile errors. Some of our functions arguments have default value set but swift 6 does not allow to set any default values. This requires a lot of code changes throughout the project. This would be lot of source code re-write. Using annotations like @uncheck sendable , @Sendable on the class (Main actor) name, lot of functions within those classes , having inside some code which coming from other classes which also showing main thread issue even we using @uncheck sendable. There are so many compile errors, we are still seeing other than what we have listed here. Fixing these compile errors throughout our project, would be like a re-write of our whole application, which would take lot of time. In order for us to migrate efficiently, we have few questions where we need your help with. Below are the questions. Are there any ways we can bypass these errors using any keywords or any other way possible? Can Swift 5 and Swift 6 co-exist? so, we can slowly migrate over a period of time.
1
0
176
Jun ’25
NEFilterManager saveToPreferences fails with "permission denied" on TestFlight build
I'm working on enabling a content filter in my iOS app using NEFilterManager and NEFilterProviderConfiguration. The setup works perfectly in debug builds when running via Xcode, but fails on TestFlight builds with the following error: **Failed to save filter settings: permission denied ** **Here is my current implementation: ** (void)startContentFilter { NSUserDefaults *userDefaults = [NSUserDefaults standardUserDefaults]; [userDefaults synchronize]; [[NEFilterManager sharedManager] loadFromPreferencesWithCompletionHandler:^(NSError * _Nullable error) { dispatch_async(dispatch_get_main_queue(), ^{ if (error) { NSLog(@"Failed to load filter: %@", error.localizedDescription); [self showAlertWithTitle:@"Error" message:[NSString stringWithFormat:@"Failed to load content filter: %@", error.localizedDescription]]; return; } NEFilterProviderConfiguration *filterConfig = [[NEFilterProviderConfiguration alloc] init]; filterConfig.filterSockets = YES; filterConfig.filterBrowsers = YES; NEFilterManager *manager = [NEFilterManager sharedManager]; manager.providerConfiguration = filterConfig; manager.enabled = YES; [manager saveToPreferencesWithCompletionHandler:^(NSError * _Nullable error) { dispatch_async(dispatch_get_main_queue(), ^{ if (error) { NSLog(@"Failed to save filter settings: %@", error.localizedDescription); [self showAlertWithTitle:@"Error" message:[NSString stringWithFormat:@"Failed to save filter settings: %@", error.localizedDescription]]; } else { NSLog(@"Content filter enabled successfully!"); [self showAlertWithTitle:@"Success" message:@"Content filter enabled successfully!"]; } }); }]; }); }]; } **What I've tried: ** Ensured the com.apple.developer.networking.networkextension entitlement is set in both the app and system extension. The Network extension target includes content-filter-provider. Tested only on physical devices. App works in development build, but not from TestFlight. **My questions: ** Why does saveToPreferencesWithCompletionHandler fail with “permission denied” on TestFlight? Are there special entitlements required for using NEFilterManager in production/TestFlight builds? Is MDM (Mobile Device Management) required to deploy apps using content filters? Has anyone successfully implemented NEFilterProviderConfiguration in production, and if so, how?
1
0
220
Jun ’25
Clearing an app’s memory, data etc.
My app works perfectly the first time it is used but when returning to the start after playing the game and playing another it does some random things. I figure it is because the app still retains the previous game in it’s memory allocation? My question is, what is the best way programmatically to have an app start fresh and not have to quit it and open it again?
6
0
164
May ’25
Is Phase Detection Autofocus degrading video stabilization, and can I disable it?
I'm developing a video capture app using AVFoundation, designed specifically for use on a boat pylon to record slalom water skiing. This setup involves considerable vibration. As you may know, the OIS that Apple began adding to lenses since the iPhone 7 is actually very problematic in high vibration circumstances, ironically creating very shaky video, whereas lenses without OIS produce perfectly stable video. Because of this, up until iPhone 14, the solution for my app was simply to use the Selfie lens, which did not have OIS. Starting with iPhone 14 through iPhone 16 (non-Pro models), technical specs suggest the selfie lens still does not include OIS. However, I’m still seeing the same kind of shaky video behavior I see on OIS-equipped lenses. The one hardware change I see in this camera module is the addition of PDAF (Phase Detection Autofocus), so that is my best guess as to what is causing the unstable video. 1- Does that make any sense - that in high vibration settings, PDAF could create unstable video in the same way that OIS does? Or could it be something else that was changed between the iPhone 13 and 14 Selfie lens? Thinking that the issue was PDAF, I figured that if I enabled my app to set a Manual Focus level, that ought to circumvent PDAF (expecting that if a lens is manually focusing, it can’t also be autofocusing via PDAF). However, even with manual focus locked via AVCaptureDevice in my app, on the Selfie lens of an iPhone 16, the video still comes out very shaky, basically unusable. I also tested with the built-in Apple Camera app (using the press-and-hold to lock focus and exposure) and another 3rd party camera app to lock focus, all with the same results, so it's not that my app just isn't correctly doing manual focus. So I'm stuck with these questions: 2- Does the selfie camera on iPhones 14–16 use PDAF even when focus is set to locked/manual mode? 3- Is there any way in AVFoundation to disable or suppress PDAF during video recording (e.g., a flag, device format setting, or private API)? 4- Is PDAF behavior or suppression documented or controllable via AVCaptureDevice or any related class? 5- If no control of PDAF is available, are there any best practices for stabilizing or smoothing this effect programmatically? Note that I also have set my app to use the most aggressive form of stabilization available, so it defaults to .cinematicExtendedEnhanced, if that’s not available, then .cinematicExtended, etc. On the 16 Selfie lens, it is using .cinematicExtended. As an additional question: 6- Would those be the most appropriate stabilization settings for a high vibration environment, and if not, what would be best?
0
0
198
May ’25
App Randomly Crashes During Continuous Sound Playback Using AVAudioPlayer
Environment→ ・Device: iPad 10th generation ・OS:**iOS18.3.2 We're using AVAudioPlayer to play a sound when a button is tapped. In our use case, this button can be tapped very frequently — roughly every 0.1 to 0.2 seconds. Each tap triggers the following function: var audioPlayer: AVAudioPlayer? func soundPlay(resource: String, type: String){ guard let path = Bundle.main.path(forResource: resource, ofType: type) else { return } do { audioPlayer = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: path)) audioPlayer!.delegate = self try audioSession.setCategory(.playback) } catch { return } self.audioPlayer!.play() } The issue is that under high-frequency tapping (especially around 0.1–0.15s intervals), the app occasionally crashes. The crash does not occur every time, but it happens randomly — sometimes within 30 seconds, within 1 minute, or even 3 minutes of continuous tapping. Interestingly, adding a delay of 0.2 seconds between button taps seems to prevent the crash entirely. Delays shorter than 0.2 seconds (e.g.,0.15s,0.18s) still result in occasional crashes. My questions are: **Is this expected behavior from AVAudioPlayer or AVAudioSession? Could this be a known issue or a limitation in AVFoundation? Is there any documentation or guidance on handling frequent sound playback safely?** Any insights or recommendations on how to handle rapid, repeated audio playback more reliably would be appreciated.
0
0
173
May ’25
Securely transmit UIImage to app running on desktop website
Hello everyone, I'm trying to figure out how to transmit a UIImage (png or tiff) securely to an application running in my desktop browser (Mac or PC). The desktop application and iOS app would potentially be running on the same local network (iOS hotspot or something) or have no internet connection at all. I'm trying to securely send over an image that the running desktop app could ingest. I was thinking something like a local server securely accepting image data from an iPhone. Any suggestions ideas or where to look for more info would be greatly appreciated! Thank you for your help.
1
0
107
May ’25
How to reduce CMSampleBuffer volume
Hello, Basically, I am reading and writing an asset. To simplify, I am just reading the asset and rewriting it into an output video without any modifications. However, I want to add a fade-out effect to the last three seconds of the output video. I don’t know how to do this. So far, before adding the CMSampleBuffer to the output video, I tried reducing its volume using an extension on CMSampleBuffer. In the extension, I passed 0.4 for testing, aiming to reduce the video's overall volume by 60%. My question is: How can I directly adjust the volume of a CMSampleBuffer? Here is the extension: extension CMSampleBuffer { func adjustVolume(by factor: Float) -> CMSampleBuffer? { guard let blockBuffer = CMSampleBufferGetDataBuffer(self) else { return nil } var length = 0 var dataPointer: UnsafeMutablePointer<Int8>? guard CMBlockBufferGetDataPointer(blockBuffer, atOffset: 0, lengthAtOffsetOut: nil, totalLengthOut: &length, dataPointerOut: &dataPointer) == kCMBlockBufferNoErr else { return nil } guard let dataPointer = dataPointer else { return nil } let sampleCount = length / MemoryLayout<Int16>.size dataPointer.withMemoryRebound(to: Int16.self, capacity: sampleCount) { pointer in for i in 0..<sampleCount { let sample = Float(pointer[i]) pointer[i] = Int16(sample * factor) } } return self } }
4
0
405
May ’25
MPNowPlayingInfoCenter nowPlayingInfo throttled
Hello, I have been running into issues with setting nowPlayingInfo information, specifically updating information for CarPlay and the CPNowPlayingTemplate. When I start playback for an item, I see lock screen information update as expected, along with the CarPlay now playing information. However, the playing items are books with collections of tracks. When I select a new track(chapter) within the book, I set the MPMediaItemPropertyTitle to the new chapter name. This change is reflected correctly on the lock screen, but almost never appears correctly on the CarPlay CPNowPlayingTemplate. The previous chapter title remains set and never updates. I see "Application exceeded audio metadata throttle limit." in the debug console fairly frequently. From that a I figured that I need to minimize updates to the nowPlayingInfo dictionary. What I did: I store the metadata dictionary in a local dictionary and only set values in the main nowPlayingInfo dictionary when they are different from the current value. I kick off the nowPlayingInfo update via a task that initially sleeps for around 2 seconds (not a final value, just for my current testing). If a previous Task is active, it gets cancelled, so that only one update can happen within that time window. Neither of these things have been sufficient. I can switch between different titles entirely and the information updates (including cover art). But when I switch chapters within a title, the MPMediaItemPropertyTitle continues to get dropped. I know the value is getting set, because it updates on the lock screen correctly. In total, I have 12 keys I update for info, though with the above changes, usually 2-4 of them actually get updated with high frequency. I am running out of ideas to satisfy the throttling thresholds to accurately display metadata. I could use some advice. Thanks.
4
1
159
May ’25