Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Activity

Xcode does not see code changes in local Swift packages (autocomplete wrong, errors shown, but still compiles)
Hey, I've been having a lot of problems with Xcode 16 not seeing changes made to code in local Swift packages (the packages are inside the root directory of the project). Whenever I make any change like renaming a variable or type, or adding new methods or whatever, autocomplete doesn't see those changes and when I type in that new type/variable manually it gives me an error. However, building the project still works fine, even with the errors never going away. The only way for it to notice the code changes is to clean the project and build it again which takes a long time. At first, I thought this was connected to the new "Explicitly built modules" feature in Xcode 16, but I turned it off and it still happens. Any ideas what I can try? I'm on the latest Xcode version, but this problem has been happening since Xcode 16 originally came out. Thanks! Dennis
0
0
159
May ’25
The UT coverage does not include branch coverage for swift
We using below command to run unit test and collect coverage: xcodebuild -workspace Demo.xcworkspace -scheme VideoTests -configuration Debug -derivedDataPath ../build/derivedData -destination 'platform=iOS Simulator,id=E6630007-570B-4DEB-A023-2BCE91116A8D' -resultBundlePath './fastlane/test_output/VideoTests.xcresult' -enableCodeCoverage YES -testPlan 'Video' test-without-building | tee '/Users/rcadmin/Library/Logs/scan/VideoTests.log' | xcbeautify -q --is-ci and using xcrun llvm-cov show command to generate coverage report: xcrun llvm-cov show /build/unit-test/coverage/libraries/merged/video.o -instr-profile=/app/ios/build/derivedData/Build//ProfileData/E6630007-570B-4DEB-A023-2BCE91116A8D/video.profdata -show-branches count -show-expansions -show-line-counts -use-color -format=html -output-dir coverage and the html report does not include branch coverage: how to generate the branch coverage?
0
0
158
May ’25
Xcode build system has crashed after adding opencv to macApp
Hi, I am currently working on a MacOS App, where I need the undistortion function of opencv. But after I tried to add opencv to my project, I get following error: unexpected service error: The Xcode build system has crashed. Build again to continue. Cleaning the build folder also doesn't help. Does anyone have an idea what could be the issue? Ryan
0
0
129
May ’25
SFSpeechRecognizer is not working inside visionOS 2.4 simulator
I know there has been issues with SFSpeechRecognizer in iOS 17+ inside the simulator. Running into issues with speech not being recognised inside the visionOS 2.4 simulator as well (likely because it borrows from iOS frameworks). Just wondering if anyone has any work arounds or advice for this simulator issue. I can't test on device because I don't have an Apple Vision Pro. Using Swift 6 on Xcode 16.3. Below are the console logs & the code that I am using. Console Logs BACKGROUND SPATIAL TAP (hit BackgroundTapPlane) SpeechToTextManager.startRecording() called [0x15388a900|InputElement #0|Initialize] Number of channels = 0 in AudioChannelLayout does not match number of channels = 2 in stream format. iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending SpeechToTextManager.startRecording() completed successfully and recording is active. GameManager.onTapToggle received. speechToTextManager.isAvailable: true, speechToTextManager.isRecording: true GameManager received tap toggle callback. Tapped Object: None BACKGROUND SPATIAL TAP (hit BackgroundTapPlane) GESTURE MANAGER - User is already recording, stopping recording SpeechToTextManager.stopRecording() called GameManager.onTapToggle received. speechToTextManager.isAvailable: true, speechToTextManager.isRecording: false Audio data size: 134400 bytes Recognition task error: No speech detected <--- Code private(set) var isRecording: Bool = false private var recognitionRequest: SFSpeechAudioBufferRecognitionRequest? private var recognitionTask: SFSpeechRecognitionTask? @MainActor func startRecording() async throws { logger.debug("SpeechToTextManager.startRecording() called") guard !isRecording else { logger.warning("Cannot start recording: Already recording.") throw AppError.alreadyRecording } currentTranscript = "" processingError = nil audioBuffer = Data() isRecording = true do { try await configureAudioSession() try await Task.detached { [weak self] in guard let self = self else { throw AppError.internalError(description: "SpeechToTextManager instance deallocated during recording setup.") } try await self.audioProcessor.configureAudioEngine() let (recognizer, request) = try await MainActor.run { () -> (SFSpeechRecognizer, SFSpeechAudioBufferRecognitionRequest) in guard let result = self.createRecognitionRequest() else { throw AppError.configurationError(description: "Speech recognition not available or SFSpeechRecognizer initialization failed.") } return result } await MainActor.run { self.recognitionRequest = request } await MainActor.run { self.recognitionTask = recognizer.recognitionTask(with: request) { [weak self] result, error in guard let self = self else { return } if let error = error { // WE ENTER INTO THIS BLOCK, ALWAYS self.logger.error("Recognition task error: \(error.localizedDescription)") self.processingError = .speechRecognitionError(description: error.localizedDescription) return } . . . } } . . . }.value } catch { . . . } } @MainActor func stopRecording() { logger.debug("SpeechToTextManager.stopRecording() called") guard isRecording else { logger.debug("Not recording, nothing to do") return } isRecording = false Task.detached { [weak self] in guard let self = self else { return } await self.audioProcessor.stopEngine() let finalBuffer = await self.audioProcessor.getAudioBuffer() await MainActor.run { self.recognitionRequest?.endAudio() self.recognitionTask?.cancel() } . . . } }
0
0
167
May ’25
[Unreal Engine] File missing if packaged with command line
Hello! I am trying to automate iOS builds for my Unreal Engine game using Unreal Automation Tool, but I cannot produce a functionnal build with it, while packaging from XCode works perfectly. I have tracked down the issue to a missing file. I'm using the Firebase SDK that requires a GoogleService-Info.plist file. I have copied this file at the root of my project, as the Firebase documentation suggests. I have not taken any manual action to specify that this file needs to be included in the packaged app. The Firebase code checks the existence of this file using NSString* Path = [[NSBundle mainBundle] pathForResource: @“GoogleService-Info” ofType: @“plist”]; return Path != nil; If I package my app from XCode using Product -> Archive, this test returns true and the SDK is properly initialized. If I package my app using Unreal Engine's RunUAT.sh BuildCookRun, this test returns false and the SDK fails to initialize (and actually crashes upon trying). I have tried several Unreal Engine tricks to include my file, like setting it as a RuntimeDependecies in my projects Build.cs file. Which enables Unreal Engine code to find it, but not this direct call to NSBundle. I would like to know either how to tell Unreal Engine to include files at the root of the app bundle, or what XCode does to automatically include this file and is there a way to script it? I can provide both versions .xcarchive if needed. Thanks!
0
0
151
Sep ’25
ActivityKit linker error
I have a ContentView in my app which includes the line of code FileUploadProgressAttributes. this struct is defined in a file included in the target FileUploadProgressExtension. and it is an ActivityAttributes. in ContentView I imported FileUploadProgressExtension, and the xcode is able to find the FileUploadProgressAttributes during prebuild. but during build, it gives me Undefined symbols for architecture arm64: "FileUploadProgressExtension.FileUploadProgressAttributes.init(filename: Swift.String) -> FileUploadProgressExtension.FileUploadProgressAttributes the workaround i found is to add the file with the FileUploadProgressAttributes to my app's target, but I'm not sure if this is the right thing to do. When Xcode created the extension for me, it added the extension target as a target dependency of my app. so obviously if i added this file to my app target it makes the extension target pointless. First time working with widgets so I'm not sure if I'm missing something.
0
0
89
May ’25
Built in ssh-add doesn't read ~/.ssh/config
I'm trying to authenticate to a git host using SSH keys stored in 1Password. I have ~/.ssh/config with mode 600 set with a symlink: Host * IdentityAgent "~/.1password/agent.sock" But ssh-add -l shows no identities. If I set $SSH_AUTH_SOCK, ssh-add -l works just fine. I'd love to not have to do this, though. Why doesn't ssh-add seem to read ~/.ssh/config? The built-in version is OpenSSH_10.0p2, LibreSSL 3.3.6. I've searched fruitlessly for an answer anywhere else.
0
0
188
Feb ’26
Modifer .classEffect()
Hi everyone, does anyone also have the problem that e.g. the modifier .classEffect() does not work? it is not recognized, although I have the required versions and languages all up to date. I can't get any further, even ChatGPT was at the end of its wisdom 😬. maybe someone has the same problem and a solution for it. many thanks in advance
0
0
43
Jun ’25
Mac (Designed for iPad) not been able to install on Mac testflight
Hi I developed an iPhone app but now I want to release the same app for iPad and Mac (Design for iPad). I made the changes below but I still not been able to install the app on my Mac. It says the app is incompatible but my Mac is MacBook Pro M2 which fills the requirements to install it. Things I did so my iPhone app could have support for iPad and Mac Added iPad and Mac (Designed for iPad) for the supported destinations Change the Status Bar Style to Require full screen Changed iOS minimum Deployments iOS to 17.6 Updated the launch screen logo so stays in the middle Xcode version 16.2 See the sccreenshots of the Xcode configuration and of the TestFlight on my MacBook Pro M2 Is there anything else I need to do? Thanks.
0
0
93
May ’25
Trouble setting up watches to use TestFlight that are AWFK configured
I am developing a simple watch app and I use my personal watch for development with Xcode. Personal watch is series 10 gps only. I have two other watches that I want to use for testing the app, but not needing them to be connected to Xcode. The test watches have cellular option, and I need a cell plan per watch because the watches need to be standalone, not counting initial setup. To get the standalone cell plan the watches need to be configured using AWFK. Here is what I have tried/current issues. I switch between all three watches on my phone using the watch app. Originally tried to put test watches in developer mode, thinking I would connect to Xcode, developer mode is not available when watch is setup using AWFK. Pushed the watch app to apple connect, setup TestFlight group, added the test users and my phone user, accepted invites TestFlight is installed on my phone, I see the testflight setup for the watch app I set a test watch using watch app on the phone, run install for the test app from TestFlight on the phone, spinner moves for awhile then goes back to Install. I am not able to get the watch app installed on the test watches from the phone. Is what I am attempting to do supported? I haven't found much specific documentation on this. If I pair the test watches as regular watches, set them to developer mode, can I pair them again as AWFK and will developer mode survive the switch? Or is there something really simple that I'm overlooking? Appreciate any help that can be extended.
0
0
280
Dec ’25
NO_CRASH_STACK with different number of crashes from the App Store.
Hello. For the last few months I have been facing a situation for which I do not understand either the reason or the solution. Here is an example for the last few weeks: In App Store Connect it shows 214 crashes for the last week In Xcode it shows 56 crashes for the last 2 weeks without stack traces (NO_CRASH_STACK) In Firebase Crashlytics there is not a single crash. When I click "Show in Finder" it opens a folder with .xccrashpoint files. As far as I understood from the instructions on posting crash reports - this is not what is needed. Interesting point: it looks like the errors occur on iOS 15-16 devices. I have several questions: Why is the number of crashes in Xcode different from the number in App Store Connect? Where are the other crashes? How can I understand what the problem is if there are no stack traces in Xcode?
0
0
124
May ’25
AR location errors on cellular + WiFi model iPad with device connected to Wi-Fi
I am developing an Augmented Reality (AR) navigation application for the iPad, utilizing the ARCL library to place Points of Interest (POIs) in the real world. The application's behavior varies significantly based on the device's networking configuration: Cellular Network (Expected Behavior): On an iPad with a cellular modem, when using the cellular network, all POIs are placed accurately with correct orientation. Wi-Fi Only (Expected Behavior): On a Wi-Fi-only model (no GPS chip), POI placement is inaccurate, confirming the need for an external GPS receiver for that hardware configuration. Cellular + Wi-Fi (Anomalous Behavior): The iPad is a cellular model (equipped with GNSS/GPS). The device is connected to a Wi-Fi network (enforced via an MDM profile, preventing the user from disabling Wi-Fi). When actively connected to this specific Wi-Fi network, the AR POIs consistently display with an incorrect orientation and placement, even though the device hardware has a dedicated GPS chip. The placement error strongly suggests that the device's determined location or heading is erroneous. It appears that the active Wi-Fi connection is somehow interfering with or overriding the high-accuracy GNSS/GPS data, leading to a flawed Core Location determination that negatively impacts the ARCL world tracking and anchor placement. Has anyone experienced a scenario where an active Wi-Fi connection on a cellular iPad model causes Core Location to prioritize less accurate location data (potentially Wi-Fi-based location services) over the device's built-in GNSS/GPS, resulting in severe orientation errors? We observed that on Apple map(native application) as well it is showing wrong location and orientation when it is connected to WiFi
0
0
301
Dec ’25
Upload Symbols Failed when Archiving with Xcode 16
Hello everyone, We are seeing some warnings in Xcode 16 when I archive my iOS app for distribution. During the upload phase (either via Xcode Organizer ), I get an error like: WARNING: Upload Symbols Failed The archive did not include a dSYM for MyFramework.framework with the UUIDs: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX Ensure that the archive’s dSYM folder includes a DWARF file for MyFramework.framework with the expected UUIDs. Question: How can these warnings be fixed? Is this warning safe to ignore if I know the vendor does not supply a dSYM for their framework? If I do need a proper dSYM for full symbolication, how should I request it from the vendor or generate it myself from an XCFramework?
0
3
438
May ’25
Battery Consumption Analysis Using Xcode Instruments
I have been working on battery consumption analysis for my application, and as part of this effort, I wanted to understand how competitor apps behave under similar usage conditions. To do this, I downloaded competitor apps from the App Store and attached them to Instruments via Xcode. I then executed a defined set of manual test scenarios to simulate real user behavior. During these tests, the iPhone was connected to a Mac and charging continuously, which meant that System Power Usage logs were not generated in Instruments. However, I was able to capture detailed metrics such as: Network usage CPU load GPU activity Display and brightness impact Other runtime performance characteristics Since direct battery drain data was unavailable, I used derived analysis (with AI assistance) to estimate approximate power consumption based on the above metrics, assuming real-device (battery-powered) conditions. According to Apple documentation, System Power Usage in Instruments is not directly tied to the device’s battery percentage. Instead, it appears to be computed using contributing factors such as CPU, network, display, and other subsystem activity. This raises a few important questions about data reliability and methodology. Key questions: How reliable are Instruments-based metrics (CPU, network, display, GPU) for estimating real-world battery consumption when the device is charging? Can these metrics be safely used as a comparative baseline between competitor applications, even if absolute battery drain values are unavailable? Is the System Power Usage instrument essentially a derived model based on subsystem activity, and if so, does it remain accurate when the device is not discharging? From Apple’s perspective, is this a valid approach for relative power comparison, provided that: The same device is used OS version is identical Test scenarios are consistent and repeatable Based on these findings, would it be reasonable to proceed with instrumenting our own application, run the same scenarios, and draw conclusions using relative comparisons rather than absolute battery percentages? The intent is not to claim exact battery drain numbers, but to establish a directionally correct and repeatable comparison that can guide performance optimizations in our own application. I would like to understand whether this methodology aligns with Apple’s recommended practices, or if there are limitations or inaccuracies I should be aware of before relying on these results for decision-making.
0
0
223
Jan ’26
CPListTemplate in CarPlay Simulator: Multiple Items Selected on Tap
I am encountering an issue while running my CarPlay app in the simulator using CPListTemplate. On app launch, when I tap on the first item in the list and then tap on the second item, both items remain selected instead of only the latest tapped item. This behavior does not align with the expected single-item selection functionality. Has anyone else faced this issue? Is there a workaround or a known resolution for this behavior?
0
0
49
May ’25
.NET(NativeAOT)
在将游戏从 Nintendo Switch 移植到 Mac 的过程中使用 .NET (NativeAOT) 有哪些限制和注意事项(尽管两者都是 ARM)?
Replies
0
Boosts
0
Views
128
Activity
Jun ’25
Xcode does not see code changes in local Swift packages (autocomplete wrong, errors shown, but still compiles)
Hey, I've been having a lot of problems with Xcode 16 not seeing changes made to code in local Swift packages (the packages are inside the root directory of the project). Whenever I make any change like renaming a variable or type, or adding new methods or whatever, autocomplete doesn't see those changes and when I type in that new type/variable manually it gives me an error. However, building the project still works fine, even with the errors never going away. The only way for it to notice the code changes is to clean the project and build it again which takes a long time. At first, I thought this was connected to the new "Explicitly built modules" feature in Xcode 16, but I turned it off and it still happens. Any ideas what I can try? I'm on the latest Xcode version, but this problem has been happening since Xcode 16 originally came out. Thanks! Dennis
Replies
0
Boosts
0
Views
159
Activity
May ’25
The UT coverage does not include branch coverage for swift
We using below command to run unit test and collect coverage: xcodebuild -workspace Demo.xcworkspace -scheme VideoTests -configuration Debug -derivedDataPath ../build/derivedData -destination 'platform=iOS Simulator,id=E6630007-570B-4DEB-A023-2BCE91116A8D' -resultBundlePath './fastlane/test_output/VideoTests.xcresult' -enableCodeCoverage YES -testPlan 'Video' test-without-building | tee '/Users/rcadmin/Library/Logs/scan/VideoTests.log' | xcbeautify -q --is-ci and using xcrun llvm-cov show command to generate coverage report: xcrun llvm-cov show /build/unit-test/coverage/libraries/merged/video.o -instr-profile=/app/ios/build/derivedData/Build//ProfileData/E6630007-570B-4DEB-A023-2BCE91116A8D/video.profdata -show-branches count -show-expansions -show-line-counts -use-color -format=html -output-dir coverage and the html report does not include branch coverage: how to generate the branch coverage?
Replies
0
Boosts
0
Views
158
Activity
May ’25
Xcode build system has crashed after adding opencv to macApp
Hi, I am currently working on a MacOS App, where I need the undistortion function of opencv. But after I tried to add opencv to my project, I get following error: unexpected service error: The Xcode build system has crashed. Build again to continue. Cleaning the build folder also doesn't help. Does anyone have an idea what could be the issue? Ryan
Replies
0
Boosts
0
Views
129
Activity
May ’25
Resetting settings through the Apple Vision Pro Simulator is bugged
When you try to reset settings through the Apple Vision Pro simulator (VisionOS 2.4) you get an error "Preferences quit unexpectedly". Bug report: FB17666053
Replies
0
Boosts
0
Views
114
Activity
May ’25
SFSpeechRecognizer is not working inside visionOS 2.4 simulator
I know there has been issues with SFSpeechRecognizer in iOS 17+ inside the simulator. Running into issues with speech not being recognised inside the visionOS 2.4 simulator as well (likely because it borrows from iOS frameworks). Just wondering if anyone has any work arounds or advice for this simulator issue. I can't test on device because I don't have an Apple Vision Pro. Using Swift 6 on Xcode 16.3. Below are the console logs & the code that I am using. Console Logs BACKGROUND SPATIAL TAP (hit BackgroundTapPlane) SpeechToTextManager.startRecording() called [0x15388a900|InputElement #0|Initialize] Number of channels = 0 in AudioChannelLayout does not match number of channels = 2 in stream format. iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending SpeechToTextManager.startRecording() completed successfully and recording is active. GameManager.onTapToggle received. speechToTextManager.isAvailable: true, speechToTextManager.isRecording: true GameManager received tap toggle callback. Tapped Object: None BACKGROUND SPATIAL TAP (hit BackgroundTapPlane) GESTURE MANAGER - User is already recording, stopping recording SpeechToTextManager.stopRecording() called GameManager.onTapToggle received. speechToTextManager.isAvailable: true, speechToTextManager.isRecording: false Audio data size: 134400 bytes Recognition task error: No speech detected <--- Code private(set) var isRecording: Bool = false private var recognitionRequest: SFSpeechAudioBufferRecognitionRequest? private var recognitionTask: SFSpeechRecognitionTask? @MainActor func startRecording() async throws { logger.debug("SpeechToTextManager.startRecording() called") guard !isRecording else { logger.warning("Cannot start recording: Already recording.") throw AppError.alreadyRecording } currentTranscript = "" processingError = nil audioBuffer = Data() isRecording = true do { try await configureAudioSession() try await Task.detached { [weak self] in guard let self = self else { throw AppError.internalError(description: "SpeechToTextManager instance deallocated during recording setup.") } try await self.audioProcessor.configureAudioEngine() let (recognizer, request) = try await MainActor.run { () -> (SFSpeechRecognizer, SFSpeechAudioBufferRecognitionRequest) in guard let result = self.createRecognitionRequest() else { throw AppError.configurationError(description: "Speech recognition not available or SFSpeechRecognizer initialization failed.") } return result } await MainActor.run { self.recognitionRequest = request } await MainActor.run { self.recognitionTask = recognizer.recognitionTask(with: request) { [weak self] result, error in guard let self = self else { return } if let error = error { // WE ENTER INTO THIS BLOCK, ALWAYS self.logger.error("Recognition task error: \(error.localizedDescription)") self.processingError = .speechRecognitionError(description: error.localizedDescription) return } . . . } } . . . }.value } catch { . . . } } @MainActor func stopRecording() { logger.debug("SpeechToTextManager.stopRecording() called") guard isRecording else { logger.debug("Not recording, nothing to do") return } isRecording = false Task.detached { [weak self] in guard let self = self else { return } await self.audioProcessor.stopEngine() let finalBuffer = await self.audioProcessor.getAudioBuffer() await MainActor.run { self.recognitionRequest?.endAudio() self.recognitionTask?.cancel() } . . . } }
Replies
0
Boosts
0
Views
167
Activity
May ’25
Question about UX/UI in a Connect mobile app
Is it possible to change the order of Cards in a Trends/Units screen like that (in a such order): Free In-App Paid Free iOS/watchOS/tvOS Free macOS In-App iOS/tvOS In-App macOS Paid iOS/watchOS/tvOS Paid macOS
Replies
0
Boosts
0
Views
13
Activity
2w
Xcode 16.3 Pods Compilation Issue
I am on MacOs 15.4.1 which is being deployed with an old version of Ruby ruby 2.6.10 and bundler version 1.17.2. I ran pod install after generating the pods with these versions and during the application compilation there have been several issues reported against pods code.
Replies
0
Boosts
0
Views
111
Activity
May ’25
[Unreal Engine] File missing if packaged with command line
Hello! I am trying to automate iOS builds for my Unreal Engine game using Unreal Automation Tool, but I cannot produce a functionnal build with it, while packaging from XCode works perfectly. I have tracked down the issue to a missing file. I'm using the Firebase SDK that requires a GoogleService-Info.plist file. I have copied this file at the root of my project, as the Firebase documentation suggests. I have not taken any manual action to specify that this file needs to be included in the packaged app. The Firebase code checks the existence of this file using NSString* Path = [[NSBundle mainBundle] pathForResource: @“GoogleService-Info” ofType: @“plist”]; return Path != nil; If I package my app from XCode using Product -> Archive, this test returns true and the SDK is properly initialized. If I package my app using Unreal Engine's RunUAT.sh BuildCookRun, this test returns false and the SDK fails to initialize (and actually crashes upon trying). I have tried several Unreal Engine tricks to include my file, like setting it as a RuntimeDependecies in my projects Build.cs file. Which enables Unreal Engine code to find it, but not this direct call to NSBundle. I would like to know either how to tell Unreal Engine to include files at the root of the app bundle, or what XCode does to automatically include this file and is there a way to script it? I can provide both versions .xcarchive if needed. Thanks!
Replies
0
Boosts
0
Views
151
Activity
Sep ’25
ActivityKit linker error
I have a ContentView in my app which includes the line of code FileUploadProgressAttributes. this struct is defined in a file included in the target FileUploadProgressExtension. and it is an ActivityAttributes. in ContentView I imported FileUploadProgressExtension, and the xcode is able to find the FileUploadProgressAttributes during prebuild. but during build, it gives me Undefined symbols for architecture arm64: "FileUploadProgressExtension.FileUploadProgressAttributes.init(filename: Swift.String) -> FileUploadProgressExtension.FileUploadProgressAttributes the workaround i found is to add the file with the FileUploadProgressAttributes to my app's target, but I'm not sure if this is the right thing to do. When Xcode created the extension for me, it added the extension target as a target dependency of my app. so obviously if i added this file to my app target it makes the extension target pointless. First time working with widgets so I'm not sure if I'm missing something.
Replies
0
Boosts
0
Views
89
Activity
May ’25
Built in ssh-add doesn't read ~/.ssh/config
I'm trying to authenticate to a git host using SSH keys stored in 1Password. I have ~/.ssh/config with mode 600 set with a symlink: Host * IdentityAgent "~/.1password/agent.sock" But ssh-add -l shows no identities. If I set $SSH_AUTH_SOCK, ssh-add -l works just fine. I'd love to not have to do this, though. Why doesn't ssh-add seem to read ~/.ssh/config? The built-in version is OpenSSH_10.0p2, LibreSSL 3.3.6. I've searched fruitlessly for an answer anywhere else.
Replies
0
Boosts
0
Views
188
Activity
Feb ’26
Modifer .classEffect()
Hi everyone, does anyone also have the problem that e.g. the modifier .classEffect() does not work? it is not recognized, although I have the required versions and languages all up to date. I can't get any further, even ChatGPT was at the end of its wisdom 😬. maybe someone has the same problem and a solution for it. many thanks in advance
Replies
0
Boosts
0
Views
43
Activity
Jun ’25
Xcode local documentation location
Where are the local documentation files location on MacOS? None of the existing data on the internet is applicable anymore. Trying to bundle documentation for use in an LLM and the developer site is unusable for crawling. Thanks! *.docset
Replies
0
Boosts
0
Views
64
Activity
May ’25
Mac (Designed for iPad) not been able to install on Mac testflight
Hi I developed an iPhone app but now I want to release the same app for iPad and Mac (Design for iPad). I made the changes below but I still not been able to install the app on my Mac. It says the app is incompatible but my Mac is MacBook Pro M2 which fills the requirements to install it. Things I did so my iPhone app could have support for iPad and Mac Added iPad and Mac (Designed for iPad) for the supported destinations Change the Status Bar Style to Require full screen Changed iOS minimum Deployments iOS to 17.6 Updated the launch screen logo so stays in the middle Xcode version 16.2 See the sccreenshots of the Xcode configuration and of the TestFlight on my MacBook Pro M2 Is there anything else I need to do? Thanks.
Replies
0
Boosts
0
Views
93
Activity
May ’25
Trouble setting up watches to use TestFlight that are AWFK configured
I am developing a simple watch app and I use my personal watch for development with Xcode. Personal watch is series 10 gps only. I have two other watches that I want to use for testing the app, but not needing them to be connected to Xcode. The test watches have cellular option, and I need a cell plan per watch because the watches need to be standalone, not counting initial setup. To get the standalone cell plan the watches need to be configured using AWFK. Here is what I have tried/current issues. I switch between all three watches on my phone using the watch app. Originally tried to put test watches in developer mode, thinking I would connect to Xcode, developer mode is not available when watch is setup using AWFK. Pushed the watch app to apple connect, setup TestFlight group, added the test users and my phone user, accepted invites TestFlight is installed on my phone, I see the testflight setup for the watch app I set a test watch using watch app on the phone, run install for the test app from TestFlight on the phone, spinner moves for awhile then goes back to Install. I am not able to get the watch app installed on the test watches from the phone. Is what I am attempting to do supported? I haven't found much specific documentation on this. If I pair the test watches as regular watches, set them to developer mode, can I pair them again as AWFK and will developer mode survive the switch? Or is there something really simple that I'm overlooking? Appreciate any help that can be extended.
Replies
0
Boosts
0
Views
280
Activity
Dec ’25
NO_CRASH_STACK with different number of crashes from the App Store.
Hello. For the last few months I have been facing a situation for which I do not understand either the reason or the solution. Here is an example for the last few weeks: In App Store Connect it shows 214 crashes for the last week In Xcode it shows 56 crashes for the last 2 weeks without stack traces (NO_CRASH_STACK) In Firebase Crashlytics there is not a single crash. When I click "Show in Finder" it opens a folder with .xccrashpoint files. As far as I understood from the instructions on posting crash reports - this is not what is needed. Interesting point: it looks like the errors occur on iOS 15-16 devices. I have several questions: Why is the number of crashes in Xcode different from the number in App Store Connect? Where are the other crashes? How can I understand what the problem is if there are no stack traces in Xcode?
Replies
0
Boosts
0
Views
124
Activity
May ’25
AR location errors on cellular + WiFi model iPad with device connected to Wi-Fi
I am developing an Augmented Reality (AR) navigation application for the iPad, utilizing the ARCL library to place Points of Interest (POIs) in the real world. The application's behavior varies significantly based on the device's networking configuration: Cellular Network (Expected Behavior): On an iPad with a cellular modem, when using the cellular network, all POIs are placed accurately with correct orientation. Wi-Fi Only (Expected Behavior): On a Wi-Fi-only model (no GPS chip), POI placement is inaccurate, confirming the need for an external GPS receiver for that hardware configuration. Cellular + Wi-Fi (Anomalous Behavior): The iPad is a cellular model (equipped with GNSS/GPS). The device is connected to a Wi-Fi network (enforced via an MDM profile, preventing the user from disabling Wi-Fi). When actively connected to this specific Wi-Fi network, the AR POIs consistently display with an incorrect orientation and placement, even though the device hardware has a dedicated GPS chip. The placement error strongly suggests that the device's determined location or heading is erroneous. It appears that the active Wi-Fi connection is somehow interfering with or overriding the high-accuracy GNSS/GPS data, leading to a flawed Core Location determination that negatively impacts the ARCL world tracking and anchor placement. Has anyone experienced a scenario where an active Wi-Fi connection on a cellular iPad model causes Core Location to prioritize less accurate location data (potentially Wi-Fi-based location services) over the device's built-in GNSS/GPS, resulting in severe orientation errors? We observed that on Apple map(native application) as well it is showing wrong location and orientation when it is connected to WiFi
Replies
0
Boosts
0
Views
301
Activity
Dec ’25
Upload Symbols Failed when Archiving with Xcode 16
Hello everyone, We are seeing some warnings in Xcode 16 when I archive my iOS app for distribution. During the upload phase (either via Xcode Organizer ), I get an error like: WARNING: Upload Symbols Failed The archive did not include a dSYM for MyFramework.framework with the UUIDs: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX Ensure that the archive’s dSYM folder includes a DWARF file for MyFramework.framework with the expected UUIDs. Question: How can these warnings be fixed? Is this warning safe to ignore if I know the vendor does not supply a dSYM for their framework? If I do need a proper dSYM for full symbolication, how should I request it from the vendor or generate it myself from an XCFramework?
Replies
0
Boosts
3
Views
438
Activity
May ’25
Battery Consumption Analysis Using Xcode Instruments
I have been working on battery consumption analysis for my application, and as part of this effort, I wanted to understand how competitor apps behave under similar usage conditions. To do this, I downloaded competitor apps from the App Store and attached them to Instruments via Xcode. I then executed a defined set of manual test scenarios to simulate real user behavior. During these tests, the iPhone was connected to a Mac and charging continuously, which meant that System Power Usage logs were not generated in Instruments. However, I was able to capture detailed metrics such as: Network usage CPU load GPU activity Display and brightness impact Other runtime performance characteristics Since direct battery drain data was unavailable, I used derived analysis (with AI assistance) to estimate approximate power consumption based on the above metrics, assuming real-device (battery-powered) conditions. According to Apple documentation, System Power Usage in Instruments is not directly tied to the device’s battery percentage. Instead, it appears to be computed using contributing factors such as CPU, network, display, and other subsystem activity. This raises a few important questions about data reliability and methodology. Key questions: How reliable are Instruments-based metrics (CPU, network, display, GPU) for estimating real-world battery consumption when the device is charging? Can these metrics be safely used as a comparative baseline between competitor applications, even if absolute battery drain values are unavailable? Is the System Power Usage instrument essentially a derived model based on subsystem activity, and if so, does it remain accurate when the device is not discharging? From Apple’s perspective, is this a valid approach for relative power comparison, provided that: The same device is used OS version is identical Test scenarios are consistent and repeatable Based on these findings, would it be reasonable to proceed with instrumenting our own application, run the same scenarios, and draw conclusions using relative comparisons rather than absolute battery percentages? The intent is not to claim exact battery drain numbers, but to establish a directionally correct and repeatable comparison that can guide performance optimizations in our own application. I would like to understand whether this methodology aligns with Apple’s recommended practices, or if there are limitations or inaccuracies I should be aware of before relying on these results for decision-making.
Replies
0
Boosts
0
Views
223
Activity
Jan ’26
CPListTemplate in CarPlay Simulator: Multiple Items Selected on Tap
I am encountering an issue while running my CarPlay app in the simulator using CPListTemplate. On app launch, when I tap on the first item in the list and then tap on the second item, both items remain selected instead of only the latest tapped item. This behavior does not align with the expected single-item selection functionality. Has anyone else faced this issue? Is there a workaround or a known resolution for this behavior?
Replies
0
Boosts
0
Views
49
Activity
May ’25