Build, test, and submit your app using Xcode, Apple's integrated development environment.

Xcode Documentation

Posts under Xcode subtopic

Post

Replies

Boosts

Views

Activity

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
518
Jul ’25
error: unknown type name 'CFAttributedStringRef'
OS ERROR] /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/CoreText.framework/Headers/CTTypesetter.h:104:27: error: cannot combine with previous 'type-name' declaration specifier [OHOS ERROR] CFAttributedStringRef string ) CT_AVAILABLE(macos(10.5), ios(3.2), watchos(2.0), tvos(9.0)); [OHOS ERROR] ^ [OHOS ERROR] /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/CoreText.framework/Headers/CTTypesetter.h:132:27: error: unknown type name 'string'; did you mean 'std::string'? [OHOS ERROR] CFAttributedStringRef string, [OHOS ERROR] ^ [OHOS ERROR] ../../prebuilts/clang/ohos/darwin-arm64/llvm/bin/../include/c++/v1/iosfwd:249:65: note: 'std::string' declared here [OHOS ERROR] typedef basic_string<char, char_traits, allocator > string; [OHOS ERROR] ^ [OHOS ERROR] In file included from ../../ide/tools/previewer/util/unix/ClipboardObjc.mm:18: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/Cocoa.framework/Headers/Cocoa.h:13: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/AppKit.framework/Headers/AppKit.h:15: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/AppKit.framework/Headers/NSAccessibilityColor.h:9: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/AppKit.framework/Headers/NSColor.h:46: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/AppKit.framework/Headers/NSApplication.h:10: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/AppKit.framework/Headers/NSResponder.h:10: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/AppKit.framework/Headers/NSEvent.h:10: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/ApplicationServices.framework/Headers/ApplicationServices.h:39: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/CoreText.framework/Headers/CoreText.h:26: [OHOS ERROR] In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/CoreText.framework/Headers/CTFramesetter.h:21: [OHOS ERROR] /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/CoreText.framework/Headers/CTTypesetter.h:132:5: error: unknown type name 'CFAttributedStringRef'; did you mean 'NSAttributedStringKey'? [OHOS ERROR] CFAttributedStringRef string, [OHOS ERROR] ^ [OHOS ERROR] /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSAttributedString.h:22:20: note: 'NSAttributedStringKey' declared here [OHOS ERROR] typedef NSString * NSAttributedStringKey NS_TYPED_EXTENSIBLE_ENUM; [OHOS ERROR] ^ [OHOS ERROR] In file included from ../../ide/tools/previewer/util/unix/ClipboardObjc.mm:18:
0
1
366
Sep ’24
XCode 16.0 "Cannot find (your_variable) in scope (your_scope)"
Hello! Trying to debug with ol' reliable print object (po) but getting error that such variable does not exist in current scope (even tho i have breakpoint literally on it, before it will "return" from current context (return myObject) Any ideas on why this might occur? Prior to xcode 16 - i faced this error only when i made typo in variable name or it actually was different scope (i jumped in stacktrace to other call)
0
0
230
Oct ’24
KeyedUnarchive a previously object archived with NSArchiver archivedDataWithRootObject
Hi, in my previous macOS app I used to archive a dictionary to a preference file using [NSArchiver archivedDataWithRootObject:dictionary]; and to unarchive it using [NSUnarchiver unarchiveObjectWithData:dataFromDisk]; Now I would like to replace the 2 deprecated methods with NSError *error; NSSet *classSet = [NSSet setWithObjects:[NSDictionary class], [NSArray class], [NSString class], [NSNumber class], [NSData class], nil]; NSDictionary *dictionary = [NSKeyedUnarchiver unarchivedObjectOfClasses:classSet fromData:dataFromDisk error:&error]; But I get a nil dictionary and the error 4864: non-keyed archive cannot be decoded by NSKeyedUnarchiver. So I guess I should first keep on unarchiving the preferences dataFromDisk using the old deprecated method [NSUnarchiver unarchiveObjectWithData:dataFromDisk]; Then I could use the new NSKeyedArchiver and NSKeyedUnarchiver methods for the upcoming release. But, if this deprecated method [NSUnarchiver unarchiveObjectWithData:dataFromDisk]; fails to unarchive the old data (and on some machines now it fails), how could I use the new methods? Should I consider my old preference file gone? Is a way to force the new NSKeyedUnarchiver method to unarchive data previously archived with NSArchiver ?
0
0
152
Sep ’24
AVAudioFile.processingFormat, only Float32 is allowed?
Here is some code I have to create an AVAudioFile instance based on Int16 samples. let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100.0, channels: 2, interleaved: false)! let audioFile = try AVAudioFile(forWriting: outputURL, settings: format.settings) When writing to the file I get the following runtime error, presumably from CoreAudio. CABufferList.h:184 ASSERTION FAILURE [(nBytes <= buf->mDataByteSize) != 0 is false]: I read this as a size mismatch between what is specified in the format used to create the file and the file's own internal processingFormat property, which is read-only. Here is my debugger console output showing the input format I created, along with the resulting AVAudioFile fileFormat and processingFormat properties. (lldb) po format <AVAudioFormat 0x300e553b0: 2 ch, 44100 Hz, Int16, deinterleaved> (lldb) po format.settings ▿ 7 elements ▿ 0 : 2 elements - key : "AVNumberOfChannelsKey" - value : 2 ▿ 1 : 2 elements - key : "AVLinearPCMBitDepthKey" - value : 16 ▿ 2 : 2 elements - key : "AVFormatIDKey" - value : 1819304813 ▿ 3 : 2 elements - key : "AVLinearPCMIsNonInterleaved" - value : 1 ▿ 4 : 2 elements - key : "AVLinearPCMIsBigEndianKey" - value : 0 ▿ 5 : 2 elements - key : "AVLinearPCMIsFloatKey" - value : 0 ▿ 6 : 2 elements - key : "AVSampleRateKey" - value : 44100 (lldb) po audioFile.fileFormat <AVAudioFormat 0x300ea5400: 2 ch, 44100 Hz, Int16, interleaved> (lldb) po audioFile.processingFormat <AVAudioFormat 0x300ea5450: 2 ch, 44100 Hz, Float32, deinterleaved> Please note that the input format I'm using does not match either the audio file fileFormat or processingFormat properties. The file format is interleaved even though I specified de-interleaved. This makes sense to me as working with audio files that are growing is much easier and more efficient with interleaved data. The head-scratcher is the processingFormat. I specified Int16 samples and it is expecting Float32? According to the format settings dictionary, we are specifying the correct key/value pairs. Is this expected behavior? Does Apple always insist on Float32 internally or is this a bug?
0
0
610
Oct ’24
xxd script broken with ENABLE_USER_SCRIPT_SANDBOXING
Hey! I have a build script that uses "xxd" to embed some resources in my binary. With the ENABLE_USER_SCRIPT_SANDBOXING now basically mandated in Xcode 16 (without it, it now shows a warning), I'm trying to enable it in my build. I'm getting no errors, but the output from xxd specifically is totally empty, like it's being blocked. I tried adding /usr/bin/xxd to the list of input files, but that didn't work. Anyone know if there's a way to get this working – or, alternatively, to suppress the warning?
0
1
473
Sep ’24
iOS Build Availability in Xcode
When attempting to fix an issue that is only happening on a specific iOS build I noticed that I am not able to install that specific build on my simulator. Is there any way to get specific iOS builds that are not available through Xcode? Example: The bug I am trying to fix is only happening on 17.6.1 and 17.7.1. These specific builds are not available through Xcode. Attempted Resolution: I found a iPhone14,5_17.6.1_21G101_Restore.ipsw file but was not able to import it into Xcode for use nor was I able to use it on a physical device.
0
0
203
Nov ’24
xcodebuild fails to build target that links AppIntents.framework once SWIFT_EXEC is specified
Reasoning I am working on a tool that does swift code preprocessing, which is done by a custom script that gets passed as SWIFT_EXEC. This script does some magic and then calls the original swift compiler from /usr/bin/swiftc. I am facing a rather non-common issue. Problem With the release of Xcode 16, for some reason xcodebuild now forcibly does not supply --compile-time-extraction flag to the appintentsmetadataprocessor, if xcodebuild also has a SWIFT_EXEC= argument. (appintentsmetadataprocessor is a tool that is executed automatically by the xcode build system if the app is using App Intents feature). Xcode 15 behaves fine in this regard and always passes --compile-time-extraction flag to appintentsmetadataprocessor (both with or without SWIFT_EXEC). But when the --compile-time-extraction flag is not passed, the appintentsmetadataprocessor fails with an error, making xcodebuild fail as well, essentially making AppIntents unavailable if SWIFT_EXEC build setting is used. Here's how to reproduce the issue: Create a new iOS Xcode project, add AppIntents.framework to the list of linked frameworks, and run xcodebuild SWIFT_EXEC=/usr/bin/swiftc from the project directory. If you're using Xcode 16 you will get a build error: starting appintentsmetadataprocessor export error: At least one halting error produced during export. No AppIntents metadata have been exported and this target is not usable with AppIntents until errors are resolved. error: The operation couldn't be completed. (GeneratorBuildProductExtractor.BinaryScanningError 6.) Alternatively, instead of running xcodebuild, one can add user-defined build setting SWIFT_EXEC=/usr/bin/swiftc to the xcode project target's build settings and build it from xcode, getting the same build failure error. Question The question I am hoping to get an answer or a hint to, is is there any kind of a workaround that would force appintentsmetadataprocessor to still get the --compile-time-extraction argument, when it is launched by the xcodebuild, so the build process completes sucessfully? I also tried passing LM_COMPILE_TIME_EXTRACTION=YES to the xcodebuild, which, according to /Applications/Xcode.app/Contents/SharedFrameworks/XCBuild.framework/Versions/A/PlugIns/XCBBuildService.bundle/Contents/PlugIns/XCBSpecifications.ideplugin/Contents/Resources/AppIntentsMetadata.xcspec, should enable --compile-time-extraction, however this setting seems to be either ignored or overridden once the SWIFT_EXEC is passed, making it useless. Can this option be injected/enforced? The corresponding issue tracking number is FB15274300 Thank you
0
3
764
Oct ’24
Shell script phases not running in parallel from xcodebuild
I tried to enable FUSE_BUILD_SCRIPT_PHASES in my project (based on https://projects.blender.org/blender/blender), and this improves the build time when I use Xcode, but not on my CI machine, which builds the same project from command line via xcodebuild. And indeed, xcodebuild runs the shell script phases sequentially, even if I set -jobs 20. What am I doing wrong?
0
0
466
Sep ’24
Simulator not working
Since its update to iOS 18.0, I have not been able to use the Simulator in Xcode at all. It is stuck loading for a very long time, and then it crashes. I managed to get it to start once, but it used up all of my system's memory immediately. I have not had any problems with the Simulator before the iOS 18.0 update. Additionally, after the Simulator crashing, the StoryBoard usually stops working as well, until I restart Xcode. (I use an M1 MacBook Air with 8GB of RAM)
0
0
538
Sep ’24
dyld[53510]: Symbol not found - React Native app issue with running on real device
I develop React Native app with dynamically linked pods, and app runs on simulator well, while running it on connected device returns this error: dyld[53510]: Symbol not found: __ZN5swift39swift51override_conformsToSwiftProtocolEPKNS_14TargetMetadataINS_9InProcessEEEPKNS_24TargetProtocolDescriptorIS1_EEN7__swift9__runtime4llvm9StringRefEPFPKNS_35TargetProtocolConformanceDescriptorIS1_EES4_S8_SC_E Referenced from: <4A3492BF-0479-3124-BE58-05BAED71BB20> /private/var/containers/Bundle/Application/0D9FDF5C-BBC9-4060-972B-B2D6FD91E321/BFF.app/Frameworks/Framework1 Expected in: <0549B906-CB15-3735-AA15-FAEB5F687C8B> /private/var/containers/Bundle/Application/0D9FDF5C-BBC9-4060-972B-B2D6FD91E321/BFF.app/Frameworks/Framework2 I already tried different things: Different versions of IPHONEOS_DEPLOYMENT_TARGET Ensured that all dependencies using same Swift version Different linking Tested on different devices and iOS versions Standard cleaning of derived data and reinstalling podfiles also included BUILD_LIBRARY_FOR_DISTRIBUTION="YES" ENABLE_BITCODE="NO"
0
0
460
Oct ’24
Code=4099 "Connection invalidated to streaming unzip service."
Hello, I am new to be using on-demand resources in my project, it’s a wonderful idea and concept I have to say! Kudos to whoever invented this! I am facing one problem that I couldn't solve so far: whenever I switch between the TestFlight / App Store / local Xcode builds I receive this error message: Code=4099 "Connection invalidated to streaming unzip service." Does anyone know what this means and how I can resolve it? I saw this other thread where it was recommended to delete & reinstall the app, but that is not always feasible because then all user data from the app is lost as well: https://forums.developer.apple.com/forums/thread/707070 Thanks a lot for any hints!
0
1
393
Dec ’24
Error 3002 running share sheet extension on local device
I'm debugging a change to my share sheet extension, which I published to the App Store several months ago. But I can't run it on my iPhone, attached to my MacBook. I know that I did this when I originally published it but some time has passed. The error that I get is "Failed to install the app on the device. Domain: com.apple.dt.CoreDeviceError Code: 3002 Failure Reason: The provided item to be installed is not of a type that CoreDevice recognizes." Does anyone know what would cause that? tia
0
1
521
Nov ’24
CoreML not found anymore once I add C++ compile source
I have a small Swift command line tool for MacOS. I have added a small demo CoreML model to the project and can correctly load the model in Swift like let model = try! Matcher_512x256(). However, once I add a single C++ or Objective C file to the Compile Sources of my build target, the compilation fails with an error stating that the model cannot be found: "...ModelAndCpp/ModelAndCpp/main.swift:11:18 Cannot find 'Matcher_512x256' in scope". Please help me resolve this. Does XCode change into a different "compilation mode" once a C++ file is present which might cause this error? XCode version is 16A242d. I can reproduce the issue with a tiny repository (5 small files) which I have uploaded here on GitHub. To reproduce the issue, simply remove the .m and .cpp files from Compile Sources to compile correctly, or add one of them to get the error. Thank you and best regards, Manuel
0
0
161
Oct ’24
Undefined symbol: _OBJC_CLASS_$_ADClient
(void)asaAttribution { if (@available(iOS 14.5, *)) { [self asaAttributionToken]; NSLog(@"adServicesToken1:%@",[ConfigModelTool singleton].asaToken); [self attributionWithToken:[ConfigModelTool singleton].asaToken]; }else { if ([[ADClient sharedClient] respondsToSelector:@selector(requestAttributionDetailsWithBlock:)]) { [[ADClient sharedClient] requestAttributionDetailsWithBlock:^(NSDictionary<NSString *,NSObject *> * _Nullable attributionDetails, NSError * _Nullable error) { if (!error) { [self handleAsaData:attributionDetails]; }else { [StatisticsTool buriedPoint:4649 withOther:@{@"s0":[NSString checkNullString:[NSString stringWithFormat:@"%d",(int)error.code]]}]; } }]; } } } I upgraded the code that was compiled correctly before the official version of Xcode16, but now this error occurs. I deleted the relevant code and still compile the error. My writing style is recommended by the installation official. Can someone help me see what the reason is? Save me, save me!
0
0
962
Oct ’24
App Not Launching on Device After Xcode 16.1 Update
Hi everyone, Since updating to Xcode 16.1 on macOS Sequoia 15.0.1, I’m having issues with my app not launching on my iOS device. The app finishes compiling in Xcode without any errors, but it never appears to launch on the device—it either gets stuck indefinitely or doesn’t show any progress on the device screen. Details of the Issue: Xcode shows that the app is launching, but there’s no progress on the device. Tried on multiple devices with the same result. Troubleshooting Steps I’ve Tried: Cleaned the build folder and deleted derived data. Verified the deployment target matches the device’s iOS version. Checked provisioning profiles and code signing settings. Restarted both Xcode and my device. Tried connecting over both USB and Wi-Fi. Workaround Found: Unpairing the device from Xcode, pairing it again, then turning off Wi-Fi on the device before building allows the app to launch successfully. Has anyone else experienced this with Xcode 16.1? Any tips on a more permanent solution or other troubleshooting steps would be greatly appreciated. Thank you!
0
1
212
Nov ’24
When building for physical iPhone throws Command PhaseScriptExecution failed with a nonzero exit code
I developed my app with React-Native-CLI, version 0.67.2. I use Xcode 16 on MacOS Sequoia 15.0.1 It builds and runs fine on any simulator, iOS 17 or iOS 18. As soon as I build it for my iPhone 12 iOS 17.6.1 or archive, it throws this error "Command PhaseScriptExecution failed with a nonzero exit code" at the last time. It actually starts the Metro. I made my iPhone into developer mode. Paired it with MacOS. The thing I don't get why it doesn't work on my iPhone when it works fine on simulators. It doesn't even archive. Anyone encountered something like this in the past?
0
0
696
Oct ’24