I have added a Network Extension to my iOS project to use the WireGuard library. Everything was working fine up to Xcode 16, but after updating, I’m facing a build issue. The build fails with the following error: No such file or directory: '@rpath/WireGuardNetworkExtensioniOS.debug.dylib' I haven’t explicitly added any .dylib to my project. The Network Extension target builds and runs fine on Xcode 16.
Search results for
xcode github
91,890 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
[quote='799514021, dutt, /thread/799514, /profile/dutt'] As I understood it I should be able to use these from iOS 18.2 [/quote] I don’t know much about this framework, but the docs for TelephonyConversationManager make it clear that it’s new in iOS 26 beta. That means: You need to build with Xcode 26 beta. Your app will either need to require iOS 26… Or you’ll have to conditionalise this code so that it only runs on iOS 26 (Claude31’s snippet shows one approach for that). Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
App & System Services
SubTopic:
General
Tags:
I have no idea why some files can be created and others are just PROPOSALs. I asked ChatGPT (via Xcode) to create a slew of code, and after 5 files were created the rest were just proposals. They were all in the root of the project. They all looked the same to me.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
I didn't check this out specifically on earlier betas, but AI is allowed to create new .swift files, yes? I'm only seeing proposals to create files (as opposed to code which it can change automatically), and the CREATE FILE button to the right of any proposed file creation does nothing. Next I'll mention running local models, but file creation does not seem to happen for neither ChatGPT nor any local model. Also I'm experimenting with LM Studio and it is reporting my client is timing out. So I guess Xcode is not waiting long enough for a response? The local models are slow, yes. But is there a setting for the AI timeout value? I told the LLM Every 1 minute send me an update so I know you are still working so my client does not time out which seems to have no visible effect, it hasn't timed out yet, but I don't visibly see that message.
It works as in Xcode 16.4. And settings are by default the same: Xcode 16.4: Xcode 26ß7
Topic:
Developer Tools & Services
SubTopic:
Xcode
No it doesn't. Xcode 26 just opens a new tab, whereas Xcode 16 opens the a new split pane.
Topic:
Developer Tools & Services
SubTopic:
Xcode
I don't know if I speak of the same thing. -Cmd-Option-click on a class name as NSWindow or UIWindow, or a protocol as UIApplicationDelegate. this should open the class or the protocol in a second pane. Is it what you mean ? If so, I tested both in Xcode 16.4 and 26ß7. It works as described in both.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Thanks for clarification. This code works: let app = XCUIApplication() app.launchArguments = [-com.apple.TipKit.HideAllTips, 1] I made a mistake to write it as one string(-com.apple.TipKit.HideAllTips 1) and Xcode did not split it to key and value...
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
I'm trying to create a dialer app for iOS that will make verified cellular, not voip, calls by registering the calls on my server with an option for passphrase offline verification. This means that I want to build a dialer with a nice UX, so I'm trying to use the new default dialer capability. I've read https://developer.apple.com/documentation/livecommunicationkit/preparing-your-app-to-be-the-default-dialer-app which links to https://developer.apple.com/documentation/livecommunicationkit/startcellularconversationaction for starting a call, but when I try to actually use it in my app it says Cannot find type 'TelephonyConversationManager' in scope and similar, despite importing LiveCommunicationKit. Is there a default dialer example app & xcode project I can look at for how this should be set up? As I understood it I should be able to use these from iOS 18.2, and I'm targeting that version in my project. The page for StartCellularConversationAction says Beta 26.0 though, have I misunderstood some
I have the same issue with Xcode 26 beta 7 and iOS 26 beta 9. Apple's sample app that has live activities implemented can show alarms when the device isn't locked. Is is required to implement live activity to show alarms when the device isn't locked?
Topic:
UI Frameworks
SubTopic:
SwiftUI
Hi @farhang_omi, would you mind double-checking to make sure you are using the same beta of Xcode to build and run for the OS? If you use Xcode 26 beta 6 to build and run on macOS 26 beta 6, this issue shouldn't happen.
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Having the exact same problem here, but it never works for me. Tried re-running, changing Xcode versions, etc. Even tried to install Metal using a pre_build script. It stopped working this week.
Topic:
Developer Tools & Services
SubTopic:
Xcode Cloud
Tags:
Foundation Models framework worked perfectly on macOS 26 Beta 2, but starting from Beta 3 and continuing through Beta 6 (latest), I get dyld symbol errors even with the exact code from Apple's documentation. Environment: macOS 26.0 Beta 6 (25A5351b) Xcode 26 Beta 6 M4 Max MacBook Pro Apple Intelligence enabled and downloaded Error Details: dyld[Process]: Symbol not found: _$s16FoundationModels20LanguageModelSessionC5model10guardrails5tools12instructionsAcA06SystemcD0C_AC10GuardrailsVSayAA4Tool_pGAA12InstructionsVSgtcfC Referenced from: /path/to/app.debug.dylib Expected in: /System/Library/Frameworks/FoundationModels.framework/Versions/A/FoundationModels Code Used (Exact from Documentation): import FoundationModels // This worked on Beta 2, crashes on Beta 3+ let model = SystemLanguageModel.default let session = LanguageModelSession(model: model) let response = try await session.respond(to: Hello) What I've Verified: FoundationModels.framework exists in /System/Library/Frameworks/ Framework is properl
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Hello, I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it. I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully). > GET /v1/feed/song/latest HTTP/2 > Host: api.media.apple.com > user-agent: insomnia/2023.5.8 > authorization: Bearer [REDACTED] > accept: */* < HTTP/2 401 < content-type: application/json; charset=utf-8 < content-length: 0 < x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q < x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod and also the Apple provided Python example code, which gives me authentication errors too. $ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path /Users/foxt/Documents/am-feed/NMBH[...
I have now signed out of my account in Xcode, quit the application, signed back in, and also cleared the Xcode derived data and cache. However, I am still receiving the 401 Unauthorized error Any operation related to Xcode Cloud (e.g., viewing the dashboard, creating a workflow) fails immediately across all Xcode projects, including brand-new empty projects. The error is consistent and always appears as: API Invalid status code: 401. Domain: XcodeCloudCombineAPI.XCCResponseError Code: 1 System Information: macOS Version 15.6.1 (Build 24G90) details : Error alert: API Invalid status code: 401.: XCCResponseError(responseErrorType: XcodeCloudCombineAPI.XCCResponseError.XCCResponseErrorType.invalidStatusCode(XcodeCloudCombineAPI.LegacyHttpStatus.unauthorized), requestUrl: Optional(https://appstoreconnect.apple.com/ci/api/teams//apps/find?bundle_id=), traceId: Optional(2ab09bea8da9ef39), retryAfterSecsStr: nil, response: Optional( { URL: https://appsto