Problem Summary After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds. Environment Details Operating System: iOS 26.1 & iOS 26.2 Framework: RealityKit Xcode Version: 16.2 (16C5032a) Expected vs. Actual Behavior Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier. Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience. Steps to Reproduce Create or open an AR application with RealityKit that uses particle components Attach a ParticleEmitterComponent to an entity via a custom system Run the application on iOS 26.1 or iOS 26.2 Observe that particles render at
Search results for
xcode github
93,987 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Just checked new Xcode 26.2 with iOS 26.2 and the debug builds startup time indeed improved significantly. We still have ~7 seconds of initial sluggishness (difference between debug executable on/off), but it's way better than it was before! 👍 While I appreciate the movement in the right direction, this still represents a substantial regression from the prior baseline, ~1s in Xcode 16. A reduction in severity isn’t a resolution. If Xcode is to remain viable for professional workflows, startup and initial responsiveness need to return to the Xcode 16 baseline.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
Just checked new Xcode 26.2 with iOS 26.2 and the debug builds startup time indeed improved significantly. We still have ~7 seconds of initial sluggishness (difference between debug executable on/off), but it's way better than it was before! 👍 That’s not an improvement, that's a partial rollback of a huge regression. On Xcode 16 it was ~1s. That was the baseline. Seven seconds of launch + long initial sluggishness in debug is still absurd. Stop celebrating “less bad.” This shouldn’t be acceptable and Xcode 26 is still unusable for any serious work.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
I have found that following code runs without issue from Xcode, either in Debug or Release mode, yet crashes when running from the binary produced by archiving - i.e. what will be sent to the app store. import SwiftUI import AVKit @main struct tcApp: App { var body: some Scene { WindowGroup { VideoPlayer(player: nil) } } } This is the most stripped down code that shows the issue. One can try and point the VideoPlayer at a file and the same issue will occur. I've attached the crash log: Crash log Please note that this was seen with Xcode 26.2 and MacOS 26.2.
@axl411 allowsUltraConstrainedNetworkAccess This property seems to have been added for support on Xcode 26.1+. Hopefully this means that we can use URLSessionConfiguration to control if our URLSession requests can go through with ultra constricted network is being used. Fingers crossed...
Topic:
App & System Services
SubTopic:
Networking
Tags:
I'm adapting for iOS 26, and I found that when pressing and slowly swiping the screen, the section header near the bottom of the navigation bar in the tableview flickers frequently. How can I fix this issue? Please view the video in the github project: issue.mp4
It seems Xcode's predictive code completion model is censored. Specifically, when typing the word torrent, the model stops working completely. It doesn't matter whether the word is written directly in the code or in a comment. It could also be part of another word, such as qBittorrent. In either case, the model stops working. Reproducing this issue is fairly simple. Create a Swift file and type the word torrent. The model will stop generating code. Xcode Version 26.2 (17C52) Predictive Code Completion Model: [com.apple.fm.code.generate_small_v2.base: 700.0.81600.13.202379,0] [com.apple.fm.code.generate_safety_guardrail.base: 1.6.81619.13.202072,0] [com.apple.gm.safety_deny.input.code_intelligence.base: 32025010.20251009.91600.100.1651,0] [com.apple.gm.safety_deny.output.code_intelligence.base: 32025010.20251009.91600.100.1651,0] (Installed)
After updating iPad/iPhone devices from iOS 18 to iOS 26, PhotogrammetrySession intermittently crashes during photogrammetry processing. The same workflow was stable on iOS 18 with no code changes to the app. Environment: OS versions: Works on OS 18, crashes on OS 26 Device: iPad/iPhone (reproducible across devices) Source images: ~170-200 JPG files at 2160 x 3840 resolution Reproduction: The crash occurs consistently on the second or third sequential run of the photogrammetry session with the same image set. First run typically succeeds. Crash details: Xcode shows an uncaught exception during image processing: terminating due to uncaught exception of type std::bad_alloc: std::bad_alloc VTPixelTransferSession 420f sid 269 (2160.00 x 3840.00) [0.00 0.00 2160 3840] rowbytes( 2160, 2160 ) Color( (null), 0x0, (null), (null), ITU_R_601_4 ) => 24 sid 19 (2160.00 x 3840.00) [0.00 0.00 2160 3840] rowbytes( 6528 ) Color( 0x0, (null), (null), (null) ) This appears to be a
Topic:
Graphics & Games
SubTopic:
RealityKit
Hi, We design a Live Activity for our app.We find that in iOS26 system, the widget can not display the correct system display model(Light mode or dark mode), always display with dark mode. When our app run in other system ,such as iOS 17, iOS18 ,it work fine. I find other developer had post a topic three month ago , but it seems there is not any new response about the feedback. https://developer.apple.com/forums/thread/799684?answerId=857377022#857377022 Anyone have idea? Thanks . Below is my code template: struct BroadcastLiveActivityBackgroundView: View { @Environment(.colorScheme) var colorScheme: ColorScheme var body:some View { LinearGradient( stops: [ Gradient.Stop(color: LiveActivityColor.backgroundColors(self.colorScheme).last!, location: 0.00), Gradient.Stop(color: LiveActivityColor.backgroundColors(self.colorScheme).first!, location: 1.00), ], startPoint: UnitPoint(x: 1, y: 0), endPoint: UnitPoint(x: 0.82, y: 1.11) ) } }
Looks like it fixed in Xcode 26.2
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
I am developing a simple watch app and I use my personal watch for development with Xcode. Personal watch is series 10 gps only. I have two other watches that I want to use for testing the app, but not needing them to be connected to Xcode. The test watches have cellular option, and I need a cell plan per watch because the watches need to be standalone, not counting initial setup. To get the standalone cell plan the watches need to be configured using AWFK. Here is what I have tried/current issues. I switch between all three watches on my phone using the watch app. Originally tried to put test watches in developer mode, thinking I would connect to Xcode, developer mode is not available when watch is setup using AWFK. Pushed the watch app to apple connect, setup TestFlight group, added the test users and my phone user, accepted invites TestFlight is installed on my phone, I see the testflight setup for the watch app I set a test watch using watch app on the phone, run install for the
Thanks for sharing @Alexander Vasenin! @guillaume-amo asked: This is good news, but why is an update of iOS required? You can think of LLDB as a client-server architecture. In Xcode, your use of the debugger is the client side of the infrastructure involved in enabling the debugger. On the other side of the infrastructure is a server process (called debugserver) that the client is talking to. It's this component that is receiving the debugging commands from the client, managing the Mach task port that allows for a debugger to attach to a running process to inspect and manipulate its state, and sending all of the information and commands back and forth to the LLDB client in Xcode for you to view. While on iOS, the debug server happens to be on a different device than where the debugger client is running, the same architecture is at play if you are working on a macOS app. If you are debugging on the same Mac as where Xcode is running, the client and server components just so happen to
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
Does Apple suggest that developers transition to arm64-only and drop support for arm64_32 devices? No — Apple Watch Series 9 and later, and Apple Watch Ultra 2 and later all support the arm64 architecture. There are many Apple Watch devices that run watchOS versions your app likely supports beyond that list, so you need to keep the arm64_32 architecture around for those devices. As noted in the announcement, you'll want to use the Standard Architectures build setting for your watchOS app, and that will automatically build the right set of supported architectures for your app, which will include arm64_32 as well as arm64. If we add support for both arm64_32 and arm64, the binary will almost certainly exceed the 75 MB app size limit, and potentially violate the size constraints for each architecture slice as well. What size constraints per architecture are you thinking of? The Maximum build file sizes documentation lists such a requirement for iOS apps in the iOS 7 and 8 time frame, but that isn't relevant for
Topic:
App Store Distribution & Marketing
SubTopic:
General
The issue can be reproduced using the simplest code. In Xcode 26 + iOS 26, when a UIBarButtonItem is created using a UIImage, it consistently prints numerous constraint conflict warnings to the console. Below is my test code and the console warnings: let btn = UIBarButtonItem(systemItem: .trash) self.toolbarItems = [btn] Unable to simultaneously satisfy constraints. Probably at least one of the constraints in the following list is one you don't want. Try this: (1) look at each constraint and try to figure out which you don't expect; (2) find the code that added the unwanted constraint or constraints and fix it. ( , , , ) Will attempt to recover by breaking constraint Make a symbolic breakpoint at UIViewAlertForUnsatisfiableConstraints to catch this in the debugger. The methods in the UIConstraintBasedLayoutDebugging category on UIView listed in may also be helpful.
Reproducible with Xcode 26.2 (17C52) FB21322904
Topic:
Developer Tools & Services
SubTopic:
Xcode Cloud
Tags: