Hi,
On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor).
In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense.
There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following.
    .accessibilityRotor(.headings) {
        ForEach(entries) { entry in
            // entry.id must be the same as the id of the SwiftUI view it is about
            AccessibilityRotorEntry(entry.name, id: entry.id)
        }
    }
It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up.
I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them.
So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else)
The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed).
Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder.
Thanks
                    
                  
                Posts under iOS tag
            
              
                200 Posts
              
            
            
              
                
              
            
          
          
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
              Post
Replies
Boosts
Views
Activity
                    
                      Hi, I’ve recently installed the iOS 26 DB, and I’m experiencing heating issues on my iPhone. Is there a fix to this? Also, is it a bug or a feature, that I can’t remove certain sections in the Photos app rather than rearranging them. Pls lemme know. Thanks.
                    
                  
                
                    
                      It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
                    
                  
                
                    
                      As part of the WWDC25 Keynote, a technology was announced that can present 2D images as 3D spatial scenes. This announcement is supported by a Press Release.
...developers can use the Spatial Scene API to make their app experience even more immersive. Zillow is taking advantage of the API for their Zillow Immersive app, allowing users to see images of homes and apartments with the rich depth and dimension that spatial scenes offer.
The feature also appears in the Photos App on iOS 26 Developer Beta 1. Tapping "Spatial Scene" on any photo opens a view of that photo with a parallax effect. I've searched the WWDC sessions and new documentation and have come up short. Reaching out here for help.
Is there any documentation for Spatial Scene API? Or any guidance on how to implement the spatial scene in iOS?
                    
                  
                
                    
                      iOS 26 added smoothness to CIRoundedRectangleGenerator, for use with CIFilter.roundedRectangleGenerator. What should the smoothness value be to achieve the same corner curve as CALayerCornerCurve.continuous? Does it need to be calculated based on the extent size, if so, how?
                    
                  
                
                    
                      been testing out ios 26 since it came out and now all of a sudden everytime it boots only into recovery and wont do anything else. recovery says no known issues found.
                    
                  
                
                    
                      Hi everyone,
I’m currently testing iOS 26 on my iPhone as part of the developer program. According to Apple’s documentation and demo materials, a new screenshot animation was introduced in this version. However, when I take a screenshot on my device, the animation remains the same as in previous iOS versions.
I’ve double-checked that I’m running the correct build of iOS 26, and I haven’t found any settings that might enable or disable this feature.
Is anyone else experiencing the same issue? Could this new animation be device-specific, region-limited, or require additional configuration?
Any insight would be appreciated!
Thanks in advance,
Alonso Rivera
                    
                  
                
                    
                      There’s a critical, actively exploited vulnerability in Apple’s iOS activation servers allowing unauthenticated XML payload injection:
https://cyberpress.org/apple-ios-activation-vulnerability/
This flaw targets the core activation process, bypassing normal security checks. Despite the severity, it’s barely discussed in public security channels.
Why is this not being addressed or publicly acknowledged? Apple developers and security researchers should urgently review and audit activation flows—this is a direct attack vector on device trust integrity.
Any insights or official response appreciated.
                    
                  
                
                    
                      The swift syntax compilation reported an error.
as follows
How should I be compatible
                    
                  
                
                    
                      The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
                    
                  
                
                    
                      Some users have reported an error editing portrait photo assets in my app:
The operation couldn’t be completed. (CINonLocalizedDescriptionKey error 3.)
What is that error? Will affected photos always encounter this error (due to data corruption for example) or can it be resolved in a future iOS update?
FB16241301
                    
                  
                
                    
                      Fix the beta iOS 26 right away i can’t check my software update page in my settings and iPhone charging ge overheated and slow to charge to
                    
                  
                
                    
                      Hello everyone,
I’ve been trying to pass a URL from Safari (or any other app) into my own app via iOS extensions (similar to how if you go to a website, open the share sheet, and hit the ChatGPT app icon, it opens ChatGPT and pastes the website URL into the chat textbox), and I’m hitting a wall. I’ve attempted both a Share Extension (using SLComposeServiceViewController) and a UI-less Action Extension (using extensionContext?.open(...)), but in both scenarios, my main app never opens.
Here’s a summary of my setup:
Main App Target plist
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
	<key>CFBundleURLTypes</key>
	<array>
		<dict>
			<key>CFBundleTypeRole</key>
			<string>Editor</string>
			<key>CFBundleURLName</key>
			<string>com.elislothower.URLDisplayApp</string>
			<key>CFBundleURLSchemes</key>
			<array>
				<string>myapp</string>
			</array>
		</dict>
	</array>
	<key>LSApplicationQueriesSchemes</key>
	<array/>
</dict>
</plist>
This means my custom URL scheme is myapp://.
My app delegate (or SwiftUI’s .onOpenURL) correctly handles myapp://share?url=... if I open it directly from Safari.
Share Extension Attempt
Subclassed SLComposeServiceViewController.
Plist had com.apple.share-services as the NSExtensionPointIdentifier.
I called extensionContext?.open(deepLink) with myapp://share?url=..., but it always returned false.
Also, the UI (with Cancel/Post buttons) was overkill for my needs.
UI-less Action Extension Attempt
Created a no-UI action extension with com.apple.ui-services as NSExtensionPointIdentifier.
In my custom ActionViewController, I formed the same myapp://share?url=... deep link and called extensionContext?.open(deepLink).
The extension does appear in the share sheet, but again, open(deepLink) returns false—my main app never opens.
Confirmed Setup
I’ve tested the URL scheme in Safari: typing myapp://share?url=... directly prompts to open my app, and the URL is handled fine there.
I’ve ensured both extension Info.plists have <key>LSApplicationQueriesSchemes</key><array><string>myapp</string></array> so they can attempt to open that scheme.
Tried on both simulator and physical device. On the physical device, the main app is definitely installed and has been launched at least once.
Current Behavior
The extension logs that it forms the deep link (myapp://share?url=...) correctly.
extensionContext?.open(deepLink) fails (success == false), so the main app never opens.
I’ve also tried forcing the call on the main thread, simplifying the URL (like myapp://test), and checking for any typos or case-sensitivity issues—still no luck.
Is there a known iOS restriction or trick for allowing an extension (share or action) to open its containing app directly? Have I missed a configuration step or entitlement that’s necessary? Is it possible that iOS is just rejecting the call in these contexts?
I’d love any insight or suggestions from those who have successfully launched their main app from an extension. Thank you in advance!
ContentView.swift
Info.plist
URLDisplayAppApp.swift
URLDisplayApp.entitlements
ActionRequestHandler.swift
ActionViewController.swift
Info.plist
MyAppActionExtension.entitlements
                    
                  
                
                    
                      I set iOS 26 to install overnight, put my iPhone 16 Pro on the MagSafe charger, watched it charge just fine, and went to sleep. When I woke up the iPhone showed the “plug into power” dead battery screen. I took it off MagSafe and put it back on. A half hour later the phone was warm but still wouldn’t power on, just showed the battery screen with a little red in it. I took it off MagSafe and plugged it into my iPad charging brick with USB cable to give it more power, still it did not turn on. I tried holding all the buttons to try to force a restart but didn’t work.
For anyone else encountering this, do this to enter DFU mode and restore it. I had to do it a few times before I got the timing right.
Plug into your Mac and open Finder (or apparently a PC with Apple Devices or iTunes)
Press and quickly release volume up
Press and quickly release volume down
Press and hold right side button
When the battery disappears and screen goes black, hold volume down and continue holding side button
After a couple seconds release the side button and continue holding volume down
A prompt to allow connecting to the iPhone should appear after a couple seconds, click Allow, and it’ll say the iPhone entered DFU mode - proceed to restore the firmware
                    
                  
                
                    
                      Hi everyone,
I’m currently trying to create a pure backdrop blur effect in my iOS app (SwiftUI / UIKit), similar to the backdrop-filter: blur(20px) effect in CSS. My goal is simple:
•	Apply a Gaussian blur (radius ~20px) to the background content
•	Overlay a semi-transparent black layer (opacity 0.3)
•	Avoid any predefined color tint from UIBlurEffect or .ultraThinMaterial, etc.
However, every method I’ve tried so far (e.g., .ultraThinMaterial, UIBlurEffect(style:)) always introduces a built-in tint, which makes the result look gray or washed out. Even when layering a black color with opacity 0.3 over .ultraThinMaterial, it doesn’t give the clean, transparent-black + blur look I want.
What I’m looking for:
•	A clean 20px blur effect (like CIGaussianBlur)
•	No color shift/tint added by default
•	A layer of black at 30% opacity on top of the blur
•	Ideally works live (not a static snapshot blur)
Has anyone achieved something like this in UIKit or SwiftUI? Would really appreciate any insights, workarounds, or libraries that can help.
Thanks in advance!
Ben
                    
                  
                
                    
                      I have an iPhone 14 Pro. I downloaded the iOS 26 beta and had a SERIOUS error, rendering the phone unusable.
I charged it to 60% and kept it plugged in while updating.
While updating, I restarted several times at the Apple logo, then at the Welcome screen, and it had quite a few bugs with low battery warnings.
When I turned it on, I noticed I had 1% (I thought it was strange).
When it was plugged in, it wouldn't charge; it only had 1% left, and it also restarted every 2 minutes. Off-plugged, it did exactly the same thing.
In the end, I had to go back to iOS 18.5; I had no problems with this version.
                    
                  
                
                    
                      So I downloaded and installed iOS26 on my iPhone 15 Pro and after that it does not want to boot anymore. It shows a battery with one red line. Which is weird because before it restarted it had 80% of the battery and I had the power cable connected at all time. Now I can't boot it up anymore. Not even with the volume button up, down, power button trick. Please help!
                    
                  
                
                    
                      I have an older Macbook, and it only supports Xcode 15.2, and I want to be able to work with my iPhone SE 3rd gen, which currently has the iOS 26 beta on it. Is there a place I can find the download for that, and can I even run it on that version of Xcode? If not, can I download a newer version of Xcode on MacOS Ventura?
                    
                  
                
                    
                      Hi All,
So I have been trouble publishing my app on App Store as it keeps rejected by App Review. Specifically guideline 2.3.10 and 3.1.1. Although I don't have any metadata for third-party services in my app or "tip" button anywhere within my apps binary. I do however have external links to my projects help and Github which have that information, which I am getting rejected for.
However, I want those external links because I need to have google play on the projects github page so users can know that are visiting to github that they can also download it officially from those sources as well. It is also useful to tell users that those are the only official platforms that I support, and downloading from anywhere else is not advised. Is there an acceptable solution where the google play and donation link can be kept on the github page? It is not really built into the binary itself anyways so I thought it would be allowed.
Here is an link to my projects repo in case that helps clarify: https://github.com/SrS2225a/custom_uploader
Really hoping to resolve this. I’d love to get the app on the App Store as soon as possible.
                    
                  
                
                    
                      (iOS 17.3)
I'm using the Apple supplied iOS sample project "ConfiguringAWiFiAccessoryToJoinTheUsersNetwork" as a base to write an App to configure an existing WiFi device using the NEHotspotConfiguration API's. I have almost everything working, and can join the network and send a packet to the device to configure it. I know that it is working as the device responds properly to what I send it. But I am not able to receive the response back from the device to the packet sent. (Only need 1 packet sent and 1 packet received)
However. If I run a packet sniffer on the phone before running my test App, then I do get a response. No packet sniffer running, no response.
When I do a debugDescription on the NWConnection after it reaches ".ready", I notice that when the sniffer is running I'm using loopback lo0:
[C1 connected 192.168.4.1:80 tcp, url: http://192.168.4.1:80, attribution: developer, path satisfied (Path is satisfied), viable, interface: lo0]
and I get a packet response in the NWConnection receiveMessage callback.
But with no sniffer running, I get interface en0:
[C1 connected 192.168.4.1:80 tcp, url: http://192.168.4.1:80, attribution: developer, path satisfied (Path is satisfied), viable, interface: en0[802.11], ipv4, dns, uses wifi]
and there is no callback to the receiveMessage handler and the NWconnection eventually times out.
The interface used seems to be the only difference that I can see when I have a sniffer running. Any ideas as to why I can't see a response in "normal" operation?