On iPads after updating to iPadOS 16.4, Safari often "looses" the session cookie provided by PlayFramework: When the browser requests assets (js scripts) or when additional data is fetched by JavaScript, the session cookie is not included in the request.
These secondary requests will redirect through our IAM because no session cookie is present. The IAM redirects back to the original domain with a payload so that the login session can be resumed. A new Set-Cookie header is sent in the response with the new session cookie.
This causes the framework to issue a new CSRF token (that is part of the session cookie) which is different from the old one that was already rendered into a hidden form input. The browser stores this new token and includes it when it POSTs the form. The token in the body of the request is now different from the one in the cookies, causing the CSRF check to fail.
We have tried different devices (Android, Windows, MacBook, and iPads) on different versions. The problem only occurs with Safari on iPad/MacBook running version 16.4, 16.4.1, or 16.5 beta. The problem cannot be reproduced using Chrome on iPad. Furthermore, the problem does not occur with private browsing in Safari.
Some things we ruled out:
Same behaviour on devices managed by MDM and on open devices.
PlayFramework version is now updated to the latest 2.8 version.
Using a separate cookie for the CSRF token (instead of the play session cookie) does not make a difference either.
Modifying the Cache-Control header to cache responses more aggressively or not at all does not help.
Has anyone also experienced this or similar problems?
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue.
We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread.
After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called.
I've attached the crashlog below.
Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
I have noticed that in iOS 14 the UIPickerView has by default a light grey background on the selected Row like shown here.
https://developer.apple.com/design/human-interface-guidelines/ios/controls/pickers/
I noticed also that pickerView.showsSelectionIndicator is deprecated on iOS 14.
Is there a way to change the background color to white and add separators to achieve a pre iOS 14 UIPickerView style?
Thank you
I pushed a configuration to my iPhone through MDM to run the content filter. However, when I modify the configuration by adding some vendor-configuration , I lose connection to the debugger and can no longer see logs or the updated configuration in Xcode. I have to build the app again. Could this be an issue with Xcode, or is it related to MDM or the configuration itself?
Our company is developing an MFi headset with a button that we would like to use for initiating PTT.
We can detect the button press and initiate PTT successfully, even when the app is not in the foreground, using the ExternalAccessory framework.
But I wonder if this is a coincidence, or a scenario that should reliably work with Push to Talk?
I have an app that displays artwork via MPMediaItem.artwork, requesting an image with a specific size. How do I get a media item's MPMediaItemAnimatedArtwork, and how to get the preview image and video to display to the user?
Hi, I'm looking for the best way to use MLX models, particularly those I've fine-tuned, within a React Native application on iOS devices. Is there a recommended integration path or specific API for bridging MLX's capabilities to React Native for deployment on iPhones and iPads?
After doing a software update I can only view iMessages older than 12hrs (from a single contact) if the other person replies to and older message. Every time I leave the Massages app and go back in the issue occurs again.
I'm pretty excited for the all new iOS 26 redesign, but what I'm most excited for is the call screening feature to avoid unwanted calls.
I downloaded the developer's beta on my secondary device to report bugs, but call screening doesn't work on it even though it's turned on. I already sent a feedback report, but I'm not sure if it's just a beta bug or that my iPhone won't support it. I haven't found much info about it.
It's a 2nd gen SE.
Thanks.
Hi,
On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor).
In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense.
There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following.
.accessibilityRotor(.headings) {
ForEach(entries) { entry in
// entry.id must be the same as the id of the SwiftUI view it is about
AccessibilityRotorEntry(entry.name, id: entry.id)
}
}
It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up.
I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them.
So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else)
The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed).
Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder.
Thanks
Hi, I’ve recently installed the iOS 26 DB, and I’m experiencing heating issues on my iPhone. Is there a fix to this? Also, is it a bug or a feature, that I can’t remove certain sections in the Photos app rather than rearranging them. Pls lemme know. Thanks.
It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
As part of the WWDC25 Keynote, a technology was announced that can present 2D images as 3D spatial scenes. This announcement is supported by a Press Release.
...developers can use the Spatial Scene API to make their app experience even more immersive. Zillow is taking advantage of the API for their Zillow Immersive app, allowing users to see images of homes and apartments with the rich depth and dimension that spatial scenes offer.
The feature also appears in the Photos App on iOS 26 Developer Beta 1. Tapping "Spatial Scene" on any photo opens a view of that photo with a parallax effect. I've searched the WWDC sessions and new documentation and have come up short. Reaching out here for help.
Is there any documentation for Spatial Scene API? Or any guidance on how to implement the spatial scene in iOS?
iOS 26 added smoothness to CIRoundedRectangleGenerator, for use with CIFilter.roundedRectangleGenerator. What should the smoothness value be to achieve the same corner curve as CALayerCornerCurve.continuous? Does it need to be calculated based on the extent size, if so, how?
been testing out ios 26 since it came out and now all of a sudden everytime it boots only into recovery and wont do anything else. recovery says no known issues found.
Hi everyone,
I’m currently testing iOS 26 on my iPhone as part of the developer program. According to Apple’s documentation and demo materials, a new screenshot animation was introduced in this version. However, when I take a screenshot on my device, the animation remains the same as in previous iOS versions.
I’ve double-checked that I’m running the correct build of iOS 26, and I haven’t found any settings that might enable or disable this feature.
Is anyone else experiencing the same issue? Could this new animation be device-specific, region-limited, or require additional configuration?
Any insight would be appreciated!
Thanks in advance,
Alonso Rivera
There’s a critical, actively exploited vulnerability in Apple’s iOS activation servers allowing unauthenticated XML payload injection:
https://cyberpress.org/apple-ios-activation-vulnerability/
This flaw targets the core activation process, bypassing normal security checks. Despite the severity, it’s barely discussed in public security channels.
Why is this not being addressed or publicly acknowledged? Apple developers and security researchers should urgently review and audit activation flows—this is a direct attack vector on device trust integrity.
Any insights or official response appreciated.
The swift syntax compilation reported an error.
as follows
How should I be compatible
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
Some users have reported an error editing portrait photo assets in my app:
The operation couldn’t be completed. (CINonLocalizedDescriptionKey error 3.)
What is that error? Will affected photos always encounter this error (due to data corruption for example) or can it be resolved in a future iOS update?
FB16241301