When working with SwiftUI TabView and ScrollView at root level with scrollPosition(id:anchor:), after tapping on Tab item the ScrollView scrolls to the top, but the scroll position ID does not get updated.
struct ContentView: View {
@State var positionID: Int?
var body: some View {
TabView {
Tab("Test", systemImage: "house") {
ScrollView(.vertical) {
LazyVStack(pinnedViews: [.sectionHeaders]) {
ForEach(0 ... 100, id: \.self) { index in
Text("\(index)")
}
}
.scrollTargetLayout()
}
.scrollPosition(id: $positionID, anchor: .top)
.onChange(of: positionID) { _, newValue in
print(newValue)
}
}
}
}
}
FB15964820
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi community.
I am trying to adopt my first person shooter iOS game for running on MacOS environment.
I need to lock the pointer when I enter battle mode, and unlock in lobby.
On iOS all works fine (with mouse and keyboard) - pointer locks and unlocks according to my commands.
However, on MacOS I faced the following behavior:
after switching the pointer lock state and setNeedsUpdateOfPrefersPointerLocked invocation, the pointer does not locked immediately. To enable pointer lock, the user must click in the window.
I checked the criteria listed in documentation: I do have fullscreen mode, I monitor UISceneActivationState and can confirm it is UISceneActivationStateForegroundActive, I do not use MacCatalyst (it is disabled in app's capabilities). However pointer locks only after click on window, which is weird.
Can someone confirm that this is the exact behaviour as designed by Apple developers, or am I doing anything wrong.
I have read the note:
"Bringing an app built with Mac Catalyst to the foreground doesn’t immediately enable pointer lock. To enable pointer lock, the user must click in the window. To exit pointer lock, users can use Command-tab to switch to another app, or using Command-tilde.", but again, I don't use MacCatalyst.
Any hints are highly appreciated!
Best regards.
refs:
https://developer.apple.com/documentation/apple-silicon/running-your-ios-apps-in-macos
https://developer.apple.com/documentation/uikit/uiviewcontroller/3601235-preferspointerlocked?language=objc
When tapping on "password" in the accessory view above the keyboard, no password manger is opened. The keyboard just closes and re-opens.
I have made sure a password exists and is available to be used.
I have replicated this within my own app, but also from within the Settings app on the simulator (see attached gif) so I am confident it is not a coding issue on my side.
I have replicated it on both iOS 17 and iOS 18
I am using Xcode Version 16.0 (16A242d)
I am running macOS Sonoma Version 14.7 (23H124)
I am fairly confident I have not experienced this when working on another clients app. I was using Xcode 15 for that so not sure if this is something introduced with Xcode 16.
The customer's iPhone 13 Pro Max crashes when opening the app after upgrading to iOS 18.1.1. The crash information collected via TestFlight is as follows:
[[PHPhotoLibrary sharedPhotoLibrary] presentLimitedLibraryPickerFromViewController:self];弹出照片选择器时,导航栏背景颜色和导航栏字体颜色均为白色,导致无法辨认。
使用
[[UINavigationBar appearanceWhenContainedInInstancesOfClasses:@[UIImagePickerController.class]] setTintColor:[UIColor blackColor]];没有作用
Hello,
I am working on an iOS app that has interactive components that react to the device accelerometer. The app works in landscape left and right only, and I need some way to identify if the screen is left or right to get the acceleration direction correct. Basically, accelerating to the device's "left" means you actually need to move the elements on screen to the left or right depending on screen direction.
One solution I tried is using UIDeviceOrientation. This works in most cases, but when screen lock is on, it gives an update about the device orientation when the screen itself didn't tilt.
Is there some other way to get the screen orientation that accurately reflects the screen orientation and not just device orientation?
Thank you.
I don't know what triggered this in a previously-running application I'm developing: When I have the build target set to "My Mac (designed for iPad)," I now must delete all the app's build materials under DerivedData to get the app to build and run exactly once. Cleaning isn't enough; I have to delete everything.
On second launch, it will crash without even getting to the instantiation of the application class. None of my code executes.
Also: If I then set my iPhone as the build target, the app will build and run repeatedly. If I then return to "My Mac (designed for iPad)," the app will again launch once and then crash on every subsequent launch.
The crash is the same every time:
dyld[3875]: Symbol not found: _OBJC_CLASS_$_AVPlayerView
Referenced from: <D566512D-CAB4-3EA6-9B87-DBD15C6E71B3> /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/Library/Debugger/libViewDebuggerSupport.dylib
Expected in: <4C34313C-03AD-32EB-8722-8A77C64AB959> /System/iOSSupport/System/Library/Frameworks/AVKit.framework/Versions/A/AVKit
Interestingly, I haven't found any similar online reports that mention this symbol.
Has anyone seen this behavior before, where the crash only happens after the first run... and gets reset when you toggle the target type?
I want to get the barometric pressure reading from the built-in barometer to display it in my iOS app. When I use CMAltimeter.startRelativeAltitudeUpdates(), my app receives no relative altitude update events, which means I can't view the barometric pressure from the sensor (because that's only contained in CMAltitudeData, not CMAbsoluteAltitudeData. If I use CMAltimeter.startAbsoluteAltitudeUpdates(), I get absolute altitude update events every second or so. NSMotionUsageDescription is set.
I have tried the following things, all of which haven't worked:
Only calling startRelativeAltitudeUpdates() and not startAbsoluteAltitudeUpdates()
Calling CMSensorRecorder.recordAccelerometer(forDuration: 0.1), as suggested in this thread
Calling CMMotionActivityManager.queryActivityStarting(from: .now, to: .now, to: .main), as suggested here
Physically moving my iPhone up and down about 200 feet using an elevator; I see absolute altitude updates which are in line with what's expected, but still receive no relative altitude update events
Calling the same APIs in a watchOS app on an Apple Watch Series 10; I see much less frequent absolute altitude updates, and still no relative altitude updates
I know the barometer sensor is working, because when I move my iPhone up and down even a foot or two indoors, I see an immediate change in the absolute altitude reading that I know wouldn't come from GPS.
This example code, when run on my iPhone 16 Pro running iOS 18.1.1, prints updates started and then updating absolute every second, but doesn't print anything else. The absolute altitude, accuracy, and authentication status fields update (and the auth status shows 3, indicating .authorized), but the relative altitude and pressure fields remain as --.
struct ContentView: View {
@State private var relAlt: String = "--"
@State private var relPressure: String = "--"
@State private var absAlt: String = "--"
@State private var precision: String = "--"
@State private var accuracy: String = "--"
@State private var status: String = "--"
var body: some View {
VStack {
Text("Altitude: \(relAlt) m")
.font(.title3)
Text("Pressure: \(relPressure) kPa")
.font(.title3)
Text("Altitude (absolute): \(absAlt) m")
.font(.title3)
Text("Precision: \(precision) m")
.font(.title3)
Text("Accuracy: \(accuracy) m")
.font(.title3)
Text("Auth status: \(status)")
.font(.title3)
}
.padding()
.onAppear {
let altimeter = CMAltimeter()
startRelativeBarometerUpdates(with: altimeter)
startAbsoluteBarometerUpdates(with: altimeter)
status = CMAltimeter.authorizationStatus().rawValue.formatted()
print("updates started")
}
}
private func startRelativeBarometerUpdates(with altimeter: CMAltimeter) {
guard CMAltimeter.isRelativeAltitudeAvailable() else {
relAlt = "nope"
relPressure = "nope"
return
}
altimeter.startRelativeAltitudeUpdates(to: .main) { data, error in
if let error = error {
print("Error: \(error.localizedDescription)")
return
}
if let data = data {
print("updating relative")
relAlt = String(format: "%.2f", data.relativeAltitude.doubleValue)
relPressure = String(format: "%.2f", data.pressure.doubleValue)
} else {
print("no data relative")
}
}
}
private func startAbsoluteBarometerUpdates(with altimeter: CMAltimeter) {
guard CMAltimeter.isAbsoluteAltitudeAvailable() else {
absAlt = "nope"
print("no absolute available")
return
}
let altimeter = CMAltimeter()
altimeter.startAbsoluteAltitudeUpdates(to: .main) { data, error in
if let error = error {
print("Error: \(error.localizedDescription)")
return
}
if let data = data {
print("updating absolute")
absAlt = String(format: "%.2f", data.altitude)
precision = String(format: "%.2f", data.precision)
accuracy = String(format: "%.2f", data.accuracy)
}
}
}
}
Is this behavior expected? How can I trigger delivery of relative altitude updates to my app?
Hello Apple Engineers,
Specific Issue:
I am working on a video recording feature in my SwiftUI app, and I am trying to record 4K60 video in ProRes Log format using the iPhone's internal storage. Here's what I have tried so far:
I am using AVCaptureSession with AVCaptureMovieFileOutput and configuring the session to support 4K resolution and ProRes codec.
The sessionPreset is set to .inputPriority, and the video device is configured with settings such as disabling HDR to prepare for Log.
However, when attempting to record 4K60 ProRes video, I get the error: "Capturing 4k60 with ProRes codec on this device is supported only on external storage device."
This error seems to imply that 4K60 ProRes recording is restricted to external storage devices. But I am trying to achieve this internally on devices such as the iPhone 15 Pro Max, which has native support for ProRes encoding.
Here are my questions:
Is it technically possible to record 4K60 ProRes Log video internally on supported iPhones (for example: iPhone 15 Pro Max)?
There are some 3rd apps (i.e. Blackmagic 👍🏻) that can save 4K60 ProRes Log video on iPhone internally. If internal saving is supported, what additional configuration is needed for the AVCaptureSession or other technique to bypass this limitation?
If anyone has successfully saved 4K60 ProRes Log video on iPhone internal storage, your guidance would be highly appreciated.
Thank you for your help!
When I tried to log in to the app store connect, it directly redirected to the Users and Access page. I can't go to the apps page (https://appstoreconnect.apple.com/apps)
I tried changing the size of the screen to very narrow, and then I found a hamburger menu in the upper left corner. Even in the hamburger menu, it is showing only Users and Access
Topic:
Developer Tools & Services
SubTopic:
Apple Developer Program
Tags:
iOS
Swift
App Store Connect
App Submission
Hi everyone,
I'm encountering an issue while working with my iPhone running iOS 18.1.1 and Xcode 16.1.
I need to run tests on my physical device using idb (Facebook's iOS Device Bridge). However, I'm unable to proceed because the required Developer Disk Image (DDI) for iOS 18.1 is not present in Xcode's DeviceSupport directory. Here's what I've tried so far:
Verified that I have the latest version of Xcode (16.1) installed:
xcodebuild -version outputs:
Xcode 16.1
Build version 16B40
Checked the contents of /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport, and the highest iOS version available is 16.4. There is no folder for 18.1.
Tried using idb to interact with my device, but it fails with the error:
The best match /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/16.1/DeveloperDiskImage.dmg: 16.1 is not suitable for 18.1
I understand that the DDI for each iOS version is critical for debugging and testing on physical devices. I've explored potential workarounds, including downloading the iOS 18.1 restore image, but I haven't been able to resolve the issue.
Questions:
Does Xcode 16.1 officially support the iOS 18.1 Developer Disk Image?
If not, is there an official way to download and add the missing DDI to DeviceSupport?
Could creating a sample app in Xcode targeting iOS 18.1 trigger the download of the required DDI?
I'd appreciate any advice or guidance from the community or Apple team. If there's a known resolution or an official source for missing DDIs, please let me know.
Thank you in advance for your help!
I am currently working on implementing the Live Caller ID Extension for my iOS app, and I understand that a backend server is required for this functionality. While I’ve gone through Apple’s documentation, the details on the backend setup are limited and not very clear for my backend team to implement it effectively.
Could someone provide a more detailed explanation or sample implementation of the backend server required for this extension? Specifically, we are looking for:
A clear understanding of the APIs and endpoints the backend needs to expose.
Any authentication mechanisms required for communication with the extension.
Data format (e.g., JSON structure) for requests and responses.
Example code or additional resources, if available.
Any help or guidance in understanding the exact backend requirements would be greatly appreciated.
what API do you call in IOS to ring an 0800 number (free calling)
I want to put it at the bottom of every screen.
Would be good if it was smart and worked different on MacOS and iPadOS
Love to meet if you are in New Zealand.
Question:
A very small number of users experience this crash:
NSInternalInconsistencyException:Use of the class INVocabulary requires the entitlement com.apple.developer.siri.
Make sure you have enabled the Siri capability in your Xcode project.
But our project definitely has siri configured to be available.
During app startup, calling the following code causes a crash:
INVocabulary *vocabulary = [INVocabulary sharedVocabulary];
Now that we can't figure it out, is it a bug in the system?
Dialog: "Would you to save this password in your Keychain to use with apps and websites"
Xcode 15.4
Simulator iOS 16.4
Rosetta
Macbook M2 pro
=> dialog save password showing and work normal.
When updating to xcode 16, it is not showing and makes the app unable to do anything else.
xCode 16.0
Simulator iOS 18.0
Rosetta
Macbook M2 pro
=> dialog save password not showing
However, if you use Non-rosetta, the dialog will not show but you can still operate the app.
My app is an alarm app, but if I deny the alarm permission, it goes to the settings page. Will this be a problem?
Apple's review process has been rejecting things that previously had no problems in various cases, and it's a bit embarrassing and makes it difficult for me.
If a server is sending a push to an app, then how can it know whether it should be sending the push using the Apple sandbox push server, or the production server?
If the app is on the app store or testflight then it needs to be using the sandbox server, but if the app is being run via XCode interactively as devs are developing/testing then the push needs to be sent via the sandbox server.
But the server itself has no idea if the app was installed via Testflight/app store/XCode/ or a development .ipa. So the server can't know how to send the push.
The app has to send the push token to the server anyway, so the app could inform the server which environment it should be sent over. But then how can the app detect that itself?
A naive answer is to use #ifdef DEBUG to detect this, but that is incorrect. Which environment a push should be sent over is not correlated with that. For example an app could be being run with a debug scheme or a release scheme, but in both cases if the app is installed/running via xcode then the push environment has to be the sandbox.
So my question is, is there a way the app can detect which push environment a push should be sent over in order than it can instruct the server accordingly?
I belong to an EC shop application developers' team, and we got a crame from a small part of our customers about our application.
"Search Bar does not work on iOS 18."
This bug doesn't appear on most of our devices updated to iOS 18.0.
In some cases, it disappeared by turning [Settings > Accessibility > Touch > Reachability] off.
But it is not the same for all customers found the bug.
I'm looking for how to fix this bug, and why it happens.
I'm not sure but I doubt that this may be a bug of iOS18, UIKit, RxCocoa, RxSwift, or something else.
Any information would be welcome.
import UIKit
import RxSwift
import RxCocoa
@IBDesignable
public final class SearchBar: UISearchBar {
var textField: UITextField {
if #available(iOS 13.0, *) {
return searchTextField
} else {
return value(forKey: "_searchField") as! UITextField
}
}
private let disposeBag = DisposeBag()
private func bind() {
textField.rx.isFirstResponder
.bind(to: Binder(self) { me, isFirstResponder in
// This doesn't work in some iOS 18 devices.
me.textField.attributedPlaceholder = placeholderAttributedString(isFirstResponder: isFirstResponder)
me.textField.backgroundColor = isFirstResponder ? Asset.Colors.whiteTwo.color : .white
if me.useCancelButton {
me.showsCancelButton = isFirstResponder
}
if me.useBookmarkButton {
me.showsBookmarkButton = !isFirstResponder
}
})
.disposed(by: disposeBag)
}
public override init(frame: CGRect) {
super.init(frame: frame)
commonInit()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
public override func awakeFromNib() {
super.awakeFromNib()
commonInit()
}
public override func prepareForInterfaceBuilder() {
super.prepareForInterfaceBuilder()
commonInit()
}
private func commonInit() {
bind()
}
}
extension Reactive where Base: SearchBar {}
import UIKit
import RxSwift
import RxCocoa
@IBDesignable
public final class SearchHeaderView: UIView {
@IBOutlet private weak var searchBar: SearchBar!
@IBOutlet private weak var cartContainerView: UIView!
private let disposeBag = DisposeBag()
public override init(frame: CGRect) {
super.init(frame: frame)
loadFromNib()
commonInit()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
public override func awakeFromNib() {
super.awakeFromNib()
loadFromNib()
commonInit()
}
public override func prepareForInterfaceBuilder() {
super.prepareForInterfaceBuilder()
loadFromNib()
commonInit()
}
private func commonInit() {
bind()
}
private func bind() {
// ↓ This doesn't work in some iOS 18 devices.
searchBar.textField.rx.isFirstResponder
.bind(to: cartContainerView.rx.isHidden)
.disposed(by: disposeBag)
}
}
extension SearchAndCartHeaderView: NibOwnerLoadable {}
Hello, all,
I'm new to iOS development and working on a project with the following setup:
Architecture:
Windows PC running Ubuntu (WSL) hosting a WebSocket Server with self-signed SSL
Python GUI application as a client to control iOS app
iOS app as another client on physical iPhone
Server running on wss://xxx.xxx.xxx.1:8001 (this is the mobile hotspot IP from Windows PC which the iPhone is needed to connect to as well)
Current Status:
✓ Server successfully created and running
✓ Python GUI connects and functions properly
✓ iOS app initially connects and communicates for 30 seconds
✗ iOS connection times out after 30 seconds
✗ Map updates from GUI don't sync to iOS app
Error Message in Xcode terminal:
WebSocket: Received text message
2024-11-25 15:49:03.678384-0800 iVEERS[1465:454666] Task <CD21B8AD-86D9-4984-8C48-8665CD069CC6>.<1> finished with error [-1001] Error Domain=NSURLErrorDomain Code=-1001 "The request timed out." UserInfo={_kCFStreamErrorCodeKey=-2103, _NSURLErrorFailingURLSessionTaskErrorKey=LocalWebSocketTask <CD21B8AD-86D9-4984-8C48-8665CD069CC6>.<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalWebSocketTask <CD21B8AD-86D9-4984-8C48-8665CD069CC6>.<1>"
), NSLocalizedDescription=The request timed out., NSErrorFailingURLStringKey=wss://xxx.xxx.xxx.1:8001/, NSErrorFailingURLKey=wss://xxx.xxx.xxx.1:8001/, _kCFStreamErrorDomainKey=4}
Technical Details:
Using iOS built-in URLSessionWebSocketTask for WebSocket connection
Self-signed SSL certificate
Transport security settings configured in Info.plist
Map updates use base64 encoded PNG data
Questions:
What's causing the timeout after 30 seconds?
How can I maintain a persistent WebSocket connection?
Why aren't map updates propagating to the iOS client?
Any guidance/suggestions would be greatly appreciated. Please let me know if additional code snippets would help on what I currently have.
This is my first day with IOS 18.1.1 and so far it’s smooth. my only problem is how chaotic the photos app has become during the update. For one, i dont like how to access any of the organization it’s at the way bottom and even after customizing and reorganizing there’s no way to to move that section to the top. i also dont like how all my photos are just out on front street when the app is launched, it makes everything hard to look at and hard to find. please fix this and make browsing photos enjoyable again.