I have an app, which is depended on custom SDK. Custom SDK has dependencies, included all dependencies in Podspec and custom framework/static lib ios.vendored_frameworks.
here I sample of my pod spec
Pod::Spec.new do |s|
s.name = "SDK"
s.version = "1.0.1"
s.summary = "SDK"
s.source = { :git => "https://github.com/ABC/SDK.git", :tag => "v"+s.version.to_s }
s.platform = :ios
s.ios.deployment_target = '16.0'
s.swift_version = '4.2'
s.static_framework = true
s.frameworks = 'Security'
s.frameworks = 'CoreLocation'
s.requires_arc = true
s.module_name = 'SDK'
s.library = 'z'
s.default_subspec = 'Shared'
s.subspec 'Shared' do |shared|
shared.ios.vendored_frameworks = 'SDKLib.xcframework'
shared.dependency 'CocoaAsyncSocket', '~> 7.4'
shared.dependency 'CocoaHTTPServer'
shared.dependency 'SocketRocket', '~> 0.6'
shared.dependency 'QNNetDiag'
shared.dependency 'SAMKeychain'
shared.dependency 'AFNetworking/Reachability', '~> 4.0'
shared.dependency 'AFNetworking/Serialization', '~> 4.0'
shared.dependency 'AFNetworking/Security', '~> 4.0'
shared.dependency 'AFNetworking/NSURLSession', '~> 4.0'
shared.dependency 'CocoaMQTT'
shared.dependency 'Starscream', '~> 4.0.8'
shared.dependency 'TrustKit'
shared.dependency 'Firebase/Analytics'
end
end
below error I am getting while linking lib
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm trying to reach apple developer team since last one month regarding my apple developer account enrolment process
I have mailed to apple developer mail but not getting any response, after that I tried to reach them via support form but still I'm did not get any message or mail from the apple regarding my issue.
Topic:
Developer Tools & Services
SubTopic:
Apple Developer Program
Tags:
App Store
iOS
iPhone
Developer Program
We use URLSessionWebSocketTask for web socket connection. When get error we reconnect - recreate new URLSessionWebSocketTask.
Test case: off wifi on iOS device; get error(s) URLError.notConnectedToInternet. When on wifi correct create new task with connect.
This working on iOS 12, 14, 15, 16, 17. But on iOS 18 we get error URLError.notConnectedToInternet without correct connection.
class WebSocketManager {
...
func openConnection() {
webSocketTask?.cancel(with: .goingAway, reason: nil)
webSocketTask = urlSession?.webSocketTask(with: urlRequest)
webSocketTask?.resume()
listen()
}
func closeConnection() {
webSocketTask?.cancel(with: .goingAway, reason: nil)
webSocketTask = nil
}
private func listen() {
webSocketTask?.receive { [weak self] result in
guard let self else { return }
switch result {
case .failure(let error):
delegate?.webSocketManager(self, error: error)
case .success(let message):
switch message {
case .string(let text):
delegate?.webSocketManager(self, message: .text(text))
case .data(let data):
delegate?.webSocketManager(self, message: .data(data))
@unknown default:
fatalError()
}
listen()
}
}
}
}
Delegate:
func webSocketManager(_ webSocketManager: WebSocketManagerType, error: Error) {
webSocketManager.openConnection()
}
We're currently facing an issue with Intune not automatically updating/downloading the updated build/app to end-user ios devices. It's worth noting that we've recently migrated the Xamarin project to a .NET-style SDK in this version. Previously, the app used to update automatically without any problems. We'd appreciate it if you could help us understand what might be causing this issue.
Hello,
My app often crashes when I use simulators. I would like some help with reading the crash report that is generated. Especially with the part below Thread 0 Crashed. Based on other posts I understand that the 0x8BADF00D in the crash report is a WatchDog crash that basically says that WatchDog terminated the app because the main thread was blocked for a significant time. Many processes can block the main thread so it's hard to find out what it is in our specific case. Can someone help me reading through the crash report?
Short_crash_report.txt
Background information
The application is Xamarin Native and I use Rider as an IDE. When I use Visual Studio, the simulators run just fine. No crash occurs while using my app on a device.
The crash happens on multiple simulators with different OS versions. I already deleted XCode cache, erased content and settings of several simulators and deleted iOS DeviceSupport files.
I would like to create iOS App that have MTP-USB functions.
So I have the following questions:
1.Is it possible to create an application with MTP-USB function using .NET MAUI?
2.If 1. is possible, how should I implement it?
The devices I plan to use are iPhone 15 and iPhone 15 Pro.
Thanks.
We are getting the cookie from server side when user will do the login successful. Cookie store into app browser. This cookie need to clear when user will do the logout app.
We are using the Cordova framework to create the iOS application. In Cordova i have used plugin to clear the cookie. But in iOS device not able to clear the app browser cookie. And in android device same Cordova plugin is working fine.
Why the iOS device not able to clear the cookie using Cordova plugin?
Plugin name - https://github.com/Cartegraph/cordova-cookie-master
Kindly help me out with the solutions.
I am working on a React Native application where I want to modify the native text selection menu (the menu that appears when you long-press on text). Specifically, I want to add a custom option alongside the default ones like Copy, Look Up, Translate, Search Web, and Share.
Is there a way to modify the native text selection menu inside a WebView on iOS?
How can I add a custom menu option to the default text selection menu while keeping all the default options intact?
Anyone know how to reduce the padding between list section header (plain style) and search bar? I have tried all available method on google but none work. The default list style does not have this big padding/space between the section header and the search bar.
struct Demo: View {
@State private var searchText: String = ""
var body: some View {
NavigationStack {
List {
Section {
ForEach(0..<100) { index in
Text("Sample value for \(index)")
}
} header: {
Text("Header")
.font(.headline)
}
}
.listStyle(.plain)
.navigationTitle("Demo")
.navigationBarTitleDisplayMode(.inline)
.searchable(text: $searchText)
}
}
}
I have a home widget with buttons (new in iOS 17).
In order to prevent taking action if the user taps on the widget buttons accidentally, I want to ask the user for confirmation.
It appeared that requestConfirmation be exactly what I needed, but no confirmation view shows up when I invoke this method in the perform function.
I have tried the following:
try await requestConfirmation(result: .result(dialog: "Are you sure you want to do this?") {
Image(.mdlsWhite)
})
and this alternative:
let confirmed: Bool = try await $name.requestConfirmation(for: self.name,
dialog: IntentDialog(stringLiteral: msg))
Neither option work.
I am starting to think that the requestConfirmation is not to be used with Home Widgets.
Is there a better way to handle confirmations for buttons included in a Home Widget?
Hello, dear forum members! I have a serious (for me personally) question for you.
I have a personal iphone 15 pro max, updated to the latest 18.3.
There are many bugs and glitches in my work applications.
I ask you for help. I need to find (or generate) a signature for ipsw iOS 17.5/6/7 to flash my phone.
I don't want to get rid of it, sell it, etc.
I understand two key concepts from desktop platforms:
Screen Mirroring – The same content is displayed on both the primary and external screens.
Screen Extension – The external display shows different content that complements what's on the main screen.
My question pertains to the second point: Is it possible to extend the display on iOS and iPadOS devices?
I'm referring to this Apple documentation, which explains how to extend content from an iOS/iPadOS device to an external display.
I tested this in a sample iOS Xcode project. In the iOS Simulator, I was able to detect an "external display" and present a separate UIWindow on it. However, when I tried the same on a real device (iPhone 15 connected to a MacBook Pro via cable), the external display connection was not detected.
I’d like to confirm whether screen extension is possible on a real iOS device. From my research, it appears that extension is only supported on iPadOS via Stage Manager, but I want to verify if there’s any way to achieve this on an iPhone. If so, are there any known apps that currently utilize extended display functionality on iOS?
If extension is not possible on iOS, what does the documentation mentions iOS?
Hi all!
I have been experiencing some issues when using the AVAudioEngine to play audio and record input while doing a voice chat (through the PTT Interface).
I noticed if I connect any players to the AudioGraph OR call start that the audio session becomes active (this is on iOS).
I don't see anything in the docs or the header files in the AVFoundation, but is it possible that calling the stop method on an engine deactivates the audio session too?
In a normal app this behavior seems logical, but when using PTT all activation and deactivation of the audio session must go through the framework and its delegate methods.
The issue I am debugging is that when the engine with the input node tapped gets stopped, and there is a gap between the input and when the server replies with inbound audio to be played and something seems to be getting the hardware/audio session into a jammed state.
Thanks for any feedback and/or confirmation on this behavior!
I want to use SwiftUI and RealityView to get AR scene understanding data (ARMeshAnchor) on iOS devices with LiDAR. The only way we can do that is by using ARSession (unless there is another way).
However in previous iOS 18 builds there was this function:
https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:session:arconfiguration:)
, which worked with SpatialTrackingSession and a custom ARSession together. This function in the the latest iOS and Xcode has since been removed in the RealityKit framework but still there on documentation.
I also wanted to get ARFaceAnchor data which I still cannot get without ARSession, the closest I can get is by using:
let target = AnchoringComponent.Target.face
let anchoringComponent = AnchoringComponent(target, trackingMode: .predicted)
entity = Entity()
entity!.components.set(anchoringComponent)
But I still can't find a way to get the current frame (ARFrame) or the anchors ([ARAnchor]) in the view.
Alternatively if I use if I use this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:) and start the ARSession separately. The session (didUpdate and didAdd) only runs for a few frames before getting interrupted.
And if I completely remove SpatialTrackingConfiguration and just run the ARSession. There still is a valid tracked entity for the AnchoringComponent.Target.face component. IF in the configuration for the ARSession I use the ARWorldTrackingConfiguration with face tracking. And I still get updated facial data each frame. But the ARSession didUpdate or didAdd functions don't get called passed the first few frames.
Interestingly if I switch the RealityViewCameraContent.RealityViewCamera to .virtual. I get ARMeshAnchor and ARFaceAnchor data, but no camera feed (as expected). This with or without SpatialTrackingConfiguration.
My overarching question is what is the proper way to access ARMeshAnchors and other ARAnchors created by the system and track them live while also using SwiftUI.
GitHub Repo with sample project can be found here: https://github.com/bpate75/RealityViewTesting
Hi Apple Developers,
I am currently working on a message filtering application and facing issues specifically with filtering RCS (Rich Communication Services) messages. To debug this, I created a sample app that consistently categorizes all incoming messages as "junk." However, the filtering behaviour is inconsistent and not functioning as expected.
Here are the key issues observed during testing on iOS versions 18.2.1 and 18.3:
Inconsistent Filtering Behavior:
When a message is received from an unknown number, it sometimes gets moved to the Junk folder momentarily but is then immediately moved back to the main Messages inbox.
In some cases, the message does not get moved to the Junk folder at all, despite the app returning the verdict as "junk."
Duplicate Contact Tiles:
The Messages app displays two separate conversation tiles for the same mobile number, which is unexpected behavior.
For reference, my carrier partner is T-Mobile. Please let me know if you need any additional information to investigate this issue further.
Looking forward to your insights and guidance.
Best regards,
Rijul Singhal
I've got an existing app which is using some 3rd party xcframeworks within its app extensions (for example within a Notification Service Extension).
Within the target for the app extensions there is a Frameworks and Libraries section where the xcframework was dragged and dropped into.
However now I want to create a new project and do a similar thing, within the app's target there is a Frameworks and Libraries section, but when an app extension target is created, Xcode is not adding a Frameworks and Libraries section.
There is a Link Library with Binary section, however this doesn't have an embed section (where you can select to embed, don't embed, embed without signing etc.) and I get build error trying to drag and drop the xcframework in here.
Where id the Framewoks and Libraries section go for app extensions for projects created with Xcode 16? How can this section be added?
Use case: When users open any URL in iOS Safari and tap the share button, the share extension can automatically receive the PDF version of the webpage from Safari.
We've observed that the system Markup app and the Apple Books app can automatically receive a PDF file from Safari. However, our custom share extension only receives the webpage URL.
While iOS Safari provides an "Options" button to manually share a webpage as a PDF, this feature is not intuitive and requires user education.
Could someone help to identify the correct NSExtensionActivationRule (or any other solution) that would allow our share extension to directly receive the PDF snapshot from Safari without requiring additional user actions?
Hi,
I've noticed a warning during export archive phase (xcrun xcodebuild -exportArchive) in our iOS CI/CD pipelines.
Updating the export options info plist file to use
<key>method</key>
<string>app-store-connect</string>
helps, but I am not sure if we should update it or not since I cannot find any references about this change in the docs.
It'd be very helpful to get some guidelines.
Thank you
Title: Content Overlapping Address Bar After Clicking Links in Safari, tested on iPhone 11 (iOS 18.1.1)
Description:
When browsing in Safari on iPhone (iOS 18.1.1), the one-tab bar (address bar) collapses as expected when scrolling down a page. However, after clicking on a link and loading the next page, the content appears to overlap the collapsed address bar. This results in parts of the content being hidden or obscured by the address bar, which affects the user experience, especially on mobile devices with limited screen space. This issue is reproducible on Next.js applications and can be observed on websites such as rotterdam.nl and halderberge.nl.
Steps to Reproduce:
Enable the One-Tab Bar: Go to Settings > Safari and enable the one-tab bar feature.
Open the website rotterdam.nl or halderberge.nl in Safari on an iPhone 11 (iOS 18.1.1).
Scroll down the page so that the top address bar collapses.
Click on any link on the page to load a new one.
Once the new page loads, observe that the content appears on top of the collapsed address bar, causing parts of the content to be hidden or obscured.
Expected Result:
The content should not overlap or be hidden behind the collapsed address bar after the page reloads. The layout should adjust properly without interference from the address bar, providing a smooth user experience.
Actual Result:
When the new page loads, the content overlaps or appears on top of the collapsed address bar, causing parts of the content to be hidden or obscured.
Device(s) Affected:
iPhone 11 running iOS 18.1.1.
OS Version:
iOS 18.1.1
Technical Notes:
To address this issue, the following solutions have been attempted with no success:
Viewport Meta Tag:
<meta name="viewport" content="width=device-width, initial-scale=1, viewport-fit=cover" />
This was added to help ensure proper layout on mobile devices, but did not resolve the issue.
CSS Safe Area Insets:
body {
padding-top: env(safe-area-inset-top);
}
This CSS rule was applied to account for the safe area and prevent content from being hidden under the address bar, but it did not solve the overlapping issue.
Scroll Position Adjustment (for scroll-to-top button):
Adjusting the scroll behavior by changing the scroll position to {top: 1} instead of {top: 0} was a successful workaround to keep the address bar collapsed when clicking the "scroll to top" button. However, this did not resolve the issue when navigating between pages or changing routes, where the content still overlaps the collapsed address bar.
Is there a way to observe the currentEDRHeadroom property of UIScreen for changes? KVO is not working for this property...
I understand that I can query the current headroom in the draw(...) method to adapt the rendering. However, our apps only render on-demand when the user changes parameters. But we would also like to re-render when the current EDR headroom changes to adapt the tone mapping to the new environment.
The only solution we've found so far is to continuously query the screen for changes, which doesn't seem ideal. It would be better if the property would be observable via KVO or if there would be a system notification to listen for.
Thanks!