I have a content blocker that generally works correctly, but I need to block an element that has certain text in it.
For example, <span id="theId">Some text</span> is easy enough to block because I can locate the id and block that, but what if there is no id, or the id is completely random? What if it's just <span>Some text</span>? How do I block that?
Let's say this is my only content blocker rule:
[
{
"action": {
"type": "css-display-none",
"selector": ":has-text(/Some text/i)"
},
"trigger": {
"url-filter": ".*"
}
}
]
No errors are seen when the rule is loaded, so it's syntactically correct, it just doesn't block the HTML. I gather this is because :has-text() works on attributes, not contents, so it works on alt, href, aria-label etc. but not on the contents of the element itself.
How do I block Some text in my example above? Thanks!
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello!
I'm trying to set a UiRefreshControl.tintColor:
.onAppear {
UIRefreshControl.appearance().tintColor = UIColor.systemBlue
}
But instead of
I get
The color in the second picture is a high contrast version of the first one. I can't understand why it works this way.
I also tried the following.
UIRefreshControl.appearance().tintColor = UIColor(red: 0, green: 0.478, blue: 1, alpha: 1) // doesn't work
UIRefreshControl.appearance().tintColor = UIColor(named: "RefreshControlColor") // doesn't work, here set "High contrast" on and indicated Universal.systemBlueColor
Perhaps I missed something?
The target was chosen as IPhone and IPad. I can also run this build on my IPad. The info.plist content of all our other applications is the same and there were no problems until today. I don't understand why it is being rejected now.
Hello Apple Team,
I'm trying to import the Audodesk FBX SDK to my Objective-C iOS Project.
The SDK is written in C++, but has support for iOS and the iOS simulator architectures.
I've added the path to the include folder in the Header Search Path
I've also added the paths to libfbxsdk.a in the Library Search Paths
Finally, I've added the libfbxsdk.a file to the Link Binary with Libraries.
However, when I build the project, I get the following error:
building for 'iOS', but linking in object file (/Users/Lond/Documents/v2/Autodesk/iOS/2020.3.7/lib/ios/debug/libfbxsdk.a[28](fbxalloc.cxx.o)) built for 'macOS'
In the terminal, if I type the command:
lipo -info libfbxsdk.a
I get the message
Non-fat file: libfbxsdk.a is architecture: arm64
confirming that I'm using the library for the correct architecture.
Do I need to add any other confifuration option? (Like the other linker flag or something else)
I'm quite new to C++, and integrating a C++ SDK into iOS is not easy.
I'm using Mac Os Sonoma 14.6.1
Tested on Xcode 15.4 and 16.2
Target Device: iPhone 13 Pro (iOS 17.6.1)
iOS FBX SDK version: 2020.3.7
Link to the SDK if needed:
https://aps.autodesk.com/developer/overview/fbx-sdk
Any help would be greatly appreciated
Thank you
Hello, I have been using the App-prefs:General&path=SOFTWARE_UPDATE_LINK URL in my application to navigate to system settings, and it worked as expected. However, after updating to iOS 18, it no longer works, and I haven't been able to find a replacement.
Is there any alternative solution or a different URL that works?
I also tried prefs:root=General&path=SOFTWARE_UPDATE_LINK, but it didn’t work either.
Getting this error in iPhone Portrait Mode with notch.
Currrently using AVQueuePlayer to play more than 30 mp3 files one by one.
All constraint properties are correct but error occures only in Apple iPhone Portrait Mode with notch series. But same code works on same iPhone in Landscape mode.
**But I get this error: **
LoudnessManager.mm:709 unable to open stream for LoudnessManager plist
Type: Error | Timestamp: 2025-02-07 | Process: | Library: AudioToolbox | Subsystem: com.apple.coreaudio | Category: aqme | TID: 0x42754
LoudnessManager.mm:709 unable to open stream for LoudnessManager plist
LoudnessManager.mm:709 unable to open stream for LoudnessManager plist
Timestamp: 2025-02-07 | Library: AudioToolbox | Subsystem: com.apple.coreaudio | Category: aqme
I have an app, which is depended on custom SDK. Custom SDK has dependencies, included all dependencies in Podspec and custom framework/static lib ios.vendored_frameworks.
here I sample of my pod spec
Pod::Spec.new do |s|
s.name = "SDK"
s.version = "1.0.1"
s.summary = "SDK"
s.source = { :git => "https://github.com/ABC/SDK.git", :tag => "v"+s.version.to_s }
s.platform = :ios
s.ios.deployment_target = '16.0'
s.swift_version = '4.2'
s.static_framework = true
s.frameworks = 'Security'
s.frameworks = 'CoreLocation'
s.requires_arc = true
s.module_name = 'SDK'
s.library = 'z'
s.default_subspec = 'Shared'
s.subspec 'Shared' do |shared|
shared.ios.vendored_frameworks = 'SDKLib.xcframework'
shared.dependency 'CocoaAsyncSocket', '~> 7.4'
shared.dependency 'CocoaHTTPServer'
shared.dependency 'SocketRocket', '~> 0.6'
shared.dependency 'QNNetDiag'
shared.dependency 'SAMKeychain'
shared.dependency 'AFNetworking/Reachability', '~> 4.0'
shared.dependency 'AFNetworking/Serialization', '~> 4.0'
shared.dependency 'AFNetworking/Security', '~> 4.0'
shared.dependency 'AFNetworking/NSURLSession', '~> 4.0'
shared.dependency 'CocoaMQTT'
shared.dependency 'Starscream', '~> 4.0.8'
shared.dependency 'TrustKit'
shared.dependency 'Firebase/Analytics'
end
end
below error I am getting while linking lib
I'm trying to reach apple developer team since last one month regarding my apple developer account enrolment process
I have mailed to apple developer mail but not getting any response, after that I tried to reach them via support form but still I'm did not get any message or mail from the apple regarding my issue.
Topic:
Developer Tools & Services
SubTopic:
Apple Developer Program
Tags:
App Store
iOS
iPhone
Developer Program
We use URLSessionWebSocketTask for web socket connection. When get error we reconnect - recreate new URLSessionWebSocketTask.
Test case: off wifi on iOS device; get error(s) URLError.notConnectedToInternet. When on wifi correct create new task with connect.
This working on iOS 12, 14, 15, 16, 17. But on iOS 18 we get error URLError.notConnectedToInternet without correct connection.
class WebSocketManager {
...
func openConnection() {
webSocketTask?.cancel(with: .goingAway, reason: nil)
webSocketTask = urlSession?.webSocketTask(with: urlRequest)
webSocketTask?.resume()
listen()
}
func closeConnection() {
webSocketTask?.cancel(with: .goingAway, reason: nil)
webSocketTask = nil
}
private func listen() {
webSocketTask?.receive { [weak self] result in
guard let self else { return }
switch result {
case .failure(let error):
delegate?.webSocketManager(self, error: error)
case .success(let message):
switch message {
case .string(let text):
delegate?.webSocketManager(self, message: .text(text))
case .data(let data):
delegate?.webSocketManager(self, message: .data(data))
@unknown default:
fatalError()
}
listen()
}
}
}
}
Delegate:
func webSocketManager(_ webSocketManager: WebSocketManagerType, error: Error) {
webSocketManager.openConnection()
}
We're currently facing an issue with Intune not automatically updating/downloading the updated build/app to end-user ios devices. It's worth noting that we've recently migrated the Xamarin project to a .NET-style SDK in this version. Previously, the app used to update automatically without any problems. We'd appreciate it if you could help us understand what might be causing this issue.
Hello,
My app often crashes when I use simulators. I would like some help with reading the crash report that is generated. Especially with the part below Thread 0 Crashed. Based on other posts I understand that the 0x8BADF00D in the crash report is a WatchDog crash that basically says that WatchDog terminated the app because the main thread was blocked for a significant time. Many processes can block the main thread so it's hard to find out what it is in our specific case. Can someone help me reading through the crash report?
Short_crash_report.txt
Background information
The application is Xamarin Native and I use Rider as an IDE. When I use Visual Studio, the simulators run just fine. No crash occurs while using my app on a device.
The crash happens on multiple simulators with different OS versions. I already deleted XCode cache, erased content and settings of several simulators and deleted iOS DeviceSupport files.
I would like to create iOS App that have MTP-USB functions.
So I have the following questions:
1.Is it possible to create an application with MTP-USB function using .NET MAUI?
2.If 1. is possible, how should I implement it?
The devices I plan to use are iPhone 15 and iPhone 15 Pro.
Thanks.
We are getting the cookie from server side when user will do the login successful. Cookie store into app browser. This cookie need to clear when user will do the logout app.
We are using the Cordova framework to create the iOS application. In Cordova i have used plugin to clear the cookie. But in iOS device not able to clear the app browser cookie. And in android device same Cordova plugin is working fine.
Why the iOS device not able to clear the cookie using Cordova plugin?
Plugin name - https://github.com/Cartegraph/cordova-cookie-master
Kindly help me out with the solutions.
I am working on a React Native application where I want to modify the native text selection menu (the menu that appears when you long-press on text). Specifically, I want to add a custom option alongside the default ones like Copy, Look Up, Translate, Search Web, and Share.
Is there a way to modify the native text selection menu inside a WebView on iOS?
How can I add a custom menu option to the default text selection menu while keeping all the default options intact?
Anyone know how to reduce the padding between list section header (plain style) and search bar? I have tried all available method on google but none work. The default list style does not have this big padding/space between the section header and the search bar.
struct Demo: View {
@State private var searchText: String = ""
var body: some View {
NavigationStack {
List {
Section {
ForEach(0..<100) { index in
Text("Sample value for \(index)")
}
} header: {
Text("Header")
.font(.headline)
}
}
.listStyle(.plain)
.navigationTitle("Demo")
.navigationBarTitleDisplayMode(.inline)
.searchable(text: $searchText)
}
}
}
I have a home widget with buttons (new in iOS 17).
In order to prevent taking action if the user taps on the widget buttons accidentally, I want to ask the user for confirmation.
It appeared that requestConfirmation be exactly what I needed, but no confirmation view shows up when I invoke this method in the perform function.
I have tried the following:
try await requestConfirmation(result: .result(dialog: "Are you sure you want to do this?") {
Image(.mdlsWhite)
})
and this alternative:
let confirmed: Bool = try await $name.requestConfirmation(for: self.name,
dialog: IntentDialog(stringLiteral: msg))
Neither option work.
I am starting to think that the requestConfirmation is not to be used with Home Widgets.
Is there a better way to handle confirmations for buttons included in a Home Widget?
Hello, dear forum members! I have a serious (for me personally) question for you.
I have a personal iphone 15 pro max, updated to the latest 18.3.
There are many bugs and glitches in my work applications.
I ask you for help. I need to find (or generate) a signature for ipsw iOS 17.5/6/7 to flash my phone.
I don't want to get rid of it, sell it, etc.
I understand two key concepts from desktop platforms:
Screen Mirroring – The same content is displayed on both the primary and external screens.
Screen Extension – The external display shows different content that complements what's on the main screen.
My question pertains to the second point: Is it possible to extend the display on iOS and iPadOS devices?
I'm referring to this Apple documentation, which explains how to extend content from an iOS/iPadOS device to an external display.
I tested this in a sample iOS Xcode project. In the iOS Simulator, I was able to detect an "external display" and present a separate UIWindow on it. However, when I tried the same on a real device (iPhone 15 connected to a MacBook Pro via cable), the external display connection was not detected.
I’d like to confirm whether screen extension is possible on a real iOS device. From my research, it appears that extension is only supported on iPadOS via Stage Manager, but I want to verify if there’s any way to achieve this on an iPhone. If so, are there any known apps that currently utilize extended display functionality on iOS?
If extension is not possible on iOS, what does the documentation mentions iOS?
Hi all!
I have been experiencing some issues when using the AVAudioEngine to play audio and record input while doing a voice chat (through the PTT Interface).
I noticed if I connect any players to the AudioGraph OR call start that the audio session becomes active (this is on iOS).
I don't see anything in the docs or the header files in the AVFoundation, but is it possible that calling the stop method on an engine deactivates the audio session too?
In a normal app this behavior seems logical, but when using PTT all activation and deactivation of the audio session must go through the framework and its delegate methods.
The issue I am debugging is that when the engine with the input node tapped gets stopped, and there is a gap between the input and when the server replies with inbound audio to be played and something seems to be getting the hardware/audio session into a jammed state.
Thanks for any feedback and/or confirmation on this behavior!
I want to use SwiftUI and RealityView to get AR scene understanding data (ARMeshAnchor) on iOS devices with LiDAR. The only way we can do that is by using ARSession (unless there is another way).
However in previous iOS 18 builds there was this function:
https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:session:arconfiguration:)
, which worked with SpatialTrackingSession and a custom ARSession together. This function in the the latest iOS and Xcode has since been removed in the RealityKit framework but still there on documentation.
I also wanted to get ARFaceAnchor data which I still cannot get without ARSession, the closest I can get is by using:
let target = AnchoringComponent.Target.face
let anchoringComponent = AnchoringComponent(target, trackingMode: .predicted)
entity = Entity()
entity!.components.set(anchoringComponent)
But I still can't find a way to get the current frame (ARFrame) or the anchors ([ARAnchor]) in the view.
Alternatively if I use if I use this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:) and start the ARSession separately. The session (didUpdate and didAdd) only runs for a few frames before getting interrupted.
And if I completely remove SpatialTrackingConfiguration and just run the ARSession. There still is a valid tracked entity for the AnchoringComponent.Target.face component. IF in the configuration for the ARSession I use the ARWorldTrackingConfiguration with face tracking. And I still get updated facial data each frame. But the ARSession didUpdate or didAdd functions don't get called passed the first few frames.
Interestingly if I switch the RealityViewCameraContent.RealityViewCamera to .virtual. I get ARMeshAnchor and ARFaceAnchor data, but no camera feed (as expected). This with or without SpatialTrackingConfiguration.
My overarching question is what is the proper way to access ARMeshAnchors and other ARAnchors created by the system and track them live while also using SwiftUI.
GitHub Repo with sample project can be found here: https://github.com/bpate75/RealityViewTesting