Post not yet marked as solved
Hi,
I am developing an App in React Native for this i need Xcode. However when i start my Expo Developer Tool Metro Bundler There when I click Run IOS Simulator. Always From the VS Code Terminal I am getting the same message like Xcode needs to be installed (don't worry, you won't have to use it), would you like to continue to the App Store? However I have already Xcode installed the after that when i type Y it takes me to App Store where i find the option open Xcode it's because i have already installed Xcode. Please guide me through this because i am not able to see my App on IOS devices
Post not yet marked as solved
Still not working. Anyone found a solution?
Post not yet marked as solved
Since the end of 2021, I have been getting reports from my users that my app Timing (https://timingapp.com) no longer records window titles and file paths for the apps they use, despite Accessibility permissions having been granted.
The problem manifests itself such that sometimes (see below on the conditions I was able to identify), the "Timing Tracker" app is shown and appears checked in "Security & Privacy" System Preferences, and calling AXIsProcessTrustedWithOptions() with options @{ (__bridge id) kAXTrustedCheckOptionPrompt : (id) kCFBooleanFalse } returns true. However, any of my actual Accessibility code (e.g. obtaining a process's windows) only returns nil (or empty arrays) when the problem is occurring.
Any pointers as to what could be the reason or what I could investigate would be very appreciated, as I really am at a loss here.
Here are a few additional things to note that may or may not be related to the issue at hand:
My use of the Accessibility API usually works fine and has been working fine for quite a while; only recently has it started to sporadically stop working for some users.
The app consists of a "main" app, with a helper contained therein that actually performs the Accessibility requests. In the "Security & Privacy" System Preferences, the helper (called "Timing Tracker") is shown and appears checked (i.e. Accessibility permissions seem to be granted).
This only seems to affect the Accessibility API; Automation (i.e. Apple Events) continue to function if the user has granted permission for them.
It appears that these issues occur more frequently after the app gets updated and the helper restarts itself because it has detected changes to its application bundle, but it appears that's not the only cause for this issue (i.e. it also happens without a recent app update having taken place). The helper uses the following code to relaunch itself:
- (void)relaunchWithDelay:(NSTimeInterval)delay {
// $N = argv[N]
// Sleep until our own process has been killed, then sleep for another 15 seconds, then relaunch the app.
NSString *killArg1AndOpenArg2Script = [NSString stringWithFormat:
@"/bin/kill $1; (while /bin/kill -0 $2 >&/dev/null; do /bin/sleep 1; done; /bin/sleep %lf; /usr/bin/open \"$3\") &",
delay];
// NSTask needs its arguments to be strings
NSString *ourPID = [NSString stringWithFormat:@"%d",
[NSProcessInfo processInfo].processIdentifier];
// this will be the path to the .app bundle,
// not the executable inside it; exactly what `open` wants
NSString *pathToUs = [NSBundle mainBundle].bundlePath;
NSArray *shArgs = @[ @"-c", // -c tells sh to execute the next argument, passing it the remaining arguments.
killArg1AndOpenArg2Script,
@"", // $0 path to script (ignored)
ourPID, // $1 in restartScript
ourPID, // $2 in restartScript
pathToUs ]; // $3 in restartScript
NSTask *restartTask = [NSTask launchedTaskWithLaunchPath:@"/bin/sh" arguments:shArgs];
[restartTask waitUntilExit]; // wait for killArg1AndOpenArg2Script to finish
NSLog(@"*** ERROR: %@ should have been terminated, but we are still running", pathToUs);
assert(false && "We should not be running!");
}
I am unsure whether this invocation somehow relaunches the helper in a state that temporarily strips it of its TCC/Accessibility permissions.
According to user reports, this can usually be fixed either relaunching the helper, rebooting the Mac, or (in some cases) unchecking the helper in the "Security & Privacy" System Preferences. (It seems like which of these escalating steps is required for the fix varies from user to user.)
I have heard reports of this from both macOS 11 and macOS 12 (not sure whether it also occurs on macOS 10.15).
Given reports of some TCC vulnerabilities having been fixed recently, I wonder whether some of the fixes could trigger a denying of Accessibility permission to my app.
I still haven't been able to reproduce the issue myself, but have received plenty of credible reports that this is actually happening. As mentioned above, rebooting the Mac usually seems to fix the problem, which makes this particularly hard to investigate and debug.
The app itself has been around since at least 2017, yet these problems seem to have only started occurring (or at least became much more frequent) towards the end of 2021. I do not recall any substantial chances to the relevant code paths recently.
The app is not sandboxed; it is correctly signed and notarized with Developer ID. Hardened Runtime is enabled; the only Hardened Runtime entitlement requested is "Apple Events".
I do have changed the Organization name of my Apple Developer account in 2021, but I don't think that's related, because the designated requirement (csreq) stored in tcc.db for Accessibility with my app is anchor apple generic and identifier "info.eurocomp.TimingHelper" and (certificate leaf[field.1.2.840.113635.100.6.1.9] /* exists */ or certificate 1[field.1.2.840.113635.100.6.2.6] /* exists */ and certificate leaf[field.1.2.840.113635.100.6.1.13] /* exists */ and certificate leaf[subject.OU] = NDB5JK3DZG). This seems to be appropriate, and does not include the organization name itself. My Organization's Team ID (NDB5JK3DZG) has not changed.
Post not yet marked as solved
I'm working on getting VoiceOver to read out text updates in an unfocused widget - as part of this, I'd like to be able to see all of the events that the macOS VoiceOver client is receiving on its end. How can I do that?
I see macOS provides an Accessibility Inspector and an Accessibility Verifier, but neither of those tools seem to target events.
Post not yet marked as solved
My app has an NSpopover and its contentView's hierarchy is as follow, but when voice over is on, it will only announce "You are on a cell".
I want VoiceOVer read keyboard navigation hints ( like this picture).:
Post not yet marked as solved
Hi,
Any my organization there are many apple products, i am creating an app for organization. in that app barcode scan facility is their, now i am looking for apple api which can provide details after scanning to barcode like Product type, Model number..etc, is there any api available at apple & what is charges for that api.
Thanks,
Yusuf Shaikh
Post not yet marked as solved
I am trying to access UITabBarItem using accessibilityIdentifier in UITestCases but its not working in iOS 11 and iOS 12. On iOS 13 it's working. I have checked this with Appium and Accessibility Inspector. In both its not showing accessibilityIdentifier applied for UITabBarItem.
tabBarItem1.accessibilityIdentifier = "tabBarItem1"
I have tried calling this in viewDidLoad, viewWillAppear, viewDidAppear. but same results for me.
has anyone faced this issue ?
Post not yet marked as solved
System Version: macOS Monterey 12.3.1
My app has a button which will show popover when clicked the button and make a button in the popover to be the popover's window's firstResponder. When Voiceover is not on, I am able to close the popover using escape key (handled by NSpopover's contentView). But when I turned voice over on, pressing escape key cause both the popover and its parent window to escape, and the debug result showed that the popover's sender received the key event rather the popover (In sender's window pressing escape will close the window).
This problem disappeared when I changed the popover's behavior to .applicationDefined (still not working for .transient or .semitransient), but I still want to know why VoiceOver will affect the behavior of NSpopover
Post not yet marked as solved
It appears that UICollectionViewCompositionalLayout has issues with accessibility when using orthogonal scrolling sections. I recently updated a CollectionView in our app to use compositional layout / diffable data source and noticed that the accessibility inspector only reads the cells that are queued initially and doesn't scroll horizontally to trigger reuse and reach the end of the list. Instead, it reaches the last cell that's already been queued for the section and moves to the next section. Only the first 2 sections are queued initially on this screen until the user scrolls vertically, so the accessibility focus only reaches the second section and then stops.
I see the same behavior using the Modern CollectionView demo app from Apple and turning the Accessibility Inspector on for the Orthogonal Section screens.
Does anyone know if this is the expected behavior and if there's a way to allow the sections to scroll horizontally to the end when using VoiceOver / Accessibility Inspector?
Post not yet marked as solved
By using the UIKit views and controls, I wanted to know if the accessibility attributes (e.g. traits, value, identifier) are automatically added to the element in the screen? Or do I need to add manually the information for each attribute?
If it is done automatically, will it appear in the Accessibility Inspector when looking at each element in the screen on the simulator?
Post not yet marked as solved
I've noticed a bug in my app recently, it appears that in watchOS 8.5 (or earlier) that page layout is no longer initializing or awakingWithContext the pages beyond the first index.
According to the documentation:
In a page-based interface, all interface controllers are initialized up front but only the first one is displayed initially.
https://developer.apple.com/documentation/watchkit/wkinterfacecontroller/1619521-init
I am simply not seeing this happen anymore. I have logging in all of the lifecycle method of all three of my pages and the second and third controllers don't fire anything (including init) until I swipe to the right. This is when I would expect the willActivate and didActivate methods to be invoked. Instead I get init, awake, willActivate, and then didActivate. :/
This is unfortunate and a bug to the user because the second controller asks to becomeCurrent under some certain conditions that the first detects and fires via NotificationCenter. The automatic programatic switching between pages is totally broken.
FB9972047
Post not yet marked as solved
Hello,
It's impossible to manage hit area for textfield and it's already too small by default (detected with Accessibility Inspector).
I'm tried to force with modifier "frame(minHeight: ***)", the component change the height but the hit area is still in same size.
Have to tips to fix this issue?
iOS accessibility bug in tableView? I’m trying to add an accessibilityLabel to a tableViewCell’s UITableViewCellAccessoryDetailDisclosureButton. VoiceOver just doesn’t “see” it. Its in the hierarchy and tappable but I need let the user know its there. Thoughts? I have set accessibilityLabels to the textLabel and detailTextLabel but no idea how to access the “info” button. If I debug the heirarchy, It does speak the “More Info” value, but nothing happens when I step through the elements or tap it on a device. Any help would be appreciated.
“More Info” is the default label for that built in control. I used isAccessibilityElement = false on the cell, so that I could access the textLabel and defaultTextLabel. If I use the Accessibility Inspector to look at the hierarchy I can see the button. For some reason I can’t get a handle on it. The info button opens an editor, so I want to tell the user that 1. it is there and 2. what it goes to. (I’m not looking to override that label, that’s not possible as per the docs)
Post not yet marked as solved
How do I use the SW API on my iPhone to get an instant image of an external USB webcamera
Post not yet marked as solved
I recently came across new Apple Watch assistive touch features, which can recognize hand clenching, finger pinching and more.
Here's the official demo video.
Is there an API, which gives developers access to these new gestures? For example, to detect finger pinching or hand clenching?
Post not yet marked as solved
Hi!
I'm just asking for ideas why VO rarely but still decides to stop working with my app? I mean it focuses on status bar and no gesture is able to switch it back into the app. I noticed it always works perfectly with the login/registration views but fails with the views Inside navigationController. First it works partly - it recognises my top bar with a title, a search bar and so on but sees neither a table view nor a tab controller below. Maybe? it's connected to the fact every view inside navigationController has in viewWillAppear this code:
if UIAccessibility.isVoiceOverRunning {
DispatchQueue.main.asyncAfter(deadline: .now() + 0.8, execute: {
UIAccessibility.post(notification: .layoutChanged, argument: self.navigationController)
})
}
Suggest please what could be the reason.
I have a custom picker where the contents of the picker are created from a downloaded JSON file. In an effort to provide better accessibility for challenged users, I am trying to use the
.accessibilityFocused modifier to focus VO on a user's previous selection. This means that when the user first views the picker it should start at the top, but if they go back in to change their selection, it should auto-focus on their previous selection instead of having them start over from the top of the list.
The issue is that to use .accessibilityFocused you really need to do so via an enum so you can call something like
.accessibilityFocused($pickerAccessFocus, equals: .enumValue). As I am
using a ForEach loop to create the picker based off of the array that is
created when the JSON is parsed and then stored in the struct, I
haven't figured out how to create the various enum cases based off of
that array.
So, to recap:
Inside the ForEach loop, I need to have a .accessibilityFocused modifier identifying each of the picker options
The onAppear needs to say something along the lines of...
if salutation == salutation.salutation {
DispatchQueue.main.asyncAfter(deadline: .now() + 0.2) {
pickerAccessFocus = .ENUMCASE
}
} else {
pickerAccessFocus = Optional.none
}
Though apparently onAppear does not like to have a ForEach statement, so I'm not sure how to solve that issue either.
I know there's a lot of code in the example I provided, but I tried
to combine it all into as simple as example as possible. Hoping it makes
sense.
If I'm thinking about it wrong, and there's a better solution, I'm all ears.
import SwiftUI
struct Picker: View {
@AccessibilityFocusState var pickerAccessFocus: PickerAccessFocus?
@State private var salutation = ""
var salutationList: [SalutationOptions] = [] // SalutationOptions is the struct from the parsed JSON
enum PickerAccessFocus: Hashable {
case ? // These cases need to be dynamically created as the same values the ForEach loop uses
}
static func nameSalutationPicker(name: String) -> LocalizedStringKey { LocalizedStringKey(String("NAME_SALUTATION_PICKER_\(name)")) }
var body: some View {
List {
Section {
ForEach(salutationList, id: \.id) { salutation in
HStack {
Text(nameSalutationPicker(name: salutation.salutation))
} // End HStack
.contentShape(Rectangle())
.accessibilityFocused(salutation == salutation.salutation ? ($pickerAccessFocus, equals: .ENUMCASE) : ($pickerAccessFocus, equals: Optional.none))
} // End ForEach
} // End Section
} // End List
.onAppear {
DispatchQueue.main.asyncAfter(deadline: .now() + 0.2) {
pickerAccessFocus = .ENUMCASE
}
}
}
}
Post not yet marked as solved
Hey guys,
I have an NSAttributedString within my app (created from HTML). I assign this string to a UITextView.
I would like certain parts of that text to be marked with an 'header' accessibility trait (all the headlines in that text) so that voice over can identify them properly.
I was under the impression that I can just use accessibilityTextHeadingLevel to do so, but the text in that given range is still setup with the 'text' accessibility trait:
var myString = NSMutableAttributedString(...)
let range = NSRange(location: 0, length: 44) myString.addAttribute(NSAttributedString.Key.accessibilityTextHeadingLevel, value: 1, range: range)
How is accessibilityTextHeadingLevel supposed to work?
Post not yet marked as solved
I am using AccesibilityFocusState with an enum to change focus to an error message if a user fills out a field incorrectly.
@AccessibilityFocusState private var accessFocus = Field?
Once the error message has been read, I would like to have the focus move back to the form field. I've seen some old UIKit ways of attempting this, but haven't found anything that would do it with SwiftUI.
The other issue is that I can't use an onChange event because it's just reading back text (the error msg). I need to allow voiceover to read the entire error message, and when it has completed reading, set my AccessibilityFocusState to a new value via:
accessFocus = .formField
Is there a way to do this with SwiftUI?
I've got a page where there is a text field for the user to enter a value. If they enter an incorrect value it shows an error message (text and a systemImage, not an alert) and when they enter a correct value it shows a continue button.
Once the user has entered anything and dismissed the keyboard I want voiceover to automatically focus on either the error message or the continue button.
The only way I've sort of been able to make this work is by using accessibilitySortPriority with a ternary conditional that says if the field is not empty and the keyboard is not focused then make this the highest sort priority.
Sure, that works (kind of) because it will in fact focus on the correct element once the keyboard is dismissed, but if the user changes their entry in the text field, then dismisses the keyboard, it stays on the text field instead of focusing once again on the error message or continue button.
Also, changing the sort priority of the entire page is a bit problematic because everything is out of order if the user needs to swipe to move around the page elements properly.
I've also looked into AccessibilityFocusState because the docs say that the element will gain VO focus when the bool is true and when you move away from that element the bool becomes false. However, nothing I've tried with this seems to do anything at all.
What I want to have happen is as follows:
User loads the page
VO reads the elements from top to bottom
User taps to enter a value in the text field
User dismisses the keyboard
Either the error msg or continue button appears on screen and VO immediately focuses on it
User taps to change their entry in the text field
User dismisses the keyboard
Either the error msg or continue button appears on screen and VO immediately focuses on it