I'm using Xcode 15.2 and have migrated my (macOS) project to use an xcstrings file a while back. Now when I check the xcstrings file, all items are marked as "stale". When I add new localized strings in code, they don't show up in the xcstrings file. The xcstrings file is built correctly (into .lproj/Localizable.strings) when building.
Where can I check which source files are checked to update xcstrings status? "xcstringstool" appears to have a "sync" feature which reads "stringsdata" files, but there is no information in the xcstringstool help on where the stringsdata files come from.
If I create a new project I can see a "stringsdata" file being generated for each source file in the intermediate build products folder.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
有在Info.plist配置权限使用说明(NSLocalNetworkUsageDescription),App本地网络权限也已经打开,但App请求局域网设备接口时,仍返回异常:
Error Domain=NSURLErrorDomain Code=-1009 "The Internet connection appears to be offline." UserInfo={_kCFStreamErrorCodeKey=50, NSUnderlyingError=0x302d79d40 {Error Domain=kCFErrorDomainCFNetwork Code=-1009 "(null)" UserInfo={_NSURLErrorNWPathKey=unsatisfied (Local network prohibited), interface: en0[802.11], ipv4, uses wifi, _kCFStreamErrorCodeKey=50, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask .<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask .<1>"
), NSLocalizedDescription=The Internet connection appears to be offline., NSErrorFailingURLStringKey=http://192.168.1.1:80/***, NSErrorFailingURLKey=http://192.168.1.1:80/***, _kCFStreamErrorDomainKey=1}
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey there! Hope you are starting the year with great joy.
My situation
I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift
My problem
I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target.
My discoveries so far
I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API.
Thank you in advanced.
macOS > Settings > Keyboard > Text Replacement is a macOS/iOS feature I use extensively but it is unavailable in quite a few macOS applications — including some Apple apps like Xcode. Other apps I want Text Replacement to work in but it doesn't are the Adobe CC suite, all my code text editors like VC Studio, Sublime Text, BBEdit, emacs etc and Firefox.
Does anybody know what the file path is for the file that stores the Text Replacement key/string pair data?
If i can access the .plist file or similar where the Text Replacement key/string pairs are stored I will be able to convert to JSON using REGEX and import to the Firefox plugin that will replicate Text Replacement functionality in Firefox for me. Ditto other code editor applications with their own particular text substitution functionality.
Background
Some of these apps have plugins or functional equivalents of Text Replacement but i need a way to do the import/export dance to keep them in sync with my Text Replacement text pairs.
Sadly, even though we can select the Text Replacement table (in macOS but not in iOS versions), we can't copy that information. This seems to me a violation of good GUI design principles. Why allow selection of the entire table if we cannot copy it to the clipboard?
Nor can we import or export the table of text tuples. The Text Replacement GUI has no buttons to do this (consider this post also a FR for that).
Screen-capturing and running text recognition software over the PNG it is not an option given:
a) all the UTF-16 unusual glyphs and combination glyphs i use;
b) that I'd like to script this as a multi-directional syncing application I can run periodically.
Typically macOS and app preferences are stored in plist files. I want to find such a file and convert it to JSON for importing into a Firefox plugin that replicates Text Replacement within Firefox.
I tried modifying Text Replacements by adding a new item to the list and clicking "Done" and then filtering ~/Library for .plist files and sorting by "Date Modified" but nothing is showing up with these values in it.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello Apple Community,
I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings.
I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug?
Any advice or insights would be greatly appreciated!
Thank you in advance,
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
The issue is, I cannot auto acquire bluetooth keyboard focus in PHPickerViewController after enabling 'Full Keyboard Access' in my IPhone 14 with iOS version 18.3.1. The keyboard focus in PHPickerViewController will show, however, after I tapped on the blank space of the PHPickerViewController. How to make the focus on at the first place then?
I'm using UINavigationController and calling setNavigationBarHidden(true, animated: false). Then I use this controller to present PHPickerViewController using some configuration setup below.
self.configuration = PHPickerConfiguration()
configuration.filter = .any(of: filters)
configuration.selectionLimit = selectionLimit
if #available(iOS 15.0, *), allowOrdering {
configuration.selection = .ordered
}
configuration.preferredAssetRepresentationMode = .current
Finally I set the delegate to PHPickerViewController and call UINavigationController.present(PHPickerViewController, animated: true) to render it.
Also I notice animation showing in first video then disappear.
I checked the latest release notes for latest beta, and there doesn't seem to be a fix for this. But basically, the vibrations that you receive for when you long press a message to react, or hold down on an app in Home Screen, seem to stop working after a while.
This issue is reoccurring randomly.
Steps to repro:
Not fully sure on this, but I'm on iPhone 16 pro max and running the iOS 18.3 dev beta described in the title. I have the default haptics enabled in which you receive a vibration when you long press on a message in iMessage or Messenger, and also when you long press on an app on the Home Screen.
These seem to stop working, along with any other vibrations apart from calls and notifications) after a while. The only workaround is to restart the iPhone entirely.
anyone else face the same?
I did the 18.3 update over the weekend and every contact and their information (family names, addresses, photos etc) that was added to my phone over the last year is completely gone. I’ve spent hours on the phone with Apple and their “top” senior account employees with no resolution. I am told my case has been escalated to engineering and they will get back to me in one week. I have zero confidence my issue will be resolved. I’ve gone over and over every action done over the weekend and the only thing I did was erase some emails and do the update. There has to be a way they can see every action made on my phone to find the issue.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that?
STEPS TO REPRODUCE:
Disable VoiceOver (Settings → Accessibility → VoiceOver → Off).
Connect and confirm that you can navigate other iOS interfaces using an external keyboard.
In any app, present a PKAddPassesViewController with a valid .pkpass file.
When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter).
Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus.
EXPECTED RESULT:
All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard.
ACTUAL RESULT:
Keyboard navigation is not possible.
No elements receive focus.
Users cannot activate Cancel or Add buttons using keyboard input.
The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements.
IMPACT:
Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible).
Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet.
Affects users of external keyboards who rely on tab/arrow navigation.
Creates an inconsistent accessibility experience compared to other iOS system modals.
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
...
@available(iOS 13.0, *)
open var **audioFileSettings:** [String : Any] { get }
@available(iOS 17.0, *)
open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
open var name: String { get }
open var identifier: String { get }
open var primaryLanguages: [String] { get }
open var supportedLanguages: [String] { get }
open var voiceSize: Int64
open var version: String
open var gender: AVSpeechSynthesisVoiceGender
open var age: Int
}
Regards
Topic:
Accessibility & Inclusion
SubTopic:
General
if you are on the tik tok website on safari, you are able to view a video that originally brought you to the website the from the search log, but if you want to click on another video listed on the website, it claims you need to use the app to go farther, and upon proceeding it just brings you to the App Store regardless if you have the app already or not , and you are unable to view the video displayed on the website without searching for it separately on the app.
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?
i downloaded ios 18.2 and siri returned to the old ios 17 siri.iIt wont respons.i have to click the button many times to respond and when it does it goes away.also genmoji isnt downloading
i have iphone 15 pro max
Topic:
Accessibility & Inclusion
SubTopic:
General
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page.
Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS.
So my questions are:
Is this a known bug? And if so, is there any ETA on a fix?
Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
Topic:
Accessibility & Inclusion
SubTopic:
General
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released.
I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that.
In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks.
As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
home button cannot be pressed.
attempting to enable developer mode says "press home to continue"
the usual accessibility switch "assistive touch" that is shown on iOS is not available on this screen
so dev mode cannot be enabled
this is an accessibility issue
}
// Start listening to the microphone
public void StartListening()
{
if (!isListening)
{
#if UNITY_IOS || UNITY_TVOS
microphoneInput = Microphone.Start(null, true, 10, 44100);
#else
try
{
microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100
if (microphoneInput == null)
{
microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate);
}
#endif
isListening = true;
Debug.Log(Microphone.devices.Length + " Started listening...");
debugText.text = Microphone.devices.Length + "- Started listening...";
}
catch (System.Exception e)
{
Debug.LogError($"Starting microphone failed: {e.Message}");
debugText.text = $"Starting microphone failed: {e.Message}";
}
}
}
void Update()
{
if (isListening && microphoneInput != null)
{
// Analyze the audio for voice activity
float volume = GetAverageVolume();
if (volume > detectionThreshold)
{
Debug.Log("User is speaking!");
lastVoiceTime = Time.time;
SoundDetected = true;
if (Time.time - lastVoiceTime > silenceDuration)
{
Debug.Log("User is silent.");
debugText.text = volume.ToString() + " - User is silent.";
}
slider.value = volume;
}
}
}
private float GetAverageVolume()
{
float[] samples = new float[128];
microphoneInput.GetData(samples, Microphone.GetPosition(null));
float sum = 0f;
foreach (float sample in samples)
{
sum += Mathf.Abs(sample);
}
return sum / samples.Length;
}
Problem:
When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected.
Question:
Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario?
Thanks in advance for any insights!
Best,
Siddharth
Hello,
I have the following problem. I’m developing a NoCode app using the FlutterFlow platform and have been working on it for over a year.
This time, after publishing a new version of the app through FlutterFlow, I tried logging into Apple Store Connect, but I got an error saying that I had made too many login attempts and needed to try again later. However, I hadn’t attempted to log in before that at all.
No matter how long I wait—24 hours, 48 hours—the same error keeps appearing, meaning I still can’t access my account. Apple Support hasn’t responded for 4 days, and in total, I’ve been locked out of my account for over 9 days.
Please help me understand what might be causing this issue. Apple Store Connect refuses to send me an SMS with the login code.
Topic:
Accessibility & Inclusion
SubTopic:
General