xss.txt
                    
                  
                Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
          Post
Replies
Boosts
Views
Activity
                  
                    
                      
                      
                      
                      
                      
                      [macOS 15.4] Game Controller Background Input Capture Broken - Accessibility App No Longer Functions
                    
                  
                  
                
                
                    
                      Our application,
https://apps.apple.com/us/app/gamecontroller-mapper/id6737088417
which maps game controller inputs to keyboard/mouse events system-wide, has stopped functioning properly after the macOS 15.4 update. Specifically, the app can no longer capture game controller inputs when running in the background, severely impacting its core functionality.
Environment
macOS version: 15.4
Previous working versions: All versions prior to 15.4
App type: Background utility with accessibility permissions
Hardware: All game controller brands compatible with macOS
Detailed Description
Before macOS 15.4
Our application correctly captured game controller inputs from any brand connected to Mac and successfully translated them to keyboard/mouse events system-wide. Users could control any application (e.g., scrolling through documents in Preview using controller buttons) while our app ran in the background with the accessibility permissions granted.
After macOS 15.4
The application only works when it has active focus (is in the foreground). When any other application gains focus, our app completely stops receiving or detecting any input events from the game controller while running in the background. For instance, pressing the 'down' button on the controller while another app is active results in no event being registered within our application.
We've tried updating the app to work in accessory mode (in the menubar), but the issue persists.
Steps to Reproduce
Install our application on macOS 15.3 or earlier
Grant accessibility permissions when prompted
Connect a compatible game controller (e.g., Xbox or other controller)
Open another application (e.g., Preview with a PDF document)
Press buttons on the controller to navigate the document without touching the keyboard
Expected result on 15.3: Controller inputs are translated to keyboard events, even when our app is in the background
Upgrade to macOS 15.4
Repeat steps 2-5
Actual result on 15.4: Controller inputs are only translated to keyboard events when our application has focus
Technical Implementation
Our app uses:
CGEvent.tapCreate() to create a global event tap
CGEvent for simulating keyboard and mouse events
GCController.extendedGamepad?.valueChangedHandler for detecting controller inputs
Proper NSAccessibilityUsageDescription and appropriate entitlements
GCController.shouldMonitorBackgroundEvents = true to ensure controller events continue when the app is inactive
Possible Relation to Recent Changes
We noticed in the macOS 15.4 Release Notes:
Game Controller - Resolved Issues:
Fixed: Game controllers might stop responding when accessibility features, such as Voice Over, are enabled. (141497799)
We suspect this fix might have introduced a regression or intentional limitation affecting applications like ours that rely on background event simulation with game controller input.
Impact
This change severely impacts:
Applications designed to use game controllers as assistive input devices for users who may have difficulty using traditional keyboard and mouse inputs
Applications for media control, presentation navigation, and other similar use cases
Users who rely on our application for accessibility purposes
Questions
Is this an intentional security change or an unintended side effect of the controller fix mentioned in the release notes?
Are there any new APIs or alternative approaches we should implement to restore functionality?
If this is a system bug, when can we expect a fix?
We would greatly appreciate any guidance on how to restore our application's functionality. Thank you for your assistance.
                    
                  
                
                    
                      if you are on the tik tok website on safari, you are able to view a video that originally brought you to the website the from the search log, but if you want to click on another video listed on the website, it claims you need to use the app to go farther, and upon proceeding it just brings you to the App Store regardless if you have the app already or not , and you are unable to view the video displayed on the website without searching for it separately on the app.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Many of us Bangladeshi iPhone users were upset when Apple changed the font to Bangla in the most recent iOS version (18.4.1).  We prefer the old Bangla typeface.  I want the old Bangla typeface to return, and so do we.  Please consider  this.
                    
                  
                
                    
                      Hello community,
We're designing an app that can optionally be controlled by a stylus with a mesh tip. In this case, the mesh tip we're using is 5 mm in diameter. It seems that mesh tip contact detection is unstable in this size, although it works better with a larger diameter.
Is it possible to access a setting in iOS that lets you define the minimum contact area needed to detect a contact on the screen? This would enable us to use this 5 mm stylus.
Best regards,
Edwin
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I updated with 18.3 beta, but lost video and audio option with that update, I tried to restore with itune in windows 11, got struck between. Forcefully turned off ipad, after two tries got off.... Off like blinked out screen... Not tried all tricks to on, can't on....please tell a solution. Used all your advices in internet. It was 90% charged, working superbly. So far no risk... Please help me. No charging icon, no sign of life. How can On?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I can’t screenshot using assistive touch after i install ios 26 beta 2
                    
                  
                
                    
                      i downloaded ios 18.2 and siri returned to the old ios 17 siri.iIt wont respons.i have to click the button many times to respond and when it does it goes away.also genmoji isnt downloading
i have iphone 15 pro max
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      I am building a language learning app for a unlisted primary language. My plan is to go with english. Any other suggestions or heads up? Check screenshot.
Its unfortunate that i have to tag a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
                    
                  
                
                    
                      We are unable to programmatically enable AppleScript automation for VoiceOver on macOS 15 (Sequoia)
In macOS 15, Apple moved the VoiceOver configuration from:
~/Library/Preferences/com.apple.VoiceOver4/default.plist
to a sandboxed path:
~/Library/Group Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist
Steps to Reproduce:
Use a macOS 15 (ARM64) machine (or GitHub Actions runner image with macOS 15 ARM).
Open VoiceOver:
open /System/Library/CoreServices/VoiceOver.app
Set the SCREnableAppleScript flag to true in the new sandboxed .plist:
plutil -replace SCREnableAppleScript -bool true ~/Library/Group\ Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist
Confirm csrutil status is either disabled or not enforced.
Attempt to control VoiceOver via AppleScript (e.g., using osascript voiceOverPerform.applescript).
Observe that the AppleScript command fails with no useful output (exit code 1), and VoiceOver does not respond to automation.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            macOS
          
        
        
      
      
    
      
      
      
        
          
            Accessibility
          
        
        
      
      
    
      
      
      
        
          
            App Sandbox
          
        
        
      
      
    
      
      
      
        
          
            AppleScript
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      Voice Control Disabling System Services After Reboot
I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem.
To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months.
I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem.
I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter.
Sincerely,
Donald Spencer Kirby
Dayton, Ohio
                    
                  
                
                    
                      }
// Start listening to the microphone
public void StartListening()
{
if (!isListening)
{
#if UNITY_IOS || UNITY_TVOS
microphoneInput = Microphone.Start(null, true, 10, 44100);
#else
try
{
microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100
        if (microphoneInput == null)
        {
            microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate);
        }
#endif
isListening = true;
Debug.Log(Microphone.devices.Length + " Started listening...");
debugText.text = Microphone.devices.Length + "- Started listening...";
}
catch (System.Exception e)
{
Debug.LogError($"Starting microphone failed: {e.Message}");
debugText.text = $"Starting microphone failed: {e.Message}";
}
}
}
void Update()
{
if (isListening && microphoneInput != null)
{
// Analyze the audio for voice activity
float volume = GetAverageVolume();
    if (volume > detectionThreshold)
    {
        Debug.Log("User is speaking!");
        lastVoiceTime = Time.time;
        SoundDetected = true;               
        if (Time.time - lastVoiceTime > silenceDuration)
        {
            Debug.Log("User is silent.");
            debugText.text = volume.ToString() + " - User is silent.";
        }
        slider.value = volume;
    }
}
}
private float GetAverageVolume()
{
float[] samples = new float[128];
microphoneInput.GetData(samples, Microphone.GetPosition(null));
float sum = 0f;
foreach (float sample in samples)
{
    sum += Mathf.Abs(sample);
}
return sum / samples.Length;
}
Problem:
When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected.
Question:
Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario?
Thanks in advance for any insights!
Best,
Siddharth
                    
                  
                
                    
                      We are using inline PhotosPicker introduced with iOS 17.0. The accessbility navigation using touch gestures work fine but the navigation through Keyboard doesn't work properly.
The tab/arrow based Keyboard navigation can't move from native app process to inline PhotosUI process or vice versa. This is logged as a high severity bug by our accessibility team.
Please look into this.
Sample code for repro:
https://github.com/saalisumer/AccessibilityIssueInlinePhotoPickerIOS
Repro video:
https://github.com/saalisumer/AccessibilityIssueInlinePhotoPickerIOS/blob/main/Simulator%20Screen%20Recording%20-%20iPhone%2015%20Pro%20-%202024-12-16%20at%2016.27.48.mp4
                    
                  
                
                    
                      I was able to add shortcuts with parameters and use them from the Shprtcuts app in iOS 17, nevertheless Siri intent did never work.
I upgraded to iOS 18 my app and my mobile.
Now, the shortcut only appears in shortcuts app if no parameter is added to it. When I try to set a parameter, the shortcut does not appear any mora in Shortcuts app.
struct ShortcutsProvider: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: OpenAppIntent(),
phrases: [
"Show (.$screen) in (.applicationName)"
],
shortTitle: "Open",
systemImageName: "iphone.badge.play"
)
}
}
struct OpenAppIntent: AppIntent {
static var title: LocalizedStringResource = "Show"
static let description = IntentDescription("Shows a screen.")
static var openAppWhenRun: Bool = true
static var authenticationPolicy = IntentAuthenticationPolicy.alwaysAllowed
@Parameter(title: "screen")
var screen: String
@MainActor
func perform() async throws -> some IntentResult {
    return .result()
}
}
extension ScreenOption: AppEntity {
struct OpenAppQuery: EntityQuery {
    @IntentParameterDependency<OpenAppIntent>( \.$screen )
    var openAppIntent
    func entities(for: [ScreenOption.ID]) async throws -> [ScreenOption] {
        return []
    }
    func suggestedEntities() async throws -> [ScreenOption] {
        return []
    }
}
var displayRepresentation: DisplayRepresentation {
    .init(stringLiteral: "\(title)")
}
static var defaultQuery: OpenAppQuery = OpenAppQuery()
static var typeDisplayRepresentation: TypeDisplayRepresentation = .init(name: "Screen")
}
extension ScreenOption: EntityIdentifierConvertible {
static func entityIdentifier(for entityIdentifierString: String) -> ScreenOption? {
allCases.filter { $0.rawValue == entityIdentifierString }.first
}
public var entityIdentifierString: String {
    rawValue
}
public init?(entityIdentifierString: String) {
    guard let screenOption = ScreenOption.entityIdentifier(for: entityIdentifierString)
    else { return nil }
    
    self = screenOption
}
}
                    
                  
                
                    
                      When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
                    
                  
                
                    
                      I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore.
Is There any reason for this to happening?.
Is anyone aware of this behaviour?
Any suggestion would be really appreciated.
Cheers
                    
                  
                
                    
                      Hi everyone,
After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues:
Significant increase in system temperature
The laptop feels hot even with light usage like Safari and Figma
Rapid battery drain
Battery is dropping unusually fast compared to macOS Sonoma.
I’ve tried, Restarting the device.
I’m aware this is a beta, but just wondering.
Is anyone else experiencing this?
Is this a known issue?
Would love to hear if others are facing similar problems or if it might be something specific to my setup.
Thanks in advance!
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hello everyone,
I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5.
This has also been reported via Feedback Assistant with
Feedback ID: FB17806167
Description:
When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation.
Specifically:
The pointer cannot move past the center of the screen
Horizontal and vertical (X/Y) movements appear to be swapped or misaligned
Natural movement of the pointer is not possible
It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape.
This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps.
Steps to Reproduce:
Enable AssistiveTouch
Connect a Bluetooth mouse to the iPhone
Rotate the device to landscape orientation
Try moving the mouse pointer across the screen
→ Notice that:
Pointer cannot move past the center
Horizontal/vertical input is interpreted incorrectly (as if still in portrait)
Expected Behavior:
The mouse pointer should move across the entire screen correctly, regardless of device orientation.
Actual Behavior:
In landscape orientation, the pointer is either restricted to part of the screen or misaligned.
It behaves as if the device is still in portrait.
Horizontal mouse movement causes vertical pointer movement, and vice versa
User experience feels broken and unintuitive
Feature Suggestion:
Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS.
I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior.
Thanks in advance for any insights or suggestions.
Best regards,
Jannis
                    
                  
                
                    
                      curl -o actions-runner-osx-x64-2.321.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.321.0/actions-runner-osx-x64-2.321.0.tar.gz echo "b2c91416b3e4d579ae69fc2c381fc50dbda13f1b3fcc283187e2c75d1b173072 actions-runner-osx-x64-2.321.0.tar.gz" | shasum -a 256 -c tar xzf ./actions-runner-osx-x64-2.321.0.tar.gz ./config.sh --url https://github.com/funds123/tpsdk --token BMOPQPZNQKB57MXE4BDDKKTHMTHJO ./run.sh
Save this configuration as com.github.actions.runner.plist in
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
	...
    @available(iOS 13.0, *)
    open var **audioFileSettings:** [String : Any] { get }
    @available(iOS 17.0, *)
    open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
    open var name: String { get }
    open var identifier: String { get }
    open var primaryLanguages: [String] { get }
    open var supportedLanguages: [String] { get }
    open var voiceSize: Int64
    open var version: String
    open var gender: AVSpeechSynthesisVoiceGender
    open var age: Int
}
Regards
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Accessibility & Inclusion
  	
                
                
                SubTopic:
                  
                    
	
		General