Hey all — hoping someone here has dealt with this before.
I’m testing an iOS app via TestFlight, and when I originally got access, I didn’t have an iPhone. So I signed in with my Apple ID on my girlfriend’s iPhone and used TestFlight there. Everything worked fine.
Now I finally have my own iPhone (iPhone 16), downloaded TestFlight, signed in with the same Apple ID, and had the developer resend the invite. But when I tap "Open in TestFlight" from the invite email, I get this error:
“Couldn’t load app because your Apple account has already been associated to this app.”
The dev tried removing me as a tester and re-adding me, I’ve deleted TestFlight from both phones, rebooted, reinstalled, waited in between — still no luck. Even tried opening the invite link in Safari instead of Mail.
Is there any way to get Apple to fully reset the association with the old device so I can use TestFlight on my new iPhone? Or do I really need to make a new Apple ID just to get around this?
Any help would be huge — thanks!
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have screen in my app that can represented by following layout, I would like this screen to be possible to navigate with full keyboard access but there is unexpected behavior:
Path:
Tap "Tab" on keyboard -> whole scrollview is targeted and inside the first button1 is selected.
Arrow down -> selection changes to button3
Arrow up -> selection changes back to button1
So button2 is always skipped, there is no way to navigate to it by arrows left/right.
Using Tab+F and searching "button2", button2 is correctly selected, so it's selectable but for some reason not findable by going through elements.
Putting empty text in Text views cause buttons to be vertically aligned and then everything works correctly but it is not an option.
public struct BugReportView: View {
public var body: some View {
ScrollView {
VStack(spacing: .zero) {
Button("button1", action: { })
HStack {
Text("some text")
Text("some text2")
Button("button2", action: { })
}
Button("button3", action: { })
}
}
}
}
It support a new NFC ability in iOS 18.1.Is it support emulate Mifare Classic 1k on iPhone?I want to use iPhone emulate Mifare Classic 1k Tag as a homnkey.
https://developer.apple.com/support/nfc-se-platform/
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi I'm a new Mac user having been a long time PC user and software developer. I also have a mobility impairment that has led me to try to use Voice Control as a replacement for Dragon NaturallySpeaking on my PC.
I have been trying to use Parallels with a Windows 11 VM and Dragon for my remote work, but that seems to have broken when I downloaded the latest macOS beta.
Ideally I'd like to use Voice Control over a VPN/Remote Desktop Connection or, in a pinch, Chrome Remote Desktop. The problem I'm running into is that macOS does not seem to recognize that I am in a text field or other control when I am in the remote application.
I have a utility in Windows that will allow me to voice type into an application window even if the cursor is not over a control, but I can't seem to figure out a way to do that in macOS.
Is there a way to do what I want to do? Is there a more capable voice recognition software package for macOS?
I am running Sequoia 15.2 beta 3 at the moment.
Topic:
Accessibility & Inclusion
SubTopic:
General
It appears iOS only comes with low quality voices installed.
iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance.
There doesn't seem to be any api that can be used to make this process easier for the app user.
Is there a way / api that would allow an app to download and use a higher quality voice?
Will apple ever install on default higher quality voices?
We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response.
Thanks
Hi,
I’ve been reviewing the Apple Wallet provisioning documentation (Getting Started with Apple Pay In-App Provisioning_ Verification_Security_Wallet Extensions )and had a few questions regarding the color path recommendation (Green, Yellow, Orange, Red) returned during the in-app provisioning flow:
Who determines the color path—is it Apple directly, the Payment Network Operator (PNO), or both?
What criteria are used to determine the color path (e.g., device info, Apple ID reputation, past provisioning attempts)?
At what point in the provisioning flow is the color path recommendation received?
Is it included in the response after the PKAddPaymentPassRequest is submitted?
Is it accessible through any specific property or callback in the delegate method?
Additionally, for Orange Path with Reason Code 0G, I understand that in-app verification is not allowed and must be handled via tenured channels (e.g., SMS/email). Can you confirm if this logic still applies for requests initiated from within the issuer's iOS app?
Would appreciate any clarification or pointers to related documentation.
i have a swift app,in latest version , i change some class module . in old version used archivedData(withRootObject rootObject: Any) to save class model. in lastest version crashed -[NSKeyedUnarchiver decodeObjectForKey:]: cannot decode object of class
Topic:
Accessibility & Inclusion
SubTopic:
General
I have a TabView with a sample tabItem as follows:
.tabItem {
Label ("Import", systemImage:"doc.on.doc")
.accessibilityLabel("Import Text")
}
But accessibility settings for large display size on does not seem to work, nor do dynamic font sizes:
.tabItem {
Label ("Import", systemImage:"doc.on.doc")
.font(.largeTitle)
.accessibilityLabel("Import Text")
}
The tabItems appear as a fixed size. The tab contents scale well, so this does not look pleasant at all.
Is this a known bug in SwiftUI?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have just purchased an Apple developer Account from Bangladesh and it sent codes perfectly for the first 2-3 times for logging in, but after that no matter what I do it doesn't send verification code and I am stuck, I now cannot log in and this is extremely extremely frustrating
Updated to iOS 26 beta and now the TV remote app in the control center won’t open. I’ve tried the following:
Restart phone
Remove shortcut and re-add
Cant find any other troubleshooting methods for this issue online so I’m guessing it’s a new problem.
Topic:
Accessibility & Inclusion
SubTopic:
General
Amogst the two languages my app would have lets say 10% and 90%.
I am launching an app for a unlisted Primary Language. I consider whatever is inside the app as the primary and that wont be English. The secondary language
Merhaba , bir apple mağaza kurulumu yaptık fakat yaptığımız ödemenin faturası henüz gelmedi. Faturaya nereden ulaşacağım hakkında bilgi verebilir misiniz?
Topic:
Accessibility & Inclusion
SubTopic:
General
I need to direct text-to-speech generated audio from my app simultaneously to a bluetooth speaker device AND to the internal iPad speaker. The app uses AVSpeechSynthesizer and several third party speech engines. How best to do this?
I noticed the outputChannels property on AVSpeechSynthesizer...are there any examples of how to use this?
Topic:
Accessibility & Inclusion
SubTopic:
General
After updating to the iOS 26 Beta version, the screenshot option within the AssistiveTouch menu has stopped working. Tapping on the "Screenshot" icon does not perform any action.
Topic:
Accessibility & Inclusion
SubTopic:
General
Issue:
When using the shortcut Command + Delete to clear a line of text, the next character I type in Thai unexpectedly appears as an English character, even though the input source is still set to Thai. After that, subsequent characters return to Thai as expected.
Details:
Affected apps: Notes, Messages, and some other native apps
Not affected: Browser text fields (Safari, Chrome, etc.)
Does not occur when using Option + Delete or just Delete
macOS [insert beta version + build number]
Mac model: [insert model]
Input sources: Thai – Kedmanee, English – U.S.
Steps to reproduce:
Open Notes (or Messages).
Switch to Thai input.
Type a few Thai words.
Press Command + Delete.
Type again — the first character shows up in English.
Expected:
First character should remain in Thai, consistent with the active input source.
Actual:
First character shows as English, then input switches back to Thai.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182.
The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends?
In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs.
I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
iOS
Accessibility
Sound and Haptics
Core Haptics
I was trying to achieve accurate positioning with UWB on an iPhone 16 in India but couldn't find any option to enable it in the settings. I am using the Qorvo Nearby Interaction app to communicate with my custom UWB tag( DWM3001 by Qorvo).
Topic:
Accessibility & Inclusion
SubTopic:
General
My app uses CoreData based on iOS 13.0 combined with iCloud to store data. This function automatically manages the data collaboration between CoreData and iCloud. Some users have reported that after going abroad, their original data disappeared, and when they returned to China, the data could be displayed normally again. I'm located in Mainland China. I've learned that iCloud data in China is all stored in Guizhou-Cloud Big Data (the data center in Guizhou). Could this problem be caused by display abnormalities resulting from the switching of the iCloud data storage centers accessed in different regions?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment.
Problem 1: Simulating Eye Tracking in Simulator
We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro?
Problem 2: WebSocket-triggered "Click" doesn't work outside the app
We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works.
We suspect this is due to sandbox restrictions or lack of system-level permissions.
Is it possible in anyway to:
Trigger interaction events outside the app using custom input (like BCI via Websocket)?
Access system-wide click/tap simulation APIs from within VisionOS apps
Integrate this with accessibility services (like Voice Control or AssistiveTouch)
We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS.
Thanks in advance for any insight you can provide!
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Xcode
Accessibility
iPad and iOS apps on visionOS
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too).
The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end.
What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys.
I've tried using:
UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access.
Making the UIPickerView a first responder when it appears.
Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either).
Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.