I created a desktop app for Mac using Xojo. The app has a controller in the main window and displays advertisements and notices on a connected external display.
I'm currently connecting my iMac24 to a REGZA-55M550M via AirPlay, and displaying video from the iMac to the REGZA, but the connection occasionally drops out. Yesterday, the connection dropped about 3.5 hours after connecting. Of course, I have other apps running on the iMac, but I'm not using any operations that would put a strain on the network or memory.
Does AirPlay connection to non-Apple products become unstable over long periods of time?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am unable to login my App Store showing verification failed. Verification codes can’t be send at this time message showing.
please help me.
i am login my apple account in browser. Works fine. If i try to login ipad App Store? Getting verification failed.
Topic:
Accessibility & Inclusion
SubTopic:
General
i would like to ask what are the new touchpad shortcuts in the latest os for ipad using a trackpad. The three fingers swipe to the right and left seemed to be removed. Thank you
I developed a watchOS app to capture gyro data, save it in real-time as a CSV file, and send it to an iOS app. However, when I start writing with the watch on, the screen dims, and it stops working. It only resumes operation when I tap the screen again.
Is there a way to let it run in the background and transmit files in real-time, even when the screen is off?
I use AttributedString to create a string containing a link. And I set the AttributedString to UILabel. How should I set up the Accessibility feature to make sure that
I can keyboard focus on the substring with link and use keyboard operation to open the link
I can VoiceOver the whole string and VoiceOver the substring with link to open the link
Thanks a lot.
Topic:
Accessibility & Inclusion
SubTopic:
General
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired.
Current and desired behavior:
This seems an issue so I filed a feedback.
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
Topic:
Accessibility & Inclusion
SubTopic:
General
We have an electron app developed for Mac. We would like to restore the user data previously saved in downloads once user installs the app from store and first launch. But MAS has restrictions with ""com.apple.security.files.downloads.read-write". We have enabled the user access in Entitlement files and request user permission before access What options can be user to auto restore the data from downlodas?
Topic:
Accessibility & Inclusion
SubTopic:
General
Since UITextView does not support the zoom function, the zoom function of UITextView with addSubview is used in UIScrollView.
However, when I use the link here, the text behind it is missing.
Ex) https://appstoreconnect.apple.com/login\nApple Developer Login
-> The text “Apple Developer Login” does not appear.
If anyone has experienced the same problem as me or knows a solution, please leave a comment.
Note)
It is working normally in iOS16, but the text behind the link disappears in iOS18.
The text is not visible, but you can copy it and paste it to retrieve the missing text.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi Apple Developer Community,
I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community.
Current Implementation
Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.)
Implementing custom sensitivity curves and smoothing algorithms
Applying baseline correction and coordinate mapping
Using quadratic regression for calibration point mapping
Issues I'm Facing
1. Calibration Mismatch
Red dot position doesn't align with where I'm actually looking
Significant offset between intended gaze point and actual cursor position
Calibration seems to drift or become inaccurate over time
2. Extreme Eye Movement Requirements
Need to make exaggerated eye movements to reach screen edges/corners
Natural eye movements don't translate to proportional cursor movement
Difficulty reaching certain screen regions even with calibration
3. Sensitivity and Stability Issues
Cursor jitters or jumps around when looking at center
Too much sensitivity to micro-movements
Inconsistent behavior between calibration and normal operation
4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook.
Primary Question: Word-Level Eye Tracking Feasibility
Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware?
I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for:
Tracking natural reading patterns (left-to-right, line-by-line progression)
Detecting which specific words a user is looking at
Maintaining accuracy for sustained reading sessions (15-30 minutes)
Working reliably across different users and lighting conditions
Questions for the Community
Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities?
Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed?
Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking?
Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case?
Current Setup
Devices: iPhone 14 Pro
iOS Version: iOS 18.3
ARKit Version: Latest available
Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features.
Thank you for your time and expertise!
Hello,
I added new In-App Purchase into my app, it was approved on 2nd of Oct but now 7th of Oct I still cannot see it in the list of products coming from Store.
I already have 2 subscriptions and 1 In-App purchase in my app, but the new In-App purchase is still not coming from the store in available products. What could cause this?
It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
We have an iOS App built in .NET MAUI (Multi-platform App UI).
This is a web view App.
We wish to integrate APP Clips into this App.
But we are unable to do it, due to less available resources online on such implementation.
We do not wish to share code between .NET MAUI App and App clips.
We understand it is not possible to add APP Clips without a parent swift/Xcode app.
As an alternative solution we were thinking to Create a new APP in APP Store Connect using XCode/swift and integrate app clips to it.
This parent app when downloaded by users will only redirect users to our MAIN .NET MAUI app to app store connect.
We need to know if such apps will be approved by APPSTORE Connect?
Please guide us on this.
Also please do let us know if you have any other solution to integrate App clips to a .NET MAUI App
I am encountering the following issue while working with app group preferences in my Safari web extension:
Couldn't read values in CFPrefsPlistSource<0x3034e7f80> (Domain: [MyAppGroup], User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd.
I am trying to read/write shared preferences using UserDefaults with an App Group but keep running into this error. Any guidance on how to resolve this would be greatly appreciated!
Has anyone encountered this before? How can I properly configure my app group preferences to avoid this issue?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Swift Packages
Messages
Xcode
Group Activities
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Steps: Open the path above → “Hattori” is not listed and cannot be downloaded
Expected: Hattori is available to download and select
Actual: Hattori is absent from the catalog
Regression: Was available on iOS 18.x on the same device
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it.
Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
As part of apple pay implementation we are trying to create a merchant session by trying to connect to apple endpoint https://apple-pay-gateway-cert.apple.com/paymentservices/startSession.
While trying to do so we are facing an error “An error occurred while sending the request. The request was aborted: Could not create SSL/TLS secure channel.” .
I call the validation url by passing to a C# .Net Framework 4.8 Web API. The API setups an HttpClient with the Merchant Identity Validation Certificate found in my apple account and calls the validation url passing in the required Json Validation Object. When I call PostAsync() I get an exception with the above error message
Code is working successfully on my local machine but facing this issue while deployed on Dev / Model environment for testing.
We have used Azure app service for deployment and TLS version 1.2 already present here.
We have used the Merchant Identity certificate that was issued and have also checked with networking and infrastructure team to make its not an issue from our side.
Does anyone have any other idea what could be causing this error.
Thank you,
Supriya
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m developing an iOS app, and I’ve noticed that when the user enables Accessibility → Display & Text Size → Color Filters → Grayscale, my app icon loses a lot of visual contrast. The original colored version looks fine, but in grayscale it appears “flat” and harder to distinguish, unlike a pure black-and-white design.
What I want to achieve:
Ensure the app icon remains visually clear and high-contrast even when iOS renders it in grayscale.
Ideally, provide an alternate “high-contrast” app icon version when grayscale mode is enabled.
What I’ve tried:
Increased color contrast in the original icon design.
Added outlines and stronger shapes.
Tested with grayscale filters in design tools.
Researched Asset Catalog and alternate icons, but found no documented API to detect or respond to grayscale mode.
Questions:
Is there any API in iOS that allows detecting when the system is in grayscale mode so that I can programmatically switch to an alternate app icon?
If not, are there Apple-recommended best practices for designing app icons that still look clear in grayscale?
Are there any accessibility guidelines specifically addressing icon design for grayscale or color-blind modes?
Additional info:
iOS version tested: iOS 17.5
Development in Swift + SwiftUI, using Asset Catalog for icons.
I am aware that iOS supports alternate icons via setAlternateIconName, but I haven’t found a trigger for grayscale mode.
How to get approved my NFC based app on IOS store fast
Topic:
Accessibility & Inclusion
SubTopic:
General
I have subscribed to the developer program, but it’s already been a day and it still shows “is not enrolled in the Apple Developer Program.”
Topic:
Accessibility & Inclusion
SubTopic:
General