It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator.
I get the same frames when using the Accessibility Inspector when the Max is selected as the host.
When I switch the host to the iOS simulator the frames are correct.
How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
Topic:
Accessibility & Inclusion
SubTopic:
General
Settings
Accessibility > VoiceOver > Typing > Typing Feedback
Set Character and Words
General > Keyboard > Keyboards
Add Japanese - Kana
Use iOS18.0 ~ iOS18.2
Build with Xcode16.1 ~ 16.2
What happens?
When we input "あ" in UITextField with Japanese - Kana Keyboard, it up until now it was feedback as "a". However, it is no feedback now.
On UITextView, Typing Feedback working.
On iOS17, also working.
Topic:
Accessibility & Inclusion
SubTopic:
General
A Summary of the WWDC25 Group Lab - Accessibility
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility.
Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels?
A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature.
Which accessibility features can I indicate support for in Accessibility Nutrition Labels?
The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion.
With the new Accessibility Nutrition Labels, will app store reviewers validate what we select?
The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app.
Are there any updates to tools for analyzing the accessibility of our apps?
Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource.
What are accessibility features you wish more people integrated?
Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users.
What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass?
Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable.
How does Liquid Glass respond to contrast? Especially for text and low contrast environments.
Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps.
What are some Apple apps that stand out for their accessibility?
Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
I was trying to achieve accurate positioning with UWB on an iPhone 16 in India but couldn't find any option to enable it in the settings. I am using the Qorvo Nearby Interaction app to communicate with my custom UWB tag( DWM3001 by Qorvo).
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello Albert!
I am experiencing some strange bugs around DeviceActivityEvents (part of the DeviceActivity framework) on iOS 26 / iOS 26.1 / iOS 26.2 beta:
When creating a DeviceActivityEvent we can assign a threshold and applicationTokens.
The idea is, that after the user has spent said threshold on said apps, eventDidReachThreshold() is called.
The property includesPastActivity is set to false.
On iOS 26 however, it happens (quite reliably after updating to a new beta seed) quite often that eventDidReachThreshold() is called immediately (after a couple of seconds) instead of waiting for the threshold to be met.
Is anyone else seeing similar issues on iOS 26 / iOS 26.1 / iOS 26.2 beta?
Only workaround I have found is to ask users to revoke and re-grant Screen Time permissions. This only holds for about two weeks though or at most until the next iOS 26 beta update is installed, so it is not a permanent solution unfortunately.
Feedback (incl. sysdiagnoses and sample project) is filed under:
FB18061981
FB18927456
One of our users has filed their own feedback request as well:
FB20817853
Thanks a lot for any help on this!
Hello,
I’m reaching out regarding an issue with our organization’s Apple Developer Program enrollment. We’ve successfully created a developer account and our organization is verified through the D-U-N-S system. The D-U-N-S ID is correctly displayed in our Apple Developer account.
However, the enrollment status still shows: “Your enrollment is being processed.” It’s been 3 months, and we haven’t received any further communication or updates.
Has anyone experienced a similar delay? Is there anything else we should do to expedite the process?
Any guidance or insight would be greatly appreciated.
Thanks,
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi Apple Developer Community,
I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community.
Current Implementation
Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.)
Implementing custom sensitivity curves and smoothing algorithms
Applying baseline correction and coordinate mapping
Using quadratic regression for calibration point mapping
Issues I'm Facing
1. Calibration Mismatch
Red dot position doesn't align with where I'm actually looking
Significant offset between intended gaze point and actual cursor position
Calibration seems to drift or become inaccurate over time
2. Extreme Eye Movement Requirements
Need to make exaggerated eye movements to reach screen edges/corners
Natural eye movements don't translate to proportional cursor movement
Difficulty reaching certain screen regions even with calibration
3. Sensitivity and Stability Issues
Cursor jitters or jumps around when looking at center
Too much sensitivity to micro-movements
Inconsistent behavior between calibration and normal operation
4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook.
Primary Question: Word-Level Eye Tracking Feasibility
Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware?
I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for:
Tracking natural reading patterns (left-to-right, line-by-line progression)
Detecting which specific words a user is looking at
Maintaining accuracy for sustained reading sessions (15-30 minutes)
Working reliably across different users and lighting conditions
Questions for the Community
Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities?
Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed?
Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking?
Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case?
Current Setup
Devices: iPhone 14 Pro
iOS Version: iOS 18.3
ARKit Version: Latest available
Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features.
Thank you for your time and expertise!
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window
AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1
CFTypeRef frontWindow = NULL;
AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );
if ( err != kAXErrorSuccess ){
NSLog(@"failed with error: %i",err);
}
NSLog(@"app: %@ frontWindow: %@",app,frontWindow);
'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
I’m using a 9th gen iPad, updated iPadOS 26 a few weeks ago, but I can’t use the 3D background, can’t find any way to use the 3D background. Is it a system issue or is it my iPad’s issue?
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?
My app uses CoreData based on iOS 13.0 combined with iCloud to store data. This function automatically manages the data collaboration between CoreData and iCloud. Some users have reported that after going abroad, their original data disappeared, and when they returned to China, the data could be displayed normally again. I'm located in Mainland China. I've learned that iCloud data in China is all stored in Guizhou-Cloud Big Data (the data center in Guizhou). Could this problem be caused by display abnormalities resulting from the switching of the iCloud data storage centers accessed in different regions?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
When I listen to title in my app with VoiceOver, it makes a strange sound.
This characters make with Korean+number+Alphabet.
Is this combination makes some strange sound with voice over?
I would like to ask if Apple can fix this issue.
Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
I have just purchased an Apple developer Account from Bangladesh and it sent codes perfectly for the first 2-3 times for logging in, but after that no matter what I do it doesn't send verification code and I am stuck, I now cannot log in and this is extremely extremely frustrating
I use AttributedString to create a string containing a link. And I set the AttributedString to UILabel. How should I set up the Accessibility feature to make sure that
I can keyboard focus on the substring with link and use keyboard operation to open the link
I can VoiceOver the whole string and VoiceOver the substring with link to open the link
Thanks a lot.
Topic:
Accessibility & Inclusion
SubTopic:
General
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.
When iOS screen reader reads the month "July" in its shorter version "Jul" its not reading it correctly as month, where as all other months name it reading it correctly in shorter version, so as a result all dates comes under that month when we display in front end and use a screen reader to read it then it will read out as number not date.
I have tried the longer version with the screen reader and then its reads correctly July as well.
Topic:
Accessibility & Inclusion
SubTopic:
General
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar.
I’ve noticed that this indicator is not consistent:
For some windows, the badge reliably appears when they are being shared.
For other windows, the badge never appears, even though the window is actively shared.
In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not.
My questions are:
What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar?
Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)?
Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
Topic:
Accessibility & Inclusion
SubTopic:
General
Environment:xcode 16.2
WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0))
”jp_jump“: Local color picture load widget crashes
info:
Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)