I'm developing a macOS app using NSView and trying to make my content navigable via VoiceOver. I'm expecting the built-in rotor category "Content Chooser" (accessed via VO + U) to list my accessible elements — just like how it shows message items in the Mail app. However, in my app, this rotor appears empty, even though:
My views return proper accessibilityChildren() or accessibilityContents() with valid NSAccessibilityElements
Each child has correct AXRole, AXLabel, etc.
The window is key and visible
VoiceOver navigation works for the elements
I've also tried:
Using both accessibilityChildren() and accessibilityContents() in container views
Setting roles like .group, .staticText, .button, etc.
Avoiding hidden elements
Ensuring all elements are visible and labeled
Still, "Content Chooser" rotor is empty.
What exact conditions must be met for an element to appear in the "Content Chooser" rotor in a macOS app?
Any Apple-specific guidance, hidden requirements, or sample code would be appreciated.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
iOS 18.3.1, iPhone 16 Pro.
I pick photos using connected physical keyboard from the user's photo library using:
.photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images)
When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
Hello all.
Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice.
Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
This has been an ongoing issue and continues in Tahoe. When dictating into Gmail in Safari, whole portions of sentences are copy and pasted, making the text a mess. I have reported this in feedback for a couple years, and it has never been resolved.
Topic:
Accessibility & Inclusion
SubTopic:
General
When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
Hi I'm planning to make macos App and distribute to MacOS App Store.
The question is should i make force update when update is needed.
The reason why I want to make this feature is I don't want to make user use previous version of app.
My plan is like this.
when app needed update, make user reach special page that describe why update is needed and set a button that can download new version of app.
the download will be automatically doing at background don't need to visit app store.
I search several forums and gpt but there is no positive reply of this..
so finally i make a post to know is there no way to make this.
Thank you!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I am a student studying accessibility.
I aim to analyze the smartphone usage patterns of visually impaired individuals.
Therefore, I would like to log the VoiceOver usage records of visually impaired iPhone users.
Is there a way to output VoiceOver logs, similar to the AccessibilityService API on Android?
Thank you in advance for your responses.
Hello everyone,
I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5.
This has also been reported via Feedback Assistant with
Feedback ID: FB17806167
Description:
When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation.
Specifically:
The pointer cannot move past the center of the screen
Horizontal and vertical (X/Y) movements appear to be swapped or misaligned
Natural movement of the pointer is not possible
It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape.
This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps.
Steps to Reproduce:
Enable AssistiveTouch
Connect a Bluetooth mouse to the iPhone
Rotate the device to landscape orientation
Try moving the mouse pointer across the screen
→ Notice that:
Pointer cannot move past the center
Horizontal/vertical input is interpreted incorrectly (as if still in portrait)
Expected Behavior:
The mouse pointer should move across the entire screen correctly, regardless of device orientation.
Actual Behavior:
In landscape orientation, the pointer is either restricted to part of the screen or misaligned.
It behaves as if the device is still in portrait.
Horizontal mouse movement causes vertical pointer movement, and vice versa
User experience feels broken and unintuitive
Feature Suggestion:
Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS.
I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior.
Thanks in advance for any insights or suggestions.
Best regards,
Jannis
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore.
Is There any reason for this to happening?.
Is anyone aware of this behaviour?
Any suggestion would be really appreciated.
Cheers
写了个自己用的app,在自己手机上测试中,隔一周左右就打不开了,显示不再可用。
ps.没花钱买开发者账号,app也不打算发布。
Topic:
Accessibility & Inclusion
SubTopic:
General
We have an iOS App built in .NET MAUI (Multi-platform App UI).
This is a web view App.
We wish to integrate APP Clips into this App.
But we are unable to do it, due to less available resources online on such implementation.
We do not wish to share code between .NET MAUI App and App clips.
We understand it is not possible to add APP Clips without a parent swift/Xcode app.
As an alternative solution we were thinking to Create a new APP in APP Store Connect using XCode/swift and integrate app clips to it.
This parent app when downloaded by users will only redirect users to our MAIN .NET MAUI app to app store connect.
We need to know if such apps will be approved by APPSTORE Connect?
Please guide us on this.
Also please do let us know if you have any other solution to integrate App clips to a .NET MAUI App
i have updated to the ipados 26 and my pointer is still the circle one and not the arrow cursor
Topic:
Accessibility & Inclusion
SubTopic:
General
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message.
Does anyone have any ideas what the problem could be?
Topic:
Accessibility & Inclusion
SubTopic:
General
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users.
Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included.
In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says:
After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app.
Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of:
AVSpeechUtterance(string: "中文") // Any Chinese Content
in the dialect specified by:
Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language
instead of the dialect that I specified in AVSpeechUtterance.voice:
AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese
AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin
However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below.
My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18.
Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen.
However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens.
This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior.
Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
I’ve noticed that the VoiceOver reads currency amounts correctly when they are below thousand.
Then, for higher amounts, for example 12.225,34 € VoiceOver reads ‘twelve point two two five thirty four euros’
If the amount is formatted without the thousand separator (12225,34 €) this problem doesn’t exist. (VO reads twelve thousand two hundred and twenty five euros and thirty four cents)
Why is the thousand separator a problem for VoiceOver if this formatting is coming from the currency and locale?
This issue exists in English. I changed my device language to Italian and German and in both cases the number was read correctly even with the separator.
Is there a way to make it work in English?
Hey everyone,
I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery?
The Core Idea
Imagine this:
Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket.
Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery.
Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures.
On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED).
Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy.
Why This Matters
This isn't just a cool gimmick; it offers significant benefits:
True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else.
Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses.
Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech.
Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source.
This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
Topic:
Accessibility & Inclusion
SubTopic:
General
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released.
I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that.
In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks.
As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
Topic:
Accessibility & Inclusion
SubTopic:
General