Post

Replies

Boosts

Views

Activity

how accessible is enough for Accessibility Nutrition Labels?
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released. I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that. In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks. As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
0
0
30
5d
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
0
0
63
May ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
85
Apr ’25
Components with Earcon haptic feedback for VoiceOver users
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound? Some transitions in Apple apps generally include different beep sounds, such as opening a new screen screen dimming when a VoiceOver user swipes from the header / navbar to the body a scraping sound when swiping up or down a page. the beginning or end of the body section in Calculator when swiping from one row to the next. opening a pop up menu I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
3
1
731
Jan ’25
VoiceOver: Detect Languages
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on. VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations? I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code. I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams. https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
1
0
806
Nov ’24
Full keyboard access UI elements
Hello, Since the full keyboard access Help menu is a little vague on the nuances between Tab, arrow keys, and Ctrl+tab in terms of navigation, could you point me to where I can find the intended mapping of FKA keys to UI elements? For example, I have been in several Apple iOS apps where the UITabBar at the bottom is navigable with any of the three options mentioned above. In other contexts, the tab key only moves the user to the tab bar section, then the icons are focusable with arrow or Ctrl + tab. When a modal pops up stating I will be leaving an app, should the choices be navigable with Tab? Ctrl+tab? or arrows too? In other places, like news articles in Apple News, it seems that I cannot scroll with the arrow keys to read the various paragraphs, nor interact with links at all that are in the article. If there is a separate keyboard shortcut for links or scrollbar, please update the Help menu. It seems pretty straightforward that arrow keys navigate between HStacks and VStacks. Is that an accurate guess of arrow key behavior? I feel like I'm guessing in several places within the content groups.
4
0
2.1k
Feb ’24
Scrolling with Full Keyboard Access
I'm testing Full Keyboard Access in my app and on the iPhone apps in my iPhone 12 mini with OS 17. My work will directly impact how much accessibility review is done on our iOS app which has millions of unique views a month. In several Apple apps I cannot seem to scroll down through the screen when the main View has focus. For example, the Home app does not scroll with arrow keys nor Ctrl+tab through any of the 6 main content groups on the Discover screen. it almost appears it's a single static image; the "Getting Started" button is not able to be activated. I can activate sections further down when I enable gestures, but cannot pinpoint a specific location. The Stocks app includes Top Stories from the Apple News app; in either app I can select a story, which brings up the article on full screen, but then I cannot use the arrow keys or Ctrl+tab to read the article or interact with inline links. Ctrl + tab selects the button features like to watch an embedded video or live coverage, then jumps down to the end of the article to focus on Related stories, ignoring all the links in between. I am able to somewhat move through the article text with keyboard gestures, but many of these articles have embedded links or content after the article (before "Related Stories" I work in digital accessibility and need to be able to tell my teams what is expected behavior and where to see examples of this. If Apple can't demonstrate Full Keyboard Access in its own apps this is a problem. Our own app has some of these issues but I am unsure how to recommend a solution when the scrollview seems to not work in native iOS apps by Apple.
1
2
1.8k
Feb ’24