I am building a language learning app for a unlisted primary language. My plan is to go with english. Any other suggestions or heads up? Check screenshot.
Its unfortunate that i have to tag a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Amogst the two languages my app would have lets say 10% and 90%.
I am launching an app for a unlisted Primary Language. I consider whatever is inside the app as the primary and that wont be English. The secondary language
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it.
Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
i would like to ask what are the new touchpad shortcuts in the latest os for ipad using a trackpad. The three fingers swipe to the right and left seemed to be removed. Thank you
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
A Summary of the WWDC25 Group Lab - Accessibility
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility.
Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels?
A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature.
Which accessibility features can I indicate support for in Accessibility Nutrition Labels?
The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion.
With the new Accessibility Nutrition Labels, will app store reviewers validate what we select?
The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app.
Are there any updates to tools for analyzing the accessibility of our apps?
Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource.
What are accessibility features you wish more people integrated?
Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users.
What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass?
Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable.
How does Liquid Glass respond to contrast? Especially for text and low contrast environments.
Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps.
What are some Apple apps that stand out for their accessibility?
Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
I have a TabView with a sample tabItem as follows:
.tabItem {
Label ("Import", systemImage:"doc.on.doc")
.accessibilityLabel("Import Text")
}
But accessibility settings for large display size on does not seem to work, nor do dynamic font sizes:
.tabItem {
Label ("Import", systemImage:"doc.on.doc")
.font(.largeTitle)
.accessibilityLabel("Import Text")
}
The tabItems appear as a fixed size. The tab contents scale well, so this does not look pleasant at all.
Is this a known bug in SwiftUI?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have the following method to insert @mentions to a text field:
func insertMention(user: Token, at range: NSRange) -> Void {
let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)")
let attachment: NSTextAttachment = NSTextAttachment()
attachment.image = tokenImage
attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height)
attachment.accessibilityLabel = user.username
attachment.accessibilityHint = "Mention of \(user.username)"
let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment))
attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1))
attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1))
let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText)
mutableText.replaceCharacters(in: range, with: attachmentString)
mutableText.append(NSAttributedString(string: " "))
textView.attributedText = mutableText
textView.selectedRange = NSRange(location: range.location + 2, length: 0)
mentionRange = nil
tableView.isHidden = true
}
When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
When VoiceOver reads decimal numbers with six or more digits after the decimal, it stops announcing the decimal separator and also adds pauses between each digit.
Text("0.12345") // VoiceOver: "zero **point** one two three four five"
Text("0.123456") // VoiceOver: "zero one, two, three, four, five, six"
How can I force VoiceOver to announce the decimal separator ("point") and not insert pauses regardless of the number of decimal digits?
There are several ways we are supposed to be able to control a11y (accessibility) focus in FKA (Full Keyboard Access) mode.
We should be able to set up an @AccessibilityFocusState variable that contains an enum for the different views that we want to receive a11y focus. That works from VO (VoiceOver) but not from FKA mode. See this sample project on Github:
https://stackoverflow.com/questions/79067665/how-to-manage-accessibilityfocusstate-for-swiftui-accessibility-keyboard
Similarly, we are supposed to be able to use accessibilitySortPriority to control the order that views are selected when a user using FKA tabs between views. That also works from VO but not from FKA mode. In the sample code below, the `.accessibilitySortPriority() ViewModifiers cause VO to change to a non-standard order when you swipe between views, but it has no effect in FKA mode.
Is there a way to either set the a11y focus or change the order in which the views are selected that actually works in SwiftUI when the user is in FKA mode?
Code that should cause FKA to tab between text fields in a custom order:
struct ContentView: View {
@State private var val1: String = "val 1"
@State private var val2: String = "val 2"
@State private var val3: String = "val 3"
@State private var val4: String = "val 4"
var body: some View {
VStack {
TextField("Value 1", text: $val1)
.accessibilitySortPriority(3)
VStack {
TextField("Value 2", text: $val2)
.accessibilitySortPriority(1)
}
HStack {
TextField("Value 3", text: $val3)
.accessibilitySortPriority(2)
TextField("Value 4", text: $val4)
.accessibilitySortPriority(4)
}
}
.padding()
}
}```
Hey all — hoping someone here has dealt with this before.
I’m testing an iOS app via TestFlight, and when I originally got access, I didn’t have an iPhone. So I signed in with my Apple ID on my girlfriend’s iPhone and used TestFlight there. Everything worked fine.
Now I finally have my own iPhone (iPhone 16), downloaded TestFlight, signed in with the same Apple ID, and had the developer resend the invite. But when I tap "Open in TestFlight" from the invite email, I get this error:
“Couldn’t load app because your Apple account has already been associated to this app.”
The dev tried removing me as a tester and re-adding me, I’ve deleted TestFlight from both phones, rebooted, reinstalled, waited in between — still no luck. Even tried opening the invite link in Safari instead of Mail.
Is there any way to get Apple to fully reset the association with the old device so I can use TestFlight on my new iPhone? Or do I really need to make a new Apple ID just to get around this?
Any help would be huge — thanks!
On iOS, there is accessibilityLanguage.
While it is possible to scroll content using VoiceOver on macOS, I was not able to find any NSAccessibility APIs related to it (such as accessibilityScroll: on iOS).
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
I have just purchased an Apple developer Account from Bangladesh and it sent codes perfectly for the first 2-3 times for logging in, but after that no matter what I do it doesn't send verification code and I am stuck, I now cannot log in and this is extremely extremely frustrating
I'm developing a calculator app and working to ensure a great experience for both VoiceOver and Braille display users.
For expressions like (2+3)×5, I need two different accessibility outputs:
VoiceOver (spoken): A descriptive string like “left paren two plus three right paren times five,” provided via .accessibilityValue. I'm using a custom spellOut function since VoiceOver doesn't announce parentheses—which are kind of important when doing math!
Braille (symbolic): The literal math string (2+3)×5, provided using .accessibilityCustomContent("", ...), with an empty label so it’s not spoken aloud.
The issue: I don’t have access to a Braille display device and Xcode’s Accessibility Inspector doesn’t seem to show the custom content.
Is there any way to confirm that custom Braille content is being set correctly in Simulator or with other tools?
Or…is there a "math mode" in VoiceOver that forces it to announce parentheses?
Any advice or workarounds would be much appreciated!
Thanks,
Uhl
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
External Accessory
iOS
Accessibility
SwiftUI
After updating to the iOS 26 Beta version, the screenshot option within the AssistiveTouch menu has stopped working. Tapping on the "Screenshot" icon does not perform any action.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace:
#44 AXSpeech
SIGSEGV
SEGV_ACCERR
0 CoreFoundation ___CFCheckCFInfoPACSignature + 4
1 CoreFoundation _CFRunLoopSourceSignal + 28
2 Foundation _performQueueDequeue + 492
3 Foundation ___NSThreadPerformPerform + 88
4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28
5 CoreFoundation ___CFRunLoopDoSource0 + 176
6 CoreFoundation ___CFRunLoopDoSources0 + 340
7 CoreFoundation ___CFRunLoopRun + 828
8 CoreFoundation _CFRunLoopRunSpecific + 608
9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212
10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776
11 Foundation ___NSThread__start__ + 732
12 libsystem_pthread.dylib __pthread_start + 136
Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows:
10 TextToSpeech LogWarning(char const*, ...) + 7288
Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!
Hello,
I'm currently unable to access App Store Connect. When I try to open https://appstoreconnect.apple.com, I receive the following error message:
“appstoreconnect.apple.com is currently unable to handle this request.”
I’ve tried the following steps, but the issue persists:
Cleared browser cache and cookies
Tried different browsers (Safari, Chrome)
Attempted from multiple devices and networks
Is this a known issue or is there any workaround available?
Would appreciate any help or update on the current status.
Thank you,
Topic:
Accessibility & Inclusion
SubTopic:
General