Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

iPhone screen layers
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
0
0
112
Apr ’25
Camera Crashes
Hi everybody, I'm trying to build a QR-Code Scanner and Generator App for IOS. Whenever I try to implement the camera the app crashes with this comment: This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data. I tried to reduce the app to the minimum of nothing but camera with the same result. Any ideas? Tank you and best Regards Horst Schippers
0
0
72
Apr ’25
Clarification on Color Path Determination in Wallet Provisioning (Green,Yellow, Orange) Path recommendation
Hi, I’ve been reviewing the Apple Wallet provisioning documentation (Getting Started with Apple Pay In-App Provisioning_ Verification_Security_Wallet Extensions )and had a few questions regarding the color path recommendation (Green, Yellow, Orange, Red) returned during the in-app provisioning flow: Who determines the color path—is it Apple directly, the Payment Network Operator (PNO), or both? What criteria are used to determine the color path (e.g., device info, Apple ID reputation, past provisioning attempts)? At what point in the provisioning flow is the color path recommendation received? Is it included in the response after the PKAddPaymentPassRequest is submitted? Is it accessible through any specific property or callback in the delegate method? Additionally, for Orange Path with Reason Code 0G, I understand that in-app verification is not allowed and must be handled via tenured channels (e.g., SMS/email). Can you confirm if this logic still applies for requests initiated from within the issuer's iOS app? Would appreciate any clarification or pointers to related documentation.
0
0
132
May ’25
CallKit連携でError 4099が頻発し、NEAppPushDelegateの通知が取得できず着信画面が表示されない問題について
iOSアプリでNEAppPushSessionを使い、NEAppPushDelegateの通知を受けてCallKitの着信画面を表示する実装をしていますが、以下の問題に直面しています。 8/13 ログにて下記のエラーが頻発しました。 通知の受け取りテストを約120回してその間ずっとこのエラーが出ていました。 エラー 2025-08-14 11:27:06.793073 +0900 nesessionmanager NESMAppPushSession[SimplePushDefaultConfiguration:7B7218F3-94B5-4AE5-9B9E-94E176694D02] failed to report incoming call to CallKit, error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process.} このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。 ですがどうやら通知の監視は開始しているようです。 15時間後の8/14 時間をあけたからか、再度通知が来るようになりました。 ですが再度通知の受け取りテストを行った時に同じエラーログが出ました。 再度通知テストを約120回程行ったら、このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。 ですがどうやら今回も通知の監視は開始しているようです。 15時間後の8/15 今日は15時間かけても通知を取得できませんでした。 ですが同じく通知の監視は開始していそうです。 iPhoneの再起動、Xcodeのクリーンアップ、アンインストールして再インストールなどしても通知は来ないままでした。 また、不思議なことに通知が来ない事象が起きた端末以外でも同じように通知を取得することができません。 他の通知は受け取ることができますが独自の通知であるNEAppPushManagerだけ通知を取得することができません。 質問です。 再度通知を出すためには何をすれば良いでしょうか。 この事象は4099エラーを出しすぎたことにより発生する障害なのでしょうか。 4099エラーを出ている原因は何でしょうか。
0
0
470
Aug ’25
Accessibility Voiceover is not treating navigation bar left button as first focused element
Accessibility Voiceover is not treating navigation bar left button as first focused element. If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title. If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI. how should I do? if I want to focus on the navigation item back button or navigation title. my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange. Why did you design it this way? Can you tell me your thinking? Thanks
0
0
399
Sep ’25
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
151
Apr ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
209
Apr ’25
Avoid trackpad gesture conflict between dragging and accessibility zooming when using three fingers
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired. Current and desired behavior: This seems an issue so I filed a feedback.
0
0
761
Aug ’25
Focus issues with ScrollView iOS18
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard. 1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard. TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView. Is this a known iOS18 issue with ScrollView / any tip to get this fixed ? Sample code that can reproduce this issue: struct ContentView: View { @State private var showBottomSheet: Bool = false @State private var goToNextView: Bool = false @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool var body: some View { NavigationView { VStack { Text("Hello, world!") // This button works fine in Bluetooth keyboard in all versions Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Goto another view") { goToNextView = true } NavigationLink( destination: View2(), isActive: $goToNextView ) { EmptyView() } .accessibility(hidden: true) } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } } #Preview { ContentView() } struct View2: View { @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool @State private var showBottomSheet: Bool = false var body: some View { ScrollView { VStack { Text("check") // In iOS18, this button doesn't get focused in Bluetooth / external keyboard // This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Test button") { } } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } }
0
1
627
Feb ’25
Programmatically Modifying Per Application or System Wide Color Filters using Cocoa/Swift in MacOS?
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings. As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs. My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
0
0
381
Jul ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
139
May ’25
A Summary of the WWDC25 Group Lab - Accessibility
A Summary of the WWDC25 Group Lab - Accessibility At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility. Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels? A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature. Which accessibility features can I indicate support for in Accessibility Nutrition Labels? The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion. With the new Accessibility Nutrition Labels, will app store reviewers validate what we select? The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app. Are there any updates to tools for analyzing the accessibility of our apps? Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource. What are accessibility features you wish more people integrated? Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users. What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass? Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable. How does Liquid Glass respond to contrast? Especially for text and low contrast environments. Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps. What are some Apple apps that stand out for their accessibility? Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
0
0
896
Jul ’25
TabItems in swiftUI do not scale
I have a TabView with a sample tabItem as follows: .tabItem { Label ("Import", systemImage:"doc.on.doc") .accessibilityLabel("Import Text") } But accessibility settings for large display size on does not seem to work, nor do dynamic font sizes: .tabItem { Label ("Import", systemImage:"doc.on.doc") .font(.largeTitle) .accessibilityLabel("Import Text") } The tabItems appear as a fixed size. The tab contents scale well, so this does not look pleasant at all. Is this a known bug in SwiftUI?
0
0
739
Jul ’25
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26 Steps: Open the path above → “Hattori” is not listed and cannot be downloaded Expected: Hattori is available to download and select Actual: Hattori is absent from the catalog Regression: Was available on iOS 18.x on the same device
0
0
427
Sep ’25
Handling VoiceOver Focus When Screen Changes (Push, Present, and SplitViewController)
I have some doubts about how VoiceOver handles focus when the screen updates. When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior? In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?
0
0
148
Apr ’25
Autonomous Single App Mode(ASAM) in macOS
Hello I tried implementing the ASAM for macOS as per apple guidelines with configuration profile mentioned here but didn't had any success. Then Apple suggested to use requestGuidedAccessSession in macOS but that is only supported in macOS Catalyst but that also didn't work with valid config profiles too. Did anyone get success with ASAM mode without assessment entitltlement?
0
0
261
Jul ’25