Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Fullscreen Detection
Hi, I want to detect if there is a fullscreen window on each screen. _AXUIElementGetWindow and kAXFullscreenAttribute methods work, but I have to be in a non-sandbox environment to use them. Is there any other way that also works? I don't think it's enough to judge if it's fullscreen by comparing the window size to the screen size, since it doesn't work on MacBook with notch, or the menu bar is set to 'auto-hide'. Thanks.
14
1
1.5k
Jul ’25
Problem with the note application
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared. Thank you very much Good to you Christophe
1
1
235
May ’25
Accessibility of Show Password Buttons
We have a password entry field with a "show password" button. The button effectively turns the "secure text entry" textfield into a non-secure text entry field allowing the user to view what they typed in. When VoiceOver is enabled, I am not including that button in the UI; it doesn't seem to make sense to me for the following reasons. If you properly test with the screen curtain, the functionality is useless. You don't see anything. I've tried to explain this to my accessibility team. It's also quite ridiculous to offer to show a blind user their password, I'm sure they'd love to see it, but they just can't. This would almost seem insulting as well. If by toggling that button, and turning a secure text entry into a non-secure text entry, now the app is literally speaking their password aloud. This seems like a security vulnerability to me. What if someone else overhears the password spoken aloud. The accessibility team is insisting that I need to include the "show password" button when VoiceOver is enabled. This is the response I received. "functionality should be the same for VI users as for sighted users. It may happen that a VI user wants to check what is typed into password field in order to correct mistakes". Again, I don't agree with that because functionality should not be the same. Functionality should be changed and altered as necessary to make the user experience as accessible as possible. And in this scenario, to me the functionality doesn't make sense at all in a VoiceOver setting. Any thoughts on this? Am I incorrect here? Are there benefits of including a "show password" button to a user utilizing VoiceOver? What should then the functionality be? Speak the password aloud? Thanks.
4
0
456
18h
iOS 26 Voice Over is reporting an extra tab
Feedback number: FB20451665 When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below: /* This is a Sample project to show that I believe there is a Voice Over bug in iOS 26. When swiping through tabs with Voice Over active, there always appears to be an extra tab. Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6. Is there a possibility that voice over is picking up the underlying `more` tab and reading that out? This has also been reportedly found in the Files app here: https://www.applevis.com/comment/195441#comment-195441 */ struct ContentView: View { var body: some View { TabView { /// Activating this has Voice over telling us there are 6 Tabs. Tab(RootTab.home.title, systemImage: "circle.fill") { Text("This is the \(RootTab.home.title.capitalized) screen") } .accessibilityLabel("\(RootTab.home.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab") Tab(RootTab.diary.title, systemImage: "circle.fill") { Text("This is the \(RootTab.diary.title.capitalized) screen") } .accessibilityLabel("\(RootTab.diary.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab") Tab(RootTab.meals.title, systemImage: "circle.fill") { Text("This is the \(RootTab.meals.title.capitalized) screen") } .accessibilityLabel("\(RootTab.meals.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab") Tab(RootTab.knowledge.title, systemImage: "circle.fill") { Text("This is the \(RootTab.knowledge.title.capitalized) screen") } .accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab") Tab(RootTab.profile.title, systemImage: "circle.fill") { Text("This is the \(RootTab.profile.title.capitalized) screen") } .accessibilityLabel("\(RootTab.profile.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab") /// Activating this also has Voice over telling us there are 6 Tabs. // ForEach(RootTab.allCases, id: \.self) { tab in // // Text("This is the \(tab.title.capitalized) screen") // .tabItem { // Label(tab.title.capitalized, systemImage: "circle.fill") // } // .accessibilityLabel("\(tab.title.capitalized) tab") // .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab") // } } } enum RootTab: CaseIterable { case home case diary case meals case knowledge case profile var title: String { switch self { case .home: "home" case .diary: "diary" case .meals: "meals" case .knowledge: "knowledge" case .profile: "profile" } } } } I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
3
0
2k
Oct ’25
AirPods Pro 3 HRV Data Access Through HealthKit?
Hey everyone I'm working on a health app that's heavily focused on HRV tracking and analysis, and I'm trying to figure out what's actually possible with AirPods Pro 3 from a developer standpoint. The hardware clearly has a much better heart rate sensor than the previous generation, but I'm hitting some walls when it comes to actually accessing the data I need. So here's the situation I'm dealing with: When I query HealthKit for HRV samples, I'm not seeing anything coming from AirPods Pro 3. The device is obviously capable of tracking heart rate continuously during workouts and listening sessions, and from what I've read about the hardware, it should theoretically be able to capture the inter-beat intervals needed for HRV calculation. But either that data isn't being processed on-device, or it's just not being made available through the standard HealthKit data types that third-party apps can access. What I'm really after is either direct HRV metrics (like SDNN, which Apple Watch already provides through HKQuantityTypeIdentifierHeartRateVariabilitySDNN) or even better, access to the raw R-R interval data. With R-R intervals, I could calculate RMSSD, pNN50, and other time-domain and frequency-domain HRV metrics that are super valuable for tracking recovery, autonomic nervous system balance, and stress levels. This would be especially useful since a lot of users wear AirPods during activities when they're not wearing their Apple Watch. Has anyone managed to find a way to pull this data from AirPods Pro 3? Are there any private frameworks or entitlements I should be looking into? Or is this just fundamentally not exposed to developers at the OS level right now? I've gone through the HealthKit documentation pretty thoroughly and haven't found anything that specifically addresses this, but I'm wondering if I'm missing something or if there are any known workarounds. I'm also curious if anyone has heard anything from Apple about future plans to expose this data. It seems like a missed opportunity given how capable the hardware is and how much value developers could provide with access to this physiological data. Would love to hear if anyone else is working on similar features or has insights into the technical limitations here.
1
0
615
Oct ’25
How to set accessibility-label to NSTextAttachment ?
I have the following method to insert @mentions to a text field: func insertMention(user: Token, at range: NSRange) -> Void { let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)") let attachment: NSTextAttachment = NSTextAttachment() attachment.image = tokenImage attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height) attachment.accessibilityLabel = user.username attachment.accessibilityHint = "Mention of \(user.username)" let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment)) attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1)) attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1)) let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText) mutableText.replaceCharacters(in: range, with: attachmentString) mutableText.append(NSAttributedString(string: " ")) textView.attributedText = mutableText textView.selectedRange = NSRange(location: range.location + 2, length: 0) mentionRange = nil tableView.isHidden = true } When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
1
1
849
Jul ’25
App Store Policy for apps that allows users to unlock contact details of person who posted content.
We have an app under development which allows musicians to unlock contact details of people who posted about an upcoming event. The musician pays a fees to unlock this contact details. Both the musician & the post owner are registered users. We will reveal the same contact info that the post owner used for account signup verification. Questions: Is this allowed? (given that we obtain consent to share contact info to other people and clearly mention this in privacy policy) If yes, will we have to use App store in-app purchase to facilitate this transaction or are we free to use a payment processor such as Stripe.
0
1
612
Feb ’25
Siri misreads local currency in notifications (Bug reported, still unresolved)
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars. Issue details: My iPhone is set to Region: Chile and Language: Spanish (Chile). In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars. A notification with the text: let content = UNMutableNotificationContent() content.body = "¡Has recibido un pago por $5.000!" is read aloud by Siri as: ”¡Has recibido un pago por 5.000 dólares!” (English: “You have received a payment of five thousand dollars!”) instead of the correct: ”¡Has recibido un pago por 5.000 pesos!” (English: “You have received a payment of five thousand pesos!”) Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177 This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services: watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions). Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency. Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly. Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency. I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348. This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars. Has anyone found a workaround, or is there any update from Apple on this?
1
1
664
Feb ’25
Apple is lying about its commitment to accessibility on macOS
I've just received an email from Apple regarding the Global Accessibility Awareness Day and some forthcoming sessions to promote their accessibility features. What a joke. For many years, Apple refuses to provide the most basic accessibility requirement on macOS: LET USERS DISABLE ALL NON-CONSENSUAL UNSOLICITED ANIMATIONS AND OTHER UI CONVULSIONS. The scourge of animations started from macOS Lion. Yes, many of them can be, fortunately, disabled through some obscure Terminal commands (that is, if the user is lucky enough to discover them on some obscure internet resources). The "Reduce motion" control in System Settings is a fake option that doesn't do anything. And there are two most glaring accessibility violations that cannot be disabled: Scroll bar rollover highlight effect introduced on macOS 10.7.3. Every time you move the cursor over a scroll bar, the bar gets highlighted. It results in bringing the user's attention to random scroll bars for no reason whatsoever just because the cursor happens to pass over the bar at some point. HUNDREDS of unnecessary, annoying events of distraction daily! Expand/collapse animation of NSOutlineView (such as when we open/close a folder in the list view in the Finder, as well as any other app that's using outline views). It's extremely annoying, distracting, and time-wasting. All feedback submitted about this through the years remains mostly ignored (except for a few cases where I received some ridiculous replies from employees who, apparently, are barely familiar with Macs in general). Apple does NOT care about accessibility. Not only this, but it's obvious that Apple is, in fact, intentionally abusing those users who can't tolerate distracting, time-wasting animations and UI convulsions.
0
1
224
Apr ’25
Wi-Fi connectivity Issue - Captive.apple.com returns “application/octet-stream” instead of “text/html”,
In our system, when a user enables a mobile hotspot and the system connects to it, the system attempts to verify WIFI availability by sending an HTTP GET request to http://captive.apple.com. Normally, the server returns: HTTP Status: 200 (OK) Content-Type: text/html This has always been used as a sign of normal connectivity. Issue: Since last Friday, the server sometimes responds with: Content-Type: application/octet-stream When this occurs, our system determines that the network is unavailable and displays a connection warning (a “!” icon). Question: Has Apple recently made any backend or CDN configuration changes to captive.apple.com that could affect the response type? Any advice how can we solve this problem? Thanks!
2
1
944
Oct ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
95
Jun ’25
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
112
Jun ’25
Custom prediction panel not working in Google Docs
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia) I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard. Here’s what I have so far: • The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars. • The panel appears above all applications and suggestions show up correctly. • However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor. I suspect this is because: • Google Docs does not use a standard macOS text input field. • Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events. • Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor. Important: I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development. Technical setup: • MacBook Air (M2, 2022) • RAM: 8 GB • macOS: Sequoia 15.3.1 • Language: French (system and keyboard) • Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard • Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction” • Browsers tested: Chrome, Safari, Firefox (same issue) • Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor? Thanks a lot!
0
1
933
Aug ’25
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data.
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data. Next Steps To resolve this issue, please remove any HealthKit functionality from the app, as well as any references to this app’s interactivity with HealthKit from the app or its metadata. This includes removing any HealthKit-related keys in the app's Info.plist or InfoPlist.strings files, as well as removing any calls to HealthKit APIs, including those from third-party platforms, from the app.
1
1
255
Oct ’25
iOS: How to maintain good app icon contrast in grayscale mode?
I’m developing an iOS app, and I’ve noticed that when the user enables Accessibility → Display & Text Size → Color Filters → Grayscale, my app icon loses a lot of visual contrast. The original colored version looks fine, but in grayscale it appears “flat” and harder to distinguish, unlike a pure black-and-white design. What I want to achieve: Ensure the app icon remains visually clear and high-contrast even when iOS renders it in grayscale. Ideally, provide an alternate “high-contrast” app icon version when grayscale mode is enabled. What I’ve tried: Increased color contrast in the original icon design. Added outlines and stronger shapes. Tested with grayscale filters in design tools. Researched Asset Catalog and alternate icons, but found no documented API to detect or respond to grayscale mode. Questions: Is there any API in iOS that allows detecting when the system is in grayscale mode so that I can programmatically switch to an alternate app icon? If not, are there Apple-recommended best practices for designing app icons that still look clear in grayscale? Are there any accessibility guidelines specifically addressing icon design for grayscale or color-blind modes? Additional info: iOS version tested: iOS 17.5 Development in Swift + SwiftUI, using Asset Catalog for icons. I am aware that iOS supports alternate icons via setAlternateIconName, but I haven’t found a trigger for grayscale mode.
0
1
418
Aug ’25
"Captions" in the Accessibility Nutrition Label for text-based apps
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users. Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included. In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says: After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app. Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
3
1
123
Jun ’25