Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

The accessibility app keeps opening by itself on my iPhone 12 Pro. Can anyone please help me?
It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
0
0
108
Jun ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
188
Apr ’25
Accessibility & Inclusion
We are developing Apple AI for foreign markets and adapting it for iPhone models 17 and above. When the system language and Siri language are not the same—for example, if the system is in English and Siri is in Chinese—it can cause a situation where Apple AI cannot be used. So, may I ask if there are any other reasons that could cause Apple AI to be unavailable within the app, even if it has been enabled?
0
0
354
3d
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
113
Apr ’25
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26 Steps: Open the path above → “Hattori” is not listed and cannot be downloaded Expected: Hattori is available to download and select Actual: Hattori is absent from the catalog Regression: Was available on iOS 18.x on the same device
0
0
387
Sep ’25
App Store Connect – “Unable to Handle This Request” Error
Hello, I'm currently unable to access App Store Connect. When I try to open https://appstoreconnect.apple.com, I receive the following error message: “appstoreconnect.apple.com is currently unable to handle this request.” I’ve tried the following steps, but the issue persists: Cleared browser cache and cookies Tried different browsers (Safari, Chrome) Attempted from multiple devices and networks Is this a known issue or is there any workaround available? Would appreciate any help or update on the current status. Thank you,
0
0
104
Jun ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
127
Apr ’25
Iphone 14 plus
I am facing issue of back camera in my iphone 14 plus it is showing the black screen and my iphone is manufacture between april 2023 to april 2024 but its still not eligible for apple program my phone is also getting same issue why its not eligible for it
1
1
374
Jan ’25
Bulgarian cyrillic alphabet appears to be Russian, but should look Bulgarian
Hello So if you use the Bulgarian keyboard, you get these characters: явертъуиопюасдфгхйклшщзьцжбнмч This isn’t really right for Bulgaria, because т should look like m, and д should look like g, and other characters should look like rotated or mirrored Latin characters. E.g., г should look like a backwards s. Compare the Bulgaria Wikipedia page in Bulgarian: https://bg.m.wikipedia.org/wiki/%D0%91%D1%8A%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F with the Bulgaria Wikipedia page in Russian: https://ru.m.wikipedia.org/wiki/%D0%91%D0%BE%D0%BB%D0%B3%D0%B0%D1%80%D0%B8%D1%8F Notice that the letters are different. Anyhow, the ios Bulgarian font is just Russian Cyrillic, and that seems like an unintended bug rather than an intentional stylistic choice, basically.
1
0
329
Feb ’25
Unexpected behaviour of hardware keyboard focus in UITests
Hello! I was faced with unexpected behavior of hardware keyboard focus in UITests. A clear description of the problem When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately. I will describe why I have such an assumption later. A step-by-step set of instructions to reproduce the problem Launch the iOS Simulator. Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings. Run a UITest on a target application (ideally an endless or long-running test). Once the app is launched, press the Tab key several times. Observe the delay in focus movement. Optionally, press the Tab or arrow keys rapidly, then stop the UITest. After stopping, you’ll see a burst of rapid focus changes. What results you expected We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests. What results you saw There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest. The version of Xcode you are using Xcode: Version 16.3 (16E140) Simulator: iPhone 16 Pro (iOS 18.4 and 18.1) Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
1
2
178
Apr ’25
Feature Request – Bionic Reading Accessibility Setting
I’d love to see Apple implement a Bionic Reading feature as a system-wide accessibility option. This type of reading aid highlights the first part of each word in bold to help guide the eyes and improve comprehension. It’s been shown to be especially helpful for people with ADHD, dyslexia, and other neurodivergent needs. Having a toggle in Settings > Accessibility would be life-changing. Ideally, it could be: • Enabled system-wide, or per-app • Allow customization of how much of the word is bolded • Available in Safari, Messages, Books, News, etc.
1
1
105
Apr ’25
How to set accessibility-label to NSTextAttachment ?
I have the following method to insert @mentions to a text field: func insertMention(user: Token, at range: NSRange) -> Void { let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)") let attachment: NSTextAttachment = NSTextAttachment() attachment.image = tokenImage attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height) attachment.accessibilityLabel = user.username attachment.accessibilityHint = "Mention of \(user.username)" let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment)) attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1)) attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1)) let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText) mutableText.replaceCharacters(in: range, with: attachmentString) mutableText.append(NSAttributedString(string: " ")) textView.attributedText = mutableText textView.selectedRange = NSRange(location: range.location + 2, length: 0) mentionRange = nil tableView.isHidden = true } When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
1
1
853
Jul ’25
Handling Keyboard Hotkeys and Shortcuts across Multiple Languages
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey. Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it. Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’. So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
1
0
396
6d
The virtual home button is not displayed in Developer Mode.
I have a question about Developer Mode on iPhone. Currently, the home button on my iPhone SE (2nd generation) is broken, so I use AssistiveTouch to display a virtual home button. However, in Developer Mode, the virtual home button does not appear, making it impossible to enable Developer Mode. Is there any way to enable Developer Mode in this situation?
1
2
258
Feb ’25
The camera preview screen cannot be previewed in full screen
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
1
0
180
Apr ’25