Hello!
I was faced with unexpected behavior of hardware keyboard focus in UITests.
A clear description of the problem
When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately.
I will describe why I have such an assumption later.
A step-by-step set of instructions to reproduce the problem
Launch the iOS Simulator.
Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings.
Run a UITest on a target application (ideally an endless or long-running test).
Once the app is launched, press the Tab key several times.
Observe the delay in focus movement.
Optionally, press the Tab or arrow keys rapidly, then stop the UITest.
After stopping, you’ll see a burst of rapid focus changes.
What results you expected
We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests.
What results you saw
There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest.
The version of Xcode you are using
Xcode: Version 16.3 (16E140)
Simulator: iPhone 16 Pro (iOS 18.4 and 18.1)
Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hey all — hoping someone here has dealt with this before.
I’m testing an iOS app via TestFlight, and when I originally got access, I didn’t have an iPhone. So I signed in with my Apple ID on my girlfriend’s iPhone and used TestFlight there. Everything worked fine.
Now I finally have my own iPhone (iPhone 16), downloaded TestFlight, signed in with the same Apple ID, and had the developer resend the invite. But when I tap "Open in TestFlight" from the invite email, I get this error:
“Couldn’t load app because your Apple account has already been associated to this app.”
The dev tried removing me as a tester and re-adding me, I’ve deleted TestFlight from both phones, rebooted, reinstalled, waited in between — still no luck. Even tried opening the invite link in Safari instead of Mail.
Is there any way to get Apple to fully reset the association with the old device so I can use TestFlight on my new iPhone? Or do I really need to make a new Apple ID just to get around this?
Any help would be huge — thanks!
I’m seeing a layout issue in SwiftUI on iOS 26 that only reproduces with specific Accessibility Motion settings.
Steps to reproduce
1. Open Settings → Accessibility → Motion.
2. Enable Reduce Motion and Prefer Cross-Fade Transitions.
3. Launch an app with a SwiftUI TextField.
4. Tap the field to show the keyboard.
5. Dismiss the keyboard (tap outside, swipe down, etc.).
Expected:
After the keyboard is dismissed, the view’s bottom safe area / layout should return to normal.
Actual:
The view continues to reserve space equal to the keyboard height — as if the keyboard were still visible. UI anchored to the safe area remains shifted upward until the view is reloaded.
I'm facing a bizarre issue with the Apple's Accessibility APIs. I am registering an AXObserver that listens for, among other things, the kAXSelectedTextChangedNotification. For many new users, the kAXSelectTextChangedNotification is not triggered, even though they have enabled Accessibility permission for the app. Other notifications are getting through (kAXWindowMovedNotification, kAXWindowResizedNotification, kAXValueChangedNotification etc - full list here), just not the kAXSelectedTextChangedNotification!
We've found that we can reproduce the error by removing accessibility permission for the app and rebooting our computers. After restarting and reenabling accessibility permissions, the kAXSelectedTextChangedNotification was not received, even though other notifications were fine.
Strangely, the issue can be resolved by launching Apple's Accessibility Inspector app on an impacted computer. Once the Accessibility Inspector is loaded, the kAXSelectedTextChangedNotifications start coming through as expected. This implies to me that either:
We are missing some needed setup when starting the observers. Accessibility Inspector gets it right, thus ‘starting’ the system properly.
Accessibility Inspector is using some Apple private APIs that we don’t have access to.
Things I’ve tried:
I've tried subscribing the AXSelectedTextChangedNotification to different AXUIElements, including the SystemWide element, the Application element, and children elements from the AXApplication. None of these received the kAXSelectedTextChangedNotification, until Accessibility Inspector is booted up. No surprises here, as Apple's documentation confirms that you should add the notification to the root Application AXUIElement if you want to receive notifications for all its children.
I had a theory that the issue might be due to my code calling AXUIElementCreateApplication multiple times, possibly creating multiple "Applications" in Apple's Accessibility implementation. If that’s the case, the notifications might be sent to the wrong application AXUIElement. However, refactoring my code to only call AXUIElementCreateApplication once didn't resolve the issue.
I thought the issue may be caused by subscribing the AXSelectedTextChangedNotification on the high-level application element (at odds with Apple's documentation). I've tried traversing the child AXUIElements until we find one with the kAXSelectedTextAttribute and then subscribing to that. This did not resolve the issue. I don’t think it's the correct path to continue exploring, given that the notifications are received correctly after AccessibilityInspector is launched.
There is one exception to the above: if I add the kSelectedTextChangedNotification listener to a specific text field AXUIElement, I do receive the notification on that text field. However, this is not practical; I need a solution that will work for all text fields within an app. The Accessibility Inspector appears to be doing something that causes the selected-text-changed notifications to be correctly passed up to the high-level application AXUIElement.
Another thought is that I could traverse the entire Accessibility hierarchy and add listeners to every subview that has the kAXSelectedTextAttribute. However, I don’t like this long-term solution. It will be slow and incomplete: new elements get added and removed frequently. I just want the kAXSelectedTextChangedNotification to be received by the high-level Application AXUIElement, which the documentation suggests it should be. I also have evidence that this can work, since notifications start coming through after Accessibility Inspector is launched. It’s just a matter of discovering how to replicate whatever Accessibility Inspector is doing.
An interesting wrinkle: I implemented the 'traverse' strategy above, but was surprised by how few elements were in the hierarchy. Most apps only go down ~2-3 levels, which didn't seem right to me. Perhaps the Accessibility tree isn't fully initialized? I tried adding a 5-second delay to allow more initialization time, but it didn't change anything.
Does anyone have any ideas? Here's our file.
Hey everyone!
I am developing a screen time limit app to help people spend less time in distracting apps.
It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps.
This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet.
So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago.
There are a lot of similar apps on the market, and this feature exists in other screen time limit apps.
Why is app blocking permission needed?
Time Exchange Functionality:
Users independently select which apps are productive and which are distracting for them.
The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage.
Full User Control:
All apps to be blocked are manually selected by the user in the settings.
The extension does not impose any restrictions without explicit permission.
Transparency and Security:
Blocking happens locally, with no data collected about app usage.
We adhere to Apple’s privacy policy.
Compliance with App Store Guidelines:
We understand that app blocking is a sensitive feature, but in our case it:
Is used for the benefit of the user (digital detox, productivity improvement).
Does not interfere with system processes or other developers’ apps.
Does not misuse access to APIs.
My question to the forum is:
Did you have similar problems, and how did you resolve them?
Are there any ways to speed up the process or contact someone from the approval team directly?
Should I give up and release it on Android?
I am very disappointed and frustrated. Hope to get some useful tips.
Thank you very much!
Topic:
Accessibility & Inclusion
SubTopic:
General
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message.
Does anyone have any ideas what the problem could be?
Topic:
Accessibility & Inclusion
SubTopic:
General
Can you guys like probably make Visual Intelligence available for the action button on the iPhone 16e? It should be only for iPhones that use A18 and future gen apple chips.
Topic:
Accessibility & Inclusion
SubTopic:
General
i have updated to the ipados 26 and my pointer is still the circle one and not the arrow cursor
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I am writing this post because the Apple Developer Program enrollment process is clearly malfunctioning, and I have reached a point where this situation is unacceptable.
First payment
I initially purchased the Apple Developer Program on December 3rd, 2025 at 16:03 (Turkey time).
The payment was fully completed, confirmed by my bank, and I received the official Apple Store receipt.
• Order ID: W1557478965
• Amount: 1029 TRY
• Status: Completed / Posted
Despite this, my account continued to show:
• “Purchase your membership”
• Enrollment status: Pending
• No access to App Store Connect
After several days with no response from Apple Support and no activation, I assumed something had gone wrong on Apple’s side.
Second payment
Because I was completely blocked and received no reply from support or the forums, I made a second payment to rule out any payment failure.
• Order ID: W1694587309
• Amount: 1029 TRY
• Status: Completed / Posted
Current situation
At this point:
• Two separate payments
• Two unique Apple Store order IDs
• Zero activation
• Zero response from Apple Support
• Enrollment still Pending
• App Store Connect still inaccessible
Support case details:
• Apple Support Case ID: 102769533427
• Multiple follow-ups sent
• No reply
• No action taken
This is no longer a delay — this is a system-level failure.
I have paid twice for a single Developer Program membership and received nothing in return: no activation, no explanation, and no support.
I am formally requesting manual intervention by Apple staff to:
1. Immediately activate my Apple Developer Program membership
2. Investigate and resolve the duplicate payment (refund or clarification)
3. Explain why a paid enrollment can remain blocked with no support response
If this forum is monitored by Apple employees, this issue requires urgent escalation.
This situation should not happen in a paid developer program.
Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
I’d love to see Apple implement a Bionic Reading feature as a system-wide accessibility option. This type of reading aid highlights the first part of each word in bold to help guide the eyes and improve comprehension.
It’s been shown to be especially helpful for people with ADHD, dyslexia, and other neurodivergent needs. Having a toggle in Settings > Accessibility would be life-changing.
Ideally, it could be:
• Enabled system-wide, or per-app
• Allow customization of how much of the word is bolded
• Available in Safari, Messages, Books, News, etc.
Hello!
I was doing some accessibility testing for my app and found out that when the user switches the text size, all of the data in the text fields is reset, which causes major disruption.
I've tried looking for documentation, but all I've found is information on how to dynamically scale the UI for different text sizes, which I've already implemented.
My guess is that every time Dynamic Type registers a change, it redraws my UI instead of just updating it.
How can I make sure the data is not reset when the text size changes?
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not.
struct ContentView: View {
@State private var buttonClicked = false
@State private var bottomSheetShowing = false
var body: some View {
NavigationView {
VStack {
Button(action: {
buttonClicked = true
}, label: {
Text("First Page Button")
.padding()
.background(Color.blue)
.foregroundColor(.white)
.cornerRadius(8)
})
.accessibilityLabel("First Page Button")
FullscreenView2()
}
.navigationTitle("Welcome")
.fullScreenCover(isPresented: $buttonClicked) {
FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing)
}
}
}
}
struct FullscreenView: View {
@Binding var buttonClicked: Bool
@Binding var bottomSheetShowing: Bool
var body: some View {
NavigationView {
VStack {
Button(action: {
bottomSheetShowing = true
}, label: {
Text("Show Bottom Sheet")
.padding()
.background(Color.green)
.foregroundColor(.white)
.cornerRadius(8)
})
}
.accessibilityHidden(bottomSheetShowing)
.navigationTitle("Fullscreen View")
.toolbar {
ToolbarItem(placement: .navigationBarLeading) {
Button(action: {
buttonClicked = false
}, label: {
Text("Close")
})
.accessibilityLabel("Close Fullscreen View Button")
}
}
.accessibilityHidden(bottomSheetShowing)
.onChange(of: bottomSheetShowing, perform: { _ in })
.sheet(isPresented: $bottomSheetShowing) {
if #available(iOS 16.0, *) {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
.presentationDetents([.medium, .large])
} else {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
}
}
}
}
}
struct FullscreenView2: View {
@State var bottomSheetShowing = false
var body: some View {
VStack {
Button(action: {
bottomSheetShowing = true
}, label: {
Text("Show Bottom Sheet")
.padding()
.background(Color.green)
.foregroundColor(.white)
.cornerRadius(8)
})
}
.accessibilityHidden(bottomSheetShowing)
.navigationTitle("Fullscreen View")
//.accessibilityHidden(bottomSheetShowing)
.onChange(of: bottomSheetShowing, perform: { _ in })
.sheet(isPresented: $bottomSheetShowing) {
if #available(iOS 16.0, *) {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
.presentationDetents([.medium, .large])
} else {
BottomSheetView(bottomSheetShowing: $bottomSheetShowing)
}
}
}
}
struct BottomSheetView: View {
@Binding var bottomSheetShowing: Bool
// @AccessibilityFocusState var isFocused: Bool
var body: some View {
VStack(spacing: 20) {
Text("Bottom Sheet")
.font(.headline)
.accessibilityAddTraits(.isHeader)
Button(action: {
bottomSheetShowing = false
}, label: {
Text("Dismiss")
.padding()
.background(Color.red)
.foregroundColor(.white)
.cornerRadius(8)
})
.accessibilityLabel("Dismiss Bottom Sheet Button")
}
.padding()
.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(
Color(UIColor.systemBackground)
.edgesIgnoringSafeArea(.all)
)
.accessibilityAddTraits(.isModal) // Indicates that this view is a modal
// .onAppear {
// // Set initial accessibility focus when the sheet appears
// DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
// isFocused = true
// }
// }
// .accessibilityFocused($isFocused)
}
}
Topic:
Accessibility & Inclusion
SubTopic:
General
When my macOS Cocoa app displays a modal alert with beginSheetModal(for:completionHandler:), VoiceOver sometimes seems to focus on an "illegal" upper level, where any attempts at navigation will give the unhelpful response "Alert, dialog", until you "drill down" with VO + shift + down or switch apps. After that, things will work as expected.
Is this a known bug? Does it happen to anybody else, or am I doing something wrong?
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings.
Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2.
Particularly affects:
Third-party speech synthesis voices (CereProc, Grammatek, etc.)
Apps relying on automatic voice selection based on user preferences
Example:
// User selected CereProc Heather for en-GB in Accessibility settings
let voice = AVSpeechSynthesisVoice(language: "en-GB")
print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default)
Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting.
Tested methods:
AVSpeechSynthesisVoice(language:)
AVSpeechUtterance auto-selection
Reflection for new APIs
All return the system default voice, not the user's preference.
Filed: FB[20271264]
Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
before I start this could just be me and handful of people but I like to reorganize my phone screen to my needs based on what’s going on in life. I was jaut thinking it would be easier if u could get rid of all the folders at once then reorganize or something easier than this long extensive process it is now.
Hello,
I'm observing a persistent and frustrating issue with an accessibility feature called Guided Access that seems to affect many users across different devices and iOS versions.
Problem
The triple-click gesture (side or home button) to activate Guided Access intermittently stops working after the device has been in normal use for a few days (typically 2-7 days) without a restart.
I have done some debugging for Apple in FB16094026 but received no updates after 6 months. So I'm posting here in the hope that this will be solved sooner. A core accessibility feature shouldn't require daily device restarts to function reliably.
Details:
Guided Access is correctly enabled in Settings > Accessibility.
Initially, the triple-click works perfectly.
After a period of normal device use (2-7 days), the triple-click no longer triggers Guided Access in any app.
Restarting the device temporarily resolves the issue, and Guided Access triple-click works again immediately after a reboot. However, the problem recurs after continued use.
Simply toggling the Guided Access setting on/off does NOT fix it.
Additional observation: Even trying to select Guided Access manually via the Accessibility Shortcut menu (if multiple shortcuts are enabled) sometimes fails to launch the feature when in this state.
Affected:
iPhones and iPads
Observed on iOS/iPadOS 16, 17, and now 18, indicating it's a long-standing bug.
Impact:
Guided Access is a crucial accessibility feature for many users (for focus, special needs, parental controls, etc.). Its unreliable activation significantly disrupts daily workflows and reliance on this function. This issue appears to be widespread, with many reports across forums like Apple Support Communities and Reddit.
For example, this post received over 1k upvotes.
To see more examples please refer to FB16094026.
Could Apple please investigate this bug urgently? Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
I have added multipe bank accounts in my apple connect store but thier status is Not in Use. Whats wrong here.
Topic:
Accessibility & Inclusion
SubTopic:
General
Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used?
Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings
The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a
shutter, big haptic, softer sound, but editable in Settings, of course
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General