In SwiftUI, the date picker component is breaking in colour contrast accessibility. Below code has been use to create date picker:
struct ContentView: View {
@State private var date = Date()
@State private var selectedDate: Date = .init()
var body: some View {
let min = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
let max = Calendar.current.date(byAdding: .year, value: 4, to: Date()) ?? Date()
DatePicker(
"Start Date",
selection: $date,
in: min ... max,
displayedComponents: [.date]
)
.datePickerStyle(.graphical)
.frame(alignment: .topLeading)
.onAppear {
selectedDate = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
}
}
}
#Preview {
ContentView()
}
attaching the screenshot of failure accessibility.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Please excuse me if this is obvious. I'm new to Apple development.
Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them?
Or... is there a demangler somewhere that will translate from these names into something this human might recognize?
I'm targeting macos, btw, if that makes any difference.
Topic:
Accessibility & Inclusion
SubTopic:
General
I can’t screenshot using assistive touch after i install ios 26 beta 2
While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected.
It’s as if the List never gains focus—every attempt to move focus lands back on the search field.
The code:
struct ContentView: View {
@State var searchText = ""
let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"]
var filteredItems: [String] {
if searchText.isEmpty {
return items
} else {
return items.filter { $0.localizedCaseInsensitiveContains(searchText) }
}
}
var body: some View {
if #available(iOS 16.0, *) {
NavigationStack {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
} else {
NavigationView {
List(filteredItems, id: \.self) { item in
Text(item)
}
.navigationTitle("Fruits")
.searchable(text: $searchText)
}
}
}
}
I am seeing a strange issue where NSObject accessibilityRespondsToUserInteraction returns true on Simulator but false on device.
Checking the same object on simulator with Accessibility inspector I see the object traits as image so why would it return true in that case?
Are there any other way to check the the item might be accessibilityRespondsToUserInteraction OR Clickable beside that property and traits?
(Or is it just another bug)
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection.
Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
After IOS 26 beta 2 installation in my iphone 13, I can't do a screenshot using assistivetouch nor touch on back.
Hi everyone,
I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace:
#44 AXSpeech
SIGSEGV
SEGV_ACCERR
0 CoreFoundation ___CFCheckCFInfoPACSignature + 4
1 CoreFoundation _CFRunLoopSourceSignal + 28
2 Foundation _performQueueDequeue + 492
3 Foundation ___NSThreadPerformPerform + 88
4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28
5 CoreFoundation ___CFRunLoopDoSource0 + 176
6 CoreFoundation ___CFRunLoopDoSources0 + 340
7 CoreFoundation ___CFRunLoopRun + 828
8 CoreFoundation _CFRunLoopRunSpecific + 608
9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212
10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776
11 Foundation ___NSThread__start__ + 732
12 libsystem_pthread.dylib __pthread_start + 136
Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows:
10 TextToSpeech LogWarning(char const*, ...) + 7288
Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!
I am registering my startup for an Apple Developer Account so that we can put our app on the App Store. I do not want use my personal Apple ID's payment to pay the $99 annual fee. How can I change the payment method so that I can continue with my enrollment?
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data.
Next Steps
To resolve this issue, please remove any HealthKit functionality from the app, as well as any references to this app’s interactivity with HealthKit from the app or its metadata. This includes removing any HealthKit-related keys in the app's Info.plist or InfoPlist.strings files, as well as removing any calls to HealthKit APIs, including those from third-party platforms, from the app.
Topic:
Accessibility & Inclusion
SubTopic:
General
I have a Twitter account that I registered with Apple id and I still don't know the PIN and I'm having a problem with it knowing the PIN I need help
privaterelay.appleid.com
Topic:
Accessibility & Inclusion
SubTopic:
General
On iOS, there is accessibilityLanguage.
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator.
I get the same frames when using the Accessibility Inspector when the Max is selected as the host.
When I switch the host to the iOS simulator the frames are correct.
How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?").
Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar.
I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this.
The Concept: Skeleton-based Normalization
Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input.
Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space.
Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance.
Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers).
Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees").
Why this approach?
Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body.
Privacy: We are processing coordinates, not video streams.
Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life.
Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps.
Looking forward to hearing your thoughts.
Hello!
I was doing some accessibility testing for my app and found out that when the user switches the text size, all of the data in the text fields is reset, which causes major disruption.
I've tried looking for documentation, but all I've found is information on how to dynamically scale the UI for different text sizes, which I've already implemented.
My guess is that every time Dynamic Type registers a change, it redraws my UI instead of just updating it.
How can I make sure the data is not reset when the text size changes?
I want to insert the medication data which is available from ios 26 from my app to apple health kit. I have tried to get the permission to read and write data but app got crashed while I tried to request that permission. Does apple allow to insert the medication data to apple health kit likewise we are able to add other health and fitness data or not?
let healthStore = HKHealthStore()
@available(iOS 26.0, *)
@objc func requestAuthorization(_ resolve: @escaping RCTPromiseResolveBlock,
rejecter reject: @escaping RCTPromiseRejectBlock) {
guard HKHealthStore.isHealthDataAvailable() else {
print("not available ")
return
}
let doseType = HKObjectType.medicationDoseEventType()
let medType = HKObjectType.userAnnotatedMedicationType()
healthStore.requestAuthorization(toShare: [doseType], read: [doseType]) { success, error in
if let err = error { reject("auth_error", err.localizedDescription, err); return }
self.healthStore.requestPerObjectReadAuthorization(for: medType, predicate: nil) { s, e in
if let err2 = e { reject("per_obj_auth", err2.localizedDescription, err2); return }
resolve(["ok": success && s])
}
}
}
Hi, I am not able to renew my Apple Developer Program membership. I do not see any renew button. not on app, not on website, not on iphone. :(
what should i do?
Topic:
Accessibility & Inclusion
SubTopic:
General
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data.
The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system?
As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality.
I would appreciate a human response to the original email I sent about this on 7/30/2025.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.