I have mulling over this for many years ,Uralic and Siberian language user interface Support.Ainu of Japan is only supported by writing roman and rendering into Katakana with a few small modified characters there is no user interface ,spell,grammar checker,dictionary ,translator ,of course the Ainu has few terms in modern vocabulary but Iam studying the language in order to find words and coin new ones, iPhone hoomi-ye-p electric speak thing. I am looking for other peple who have the same idea.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window
AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1
CFTypeRef frontWindow = NULL;
AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );
if ( err != kAXErrorSuccess ){
NSLog(@"failed with error: %i",err);
}
NSLog(@"app: %@ frontWindow: %@",app,frontWindow);
'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
I’ve noticed that the VoiceOver reads currency amounts correctly when they are below thousand.
Then, for higher amounts, for example 12.225,34 € VoiceOver reads ‘twelve point two two five thirty four euros’
If the amount is formatted without the thousand separator (12225,34 €) this problem doesn’t exist. (VO reads twelve thousand two hundred and twenty five euros and thirty four cents)
Why is the thousand separator a problem for VoiceOver if this formatting is coming from the currency and locale?
This issue exists in English. I changed my device language to Italian and German and in both cases the number was read correctly even with the separator.
Is there a way to make it work in English?
I'm currently testing the announce notifications feature and I can't seem to find out how to make Siri read aloud the current currency instead of dollars.
My locale is es-CL (Chile). It uses the currency symbol $ and reads as Pesos locally or Chilean Pesos where the number 5000.1 is represented as 5.000,1
This is the notification content
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
Siri reads it aloud as "¡Has recibido un pago por 5.000 Dolares!" which translates to "You have received a payment for 5,000 Dollars", instead of the expected "¡Has recibido un pago por 5.000 Pesos!" -> "You have received a payment for 5,000 Pesos"
I've tried changing the development region of the app, interpolating the string with NumberFormatter.localizedString(from: 5000, number: .currency), and with others styles( .currencyAccounting, .currencyISOCode and .currencyPlural) without good results. The last one seems to work buts it's not ideal since it outputs "5.000 pesos chilenos" which gets read as "5 pesos chilenos" which is not the correct amount (bug), it's as is you're not on Chile and I personally prefer it to be a symbol instead of words.
I'm testing with my device which is setup with the region "Chile"
Could someone help me find a solution?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Localization
User Notifications
Siri and Voice
I'm using Xcode 15.2 and have migrated my (macOS) project to use an xcstrings file a while back. Now when I check the xcstrings file, all items are marked as "stale". When I add new localized strings in code, they don't show up in the xcstrings file. The xcstrings file is built correctly (into .lproj/Localizable.strings) when building.
Where can I check which source files are checked to update xcstrings status? "xcstringstool" appears to have a "sync" feature which reads "stringsdata" files, but there is no information in the xcstringstool help on where the stringsdata files come from.
If I create a new project I can see a "stringsdata" file being generated for each source file in the intermediate build products folder.
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items.
The app is in German.
I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine.
If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
Topic:
Accessibility & Inclusion
SubTopic:
General
Question:
Hi everyone,
I'm developing a Vision Pro app using the latest visionOS 2, and I've encountered some issues with the new hand gestures introduced in this update. My app is designed to display a UI element when a user's palm is detected. However, the new hand gestures for navigating key functions like Home View, Control Center, and adjusting the volume are interfering with my app's functionality.
What I'm Trying to Achieve
Detect when a user's palm is open and display a UI element.
Ensure that my app's custom hand gestures are not disturbed by the new default gestures in visionOS 2.
Problem
The new hand gestures in visionOS 2 (such as those for Home View, Control Center, and volume adjustment) are activating while my app is open, causing disruptions to my app's functionality. I want to disable these system-level gestures when my app is running.
Just installed iOS 18 Beta 3.
I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out.
If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app.
Calls still work since AccessibilityUIServer will release and the phone will ring.
Feed back ID is FB14241838.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Audio
Accessibility
Sound Analysis
Sound and Haptics
A lot of apps use undocumented App-prefs URLs to help users get to the iOS Settings screen needed to set up the app. In iOS 18, it seems like these all stopped working.
Here are the ones I currently use:
App-prefs:MESSAGES - broken in iOS 18
Used for SMS Protection.
App-prefs:Phone - broken in iOS 18
Used for Live Voicemail, Silence Unknown Callers, and SMS Reporting.
Some but not most paths have specific documented replacements. E.g. for Call Blocking & Identification you can use CXCallDirectoryManager.sharedInstance.openSettings() and this still works in iOS 18. But I don't see any other direct replacements.
Apple probably doesn't consider this a bug but I filed FB14378568 anyway.
I consider this an accessibility issue because many older, inexperienced, or users with disabilities have trouble finding the right Settings screen based on a textual description alone.
Hello, I am trying to add an accessibility label and hint to a SwiftUI sheet's drag indicator but am not having any luck. Currently, VO reads 'sheet grabber, button, double tap to expand the sheet'. I'd really like VO to also include the current height of the sheet (similar to Apple Maps sheet). Does anyone by chance know how I can target the drag indicator/sheet grabber to do this? Thanks in advance.
So basically using IOS voiceover it reads a radio button like this.
Aria label +"check box" + "radio button uncheck 1/2 required"
is this behavior expected for the IOS Voice over.
Thank you !
Heya,
I'm currently building out my own application for tracking my health information, and I'm hoping to collect my historical data from Apple Health.
Sadly, it would appear that certain things I wish to export don't appear in the export.xml file.
Some of the things that I would expect to find in the export.xml file, that do not currently appear, are as follows:
More data about my medications, currently I can only export a list of my medications, its not possible to export data such as when what medication was taken.
Logged emotions; there is currently no support for this (that I can find).
Would appreciate some insight into this.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Developer Tools
Health and Fitness
HealthKit
Hi,
I am trying to include an Image in the Apple News template which is clickable, meaning if a user clicks on that image it should be redirected to a different page. I am using action and openURL as below but it seems though the image is loading fine but its not doing anything when I click on the image.
{
"role": "container",
"components": [
{
"role": "container",
"components": [
{
"role": "image",
"URL": "https://myImageURL.jpg",
"action": {
"type": "openURL",
"URL": "https://URLToBeRedirected"
}
}
]
}
]
}
I am using Newspreview to preview my article.json. Please let me know any resolution for this issue.
When voiceover is turned on in IOS devices the video controls like pause/resume, forward , backward are not working in
inline mode and works fine in fullscreen.
The controls are announced properly in voiceover,but when we press it, it is not performing any action at all.
We used HTML5 video tag to display mp4 video in our app.we have added all necessary accessibility tags as well.
We also observe the same issue is happening in the w3school web page. Attaching the link here for reference. (https://www.w3schools.com/html/tryit.asp?filename=tryhtml5_video)
Could you please guide us why the video controls are not working. Is there anything we want to change or any update needed from the safari side.
Hello everyone
Yesterday I logged in with my Apple id on the Apple developer site. I have no subscription so some functions are not available for my account. Since I logged in, if I go to settings, general, software updates the beta updates also appear. I wanted to know if there is a way to remove this and to delete my account from Apple developer. Thank you
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m developing an app for Vision Pro and have encountered an issue related to the UI layout and model display. Here's a summary of the problem:
I created an anchor window to display text and models in the hand menu UI.
While testing on my Vision Pro, everything works as expected; the text and models do not overlap and appear correctly.
However, after pushing the changes to GitHub and having my client test it, the text and models are overlapping.
Details:
I’m using Reality Composer Pro to load models and set them in the hand menu UI.
All pins are attached to attachmentHandManu, and attachmentHandManu is set to track the hand and show the elements in the hand menu.
I ensure that the attachmentHandManu tracks the hand properly and displays the UI components correctly in my local tests.
Question:
What could be causing the text and models to overlap in the client’s environment but not in mine? Are there any specific settings or configurations I should verify to ensure consistent behavior across different environments? Additionally, what troubleshooting steps can I take to resolve this issue?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
ARKit
RealityKit
Reality Composer Pro
visionOS
I have created wallet non UI extension for adding card through wallet. It's working perfect when I open wallet from iPhone. But when I open wallet from Watch app (Watch bridge app on iPhone not on physical apple watch) then my extension (issuer app) is not showing there. Any idea if I need to setup or configure anything to access extension through watch bridge app wallet?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Extensions
Wallet
iPhone
Watch Complications
I'm developing a macOS app that interacts with Microsoft Teams using the Accessibility API. I've noticed inconsistent behavior when querying UI elements, particularly for the mute button. My queries often fail, while system tools like VoiceOver can consistently access these elements (which are visible on the screen).
In some cases, it works well, but in others, the UI elements are not visible from my code. When I try Accessibility Inspector, it also initially fails to inspect. However, the Inspector seems to have some "magical" power that, when I run it or via AX audit, appears to refresh the AX tree, and then my code occasionally works as well.
Given that VoiceOver can consistently read the screen, I assume the issue is not with the Microsoft Teams app itself (assuming it's based on Electron/React). I am mentioning this, because when I interact with Zoom app, reading the mute status from the app's menu bar, its 100% working anytime.
What would you recommend I try or explore to improve reliability?
Can I refresh the apps' AX tree from my end from swift?
Is that a bug in AX API or even in Microsoft Teams?
(have ready example and demo video, but it does not let me upload here)
Hello Team,
I am not able to capture any elements on physical iPad device in Landscape mode.
Elements appear to be offset.
I suspect it considers the screen in Portrait mode even though it’s in landscape mode.
I am using below devices:
iPAD(10th generation) - iOS 17.6.1
Xcode - 15.4
Appium - 2.11.3
Xcuitest to 7.26.0
Note - I am able to capture the elements in the portrait mode on ipad
Could you please help me or provide any inputs on this ?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have a list of items that have swipe actions that I'd like to hide from accessibility and make actions available via the AccessibilityActions modifier. Additionally, I would like to make the contextMenu annoucable as an action. This behavior exists in the Apple Podcasts app but can't find out to do it in SwiftUI.
If the users knows to use the triple-tap, they can activate the context menu but have it as an action of the element is even better.