Voice Control Disabling System Services After Reboot
I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem.
To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months.
I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem.
I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter.
Sincerely,
Donald Spencer Kirby
Dayton, Ohio
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
}
// Start listening to the microphone
public void StartListening()
{
if (!isListening)
{
#if UNITY_IOS || UNITY_TVOS
microphoneInput = Microphone.Start(null, true, 10, 44100);
#else
try
{
microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100
if (microphoneInput == null)
{
microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate);
}
#endif
isListening = true;
Debug.Log(Microphone.devices.Length + " Started listening...");
debugText.text = Microphone.devices.Length + "- Started listening...";
}
catch (System.Exception e)
{
Debug.LogError($"Starting microphone failed: {e.Message}");
debugText.text = $"Starting microphone failed: {e.Message}";
}
}
}
void Update()
{
if (isListening && microphoneInput != null)
{
// Analyze the audio for voice activity
float volume = GetAverageVolume();
if (volume > detectionThreshold)
{
Debug.Log("User is speaking!");
lastVoiceTime = Time.time;
SoundDetected = true;
if (Time.time - lastVoiceTime > silenceDuration)
{
Debug.Log("User is silent.");
debugText.text = volume.ToString() + " - User is silent.";
}
slider.value = volume;
}
}
}
private float GetAverageVolume()
{
float[] samples = new float[128];
microphoneInput.GetData(samples, Microphone.GetPosition(null));
float sum = 0f;
foreach (float sample in samples)
{
sum += Mathf.Abs(sample);
}
return sum / samples.Length;
}
Problem:
When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected.
Question:
Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario?
Thanks in advance for any insights!
Best,
Siddharth
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore.
Is There any reason for this to happening?.
Is anyone aware of this behaviour?
Any suggestion would be really appreciated.
Cheers
I bought a new iPhone 16 recently and connected with my car (Hyundai Venue) I couldn't able to see WhatsApp. I researched and found some forum, but the suggested steps are not workable or not suitable for latest iOS version.
I have updated iOS and WhatsApp, nothing helped to resolve.
Note: Earlier I was used Pixel phone I can able to see Whatsapp and I can make a call
curl -o actions-runner-osx-x64-2.321.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.321.0/actions-runner-osx-x64-2.321.0.tar.gz echo "b2c91416b3e4d579ae69fc2c381fc50dbda13f1b3fcc283187e2c75d1b173072 actions-runner-osx-x64-2.321.0.tar.gz" | shasum -a 256 -c tar xzf ./actions-runner-osx-x64-2.321.0.tar.gz ./config.sh --url https://github.com/funds123/tpsdk --token BMOPQPZNQKB57MXE4BDDKKTHMTHJO ./run.sh
Save this configuration as com.github.actions.runner.plist in
Topic:
Accessibility & Inclusion
SubTopic:
General
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
I have a couple follow up questions after the "Accessibility technologies group lab".
I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example.
Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer?
Are there other strategies, avenues, or examples to promote user feedback?
again and again this issue is coming , restarted my laptop, have storage , I don't why this issue is coming!!
Topic:
Accessibility & Inclusion
SubTopic:
General
In our system, when a user enables a mobile hotspot and the system connects to it, the system attempts to verify WIFI availability by sending an HTTP GET request to http://captive.apple.com.
Normally, the server returns:
HTTP Status: 200 (OK)
Content-Type: text/html
This has always been used as a sign of normal connectivity.
Issue:
Since last Friday, the server sometimes responds with:
Content-Type: application/octet-stream
When this occurs, our system determines that the network is unavailable and displays a connection warning (a “!” icon).
Question:
Has Apple recently made any backend or CDN configuration changes to captive.apple.com that could affect the response type?
Any advice how can we solve this problem?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window
AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1
CFTypeRef frontWindow = NULL;
AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );
if ( err != kAXErrorSuccess ){
NSLog(@"failed with error: %i",err);
}
NSLog(@"app: %@ frontWindow: %@",app,frontWindow);
'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
...
@available(iOS 13.0, *)
open var **audioFileSettings:** [String : Any] { get }
@available(iOS 17.0, *)
open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
open var name: String { get }
open var identifier: String { get }
open var primaryLanguages: [String] { get }
open var supportedLanguages: [String] { get }
open var voiceSize: Int64
open var version: String
open var gender: AVSpeechSynthesisVoiceGender
open var age: Int
}
Regards
Topic:
Accessibility & Inclusion
SubTopic:
General
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
Is it possible to position windows on the floor by changing some setting? Currently, they cannot be placed on the floor due to drag limitations.
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token.
We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
I have a UIImageView as the background of a custom UIView subclass. The image itself does not contain any text. On top of this image view, I have added two UILabels.
To improve accessibility, I converted the entire view into a single accessibility element and set a proper accessibilityLabel. Additionally, I disabled accessibility for the UIImageView and the labels by setting isAccessibilityElement = false.
However, when VoiceOver's Accessibility Recognition's Text Recognition feature is enabled, VoiceOver still detects and announces the text inside the UILabels at the end after reading my custom accessibility properties. This text should not be announced.
It seems that VoiceOver treats the UILabel content as part of the UIImageView. Additionally, when using the Explore Image rotor action, the entire subview is recognized as a single image.
Is this the expected behavior? If so, is there a way to disable VoiceOver’s text recognition for this view while keeping custom accessibility intact?
class BackgroundLabelView: UIView {
private let backgroundImageView = UIImageView()
private let backgroundImageView2 = UIImageView()
private let titleLabel = UILabel()
private let subtitleLabel = UILabel()
override init(frame: CGRect) {
super.init(frame: frame)
setupView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupView()
configureAceesibility()
}
private func configureAceesibility() {
backgroundImageView.isAccessibilityElement = false
backgroundImageView2.isAccessibilityElement = false
titleLabel.isAccessibilityElement = false
subtitleLabel.isAccessibilityElement = false
isAccessibilityElement = true
accessibilityTraits = .button
}
func configure(backgroundImage: UIImage?, title: String, subtitle: String) {
backgroundImageView.image = backgroundImage
titleLabel.text = title
subtitleLabel.text = subtitle
accessibilityLabel = "Holiday Offer ," + title + "," + subtitle
}
private func setupView() {
backgroundImageView2.contentMode = .scaleAspectFill
backgroundImageView2.clipsToBounds = true
backgroundImageView2.translatesAutoresizingMaskIntoConstraints = false
backgroundImageView2.image = UIImage(resource: .bannerfestival)
addSubview(backgroundImageView2)
backgroundImageView.contentMode = .scaleAspectFit
backgroundImageView.clipsToBounds = true
backgroundImageView.translatesAutoresizingMaskIntoConstraints = false
addSubview(backgroundImageView)
titleLabel.font = UIFont.systemFont(ofSize: 18, weight: .bold)
titleLabel.textColor = .white
titleLabel.translatesAutoresizingMaskIntoConstraints = false
titleLabel.numberOfLines = 0
addSubview(titleLabel)
subtitleLabel.font = UIFont.systemFont(ofSize: 14, weight: .regular)
subtitleLabel.textColor = .white.withAlphaComponent(0.8)
subtitleLabel.translatesAutoresizingMaskIntoConstraints = false
subtitleLabel.numberOfLines = 0
addSubview(subtitleLabel)
NSLayoutConstraint.activate([
backgroundImageView2.leadingAnchor.constraint(equalTo: leadingAnchor),
backgroundImageView2.trailingAnchor.constraint(equalTo: trailingAnchor),
backgroundImageView2.heightAnchor.constraint(equalToConstant: 200),
backgroundImageView.centerYAnchor.constraint(equalTo: centerYAnchor),
backgroundImageView.topAnchor.constraint(equalTo: topAnchor),
backgroundImageView.leadingAnchor.constraint(greaterThanOrEqualTo: leadingAnchor),
backgroundImageView.trailingAnchor.constraint(equalTo: trailingAnchor),
backgroundImageView.bottomAnchor.constraint(equalTo: bottomAnchor),
titleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16),
titleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor),
titleLabel.bottomAnchor.constraint(equalTo: centerYAnchor, constant: -4),
subtitleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16),
subtitleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor),
subtitleLabel.topAnchor.constraint(equalTo: centerYAnchor, constant: 4)
])
}
override func layoutSubviews() {
super.layoutSubviews()
backgroundImageView.layer.cornerRadius = layer.cornerRadius
}
}
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
While it is possible to scroll content using VoiceOver on macOS, I was not able to find any NSAccessibility APIs related to it (such as accessibilityScroll: on iOS).