With the announcement of the new Apple Translate app and offline support, are there any URL schemes that we can use in our apps to send text to the Translate app and have it open and translate the sent text?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In my navigation CarPlay app I am needing to capture voice input and process that into text. Is there a built in way to do this in CarPlay?
I did not find one, so I used the following, but I am running into issues where the AVAudioSession will throw an error when I am trying to set active to false after I have captured the audio.
public func startRecording(completionHandler: @escaping (_ completion: String?) -> ()) throws {
// Cancel the previous task if it's running.
if let recognitionTask = self.recognitionTask {
recognitionTask.cancel()
self.recognitionTask = nil
}
// Configure the audio session for the app.
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(.record, mode: .default, options: [.duckOthers, .interruptSpokenAudioAndMixWithOthers])
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
let inputNode = self.audioEngine.inputNode
// Create and configure the speech recognition request.
self.recognitionRequest = SFSpeechAudioBufferRecognitionRequest()
guard let recognitionRequest = self.recognitionRequest else { fatalError("Unable to created a SFSpeechAudioBufferRecognitionRequest object") }
recognitionRequest.shouldReportPartialResults = true
// Keep speech recognition data on device
recognitionRequest.requiresOnDeviceRecognition = true
// Create a recognition task for the speech recognition session.
// Keep a reference to the task so that it can be canceled.
self.recognitionTask = self.speechRecognizer.recognitionTask(with: recognitionRequest) { result, error in
var isFinal = false
if let result = result {
// Update the text view with the results.
let textResult = result.bestTranscription.formattedString
isFinal = result.isFinal
let confidence = result.bestTranscription.segments[0].confidence
if confidence > 0.0 {
isFinal = true
completionHandler(textResult)
}
}
if error != nil || isFinal {
// Stop recognizing speech if there is a problem.
self.audioEngine.stop()
do {
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)
} catch {
print(error)
}
inputNode.removeTap(onBus: 0)
self.recognitionRequest = nil
self.recognitionTask = nil
if error != nil {
completionHandler(nil)
}
}
}
// Configure the microphone input.
let recordingFormat = inputNode.outputFormat(forBus: 0)
inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in
self.recognitionRequest?.append(buffer)
}
self.audioEngine.prepare()
try self.audioEngine.start()
}
Again, is there a build-in method to capture audio in CarPlay? If not, why would the AVAudioSession throw this error?
Error Domain=NSOSStatusErrorDomain Code=560030580 "Session deactivation failed" UserInfo={NSLocalizedDescription=Session deactivation failed}
I have a CarPlay navigation app and I would like to allow the user to speak an address and have our app search at that location.
In the Waze app, it provides a button to tap, then it brings up a CPVoiceControlTemplate and you can give it directions or a location and it will then show you search results including the text you spoke as the title. I assume that app would have the same limitations as I do, so I am wondering how another app might do this?
It was suggested that I use an App Intent with suggested phrases and then a Shortcut could perform the action. Is there documentation on this somewhere or am I going in the wrong direction here?
Obviously Waze is doing what I am wanting so there must be a way. Can anyone point me in the right direction?
I am building a CarPlay navigation app and I would like it to be as hands free as possible.
I need a better understanding of how Siri can work with CarPlay and if the direction I need to go is using Intents or App Shortcuts.
My goal is to be able to have the user speak to Siri and do things like "open settings" or "zoom in map" and then call a func in my app to do what the user is asking.
Does CarPlay support this? Do I need to use App Intents or App Shortcuts or something else?
With CarPlay, is it possible to programmatically know which side of the screen the status bar is placed on?
I am having an issue with the code that I posted below. I capture voice in my CarPlay app, then allow the user to have it read back to them using AVSpeechUtterance.
This works fine on some cars, but many of my beta testers report no audio being played. I have also experienced this in a rental car where the audio was either too quiet or the audio didn't play.
Does anyone see any issue with the code that I posted? This is for CarPlay specifically.
class CarPlayTextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate {
private var speechSynthesizer = AVSpeechSynthesizer()
static let shared = CarPlayTextToSpeechService()
/// Completion callback
private var completionCallback: (() -> Void)?
override init() {
super.init()
speechSynthesizer.delegate = self
}
func configureAudioSession() {
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.duckOthers, .interruptSpokenAudioAndMixWithOthers, .allowBluetoothHFP])
} catch {
print("Failed to set audio session category: \(error.localizedDescription)")
}
}
public func speak(_ text: String, completion: (() -> Void)? = nil) {
self.configureAudioSession()
// Store the completion callback
self.completionCallback = completion
Task(priority: .high) {
let speechUtterance = AVSpeechUtterance(string: text)
let langCode = Locale.preferredLocalLanguageCountryCode
if langCode == "en-US" {
speechUtterance.voice = AVSpeechSynthesisVoice(identifier: AVSpeechSynthesisVoiceIdentifierAlex)
} else {
speechUtterance.voice = AVSpeechSynthesisVoice(language: langCode)
}
try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation)
speechSynthesizer.speak(speechUtterance)
}
}
func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) {
Task {
stopSpeech()
try AVAudioSession.sharedInstance().setActive(false)
}
// Call completion callback if available
self.completionCallback?()
self.completionCallback = nil
}
func stopSpeech() {
speechSynthesizer.stopSpeaking(at: .immediate)
}
}
I have a driving task app and am trying to show a CPActionSheetTemplate or a CPAlertTemplate. Both of these are crashing showing:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Unsupported object <CPActionSheetTemplate: 0x6000030319e0> <identifier: C744031B-99F6-4999-AF19-6ED43140502B, userInfo: (null), tabTitle: (null), tabImage: (null), showsTabBadge: 0> passed to pushTemplate:animated:completion:. Allowed classes: {(
CPSearchTemplate,
CPNowPlayingTemplate,
CPPointOfInterestTemplate,
CPListTemplate,
CPInformationTemplate,
CPContactTemplate,
CPGridTemplate,
CPMapTemplate
)}'
This is very strange, because in the docs all app types are allowed to show ActionSheets and Alerts.
Why is this crashing?
I am using a Swift PM module and adding it to a brand new project. This project is Objective-C based, but I would like to use the module within a .swift file as I am working to migrate part of my project to Swift. The .swift file is called from the Objective-C app delegate.
When doing this method on a brand new project, it builds correctly.
However, when I add the module to my production app (has been in development for 10 years) in the same way, I get the following error in my MyApp-Swift.h file:
Cannot find interface declaration for 'MBNavigationViewController', superclass of 'CarPlayMapViewController'; did you mean 'UINavigationController'?
I have even created a stripped down version of my app with minimal files and it still does not build.
There must be some build setting or something else that allows the module to work in a brand new project (accessing the MyApp-Swift.h file that generates the Obj-C methods) but not in my older project?
I am trying to use CPVoiceControlState and include an animated image, but for the life of me I cannot figure out what sort of image it wants. I have tried animated .gif, .png, a static image, an image sequence and none seem to animate.
Here is what I am using with an animated .png:
func showLoadingTemplate() {
enum VoiceControlStates: String {
case loading = "loading"
}
let spinner = UIImage(named: "spinner")
loadingTemplate = CPVoiceControlTemplate(voiceControlStates: [
CPVoiceControlState(identifier: VoiceControlStates.loading.rawValue, titleVariants: [NSLocalizedString("Loading...", comment: "CarPlay: Loading")], image: spinner, repeats: true)
])
loadingTemplate?.activateVoiceControlState(withIdentifier: VoiceControlStates.loading.rawValue)
if let loading = loadingTemplate {
currentInterfaceController?.presentTemplate(loading, animated: true, completion: { (result: Bool, error: Error?) in
})
}
}
This shows the image but it isn't animating.
Can anyone let me know what sort of image needs to be used in order to get it to animate? I have seen animated images working in Waze and Google Maps so if must be possible.
I am wondering how I change the measurement units on screen in my CPMapTemplate. In my screenshot below the distance is in miles, but how can I change that to kilometers?
Does this need to come from my route data?
I am not seeing this anywhere in the CarPlay programming guide or in the documentation.
I am using CPNavigationAlert and am getting a specific name for the title, so I do not have more than one variant. Sometimes, the variant title is longer, which for some reason makes the image very small.
I have tried to make sure to keep displayScale in mind:
let maximumImageSize = CPListItem.maximumImageSize
let displayScale = self.interfaceController?.carTraitCollection.displayScale ?? 2
let imageSize = CGSizeMake(maximumImageSize.width * displayScale, maximumImageSize.width * displayScale)
let image = CarPlayMapUtility().getIconAlertImage(item: item, frame: imageSize)
If the titleVariants is shorter, the image is displayed corrected. If it is longer, the image might be extremely small or not shown at all. Is this expected?
I have a UINavigation controller which on the second viewController has a Google Map view. In iOS 18 you would navigate normally around the map, and swipe your finger from the left side of the screen to move the map, this worked correctly.
However, in iOS 26 swiping from left to right from anywhere but the far right side of the screen will cause the UI to try to pop the viewController.
This does not happen when I add Apple Maps or other map frameworks to the VC.
Is there a way to disable the "swipe from middle" in iOS 26?
In my app, I have a mode that rotates the MKMapView camera heading based on Core Location heading updates. This works great.
[UIView animateWithDuration:0.5 delay:0 options:UIViewAnimationOptionAllowUserInteraction animations:^{
self.mapView.camera.heading = theHeading;//Comes from CLHeading
} completion:nil];
At the same time, I have two annotations, one is the current location, that I always want to keep on screen. So as the user gets closer to one of the annotations, the map will continue to zoom in.
MKMapRect flyTo = MKMapRectNull;
if (location) {
MKMapPoint annotationPoint = MKMapPointForCoordinate(location.coordinate);
MKMapRect pointRect = MKMapRectMake(annotationPoint.x, annotationPoint.y, 0, 0);
if (MKMapRectIsNull(flyTo)) {
flyTo = pointRect;
} else {
flyTo = MKMapRectUnion(flyTo, pointRect);
}
}
if (CLLocationCoordinate2DIsValid(self.selectedCoordinate)) {
MKMapPoint annotationPoint = MKMapPointForCoordinate(self.selectedCoordinate);
MKMapRect pointRect = MKMapRectMake(annotationPoint.x, annotationPoint.y, 0, 0);
if (MKMapRectIsNull(flyTo)) {
flyTo = pointRect;
} else {
flyTo = MKMapRectUnion(flyTo, pointRect);
}
}
[self.mapView setVisibleMapRect:flyTo edgePadding:UIEdgeInsetsMake(30, 30, 30, 30) animated:animated];
The Problem
The big issue here, is that as the user rotates the device, and the heading is updated, the visibleMapRect is also updated to keep both annotations on the map. This causes the map to animate back to "north up" and then quickly back to rotating the map camera. This creates a very jerky movement which is not what I want.
Is it possible to set the visibleMapRect and the camera heading at the same time without this negative side effect?
In our iOS app we use a MKTileOverlay subclass and set tileSize to CGSizeMake(512, 512). This allows tiled maps such as Open Street Maps to display larger for those that need larger font sizes.
Since iOS 15 this no longer works, and the tiles are not large anymore and are small, similar to how it was before we added this feature.
Once tileSize has been set, we use loadTileAtPath:(MKTileOverlayPath)path result:(void (^)(NSData *, NSError *))result to return the larger image NSData in the result.
Did something change in OS 15 and MKTileOverlay and tileSize that would require some changes on our end?
In the past, it seemed that if you used the app switcher and killed the app, that state restoration data would not persist, and starting the app again would not load any stored state data. But now (at lest in iOS 17) that is no longer the case.
There are situations where the old state might cause issues, which we obviously need to fix, but users should be able to clear the state data.
I am not using sessions and am using the older methods - application:shouldSaveApplicationState: and application:shouldRestoreApplicationState:.
My question is, how can I tell my users to reset/clear state restoration data if needed?