Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Question regarding CarPlay Integration for a Note/Voice Recording App
Hello everyone, I am currently working on an app project aimed at users who want to quickly and easily capture their ideas and notes while on the go. The basic concept is to develop an iOS app where users can store both typed notes and voice recordings – essentially a "brain dump" solution. The core functionality (storing, editing, synchronizing via CloudKit, etc.) will be handled within the iOS app. In addition, I plan to integrate a CarPlay extension that allows the driver to start and stop a recording – ideally through a minimalist interface featuring a large record button and a "Done" button. Since the iPhone is often not within immediate reach in the car, the CarPlay integration should serve as a quick trigger to initiate the recording in the iOS app. My questions are as follows: Has anyone had experience implementing a CarPlay extension for an app that primarily handles notes and voice recordings, rather than falling into the traditional categories like navigation, audio, or communication? Has such a concept ever been approved by Apple, or are there known hurdles and guidelines that must be observed? Are there alternative approaches to implementing CarPlay integration in this context in a compliant and effective manner? I would greatly appreciate any feedback, shared experiences, and tips on best practices. Thank you in advance and best regards!
1
0
464
Feb ’25
Macro-mode in AVCaptureDevice(custom camera)
Hi, I would like to use macro-mode for the custom camera using AVCaptureDevice in my project. This feature might help to automatically adjust and switch between lenses to get a close up clear image. It looks like this feature is not available and there are no open apis to achieve macro mode from Apple. Is there a way to get this functionality in the custom camera without losing the image quality. Please let me know if this is possible. Thanks you, Adil Thamarasseri
7
1
1.3k
Feb ’25
AVURLAsset with AVURLAssetHTTPCookiesKey - Cookies not persisting on retry requests
I'm experiencing an unexpected behavior with AVURLAsset and cookies. When setting cookies through AVURLAssetHTTPCookiesKey option, they seem to be sent only on the initial request but not on retry attempts. Here's my current implementation: let cookieProperties: [HTTPCookiePropertyKey: Any] = [ .name: "sessionCookie", .value: "testValue", .domain: url.host ?? "", .path: "/", .secure: true ] if let cookie = HTTPCookie(properties: cookieProperties) { let asset = AVURLAsset(url: url, options: [ AVURLAssetHTTPCookiesKey: [cookie], ]) } According to the documentation, AVURLAssetHTTPCookiesKey should apply the cookies to all requests made by this asset. However, when the initial request fails and AVPlayer retries, the cookies are not included in subsequent requests. Only when I store the cookie with HTTPCookieStorage.shared.setCookie, then it persists. Questions: Is this the expected behavior? If not, what could be causing the cookies to not persist for retry attempts? Is using HTTPCookieStorage.shared the recommended approach instead? Environment: iOS 16+ Using AVPlayer with AVURLAsset Streaming HLS content Any insights would be greatly appreciated.
0
0
347
Feb ’25
How can I add support for Apple Music lyrics sharing in my app?
I noticed that Instagram and iMessage support receiving shared lyrics from Apple Music. Specifically, when users long-press lyrics, a sheet pops up showing iMessage and Instagram. Clicking on either app generates a beautifully formatted lyrics image. I've looked through MusicKit documentation but couldn't find any related APIs. How can I implement this functionality in my app?
1
1
429
Feb ’25
Shazamkit with AirPods
HI Guys, I'm using Shazamkit in my IOS app and successfully capturing the currently playing track details, when using the devices (iPhone) built-in mic. When I test with AirPods though, my app cannot both send the output to through the AirPods and capture that same output with the AirPods mic, for Shazamkit recognition. I believe this must be possible, because the Shazamkit widget on IOS can do this. Is it restricted in some way for third party apps? If not, I'd appreciate some guidance on how to achieve this in Swift code. Thanks in advance.
1
1
571
Feb ’25
AV Player Live playback Pause is not working on tvOS 18
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works. The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior. Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated. Thank you!
6
9
780
Feb ’25
Using AVCaptureSession to record a video on initialising audio session from a Push To Talk call, audio of the ongoing video recording is getting stopped while the video recording is still ongoing.
We have a Push To Talk application which allow user to record video and audio. When user is recording a video using AVCaptureSession and receive's an Push To Talk call, from moment the Push To Talk call is received the audio in the video which is being captured is stopped while the video capture is still in progress. Here after the PTT call is completed, we have tried restarting the audio session, there are no errors that are getting printed but we still don't see the audio getting restarted in video capture. We have also tried to add a new input for AVCaptureSession we are receiving error that is resulting in video capture stopping, error mentioned below: [OS-PLT] [CameraManager] Movie file finished with error: Error Domain=AVFoundationErrorDomain Code=-11818 "Recording Stopped" UserInfo={AVErrorRecordingSuccessfullyFinishedKey=true, NSLocalizedDescription=Recording Stopped, NSLocalizedRecoverySuggestion=Stop any other actions using the recording device and try again., AVErrorRecordingFailureDomainKey=1, NSUnderlyingError=0x3026bff60 {Error Domain=NSOSStatusErrorDomain Code=-16414 "(null)"}}, success We have also raised a Feedback Ticket on same: https://feedbackassistant.apple.com/feedback/16050598
1
0
583
Feb ’25
AVAudioEngine Stop Method
Hi all! I have been experiencing some issues when using the AVAudioEngine to play audio and record input while doing a voice chat (through the PTT Interface). I noticed if I connect any players to the AudioGraph OR call start that the audio session becomes active (this is on iOS). I don't see anything in the docs or the header files in the AVFoundation, but is it possible that calling the stop method on an engine deactivates the audio session too? In a normal app this behavior seems logical, but when using PTT all activation and deactivation of the audio session must go through the framework and its delegate methods. The issue I am debugging is that when the engine with the input node tapped gets stopped, and there is a gap between the input and when the server replies with inbound audio to be played and something seems to be getting the hardware/audio session into a jammed state. Thanks for any feedback and/or confirmation on this behavior!
2
0
638
Feb ’25
PTTFramework w/ AVAudioSession
Hi all, I have spent a lot of time reading the tech note and watching the WDDC video that introduce the PTTFramework on iOS. I currently have a custom setup where I am using AVAudioEngine to schedule and play buffers that are being streamed through a call. I am looking to use the PTTFramework to allow a user to trigger this push to talk behavior from the lock screen and the various places with the system UI it provides. However I am unsure what the correct behavior is regarding the handling of the audio session. Right now I am using .playback when there is no active voice transmission so that devices such as AirPods can be in AD2P mode where applicable, and then transitioning to .playbackAndRecord category only when the mic input should become active. Following this change in my AVAudioEngine manager I am then manually activating and deactivating the audio session manually when the engine is either playing/recording or idle. In the documentation it states that you should not attempt to activate or deactivate your audio session directly, but allow the framework to handle it. Does that mean that I need to either call the request to transmit delegate function or set an active participant on the channel manager first, and then wait for the didBecomeActive delegate method to trigger before I actually attempt to play or record any audio? (I am using the fullDuplex mode currently.) I noticed that that delegate method will only trigger if the audio session wasn't active before doing one of the above (setting active participant, requesting transmit). Lastly, when using the PTTFramework it also mentions that we get support for PTT devices and I notice on the didBeginTransmittingFrom property we have a handsfreeButton case. Is there any documentation or resources for what is actually supported out of the box for this? I am currently working on handling a lot of the push to talk through bluetooth LE, and wanted to make sure there wasn't overlap with what the system provides. Thank you!
2
0
597
Feb ’25
MPRemoteCommandCenter not updating play/pause button to proper state on iOS
So I'm using AVAudioEngine. When playing audio I become the 'now playing' app using MPNowPlayingInfoCenter/MPRemoteCommandCenter APIs. When configuring MPRemoteCommandCenter I add a play/pause command target via -addTargetWithHandler on the togglePlayPauseCommand property. Now I also have a play/pause button in my app's UI. When I pause playback from my app's UI (which means I'm the active app, I'm in the foreground), what I do is this: -I pause the AVAudioPlayerNode I'm using with AVAudioEngine. I do not, stop, reset, etc. the AVAudioEngine. I only pause the player node. My thought process here is that the user just pressed pause and it is very likely that he will hit 'play' to resume playback in the near future because My app is in the foreground and the user just hit the pause button. Now if my app moves to the background and if I receive a memory warning I presume it'd make sense to tear down the engine or pause it. Perhaps I'm wrong about this? So when I initially hit the play button from my app's UI I also activate my AVAudioSession. I do this in high priority NSOperation since the documentation warns that "we recommend that applications not activate their session from a thread where a long blocking operation will be problematic." So now I'm playing and I hit pause from my app's UI. Then I quickly bring up the "Now Playing" center and I see I'm the "Now Playing" app but the play-pause button is showing the pause icon instead of the play icon but I'm in the pause state. I do set MPNowPlayingInfoCenter's playbackState to MPNowPlayingPlaybackStatePaused when I pause. Not surprisingly this doesn't work. The documentation states this is for macOS only. So the only way to get MPRemoteCommandCenter to show the "play" image for the play-pause button is to deactivate my AVAudioSession when I pause playback? Since I change the active state of my audio session in a NSOperation because documentation recommends "we recommend that applications not activate their session from a thread where a long blocking operation will be problematic." the play-pause toggle in the remote command center won't immediately update since I'm doing it on another thread. IMO it feels kind of inappropriate for a play-pause button to wait on a NSOperation activating the audio session before updating its UI when I already know my play/paused state, it should update right away like the button in my app does. Wouldn't it be nicer to just use MPNowPlayingInfoCenter's playbackState property on iOS too? If I'm no the longer the now playing app/active audio session it doesn't matter since I'm not in the now playing UI, just ignore it? Also is it recommended that I deactivate my audio session explicitly every time the user pauses audio in my app (when I'm in the foreground)? Also when I do deactivate the audio session I get an error: AVAudioSessionErrorCodeIsBusy (but the button in the now playing center updates to the proper image). I do this : -(void)pause { [self.playerNode pause]; [self runOperationToDeactivateAudioSession]; // This does nothing on iOS: MPNowPlayingInfoCenter *nowPlayingCenter = [MPNowPlayingInfoCenter defaultCenter]; nowPlayingCenter.playbackState = MPNowPlayingPlaybackStatePaused; } So in -runOperationToDeactivateAudioSession I get the AVAudioSessionErrorCodeIsBusy. According to the documentation Starting in iOS 8, if the session has running I/Os at the time that deactivation is requested, the session will be deactivated, but the method will return NO and populate the NSError with the code property set to AVAudioSessionErrorCodeIsBusy to indicate the misuse of the API. So pausing the player node when pausing isn't enough to meet the deactivation criteria. I guess I have to pause or stop the audio engine. I could probably wait until I receive a scene went to background notification or something before deactivating my audio session (which is async, so the button may not update to the correct image in time). This seems like a lot of code to have to write to get a play-pause toggle to update, especially in iPad-multi window scene environment. What's the recommended approach? Should I pause the AudioEngine instead of the player node always? Should I always explicitly deactivate my audio session when the user pauses playback from my app's UI even if I'm in the foreground? I personally like the idea of just being able to set [MPNowPlayingInfoCenter defaultCenter].playbackState = MPNowPlayingPlaybackStatePaused; But maybe that's because that would just make things easier on me. This does feels overcomplicated though. If anyone can share some tips on how I should handle this, I'd appreciate it.
4
0
688
Feb ’25
[MusicKit] Check for availability of songs
Songs can be unavailable (greyed out) in Apple Music. How can I check if a song is unavailable via the MusicKit framework? Obviously the playback will fail with MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" but how can I know that in advance? I need to check the availability of hundreds of albums and therefore initiating a playback for each of them is not an option. Things I have tried: Checking if the release date property is set to a future date. This filters out all future releases but doesn't solve the problem for already released songs. Checking if the duration is 0. This does not work since the duration of unavailable songs does not have to be 0. Initiating a playback and checking for the "Failed to prepare to play" error. This is not suitable for a huge amount of Albums. I couldn't find a solution yet but somehow other third-party-apps are able ignore/don't shows these albums. I believe the Apple Music app is only displaying albums where at least one song is available. I am using this function to fetch all albums of an artist. private func fetchAlbumsFor(_ artist: Artist) async throws -> [Album] { let artistWithAlbums = try await artist.with(.albums) var allAlbums = [Album]() guard var currentBadge = artistWithAlbums.albums else { return [] } allAlbums.append(contentsOf: currentBadge) while currentBadge.hasNextBatch { if let nextBatch = try await currentBadge.nextBatch() { currentBadge = nextBatch allAlbums.append(contentsOf: nextBatch) } else { break } } return allAlbums } Here is an example album where I am unable to detect its unavailability (at least in Germany): https://music.apple.com/de/album/die-haferhorde-immer-den-n%C3%BCstern-nach-h%C3%B6rspiel-zu-band-3/1755774804 Furthermore I was unable to navigate to this album via the Apple Music app directly. Thanks for any help Edit: Apparently this album is not included in an apple music subscription but can be bought seperately. The question remains: How can I check that?
1
0
573
Feb ’25
Help for a plugin audio unit
Hello All, It seems that it's "very easy" (😬) to implement a little Swift code inside the prepared AU using Xcode 16.2 on Sequoia 15.1.1 and a Mac Studio M1 Ultra, but my issue is that I finally don't know... where. The documentation says that I've to find the AudioUnitViewController.swift file and then modify the render block : audioUnit.renderBlock = { (numFrames, ioData) in // Process audio here } in the Xcode project automatically generated, but I didn't find such a file... If somebody can help me in showing where is the file to be modified, I'll be very grateful ! Thank you very much. J
1
0
440
Feb ’25
Why is the volume very low when using the real-time recording and playback feature with AEC?
I’ve been researching how to achieve a recording playback effect in iOS similar to the hands-free calling effect in the system’s phone app. How can this be implemented? I tried using the voice chat recording method, but found that the volume of the speaker output is too low. How should this issue be addressed? I couldn’t find a suitable API. Could you provide me with some documentation or sample code? Thank you.
1
0
418
Feb ’25
Populating Now Playing with Objective-C
Hello. I am attempting to display the music inside of my app in Now Playing. I've tried a few different methods and keep running into unknown issues. I'm new to Objective-C and Apple development so I'm at a loss of how to continue. Currently, I have an external call to viewDidLoad upon initialization. Then, when I'm ready to play the music, I call playMusic. I have it hardcoded to play an mp3 called "1". I believe I have all the signing set up as the music plays after I exit the app. However, there is nothing in Now Playing. There are no errors or issues that I can see while the app is running. This is the only file I have in Xcode relating to this feature. Please let me know where I'm going wrong or if there is another object I need to use! #import <Foundation/Foundation.h> #import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> #import <AVFoundation/AVFoundation.h> @interface ViewController : UIViewController <AVAudioPlayerDelegate> @property (nonatomic, strong) AVPlayer *player; @property (nonatomic, strong) MPRemoteCommandCenter *commandCenter; @property (nonatomic, strong) MPMusicPlayerController *controller; @property (nonatomic, strong) MPNowPlayingSession *nowPlayingSession; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; NSLog(@"viewDidLoad started."); [self setupAudioSession]; [self initializePlayer]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; NSLog(@"viewDidLoad completed."); } - (void)setupAudioSession { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *setCategoryError = nil; if (![audioSession setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError]) { NSLog(@"Error setting category: %@", [setCategoryError localizedDescription]); } else { NSLog(@"Audio session category set."); } NSError *activationError = nil; if (![audioSession setActive:YES error:&activationError]) { NSLog(@"Error activating audio session: %@", [activationError localizedDescription]); } else { NSLog(@"Audio session activated."); } } - (void)initializePlayer { NSString *soundFilePath = [NSString stringWithFormat:@"%@/base/game/%@",[[NSBundle mainBundle] resourcePath], @"bgm/1.mp3"]; if (!soundFilePath) { NSLog(@"Audio file not found."); return; } NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; self.player = [AVPlayer playerWithURL:soundFileURL]; NSLog(@"Player initialized with URL: %@", soundFileURL); } - (void)createNowPlayingSession { self.nowPlayingSession = [[MPNowPlayingSession alloc] initWithPlayers:@[self.player]]; NSLog(@"Now Playing Session created with players: %@", self.nowPlayingSession.players); } - (void)configureNowPlayingInfo { MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter]; CMTime duration = self.player.currentItem.duration; Float64 durationSeconds = CMTimeGetSeconds(duration); CMTime currentTime = self.player.currentTime; Float64 currentTimeSeconds = CMTimeGetSeconds(currentTime); NSDictionary *nowPlayingInfo = @{ MPMediaItemPropertyTitle: @"Example Title", MPMediaItemPropertyArtist: @"Example Artist", MPMediaItemPropertyPlaybackDuration: @(durationSeconds), MPNowPlayingInfoPropertyElapsedPlaybackTime: @(currentTimeSeconds), MPNowPlayingInfoPropertyPlaybackRate: @(self.player.rate) }; infoCenter.nowPlayingInfo = nowPlayingInfo; NSLog(@"Now Playing info configured: %@", nowPlayingInfo); } - (void)playMusic { [self.player play]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; } - (void)pauseMusic { [self.player pause]; [self configureNowPlayingInfo]; } @end
2
0
565
Feb ’25
AVQueuePlayer/AVPlayer rate property is not being changed everytime I assign a new value to it.
I have used AVQueuePlayer in my music app to play sequence of audios from a remote server, this how I have defined things my player in my ViewModel Variables private var cancellables = Set() private let audioSession = AVAudioSession.sharedInstance() private var avQueuePlayer: AVQueuePlayer? @Published var playbackSpeed: Float = 1.0 before starting playback, I am making sure that audio session is set properly, the code snippet used for that is do { try audioSession.setCategory(.playback, mode: .default, options: []) try audioSession.setActive(true, options: []) } catch { return } and this is the function I am using to update playback speed func updatePlaybackSpeed(_ newSpeed: Float){ if newSpeed > 0.0, newSpeed <= 2.0{ playbackSpeed = newSpeed avQueuePlayer?.rate = newSpeed print("requested speed is (newSpeed) and actual speed is (String(describing: avQueuePlayer?.rate))") } } sometimes whatever speed is set, player seems to play at the same speed as it was set, e.g. Once I got "requested speed is 1.5 and actual speed is 1.5", and player also seemed to play at the speed of 1.5 but another time I got "requested speed is 2.0 and actual speed is 2.0", but player still seemed to play at the speed of 1.0 to observe changes in rate, I used this **private func observeRateChanges() { guard let avQueuePlayer = self.avQueuePlayer else { return } NotificationCenter.default.publisher(for: AVQueuePlayer.rateDidChangeNotification, object: avQueuePlayer) .compactMap { $0.userInfo?[AVPlayer.rateDidChangeReasonKey] as? AVPlayer.RateDidChangeReason } .sink { reason in switch reason { case .appBackgrounded: print("The app transitioned to the background.") case .audioSessionInterrupted: print("The system interrupts the app’s audio session.") case .setRateCalled: print("The app set the player’s rate.") case .setRateFailed: print("An attempt to change the player’s rate failed.") default: break } } .store(in: &cancellables) }** when rate was set properly, I got this "The app set the player’s rate." from the above function, but when it wasn't, I got this "An attempt to change the player’s rate failed.," now I am not able to understand why rate is not being set, and if it gave "requested speed is 2.0 and actual speed is 2.0" from updatePlaybackSpeed function, why does the player seems to play with the speed of 1.0?
2
0
399
Feb ’25
Non-sendable type AVMediaSelectionGroup
Hi all, we try migrate project to Swift 6 Project use AVPlayer in MainActor Selection audio and subtitiles not work Task { @MainActor in let group = try await item.asset.loadMediaSelectionGroup(for: AVMediaCharacteristic.audible) get error: Non-sendable type 'AVMediaSelectionGroup?' returned by implicitly asynchronous call to nonisolated function cannot cross actor boundary and second example `if #available(iOS 15.0, *) { player?.currentItem?.asset.loadMediaSelectionGroup(for: AVMediaCharacteristic.audible, completionHandler: { group, error in if error != nil { return } if let groupWrp = group { DispatchQueue.main.async { self.setupAudio(groupWrp, audio: audioLang) } } }) }` get error: Sending 'groupWrp' risks causing data races
1
0
501
Feb ’25
Problem with UVC Device Access on visionOS
No external cameras show up in the app on visionOS. We use this sample code as a basis for our tests: https://developer.apple.com/documentation/visionos/displaying-video-from-connected-devices We also received the needed entitlement from Apple, but every camera we tried so far does not show up on visionOS. We tried the following devices and hubs: Insta360 X4 Somikon Endoscope Camera: USB HD Endoscope Camera EMEET Full HD Webcam - C960 BENFEI Video/Audio Capture Card, 4K HDMI auf USB C/A Logitech C920 HD PRO Webcam, Anker PowerConf C200 Insta360 GO 3S Anker 341 USB-C Hub UGREEN Revodok Pro 10Gbps USB-C Hub All Vision Pro devices we tried run with visionOS 2.3. When trying the same code on iPad we can actually use external cameras. Steps to reproduce: Start the app on a Vision Pro device and connect an external camera. The connected camera does not show up in the dropdown. Development environment: Xcode 16.2, macOS 15.3 Run-time configuration: iOS 18.3, visionOS 2.3
2
0
613
Feb ’25