Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

Accessing External Timecode from Blackmagic ProDock in Custom App
Hi everyone, I’m exploring using the iPhone 17 Pro with the Blackmagic ProDock in a custom capture app. The genlock functionality seems accessible via AVExternalSyncDevice and related APIs, which is great. I’m specifically curious about external timecode coming in from the ProDock: • Is there a public way to access the timecode feed in a custom app via AVFoundation or another Apple API? • If so, what is the recommended approach to read or apply that timecode during capture? • Are there any current limitations or entitlements required to access timecode from ProDock in a third-party app? I’m excited to start integrating synchronized capture in my app, and any guidance or sample patterns would be greatly appreciated. Thanks in advance! — [Artem]
2
0
261
Oct ’25
VTLowLatencyFrameInterpolationConfiguration supported dimensions
Is there limits on the supported dimension for VTLowLatencyFrameInterpolationConfiguration. Querying VTLowLatencyFrameInterpolationConfiguration.maximumDimensions and VTLowLatencyFrameInterpolationConfiguration.minimumDimensions returns nil. When I try the WWDC sample project EnhancingYourAppWithMachineLearningBasedVideoEffects with a 4k video this statement try frameProcessor.startSession(configuration: configuration) executes but try await frameProcessor.process(parameters: parameters) throws error Error Domain=VTFrameProcessorErrorDomain Code=-19730 "Processor is not initialized" UserInfo={NSLocalizedDescription=Processor is not initialized}. Also, why is VTLowLatencyFrameInterpolationConfiguration able to run while app is backgrounded but VTFrameRateConversionParameters can't (due to gpu usage)?
2
0
297
1w
Control system video effects support for CMIO extension
We're distributing a virtual camera with our app that does not profit in the slightest from automatically applied system video effects both to the video going in (physical camera device) or out (virtual camera device). I'm aware of setting NSCameraReactionEffectGesturesEnabledDefault in Info.plist and determining active video effects via AVCaptureDevice API. Those are obviously crutches, because having to tell users to go look for and click around in menu bar apps is the opposite of a great UX. To make our product's video output more deterministic, I'm looking for a way to tell the CMIO subsystem that our virtual camera does not support any of the system video effects. I'm seeing properties like AVCaptureDevice.Format.isPortraitEffectSupported and AVCaptureDevice.Format.isStudioLightSupported whose documentation refers to the format's ability to support these effects. Since we're setting a CMFormatDescription via CMIOExtensionStreamSource.formats I was hoping to find something in the extensions, but wasn't successful so far. Can this be done?
2
0
153
Nov ’25
Attaching color properties to CVPixelBufferRef
I believe this should work: CFMutableDictionaryRef attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); CFDictionaryAddValue(attrs, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_709_2); CFDictionaryAddValue(attrs, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_709_2); CFDictionaryAddValue(attrs, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_709_2); CVPixelBufferRef pixelBuffer = NULL; CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32ARGB, attrs, &pixelBuffer); assert(CFDictionaryGetCount(CVBufferGetAttachments(pixelBuffer, kCVAttachmentMode_ShouldPropagate)) > 0); But that last assert fails, so it appears the color info does not get attached. kCVImageBufferColorPrimariesKey and the others are not one of the keys listed under BufferAttributeKeys, but I think they're supposed to be allowed because they're listed by CMVideoFormatDescriptionGetExtensionKeysCommonWithImageBuffers(). I'm hoping that putting the color matrix info in there will control how AVAssetWriter converts the RGB to YCbCr.
2
0
308
3w
Is Apple Log open to developers for 3rd party apps?
Hello! I am building a video camera app and trying to implement Apple log for iPhone 15 Pro and 16 Pro. I am not seeing a lot of documentation on it and notice the amount of apps that use it on the app is rather limited. Less an 5 to be exact. Is Apple Log recording a feature that is accessible to developers? Here is a link to documentation: https://developer.apple.com/documentation/avfoundation/avcapturecolorspace/applelog
1
0
494
Jan ’25
AVAssetWriter append audio/video streams concurrently in Real time recording setup
I see in most of the old sample codes from Apple that when using AVAssetWriter to append audio, video, and metadata samples in a real time camera recording setup, calls to .append(sampleBuffer) are either synchronised using an NSLock or all the samples are sent to the asset writer on the same dispatch queue thereby preventing concurrent writes. However I can't find any documentation that calls to assetWriterInput.append(sampleBuffer) for different media samples such as Audio and Video should not be done concurrently. Is it not valid for these methods to be executed in parallel for instance? `videoSamplesAssetWriterInput.append(videoSampleBuffer)` from DispatchQueue 1 `audioSamplesAssetWriterInput.append(audioSampleBuffer)` from DispatchQueue 2
1
0
635
Jan ’25
Exporting Higher Resolution Spatial Videos in Final Cut
How do we export a spatial video at a higher resolution than 2200 x 2200? For example, I want to export a video that I edited in a spatial timeline as 4096 x 4096 resolution with spatial metadata to output as an MV-HEVC file. I looked under both Final Cut and Compressor export settings, but couldn't find a way to do this. Thanks for your help!
1
0
482
Jan ’25
Help diagnosing crash with only system code in its stack trace
Hello, I work on a video streaming app, and I have been working on this crash that we are seeing quite frequently (it is our #2 crasher at the moment). The stack trace indicates that an AVPictureInPictureController is being deallocated on a background thread. This leads to dangling AutoLayout constraints getting cleaned up, further resulting in an exception being thrown from the framework about the layout engine being accessed from a background thread. We have internal analytics which indicate that the crash occurs after the user comes back to the app after putting it in the background, and switches playback from Picture in Picture mode back to the app's regular playback UI. What has me puzzled here is that, as I'm sure you know, we have no control over the PIP UI. It is entirely system-provided, and there is none of our app's code in the stack trace. The whole process is even initiated by a KVO on an AVPlayerController property, and our app doesn't use that class directly anywhere. So how did we manage to cause this process to happen on a background thread? Add to this the fact that the bug only appeared once we switched to compiling with Xcode 16, and is overwhelmingly present only on devices running iOS 18. These factors lead me to believe that this is probably an OS issue. But before I go to file a feedback, I thought I would post here in case anyone has any ideas. I have attached an instance of the crash log to this post. 2025-01-08_17-32-45.7003_-0800-ea8d5c3323e0f1fc059cf83f6ec86377bdae1788.crash
1
0
436
Jan ’25
How start AVPictureInPicture when video is paused
I have AVPlayer with AVPictureInPictureController. Play video in app and picture In Picture works except one situation. Issue is: I pause video in application and during switch to background is not PiP activate. What do I wrong? import UIKit import AVKit import AVFoundation class ViewControllerSec: UIViewController,AVPictureInPictureControllerDelegate { var pipPlayer: AVPlayer! var avCanvas : UIView! var pipCanvas: AVPlayerLayer? var pipController: AVPictureInPictureController! var mainViewControler : UIViewController! var playerItem : AVPlayerItem! var videoAvasset : AVAsset! public func link(to parentViewController : UIViewController) { mainViewControler = parentViewController setup() } @objc func appWillResignActiveNotification(application: UIApplication) { guard let pipController = pipController else { print("PiP not supported") return } print("PIP isSuspend: \(pipController.isPictureInPictureSuspended)") print("PIP isPossible: \(pipController.isPictureInPicturePossible)" if playerItem.status == .readyToPlay { if pipPlayer.rate == 0 { pipPlayer.play() } pipController.startPictureInPicture(). ---> Errorin log: Failed to start picture in picture. } else { print("Player not ready for PiP.") } } private func setupAudio() { do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .moviePlayback) try session.setActive(true) } catch { print("Audio session setup failed: \(error.localizedDescription)") } } @objc func playerItemDidFailToPlayToEnd(_ notification: Notification) { if let error = notification.userInfo?[AVPlayerItemFailedToPlayToEndTimeErrorKey] as? Error { print("Failed to play to end: \(error.localizedDescription)") } } func setup() { setupAudio() guard let videoURL = URL(string: "https://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.mp4/.m3u8") else { return } videoAvasset = AVAsset(url: videoURL) playerItem = AVPlayerItem(asset: videoAvasset) addPlayerObservers() pipPlayer = AVPlayer(playerItem: playerItem) avCanvas = UIView(frame: view.bounds) pipCanvas = AVPlayerLayer(player: pipPlayer) guard let pipCanvas else { return } pipCanvas.frame = avCanvas.bounds //pipCanvas.videoGravity = .resizeAspectFill mainViewControler.view.addSubview(avCanvas) avCanvas.layer.addSublayer(pipCanvas) if AVPictureInPictureController.isPictureInPictureSupported() { pipController = AVPictureInPictureController(playerLayer: pipCanvas) pipController?.delegate = self pipController?.canStartPictureInPictureAutomaticallyFromInline = true } let playButton = UIButton(frame: CGRect(x: 20, y: 50, width: 100, height: 50)) playButton.setTitle("Play", for: .normal) playButton.backgroundColor = .blue playButton.addTarget(self, action: #selector(playTapped), for: .touchUpInside) mainViewControler.view.addSubview(playButton) let pauseButton = UIButton(frame: CGRect(x: 140, y: 50, width: 100, height: 50)) pauseButton.setTitle("Pause", for: .normal) pauseButton.backgroundColor = .red pauseButton.addTarget(self, action: #selector(pauseTapped), for: .touchUpInside) mainViewControler.view.addSubview(pauseButton) let pipButton = UIButton(frame: CGRect(x: 260, y: 50, width: 150, height: 50)) pipButton.setTitle("Start PiP", for: .normal) pipButton.backgroundColor = .green pipButton.addTarget(self, action: #selector(startPictureInPicture), for: .touchUpInside) mainViewControler.view.addSubview(pipButton) print("Error:\(String(describing: pipPlayer.error?.localizedDescription))") NotificationCenter.default.addObserver(forName: UIApplication.didEnterBackgroundNotification, object: nil, queue: nil) { [weak self] _ in guard let self = self else { return } if self.pipPlayer.rate == 0 { self.pipPlayer.play() pipController?.startPictureInPicture() } } func addPlayerObservers() { playerItem?.addObserver(self, forKeyPath: "status", options: [.old, .new], context: nil) NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying(_:)), name: .AVPlayerItemDidPlayToEndTime, object: playerItem) } override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) { if keyPath == "status" { if let statusNumber = change?[.newKey] as? NSNumber { let status = AVPlayer.Status(rawValue: statusNumber.intValue)! switch status { case .readyToPlay: print("Player is ready to play") case .failed: print("Player failed: \(String(describing: playerItem?.error))") case .unknown: print("Player status is unknown") @unknown default: fatalError() } } } } @objc func playerDidFinishPlaying(_ notification: Notification) { print("Video finished playing.") } deinit { playerItem?.removeObserver(self, forKeyPath: "status") NotificationCenter.default.removeObserver(self) } @objc func playTapped() { pipPlayer.play() } @objc func pauseTapped() { pipPlayer.pause() } @objc func startPictureInPicture() { if let pipController = pipController, !pipController.isPictureInPictureActive { pipController.startPictureInPicture() } } @objc func stopPictureInPicture() { if let pipController = pipController, pipController.isPictureInPictureActive { pipController.stopPictureInPicture() } } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: Error) { print("Failed to start PiP: \(error.localizedDescription)") if let underlyingError = (error as NSError).userInfo[NSUnderlyingErrorKey] { print("Underlying error: \(underlyingError)") } } }
1
0
546
Jan ’25
AVAssetExportSession in iOS18- Thread 11: "*** -[AVAssetExportSession exportAsynchronouslyWithCompletionHandler:] Cannot call exportAsynchronouslyWithCompletionHandler: more than once."
I’m experiencing a crash at runtime when trying to extract audio from a video. This issue occurs on both iOS 18 and earlier versions. The crash is caused by the following error: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetExportSession exportAsynchronouslyWithCompletionHandler:] Cannot call exportAsynchronouslyWithCompletionHandler: more than once.' *** First throw call stack: (0x1875475ec 0x184ae1244 0x1994c49c0 0x217193358 0x217199899 0x192e208b9 0x217192fd9 0x30204c88d 0x3019e5155 0x301e5fb41 0x301af7add 0x301aff97d 0x301af888d 0x301aff27d 0x301ab5fa5 0x301ab6101 0x192e5ee39) libc++abi: terminating due to uncaught exception of type NSException My previous code worked fine, but it's crashing with Swift 6. Does anyone know a solution for this? ## **Previous code:** func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) { let asset = AVAsset(url: videoURL) // Create an AVAssetExportSession to export the audio track guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"]))) return } // Set the output file type and path guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return } let outputURL = VideoUtils.getTempAudioExportUrl(filename) VideoUtils.deleteFileIfExists(outputURL.path) exportSession.outputFileType = .m4a exportSession.outputURL = outputURL let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0) if let exportHandler = exportHandler { exportHandler(exportSession, audioExportProgressPublisher) } // Periodically check the progress of the export session let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in audioExportProgressPublisher.send(exportSession.progress) } // Export the audio track asynchronously exportSession.exportAsynchronously { switch exportSession.status { case .completed: completion(.success(outputURL)) case .failed: completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"]))) case .cancelled: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"]))) default: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"]))) } // Invalidate the timer when the export session completes or is cancelled timer.invalidate() } } ## New Code: func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) async { let asset = AVAsset(url: videoURL) // Create an AVAssetExportSession to export the audio track guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"]))) return } // Set the output file type and path guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return } let outputURL = VideoUtils.getTempAudioExportUrl(filename) VideoUtils.deleteFileIfExists(outputURL.path) let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0) if let exportHandler { exportHandler(exportSession, audioExportProgressPublisher) } if #available(iOS 18.0, *) { do { try await exportSession.export(to: outputURL, as: .m4a) let states = exportSession.states(updateInterval: 0.1) for await state in states { switch state { case .pending, .waiting: break case .exporting(progress: let progress): print("Exporting: \(progress.fractionCompleted)") if progress.isFinished { completion(.success(outputURL)) }else if progress.isCancelled { completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"]))) }else { audioExportProgressPublisher.send(Float(progress.fractionCompleted)) } } } }catch let error { print(error.localizedDescription) } }else { // Periodically check the progress of the export session let publishTimer = Timer.publish(every: 0.1, on: .main, in: .common) .autoconnect() .sink { [weak exportSession] _ in guard let exportSession else { return } audioExportProgressPublisher.send(exportSession.progress) } exportSession.outputFileType = .m4a exportSession.outputURL = outputURL await exportSession.export() switch exportSession.status { case .completed: completion(.success(outputURL)) case .failed: completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"]))) case .cancelled: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"]))) default: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"]))) } // Invalidate the timer when the export session completes or is cancelled publishTimer.cancel() } }
1
0
698
Jan ’25
Non-sendable type AVMediaSelectionGroup
Hi all, we try migrate project to Swift 6 Project use AVPlayer in MainActor Selection audio and subtitiles not work Task { @MainActor in let group = try await item.asset.loadMediaSelectionGroup(for: AVMediaCharacteristic.audible) get error: Non-sendable type 'AVMediaSelectionGroup?' returned by implicitly asynchronous call to nonisolated function cannot cross actor boundary and second example `if #available(iOS 15.0, *) { player?.currentItem?.asset.loadMediaSelectionGroup(for: AVMediaCharacteristic.audible, completionHandler: { group, error in if error != nil { return } if let groupWrp = group { DispatchQueue.main.async { self.setupAudio(groupWrp, audio: audioLang) } } }) }` get error: Sending 'groupWrp' risks causing data races
1
0
499
Feb ’25
Using AVCaptureSession to record a video on initialising audio session from a Push To Talk call, audio of the ongoing video recording is getting stopped while the video recording is still ongoing.
We have a Push To Talk application which allow user to record video and audio. When user is recording a video using AVCaptureSession and receive's an Push To Talk call, from moment the Push To Talk call is received the audio in the video which is being captured is stopped while the video capture is still in progress. Here after the PTT call is completed, we have tried restarting the audio session, there are no errors that are getting printed but we still don't see the audio getting restarted in video capture. We have also tried to add a new input for AVCaptureSession we are receiving error that is resulting in video capture stopping, error mentioned below: [OS-PLT] [CameraManager] Movie file finished with error: Error Domain=AVFoundationErrorDomain Code=-11818 "Recording Stopped" UserInfo={AVErrorRecordingSuccessfullyFinishedKey=true, NSLocalizedDescription=Recording Stopped, NSLocalizedRecoverySuggestion=Stop any other actions using the recording device and try again., AVErrorRecordingFailureDomainKey=1, NSUnderlyingError=0x3026bff60 {Error Domain=NSOSStatusErrorDomain Code=-16414 "(null)"}}, success We have also raised a Feedback Ticket on same: https://feedbackassistant.apple.com/feedback/16050598
1
0
582
Feb ’25
Use Metal to conver HDR Pixelbuffer to SDR Pixelbuffer
I see some demo show convert HDR video to SDR Pixelbuffer,such AVAssetReader、 AVVideoComposition 、AVComposition 、AVFoundation. But In some cases,I want to render HDR Pixelbuffer and record video. AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([videoDevice isVideoHDRSupported]) { NSError *error = nil; if ([videoDevice lockForConfiguration:&error]) { videoDevice.automaticallyAdjustsVideoHDREnabled = NO; videoDevice.videoHDREnabled = YES; // 开启 HDR [videoDevice unlockForConfiguration]; } else { NSLog(@"Error: %@", error.localizedDescription); } } Real-time processing of HDR data requires processing of video frame data (such as filters), ensuring that the processing chain supports 10-bit color depth and HDR metadata. And use imagesBuffer to object tracking, etc. How to solve this problem?
1
0
459
Jan ’25
Inconsistent FPS (20 FPS Issue) While Recording Video Using AVCaptureSession.
Hi, I am recording video using my app. And setting up fps also using below code. But sometime video is being recorded using 20 FPS. Can someone please let me know what I am doing wrong? private func eightBitVariantOfFormat() -> AVCaptureDevice.Format? { let activeFormat = self.videoDeviceInput.device.activeFormat let fpsToBeSupported: Int = 60 debugPrint("fpsToBeSupported - \(fpsToBeSupported)" as AnyObject) let allSupportedFormats = self.videoDeviceInput.device.formats debugPrint("all formats - \(allSupportedFormats)" as AnyObject) let activeDimensions = CMVideoFormatDescriptionGetDimensions(activeFormat.formatDescription) debugPrint("activeDimensions - \(activeDimensions)" as AnyObject) let filterBasedOnDimensions = allSupportedFormats.filter({ (CMVideoFormatDescriptionGetDimensions($0.formatDescription).width == activeDimensions.width) && (CMVideoFormatDescriptionGetDimensions($0.formatDescription).height == activeDimensions.height) }) if filterBasedOnDimensions.isEmpty { // Dimension not found. Required format not found to handle. debugPrint("Dimension not found" as AnyObject) return activeFormat } debugPrint("filterBasedOnDimensions - \(filterBasedOnDimensions)" as AnyObject) let filterBasedOnMaxFrameRate = filterBasedOnDimensions.compactMap({ format in let videoSupportedFrameRateRanges = format.videoSupportedFrameRateRanges if !videoSupportedFrameRateRanges.isEmpty { let contains = videoSupportedFrameRateRanges.contains(where: { Int($0.maxFrameRate) >= fpsToBeSupported }) if contains { return format } else { return nil } } else { return nil } }) debugPrint("allFormatsToBeSupported - \(filterBasedOnMaxFrameRate)" as AnyObject) guard !filterBasedOnMaxFrameRate.isEmpty else { debugPrint("Taking default active format as nothing found when filtered using desired FPS" as AnyObject) return activeFormat } var formatToBeUsed: AVCaptureDevice.Format! if let four_two_zero_v = filterBasedOnMaxFrameRate.first(where: { CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange}) { // 'vide'/'420v' formatToBeUsed = four_two_zero_v } else { // Take the first one from above array. formatToBeUsed = filterBasedOnMaxFrameRate.first } do { try self.videoDeviceInput.device.lockForConfiguration() self.videoDeviceInput.device.activeFormat = formatToBeUsed self.videoDeviceInput.device.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) self.videoDeviceInput.device.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) if videoDeviceInput.device.isFocusModeSupported(.continuousAutoFocus) { self.videoDeviceInput.device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus } self.videoDeviceInput.device.unlockForConfiguration() } catch let error { debugPrint("\(error)" as AnyObject) } return formatToBeUsed }
1
0
396
Mar ’25
Issue: FPS Drops to 30 After Trimming Video in Photos App
Hi, I am recording a video at 240 FPS within my application and saving it to the Photos app. The recorded video retains 240 FPS in the Photos app. However, after trimming the video using the Photos app and importing it back into my app, the FPS is reduced to 30 FPS. Steps to Reproduce: Record a video inside the application at 240 FPS. Save the recorded video to the Photos app. Verify that the video retains 240 FPS in the Photos app. Trim the video using the built-in Photos app editor. Import the trimmed video back into the application. The FPS of the imported video is now reduced to 30 FPS. Code Used for Importing Video: I am using the following code to fetch the video from the Photos app: let options: PHVideoRequestOptions = PHVideoRequestOptions() options.version = .current // Using `.original` preserves FPS, but I need `.current` for other changes options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestAVAsset(forVideo: self, options: options) { (avAsset, audioMix, info) in if let urlAsset = avAsset as? AVURLAsset { completionHandler(urlAsset.url, self) } else { self.askForOriginal(completionHandler: completionHandler) } } Observations: The original video retains 240 FPS until it is trimmed in the Photos app. After trimming, the FPS automatically drops to 30 FPS when imported back into the app. If I use options.version = .original, the FPS is preserved, but I need .current to apply other modifications. Questions: Is this an expected behavior of PHImageManager when requesting a video with options.version = .current? Is there a way to preserve the original FPS while still using .current? Are there any workarounds to extract the trimmed video without FPS reduction? Any insights or solutions would be greatly appreciated. Thanks in advance!
1
0
359
Mar ’25
Issue with AVAssetDownloadURLSession - Only One Audio Language track Downloaded
We are using AVAssetDownloadURLSession to download content with multiple audio tracks. Still, we are facing an issue where only one audio language is being downloaded, despite explicitly requesting multiple audio languages. However, all subtitle variants are being downloaded successfully. Issue Details: Observed Behaviour: When initiating a download using AVAssetDownloadURLSession, only one audio track (Hindi, in this case) is downloaded, even though the content contains multiple audio tracks. Expected Behaviour: All requested audio tracks should be downloaded, similar to how subtitle variants are successfully downloaded. Please find sample app implementation details: https://drive.google.com/file/d/1DLcBGNnuWFYsY0cipzxpIHqZYUDJujmN/view?usp=sharing Manifest file for the asset looks something like below #EXTM3U #EXT-X-VERSION:6 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A1",NAME="Hindi",LANGUAGE="hi",URI="indexHindi/Hindi.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A2",NAME="Bengali",LANGUAGE="bn",URI="indexBengali/Bengali.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A3",NAME="Kannada",LANGUAGE="kn",URI="indexKannada/Kannada.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A4",NAME="Malayalam",LANGUAGE="ml",URI="indexMalayalam/Malayalam.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A5",NAME="Tamil",LANGUAGE="ta",URI="indexTamil/Tamil.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A6",NAME="Telugu",LANGUAGE="te",URI="indexTelugu/Telugu.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",NAME="English",DEFAULT=YES,AUTOSELECT=YES,FORCED=NO,LANGUAGE="en",URI="subtitle_en/sub_en_vtt.m3u8"
1
6
554
Mar ’25