Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

Using AVCaptureSession to record a video on initialising audio session from a Push To Talk call, audio of the ongoing video recording is getting stopped while the video recording is still ongoing.
We have a Push To Talk application which allow user to record video and audio. When user is recording a video using AVCaptureSession and receive's an Push To Talk call, from moment the Push To Talk call is received the audio in the video which is being captured is stopped while the video capture is still in progress. Here after the PTT call is completed, we have tried restarting the audio session, there are no errors that are getting printed but we still don't see the audio getting restarted in video capture. We have also tried to add a new input for AVCaptureSession we are receiving error that is resulting in video capture stopping, error mentioned below: [OS-PLT] [CameraManager] Movie file finished with error: Error Domain=AVFoundationErrorDomain Code=-11818 "Recording Stopped" UserInfo={AVErrorRecordingSuccessfullyFinishedKey=true, NSLocalizedDescription=Recording Stopped, NSLocalizedRecoverySuggestion=Stop any other actions using the recording device and try again., AVErrorRecordingFailureDomainKey=1, NSUnderlyingError=0x3026bff60 {Error Domain=NSOSStatusErrorDomain Code=-16414 "(null)"}}, success We have also raised a Feedback Ticket on same: https://feedbackassistant.apple.com/feedback/16050598
1
0
581
Feb ’25
Non-sendable type AVMediaSelectionGroup
Hi all, we try migrate project to Swift 6 Project use AVPlayer in MainActor Selection audio and subtitiles not work Task { @MainActor in let group = try await item.asset.loadMediaSelectionGroup(for: AVMediaCharacteristic.audible) get error: Non-sendable type 'AVMediaSelectionGroup?' returned by implicitly asynchronous call to nonisolated function cannot cross actor boundary and second example `if #available(iOS 15.0, *) { player?.currentItem?.asset.loadMediaSelectionGroup(for: AVMediaCharacteristic.audible, completionHandler: { group, error in if error != nil { return } if let groupWrp = group { DispatchQueue.main.async { self.setupAudio(groupWrp, audio: audioLang) } } }) }` get error: Sending 'groupWrp' risks causing data races
1
0
498
Feb ’25
Missing Depth Frames When Recording with AVCaptureVideoDataOutputSampleBufferDelegate/AVCaptureDataOutputSynchronizerDelegate and AVAssetWriter
I’ve tried both AVCaptureVideoDataOutputSampleBufferDelegate (captureOutput) and AVCaptureDataOutputSynchronizerDelegate (dataOutputSynchronizer), but the number of depth frames and saved timestamps is significantly lower than the number of frames in the .mp4 file written by AVAssetWriter. In my code, I save: Timestamps for each frame to a metadata file Depth frames to a binary file Video to an .mp4 file If I record a 4-second video at 30fps, the .mp4 file correctly plays for 4 seconds, but the number of stored timestamps and depth frames is much lower—around 70 frames instead of the expected 120. Does anyone know why this mismatch happens? func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) { // Read all outputs guard let syncedDepthData: AVCaptureSynchronizedDepthData = synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData, let syncedVideoData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else { // only work on synced pairs return } if syncedDepthData.depthDataWasDropped || syncedVideoData.sampleBufferWasDropped { return } let depthData = syncedDepthData.depthData let depthPixelBuffer = depthData.depthDataMap let sampleBuffer = syncedVideoData.sampleBuffer guard let videoPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer), let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else { return } addToPreviewStream?(CIImage(cvPixelBuffer: videoPixelBuffer)) if !canWrite() { return } // Extract the presentation timestamp (PTS) from the sample buffer let timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) //sessionAtSourceTime is the first buffer we will write to the file if self.sessionAtSourceTime == nil { //Make sure we don't start recording until the buffer reaches the correct time (buffer is always behind, this will fix the difference in time) guard sampleBuffer.presentationTimeStamp >= self.recordFromTime! else { return } self.sessionAtSourceTime = sampleBuffer.presentationTimeStamp self.videoWriter!.startSession(atSourceTime: sampleBuffer.presentationTimeStamp) } if self.videoWriterInput!.isReadyForMoreMediaData { self.videoWriterInput!.append(sampleBuffer) self.videoTimestamps.append( Timestamp( frame: videoTimestamps.count, value: timestamp.value, timescale: timestamp.timescale ) ) let ddm = depthData.depthDataMap depthCapture.addDepthData(pixelBuffer: ddm, timestamp: timestamp) } }
3
0
446
Feb ’25
Build a vision capture system for apple vision pro
Hi, I am a newbie here. We have been given a task to build a robotic vision system to capture an immersive video in a hazed environment, which will later be played on Apple Vision Pro. I am thinking of starting with 2 or 4 basic CMOS camera sensors, such as IMX378, AR0144, or VD66GY, and designing an FPGA-based circuit to synchronously capture and store raw frame-by-frame data. Some frame initial processing such as demosaicing and filtering can also be done by the FPGA. Then, I would use software for post-processing to convert the data into a compatible video format for Apple Vision Pro. Will this idea work? I can handle the raw data capture, but I’m unsure if this approach is feasible and what post-processing software I should use. Thanks a lot for your suggestions! Charlie
0
0
309
Feb ’25
How to write RGB & Depth Frames Without Losing Synchronization
I’m currently working on a project where I capture both depth frames and RGB frames using AVCaptureDataOutputSynchronizer. Depth frames are stored as raw binary data and RGB frames are saved with AVAssetWriter. The issue I’m facing is that AVAssetWriter enforces a fixed framerate, meaning it adds or discards frames to maintain that rate (as I understand it). This causes a desynchronization between the depth and RGB frames, which is a problem because I need each depth frame to be exactly matched with the corresponding RGB frame as they were captured. How can I ensure that the RGB frames are saved without AVAssetWriter modifying the frame count?
0
0
361
Feb ’25
Reference White Calculation for HDR Video Rendering in Metal
Our multimedia application Boinx FotoMagico displays media files of various kinds with a Metal rendering engine. At the moment we still use .bgra8Unorm pixel format and sRGB color space and only render in SDR, which is increasingly a problem, as much of the video content is HDR nowadays (e.g. videos shot on an iPhone). For that reason we would like to switch to EDR rendering with .rgba16Float pixel format and extendedLinearDisplayP3 color space. We have already worked out how to do this for HDR image files, but still have a technical problem when rendering HDR video files. We are using AVFoundation to get the video frames as CVPixelBuffers and convert them to MTLTexture using a CVMetalTextureCache. MTLTextures are then further processed in various compute shaders before being rendered to screen. However the pixel values in the texture are not what we expected. Video frames appear too bright/overexposed. In WWDC21 session "Explore HDR rendering with EDR" Ken Greenebaum mentioned: “AVFoundation does not presently decode HDR formats, such as HDR10, to EDR. Consequently, these need to be adapted for use with EDR rendering. This conversion is straightforward and involves two steps. First, converting to linear light by applying the inverse transfer function. And second, dividing by the medium's reference white.” https://developer.apple.com/videos/play/wwdc2021/10161?time=1498 However, the session does not explain, how to get or calculate the correct value for "reference white". We could not find any relevant info on the web. This is why we need DTS assistance. We need the code that calculates the correct value for reference white for any kind of video, whether it is SDR or HDR, and regardless of codec and encoding. I assume that Ken Greenebaum is the best Apple engineer to ask in this case, because he recorded most of the EDR related WWDC sessions in recent years? We have written a small test app that renders a short sample video (HLG encoding). The window contains two views. The upper view uses an AVPlayerLayer and renders the video natively just like QuickTime Player. The video content looks correct here. BTW, the window background is SDR white, so that bright EDR pixels can be clearly identified, e.g. the clouds just above the mountains in the upper left corner of the sample video. You may need to lower display brightness a bit if these clouds do not appear brighter than the white window background. The bottom view uses a CAMetalLayer and low-level Metal rendering. The CVPixelBuffers we receive from AVFoundation still need to be scaled down so that SDR reference white reaches pixel value 1.0. Entering a value of 9.0 to 10.0 for reference white in the text field makes it look about right on my Studio Display. But that is just experimental for this sample video file. We need code to calculate the correct value for reference white for any kind of video file! We have a couple of questions: SDR videos should probably use 1.0 as reference white, as their encoded pixel values can already be used as is? Is this assumption correct? Different video encoding of HDR video (HLG, PQ, etc) will probably lead to different values for reference white? Is the value for reference white constant throughout a video, or can it vary over time, either scene by scene, or even frame by frame? If it can vary, does the CVPixelBuffer of the current video frame contain all the necessary metadata to calculate the correct value? Does the NSScreen.maximumExtendedDynamicRangeColorComponentValue also influence the reference white value? The attached sample project is structured in a way that the only piece of code that needs to be modified is the ViewController.sdrReferenceWhiteValue() function. Please read the comments and the #warning in this function. This is where the code for calculating the reference white value should be inserted. Here is the download link for the sample project: https://www.dropbox.com/scl/fi/4w5gmftav5xhbixu9u6pb/HDRMetalTest.zip?rlkey=n8cm02soux3rx03vplgo6h1lm&dl=0
9
2
666
Feb ’25
H.265 Decoding with VideoToolBox
I am creating an app that decodes H.265 elementary streams on iOS. I use VideoToolBox to decode from H.265 to NV12. The decoded data is enqueued in the CMSampleBufferDisplayLayer as a CMSampleBuffer. However, nothing is displayed in the VideoPlayerView. It remains black. The decoding in VideoToolBox is successful. I confirmed this by saving the NV12 data in the CMSampleBuffer to a file and displaying it using a tool. Why is nothing displayed in the VideoPlayerView? I can provide other source code as well. // // VideoPlayerView.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import SwiftUI import AVFoundation struct VideoPlayerView: UIViewRepresentable { // Return an H265Player as the coordinator, and start playback there. func makeCoordinator() -> H265Player { H265Player() } func makeUIView(context: Context) -> UIView { let uiView = UIView(frame: .zero) // Base layer for attaching sublayers uiView.backgroundColor = .black // Screen background color (for iOS) // Create the display layer and add it to uiView.layer let displayLayer = context.coordinator.displayLayer displayLayer.frame = uiView.bounds displayLayer.backgroundColor = UIColor.clear.cgColor uiView.layer.addSublayer(displayLayer) // Start playback context.coordinator.startPlayback() return uiView } func updateUIView(_ uiView: UIView, context: Context) { // Reset the frame of the AVSampleBufferDisplayLayer when the view's size changes. let displayLayer = context.coordinator.displayLayer displayLayer.frame = uiView.layer.bounds // Optionally update the layer's background color, etc. uiView.backgroundColor = .black displayLayer.backgroundColor = UIColor.clear.cgColor // Flush transactions if necessary CATransaction.flush() } } // // H265Player.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import Foundation import AVFoundation import CoreMedia class H265Player: NSObject, VideoDecoderDelegate { let displayLayer = AVSampleBufferDisplayLayer() private var decoder: H265Decoder? override init() { super.init() // Initial configuration for the display layer displayLayer.videoGravity = .resizeAspect // Initialize the decoder (delegate = self) decoder = H265Decoder(delegate: self) // For simple playback, set isBaseline to true decoder?.isBaseline = true } func startPlayback() { // Load the file "cars_320x240.h265" guard let url = Bundle.main.url(forResource: "temp2", withExtension: "h265") else { print("File not found") return } do { let data = try Data(contentsOf: url) // Set FPS and video size as needed let packet = VideoPacket(data: data, type: .h265, fps: 30, videoSize: CGSize(width: 1080, height: 1920)) // Decode as a single packet decoder?.decodeOnePacket(packet) } catch { print("Failed to load file: \(error)") } } // MARK: - VideoDecoderDelegate func decodeOutput(video: CMSampleBuffer) { // When decoding is complete, send the output to AVSampleBufferDisplayLayer displayLayer.enqueue(video) } func decodeOutput(error: DecodeError) { print("Decoding error: \(error)") } }
0
0
362
Feb ’25
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
4
0
903
Feb ’25
AVContentKeySession reuse
Context We develop an iOS/Apple TV app that allows to play HLS+FP Live streams (custom playback UI), some of which use the same FairPlay content key id. All FairPlay content keys are requested to the same content key server. Implementation Despite Apple documentation warning to not reuse AVContentKeySessions, we use only one AVContentKeySession for all channels which allows the system to reuse the content key when a content key id is met again. As seen in another thread, people seems to think this is OK. Issue When reusing the AVContentKeySession and the user quickly tunes channels multiple times (up to 2 or 3 times per second using gestures), an inconsistency may occur where the content key request for a previous streams is asked to the delegate after a new stream is already being prepared and its AVURLAsset already assigned as the content key session AVContentKeyRecipient. Note that the previous content key recipient is removed before the new one is added. We also have been reported for crashes (though I haven't experienced it myself) when performing multiple channels tunings which makes us think that the AVContentKeySession should definitely not been reused. Note: On the other hand if a new AVContentKeySession is used for each stream, the system systematically requests a content key even if previous streams have used the same content key id. In this case, neither the crash nor the inconsistency issue are observed but it dramatically increases the number of calls to the content key server. Questions Should AVContentKeySessions definitely not be reused? Otherwise, how to handle the inconsistency issue described above?
0
0
413
Feb ’25
CoreMediaErrorDomain -12035 error when playing a Fairplay-protected HLS stream on iOS 18+ through the Apple lightning AV Adapter
Our iOS/AppleTV video content playback app uses AVPlayer to play HLS video streams and supports both custom and system playback UIs. The Fairplay content key is retrieved using AVContentKeySession. AirPlay is supported too. When the iPhone is connected to a TV through the lightning Apple Digital AV Adapter (A1438), the app is mirrored as expected. Problem: when using an iPhone or iPad on iOS 18.1.1, FairPlay-protected HLS streams are not played and a CoreMediaErrorDomain -12035 error is received by the AVPlayerItem. Also, once the issue has occurred, the mirroring freezes (the TV indefinitely displays the app playback screen) although the app works fine on the iOS device. The content key retrieval works as expected (I can see that 2 content key requests are made by the system by the way, probably one for the local playback and one for the adapter, as when AirPlaying) and the error is thrown after providing the AVContentKeyResponse. Unfortunately, and as far as I know, there is not documentation on CoreMediaErrorDomain errors so I don't know what -12035 means. The issue does not occur: on an iPhone on iOS 17.7 (even with FairPlay-protected HLS streams) when playing DRM-free video content (whatever the iOS version) when using the USB-C AV Adapter (whatever the iOS version) Also worth noting: the issue does not occur with other video playback apps such as Apple TV or Netflix although I don't have any details on the kind of streams these apps play and the way the FairPlay content key is retrieved (if any) so I don't know if it is relevant.
5
2
1.4k
Feb ’25
Issue: FPS Drops to 30 After Trimming Video in Photos App
Hi, I am recording a video at 240 FPS within my application and saving it to the Photos app. The recorded video retains 240 FPS in the Photos app. However, after trimming the video using the Photos app and importing it back into my app, the FPS is reduced to 30 FPS. Steps to Reproduce: Record a video inside the application at 240 FPS. Save the recorded video to the Photos app. Verify that the video retains 240 FPS in the Photos app. Trim the video using the built-in Photos app editor. Import the trimmed video back into the application. The FPS of the imported video is now reduced to 30 FPS. Code Used for Importing Video: I am using the following code to fetch the video from the Photos app: let options: PHVideoRequestOptions = PHVideoRequestOptions() options.version = .current // Using `.original` preserves FPS, but I need `.current` for other changes options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestAVAsset(forVideo: self, options: options) { (avAsset, audioMix, info) in if let urlAsset = avAsset as? AVURLAsset { completionHandler(urlAsset.url, self) } else { self.askForOriginal(completionHandler: completionHandler) } } Observations: The original video retains 240 FPS until it is trimmed in the Photos app. After trimming, the FPS automatically drops to 30 FPS when imported back into the app. If I use options.version = .original, the FPS is preserved, but I need .current to apply other modifications. Questions: Is this an expected behavior of PHImageManager when requesting a video with options.version = .current? Is there a way to preserve the original FPS while still using .current? Are there any workarounds to extract the trimmed video without FPS reduction? Any insights or solutions would be greatly appreciated. Thanks in advance!
1
0
359
Mar ’25
SDAVAssetExportSession or AVAssetWriter fail to process iphone16 video with spatial audio tracks
I have been using SDAVAssetExportSession to compress videos in an app I am building, everything goes very smoothly until I have my new Iphone16, on the device, the spatial audio in camera setting is turned on by default, then the SDAVAssetExportSession starts to fail. I know it has something to do with audioSetting. the current setting is something like this: exportSession.audioSettings = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000 ] And also, this is passed to the underlying object AVAssetReader or AVAssetWriter. I am not experienced in this area, and I really had a hard time trying to figure out. Does anyone know how to set up AVAssetReader or AVAssetWriter to process video with spatial audio tracks ? thanks in advance.
1
0
289
Mar ’25
Issue with AVAssetDownloadURLSession - Only One Audio Language track Downloaded
We are using AVAssetDownloadURLSession to download content with multiple audio tracks. Still, we are facing an issue where only one audio language is being downloaded, despite explicitly requesting multiple audio languages. However, all subtitle variants are being downloaded successfully. Issue Details: Observed Behaviour: When initiating a download using AVAssetDownloadURLSession, only one audio track (Hindi, in this case) is downloaded, even though the content contains multiple audio tracks. Expected Behaviour: All requested audio tracks should be downloaded, similar to how subtitle variants are successfully downloaded. Please find sample app implementation details: https://drive.google.com/file/d/1DLcBGNnuWFYsY0cipzxpIHqZYUDJujmN/view?usp=sharing Manifest file for the asset looks something like below #EXTM3U #EXT-X-VERSION:6 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A1",NAME="Hindi",LANGUAGE="hi",URI="indexHindi/Hindi.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A2",NAME="Bengali",LANGUAGE="bn",URI="indexBengali/Bengali.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A3",NAME="Kannada",LANGUAGE="kn",URI="indexKannada/Kannada.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A4",NAME="Malayalam",LANGUAGE="ml",URI="indexMalayalam/Malayalam.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A5",NAME="Tamil",LANGUAGE="ta",URI="indexTamil/Tamil.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A6",NAME="Telugu",LANGUAGE="te",URI="indexTelugu/Telugu.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",NAME="English",DEFAULT=YES,AUTOSELECT=YES,FORCED=NO,LANGUAGE="en",URI="subtitle_en/sub_en_vtt.m3u8"
1
6
554
Mar ’25
Inconsistent FPS (20 FPS Issue) While Recording Video Using AVCaptureSession.
Hi, I am recording video using my app. And setting up fps also using below code. But sometime video is being recorded using 20 FPS. Can someone please let me know what I am doing wrong? private func eightBitVariantOfFormat() -> AVCaptureDevice.Format? { let activeFormat = self.videoDeviceInput.device.activeFormat let fpsToBeSupported: Int = 60 debugPrint("fpsToBeSupported - \(fpsToBeSupported)" as AnyObject) let allSupportedFormats = self.videoDeviceInput.device.formats debugPrint("all formats - \(allSupportedFormats)" as AnyObject) let activeDimensions = CMVideoFormatDescriptionGetDimensions(activeFormat.formatDescription) debugPrint("activeDimensions - \(activeDimensions)" as AnyObject) let filterBasedOnDimensions = allSupportedFormats.filter({ (CMVideoFormatDescriptionGetDimensions($0.formatDescription).width == activeDimensions.width) && (CMVideoFormatDescriptionGetDimensions($0.formatDescription).height == activeDimensions.height) }) if filterBasedOnDimensions.isEmpty { // Dimension not found. Required format not found to handle. debugPrint("Dimension not found" as AnyObject) return activeFormat } debugPrint("filterBasedOnDimensions - \(filterBasedOnDimensions)" as AnyObject) let filterBasedOnMaxFrameRate = filterBasedOnDimensions.compactMap({ format in let videoSupportedFrameRateRanges = format.videoSupportedFrameRateRanges if !videoSupportedFrameRateRanges.isEmpty { let contains = videoSupportedFrameRateRanges.contains(where: { Int($0.maxFrameRate) >= fpsToBeSupported }) if contains { return format } else { return nil } } else { return nil } }) debugPrint("allFormatsToBeSupported - \(filterBasedOnMaxFrameRate)" as AnyObject) guard !filterBasedOnMaxFrameRate.isEmpty else { debugPrint("Taking default active format as nothing found when filtered using desired FPS" as AnyObject) return activeFormat } var formatToBeUsed: AVCaptureDevice.Format! if let four_two_zero_v = filterBasedOnMaxFrameRate.first(where: { CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange}) { // 'vide'/'420v' formatToBeUsed = four_two_zero_v } else { // Take the first one from above array. formatToBeUsed = filterBasedOnMaxFrameRate.first } do { try self.videoDeviceInput.device.lockForConfiguration() self.videoDeviceInput.device.activeFormat = formatToBeUsed self.videoDeviceInput.device.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) self.videoDeviceInput.device.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) if videoDeviceInput.device.isFocusModeSupported(.continuousAutoFocus) { self.videoDeviceInput.device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus } self.videoDeviceInput.device.unlockForConfiguration() } catch let error { debugPrint("\(error)" as AnyObject) } return formatToBeUsed }
1
0
396
Mar ’25
AVPlayer: Significant Delays and Asset Loss When Playing Partially Downloaded HLS Content Offline
We're experiencing significant issues with AVPlayer when attempting to play partially downloaded HLS content in offline mode. Our app downloads HLS video content for offline viewing, but users encounter the following problems: Excessive Loading Delay: When offline, AVPlayer attempts to load resources for up to 60 seconds before playing the locally available segments Asset Loss: Sometimes AVPlayer completely loses the asset reference and fails to play the video on subsequent attempts Inconsistent Behavior: The same partially downloaded asset might play immediately in one session but take 30+ seconds in another Network Activity Despite Offline Settings: Despite configuring options to prevent network usage, AVPlayer still appears to be attempting network connections These issues severely impact our offline user experience, especially for users with intermittent connectivity. Technical Details Implementation Context Our app downloads HLS videos for offline viewing using AVAssetDownloadTask. We store the downloaded content locally and maintain a dictionary mapping of file identifiers to local paths. When attempting to play these videos offline, we experience the described issues. Current Implementation Here's our current implementation for playing the videos: - (void)presentNativeAvplayerForVideo:(Video *)video navContext:(NavContext *)context { NSString *localPath = video.localHlsPath; if (localPath) { NSURL *videoURL = [NSURL URLWithString:localPath]; NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @YES, AVURLAssetAllowsCellularAccessKey: @NO, AVURLAssetAllowsExpensiveNetworkAccessKey: @NO, AVURLAssetAllowsConstrainedNetworkAccessKey: @NO }; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:options]; AVPlayerViewController *playerViewController = [[AVPlayerViewController alloc] init]; NSArray *keys = @[@"duration", @"tracks"]; [asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset]; AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem]; playerViewController.player = player; [player play]; }); }]; playerViewController.modalPresentationStyle = UIModalPresentationFullScreen; [context presentViewController:playerViewController animated:YES completion:nil]; } } Attempted Solutions We've tried several approaches to mitigate these issues: Modified Asset Options: NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @NO, // Changed to NO AVURLAssetAllowsCellularAccessKey: @NO, AVURLAssetAllowsExpensiveNetworkAccessKey: @NO, AVURLAssetAllowsConstrainedNetworkAccessKey: @NO, AVAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidRemoteReferenceToLocal) }; Skipped Asynchronous Key Loading: AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset automaticallyLoadedAssetKeys:nil]; Modified Player Settings: player.automaticallyWaitsToMinimizeStalling = NO; [playerItem setPreferredForwardBufferDuration:2.0]; Added Network Resource Restrictions: playerItem.canUseNetworkResourcesForLiveStreamingWhilePaused = NO; Used File URLs Instead of HTTP URLs where possible Despite these attempts, the issues persist. Expected vs. Actual Behavior Expected Behavior: AVPlayer should immediately begin playback of locally available HLS segments When offline, it should not attempt to load from network for more than a few seconds Once an asset is successfully played, it should be reliably available for future playback Actual Behavior: AVPlayer waits 10-60 seconds before playing locally available segments Network activity is observed despite all network-restricting options Sometimes the player fails completely to play a previously available asset Behavior is inconsistent between playback attempts with the same asset Questions: What is the recommended approach for playing partially downloaded HLS content offline with minimal delay? Is there a way to force AVPlayer to immediately use available local segments without attempting to load from the network? Are there any known issues with AVPlayer losing references to locally stored HLS assets? What diagnostic steps would you recommend to track down the specific cause of these delays? Does AVFoundation have specific timeouts for offline HLS playback that could be configured? Any guidance would be greatly appreciated as this issue is significantly impacting our user experience. Device Information iOS Versions Tested: 14.5 - 18.1 Device Models: iPhone 12, iPhone 13, iPhone 14, iPhone 15 Xcode Version: 15.3-16.2.1
1
0
482
Mar ’25
Using the AVPlayer to play encrypted streams causes the system to reboot
When we use AVPlayer to play DRM encrypted streams, it will not play normally under iOS 17.6.1 system version, and there is a high probability of system restart. This is the relevant core error log: error 14:47:53.323369+0800 audiomxd [AirPlayError] carManager_copyProperty_block_invoke:499: got error -12784/0xFFFFCE10 kCMBaseObjectError_PropertyNotFound error 14:47:53.323414+0800 audiomxd [SPEndpointManagerFactory] SidePlay Endpoint Manager creation failed with -72390/0xFFFEE53A error 14:47:53.364949+0800 audiomxd [APBrowserCarSessionHelper] [0xF6AA] [Bonjour/WiFi] Unrecognized ConnectivityHelper event 101 error 14:47:53.375313+0800 audiomxd AddInstanceForFactory: No factory registered for id <CFUUID 0xa5c5118c0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
1
0
247
Mar ’25
AVPlayer handle 403 error
Hello, Is there a way to handle 403 error returned by the server, eg token expired ? Cannot find any information about this and everything that I tried wasn't working (addObserver, NotificationCenter with .AVPlayerItemNewErrorLogEntry, AVPlayerItemPlaybackStalled, ...) Thank you very much.
0
0
138
Mar ’25