Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

Generating a new FPS certificate (SDK 26) alongside an existing SDK 4 certificate
Hi, Our client currently has an FPS deployment certificate generated with SDK version 4 that is still actively used in production. They would like to generate an additional certificate using SDK version 26. Before doing so, they just want to confirm: Will the existing SDK 4 certificate remain unaffected and still visible in the Apple Developer portal? Any considerations they should keep in mind? Thanks!
0
0
39
5h
Generating a new FPS certificate (SDK 26) alongside an existing SDK 4 certificate
Hi, Our client currently has an FPS deployment certificate generated with SDK version 4 that is still actively used in production. They would like to generate an additional certificate using SDK version 26. Before doing so, they just want to confirm: Will the existing SDK 4 certificate remain unaffected and still visible in the Apple Developer portal? Any considerations they should keep in mind? Thanks! :)
0
0
29
5h
On iOS 26, HLS alternate audio track selection behaves inconsistently
Summary On iOS 26, HLS alternate audio track selection behaves inconsistently on both VOD and live streams: the French track falls back to the DEFAULT=YES (English) track after manual selection, and in some cases switching to a non-default track appears to work but it is then impossible to switch back to English. Environment iOS version: 26 Players affected: native Safari on iOS 26 and THEOplayer (issue also reproducible on THEOplayer's own demo page) Stream type: HLS/CMAF with demuxed alternate audio renditions (CMFC container) Affected stream types: both VOD and live streaming Issue NOT present on iOS 17/18 Manifest #EXTM3U #EXT-X-VERSION:4 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-STREAM-INF:BANDWIDTH=8987973,AVERAGE-BANDWIDTH=8987973,VIDEO-RANGE=SDR,CODECS="avc1.640028",RESOLUTION=1920x1080,FRAME-RATE=29.970,AUDIO="program_audio" video_1080p.m3u8 #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="en",ASSOC-LANGUAGE="en",NAME="English",AUTOSELECT=YES,DEFAULT=YES,CHANNELS="2",URI="audio_ENG.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="de",ASSOC-LANGUAGE="de",NAME="Deutsch",AUTOSELECT=YES,DEFAULT=NO,CHANNELS="2",URI="audio_DEU.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="fr",ASSOC-LANGUAGE="fr",NAME="Francais",AUTOSELECT=YES,DEFAULT=NO,CHANNELS="2",URI="audio_FRA.m3u8" Steps to Reproduce Load the HLS manifest (VOD or live) in Safari on iOS 26, or in any AVFoundation-backed player Start playback — English plays correctly as DEFAULT Manually select "Francais" from the audio track selector Observe that English audio continues playing (French does not play) In a separate scenario: manually select "Deutsch" — German plays correctly Attempt to switch back to English — English does not resume; audio remains on the previously selected track Expected behavior Selecting any track should immediately switch to that language Switching back to English (DEFAULT=YES) should work at any time Behavior should be consistent across VOD and live streams Actual behavior Two distinct anomalies observed, reproducible on both VOD and live streams, in both native Safari and THEOplayer: French-specific fallback: selecting the French track causes playback to fall back to English. This does not happen with German. Cannot return to English: in cases where a non-default track plays correctly, attempting to switch back to the DEFAULT=YES track (English) fails — the previous non-default track continues playing. The fact that the issue reproduces in native Safari confirms this is an AVFoundation/WebKit-level regression, not a third-party player bug. What we have already verified and ruled out LANGUAGE codes are BCP-47 compliant (en, fr, de) ✓ EXT-X-VERSION:4 is present ✓ Audio codec removed from STREAM-INF CODECS (video-only) ✓ ASSOC-LANGUAGE attribute added matching LANGUAGE value ✓ Container metadata verified via ffprobe: mdhd box correctly contains language tags (e.g. "fra") ✓ Audio segment content verified via ffplay: correct audio in each language file ✓ French audio source file contains correct French audio content ✓ Issue reproduces in native Safari on iOS 26, confirming it is not a THEOplayer-specific bug Issue does NOT reproduce on iOS 17/18 with the same manifest and segments Additional notes The VOD stream is packaged with AWS MediaConvert, CMAF output group, SEGMENTED_FILES, AAC-LC codec (mp4a.40.2), 128kbps, 48kHz stereo. English uses AudioTrackType ALTERNATE_AUDIO_AUTO_SELECT_DEFAULT; French and German use ALTERNATE_AUDIO_AUTO_SELECT. The live stream uses AWS MediaPackage with a similar CMAF/HLS output configuration. Please advise whether this is a known regression in AVFoundation on iOS 26 and whether a fix is planned.
1
0
64
2d
Swift Array Out of Bounds Crash in VTFrameProcessor when using VTLowLatencyFrameInterpolationParameters
Hi everyone, Our team is encountering a reproducible crash when using VTLowLatencyFrameInterpolation on iOS 26.3 while processing a live LL-HLS input stream. 🤖 Environment Device: iPhone 16 OS: iOS 26.3 Xcode: Xcode 26.3 Framework: VideoToolbox 💥 Crash Details The application crashes with the following fatal error: Fatal error: Swift/ContiguousArrayBuffer.swift:184: Array index out of range The stack trace highlights the following: VTLowLatencyFrameInterpolationImplementation processWithParameters:frameOutputHandler: Called from VTFrameProcessor.process(parameters:) Here is the simplified implementation block where the crash occurs. (Note: PrismSampleBuffer and PrismLLFIError are our internal custom wrapper types). // Create `VTFrameProcessorFrame` for the source (previous) frame. let sourcePTS = sourceSampleBuffer.presentationTimeStamp var sourceFrame: VTFrameProcessorFrame? if let pixelBuffer = sourceSampleBuffer.imageBuffer { sourceFrame = VTFrameProcessorFrame(buffer: pixelBuffer, presentationTimeStamp: sourcePTS) } // Validate the source VTFrameProcessorFrame. guard let sourceFrame else { throw PrismLLFIError.missingImageBuffer } // Create `VTFrameProcessorFrame` for the next frame. let nextPTS = nextSampleBuffer.presentationTimeStamp var nextFrame: VTFrameProcessorFrame? if let pixelBuffer = nextSampleBuffer.imageBuffer { nextFrame = VTFrameProcessorFrame(buffer: pixelBuffer, presentationTimeStamp: nextPTS) } // Validate the next VTFrameProcessorFrame. guard let nextFrame else { throw PrismLLFIError.missingImageBuffer } // Calculate interpolation intervals and allocate destination frame buffers. let intervals = interpolationIntervals() let destinationFrames = try framesBetween(firstPTS: sourcePTS, lastPTS: nextPTS, interpolationIntervals: intervals) let interpolationPhase: [Float] = intervals.map { Float($0) } // Create VTLowLatencyFrameInterpolationParameters. // This sets up the configuration required for temporal frame interpolation between the previous and current source frames. guard let parameters = VTLowLatencyFrameInterpolationParameters( sourceFrame: nextFrame, previousFrame: sourceFrame, interpolationPhase: interpolationPhase, destinationFrames: destinationFrames ) else { throw PrismLLFIError.failedToCreateParameters } try await send(sourceSampleBuffer) // Process the frames. // Using progressive callback here to get the next processed frame as soon as it's ready, // preventing the system from waiting for the entire batch to finish. for try await readOnlyFrame in self.frameProcessor.process(parameters: parameters) { // Create an interpolated sample buffer based on the output frame. let newSampleBuffer: PrismSampleBuffer = try readOnlyFrame.frame.withUnsafeBuffer { pixelBuffer in try PrismLowLatencyFrameInterpolation.createSampleBuffer(from: pixelBuffer, readOnlyFrame.timeStamp) } // Pass the newly generated frame to the output stream. try await send(newSampleBuffer) } 🙋 Questions Are there any known limitations or bugs regarding VTLowLatencyFrameInterpolation when handling live 60fps streams? Are there any undocumented constraints we should be aware of regarding source/previous frame timing, pixel buffer attributes, or how destinationFrames and interpolationPhase arrays must be allocated? Is a "warm-up" sequence recommended after startSession() before making the first process(parameters:) call?
1
0
434
5d
Offline Fairplay Error -42650
We have implemented offline Fairplay playback and it works fine.But at times when trying to playback the offline downloaded content, we get the following error"An unknown error occured (-42650)"Tried looking up the error in the documentation but couldnt find anything relevant.What could possibly be creating this error?
6
2
3.2k
1w
Unity iOS (Metal) → WebRTC (Unity WebRTC) video stream to remote Unity client: PeerConnection connects but receiver renders black frames
Hello! My name is Mason Prather. I'm a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role. I’m currently building a research prototype that connects a mobile companion application to a VR headset. The mobile application is built in Unity and deployed on iOS, and it streams video frames to a remote Unity client using WebRTC. Environment Device: iPhone 15 OS: iOS 26.3 (tested on physical device, not Simulator) Engine: Unity 2022.3.57f1 Graphics API: Metal Streaming Technology: WebRTC (Unity WebRTC package) Architecture: Mobile Unity app streaming video frames to a remote Unity client Receiver Device: Meta Quest Pro headset (Unity application) Networking: LAN (UDP discovery + TCP signaling) Video Source: Unity RenderTexture Goal The goal of the system is to allow a VR user to view media stored on their phone inside a VR environment. The iOS app: renders or captures media content converts frames into a WebRTC video track streams the video to the headset Current Status Connection setup works correctly. Observed behavior: Signaling connection successful ICE candidate exchange successful PeerConnection state becomes Connected Video track created successfully However, the receiving application displays black frames. iOS App Details The video source originates from a Unity RenderTexture. Inside the phone application: RenderTexture displays correctly Frames appear correct locally But the receiving peer does not display the frames. Relevant Components Unity WebRTC package iOS Metal rendering pipeline Custom TCP signaling LAN discovery via UDP Expected Behavior Rendered frames should transmit via WebRTC and appear on the remote device. Actual Behavior The remote video track is active, but the rendered frames appear black on the receiving client. Questions Are there known issues involving Unity WebRTC + iOS Metal texture capture? Are there specific pixel format requirements when streaming textures from Unity on iOS? Could the issue relate to texture readback limitations or GPU synchronization? I am more than happy to provide screenshots and console logs upon request. If anyone has experience streaming Unity video frames via WebRTC on iOS, I would greatly appreciate any guidance.
0
0
192
1w
Clarification on SPC Version 3 Availability and Requirements (SDK 26 Certificate Bundle)
Hello, I’m using a valid certificate bundle generated with SDK 26 (combined RSA‑1024 + RSA‑2048). However, all my devices currently still generate SPC v2 during playback, including my iPhone 16 under iOS 26.2. Apple staff mentioned that future iOS versions will send SPC v3 when using an SDK 26 certificate bundle. Could you please clarify: Which iOS/macOS versions will first support SPC v3? Are there any additional client‑side requirements (Safari version, playback APIs, headers, etc.) to trigger SPC v3? Is there any way to test SPC v3 today, e.g., using beta builds? Thank you!
1
1
437
2w
Offline HLS with FairPlay?
Hello,We have been working on FPS for a while and managed to play an encrypted asset with it, using version 2 of FairPlay Streaming Server SDK 2.03.Now we are working on "Download and Play Offline" functionality. For this I have downloaded the FairPlay Streaming Server SDK 3.0 and in SDK Folder there is "OfflineHLSGuide_withFPS.pdf" document beside "FairPlayStreaming_PG.pdf" where they tell about how it works and what need to be done.In that document ("OfflineHLSGuide_withFPS.pdf") there is new version of Content Key Duration TLLV which is slightly different than the structure shown in "FairPlayStreaming_PG.pdf"For Content Key Duration TLLV (0x47acf6a418cd091a) in "FairPlayStreaming_PG" there is "Lease Duration" between bytes 16-19.But in the document about OfflineHLSGuide, it says "Reserved" for bytes between 16-19.Where has LeaseDuration gone for offline? ORShould I use one ContentKeyDuration TLLV for non-persist and other version of ContentKeyDuration TLLV for (persist)offline?Thanks
5
0
2.4k
2w
Inquiry regarding CoreMediaErrorDomain Code=-15517 during LL-HLS Live Playback
Hello, I am currently developing a live streaming application using AVPlayer to play LL-HLS (Low-Latency HLS) content. During our testing phase, we consistently encountered the following error in the logs: CoreMediaErrorDomain Code=-15517 The challenge we are facing is that the error description is quite vague. It only provides cryptic messages such as "Key not found" or "No value information," which makes it extremely difficult to identify the root cause or perform a deep-dive analysis. I have searched through the official Apple Developer documentation and technical notes, but I couldn’t find any specific reference to what Code -15517 signifies in the context of LL-HLS or CoreMedia. Regarding this issue, I have the following questions: What is the specific meaning of this error code (-15517)? Does it relate to missing tags in the HLS manifest, or is it an internal state issue within the AVPlayer stack? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Is there any additional logging or debugging tool you would recommend to further investigate "Key not found" issues in LL-HLS? Any insights or guidance from the community or Apple engineers would be greatly appreciated. Thank you in advance for your help.
1
0
166
3w
The audio of FairPlay protected content can be captured - Safari on iOS
Hi, Has anyone been able to protect the audio part of FairPlay protected content from being captured as part of screen recording on Safari/iOS (PWA and/or online web app)? We have tried many things but could not prevent the audio from being recorded. Same app and content on Safari/Mac does not allow audio to be recorded. Any tips?
0
0
130
3w
A mistake in FairPlay Streaming SDK 26 sample code on comparing ProtocolVersionUsed value?
Hello, I am reviewing the sample codes of FairPlay Streaming SDK 26 and there was a place where I think is a mistake. The codes are for the server, for both Swift and Rust codes. There is an if statement which compares "ProtocolVersionUsed"(spcData.versionUsed) and SPCVersion1 constant, though "ProtocolVersionUsed" and SPC Version is a different thing, so shouldn't it be using a different constant value? [createContentKeyPayload.swift] // Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS. if serverCtx.spcContainer.spcData.versionUsed == base_constants.SPCVersion.v1.rawValue && [createContentKeyPayload.rs] // Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS. if (serverCtx.spcContainer.spcData.versionUsed == SPCVersion::v1 as u32) && Thank you.
0
0
88
3w
Inquiry regarding CoreMediaErrorDomain Code=-12880 in LL-HLS playback
Hello, I am developing a custom player SDK based on AVPlayer. While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880 Since I cannot find documentation for this specific code, could you please clarify its meaning? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Any insights would be appreciated. Thank you.
1
0
159
Feb ’26
Crash when trying to get originatingRecipient
According to the documentation (https://developer.apple.com/documentation/avfoundation/avcontentkeyrequest/originatingrecipient?changes=_3&language=objc), starting with ios 18.4, I can get AVContentKeyRecipient from AVContentKeyRequest. But when I try to get it, I get a crash. What could be the issue? I want to note that I add the asset to the AVContentKeySession using the addContentKeyRecipient method (https://developer.apple.com/documentation/avfoundation/avcontentkeysession/addcontentkeyrecipient(_:)?changes=_3&language=objc).
1
0
251
Feb ’26
FairPlay: SDK4 vs SDK26 credentials/certificate for iOS/tvOS client apps
Hi We’re updating our KSM to support SPC v2/v3 and currently operate with both legacy SDK4 credentials (ASK + 1024 cert) and SDK26 credentials (certificate bundle + provisioning data + 1024/2048 keys). Our client apps run across a wide range of iOS/tvOS versions, so we want to follow Apple’s recommended client strategy for certificate selection. The docs describe SHA‑1 vs SHA‑256 in the SPC header, but do not specify which OS versions should use SDK4 vs SDK26 credentials. Could you clarify: Is there an official minimum iOS/tvOS version where you recommend SDK26 credentials for client apps? For older OS versions (e.g. iOS 15), is SDK4 still the recommended choice for client apps? Are there any official migration guidelines for client apps moving from SDK4 to SDK26 credentials? Thanks in advance.
2
0
252
Feb ’26
Cannot generate 2048-bit FairPlay Streaming certificate
Hello, I have a problem generating a 2048-bit FairPlay Streaming certificate. I tried generating SDK v26.x certificate in two ways. (1) Use existing certificate (2) Create new certificate Though, in both ways, Apple gives me a certificate bundle of 1024-bit certificate. (fps_certificate.bin) I've uploaded 2048-bit CSR on creating a certificate. Just to note, I have created a SDK v4.x certificate few years ago. Have anyone bumped into a same issue? Or am I missing something?
5
0
1k
Feb ’26
Inquiry about Low-Latency Frame Interpolation & Super Resolution using VTFrameProcessor
Hello, I have implemented Low-Latency Frame Interpolation using the VTFrameProcessor framework, based on the sample code from https://developer.apple.com/kr/videos/play/wwdc2025/300. It is currently working well for both LIVE and VOD streams. However, I have a few questions regarding the lifecycle management and synchronization of this feature: 1. Common Questions (Applicable to both Frame Interpolation & Super Resolution) 1.1 Dynamic Toggling Do you recommend enabling/disabling these features dynamically during playback? Or is it better practice to configure them only during the initial setup/preparation phase? If dynamic toggling is supported, are there any recommended patterns for managing VTFrameProcessor session lifecycle (e.g., startSession / endSession timing)? 1.2 Synchronization Method I am currently using CADisplayLink to fetch frames from AVPlayerItemVideoOutput and perform processing. Is CADisplayLink the recommended approach for real-time frame acquisition with VTFrameProcessor? If the feature needs to be toggled on/off during active playback, are there any concerns or alternative approaches you would recommend? 1.3 Supported Resolution/Quality Range What are the minimum and maximum video resolutions supported for each feature? Are there any aspect ratio restrictions (e.g., does it support 1:1 square videos)? Is there a recommended resolution range for optimal performance and quality? 2. Frame Interpolation Specific Questions 2.1 LIVE Stream Support Is Low-Latency Frame Interpolation suitable for LIVE streaming scenarios where latency is critical? Are there any special considerations for LIVE vs VOD? 3. Super Resolution Specific Questions 3.1 Adaptive Bitrate (ABR) Stream Support In ABR (HLS/DASH) streams, the video resolution can change dynamically during playback. Is VTLowLatencySuperResolutionScaler compatible with ABR streams where resolution changes mid-playback? If resolution changes occur, should I recreate the VTLowLatencySuperResolutionScalerConfiguration and restart the session, or does the API handle this automatically? 3.2 Small/Square Resolution Issue I observed that 144x144 (1:1 square) videos fail with error:   "VTFrameProcessorErrorDomain Code=-19730: processWithSourceFrame within VCPFrameSuperResolutionProcessor failed" However, 480x270 (16:9) videos work correctly. minimumDimensions reports 96x96, but 144x144 still fails. Is there an undocumented restriction on aspect ratio or a practical minimum resolution? 3.3 Scale Factor Selection supportedScaleFactors returns [2.0, 4.0] for most resolutions. Is there a recommended scale factor for balancing quality and performance? Are there scenarios where 4.0x should be avoided? The documentation on this specific topic seems limited, so I would appreciate any insights or advice. Thank you.
0
0
403
Feb ’26
Facing issue with fairplay Streaming server SDK 26.0.0
I am trying to Build server for testing on Linux(Alma linux 9 VM) NAME="AlmaLinux" VERSION="9.7 (Moss Jungle Cat)" ID="almalinux" ID_LIKE="rhel centos fedora" VERSION_ID="9.7" PLATFORM_ID="platform:el9" PRETTY_NAME="AlmaLinux 9.7 (Moss Jungle Cat)" ANSI_COLOR="0;34" [azuki@AlmaDevVM ~]$ uname -m x86_64 I have tried the following steps: Before starting, ensured that Swift 6 installed. Referred https://www.swift.org/install/ for instructions. Build the library In Terminal, uses the following commands to compile the Swift library: cd Development/Key_Server_Module/Swift swift build -Xbuild-tools-swiftc -DTEST_CREDENTIALS After building the library, ran test cases to ensure the library behaves as expected. ALL unit tests are passing with the development credentials. • Since I was using an x86_64 machine: export LD_LIBRARY_PATH=./Sources/prebuilt/x86_64-unknown-linux-gnu/ Run all tests: swift test -Xbuild-tools-swiftc -DTEST_CREDENTIALS --disable-swift-testing Build the server Build the server: Apache Before starting, ensured the following: a. Installed Apache HTTPD and the dev tools. Using the following command for installation: yum install httpd httpd-devel redhat-rpm-config b. After this, integrated it into the Apache server environment with swift library that was built above. Used the following command to build the server using apxs: • Since I was using an x86_64 machine: apxs -i -a -c -Wl,-L${PWD}/.build/x86_64-unknown-linux-gnu/debug/ -Wl,-lswift_fpssdk -Wl,-L${PWD}/Sources/prebuilt/x86_64-unknown-linux-gnu -lfpscrypto -Wl,-R${PWD}/.build/x86_64-unknown-linux-gnu/debug server_setup/mod_fps.c c. Next, copied the dependent libraries to the Apache modules folder using these commands: • If using an x86_64 machine: cp Sources/prebuilt/x86_64-unknown-linux-gnu/libfpscrypto.so /usr/lib64/httpd/modules/libfpscrypto.so cp .build/x86_64-unknown-linux-gnu/debug/libswift_fpssdk.so /usr/lib64/httpd/modules/libswift_fpssdk.so d. Configuring Apache HTTPD Configured Apache HTTPD by adding the module and handler to your Apache HTTPD configuration (/etc/httpd/conf/httpd.conf). Note that the apxs command may automatically add the LoadModule line in the previous step. Listen 8080 LoadFile /usr/lib64/httpd/modules/libfpscrypto.so LoadFile /usr/lib64/httpd/modules/libswift_fpssdk.so LoadModule fps_module /usr/lib64/httpd/modules/mod_fps.so <Location "/fps"> SetHandler fps_handler Copy the credentials to the Apache modules folder. cp -r ../credentials /usr/lib64/httpd/modules/ export FPS_CERT_PATH= /usr/lib64/httpd/modules/credentials/test_certificates.json e. Run your server You can run the Apache HTTPD server with the configured module by using the following command: httpd -D FOREGROUND No issues see till step. Get SDK version [azuki@AlmaDevVM Key_Server_Module]$ curl localhost:8080/fps/v 26.0.0 But when i try to generate license [azuki@AlmaDevVM Key_Server_Module]$ curl -d ../Test_Inputs/iOS/spc_ios_hd_lease_2048.json localhost:8080/fps {"fairplay-streaming-response":{"create-ckc":[{"id":1,"status":-42601}]}} Can you please suggest what i might be missing here?
8
0
1.1k
Feb ’26
What does CoreMediaErrorDomain code -15418 indicate during LL-HLS live playback?
Hello, I am currently developing a video player using Custom AVPlayer SDK and testing LL-HLS live streaming. I encountered a specific error, CoreMediaErrorDomain -15418, during playback. I have searched through the official documentation and the forums, but I could not find any information regarding this error code. I would like to inquire about the following: Description & Cause: What does the error code -15418 specifically represent in the context of CoreMedia and LL-HLS? Severity: Is this a critical error that halts playback, or is it merely a warning? Environment Details: iOS Version: iOS 26.2 Device: iPhone 15 Pro Max Stream Type: LL-HLS (Low-Latency HLS) Impact: Quality drops Any insights or references to documentation would be greatly appreciated. Thank you.
1
0
329
Jan ’26
-46250 error when calling `makeSecureTokenForExpirationDateOfPersistableContentKey`
Hi there, We're working on offline playback of DRM tracks. The persistent keys (also known as track licenses) for offline playback are stored locally on the device and are served from cache when a user initiates playback of a downloaded track. Our persistent keys have a limited validity time and need to be refreshed when they expire. To prevent a situation where a persistent key expires while the user is offline, we've decided to eagerly refresh these keys one week before their expiration date. To make that happen we need to be able to obtain the expiration date of the given track license. We've been attempting to use the makeSecureTokenForExpirationDateOfPersistableContentKey API to facilitate this process. The documentation states that this API returns a secret token representing the persistent key, which we can then exchange with our license server for the expiration date: https://developer.apple.com/documentation/avfoundation/avcontentkeysession/makesecuretokenforexpirationdate(ofpersistablecontentkey:completionhandler:)?language=objc However, every time we call makeSecureTokenForExpirationDateOfPersistableContentKey, we receive an error with code -46250. We haven't been able to find any public references or documentation for this specific error code, which is preventing us from troubleshooting the issue. We are conducting our tests on a physical device, as the simulator does not support FairPlay playback. We don't use dual expiry approach. Is our understanding of how to obtain the expiration timestamp correct? Are we using the makeSecureTokenForExpirationDateOfPersistableContentKey API as it was intended? What does the -46250 error code mean, and what steps should we take to fix our FairPlay implementation to make this work? Thanks in advance for your assistance.
2
1
377
Jan ’26