I am trying to Build server for testing on Linux(Alma linux 9 VM)
NAME="AlmaLinux"
VERSION="9.7 (Moss Jungle Cat)"
ID="almalinux"
ID_LIKE="rhel centos fedora"
VERSION_ID="9.7"
PLATFORM_ID="platform:el9"
PRETTY_NAME="AlmaLinux 9.7 (Moss Jungle Cat)"
ANSI_COLOR="0;34"
[azuki@AlmaDevVM ~]$ uname -m
x86_64
I have tried the following steps:
Before starting, ensured that Swift 6 installed. Referred https://www.swift.org/install/ for instructions.
Build the library
In Terminal, uses the following commands to compile the Swift library:
cd Development/Key_Server_Module/Swift
swift build -Xbuild-tools-swiftc -DTEST_CREDENTIALS
After building the library, ran test cases to ensure the library behaves as expected. ALL unit tests are passing with the development credentials.
• Since I was using an x86_64 machine:
export LD_LIBRARY_PATH=./Sources/prebuilt/x86_64-unknown-linux-gnu/
Run all tests:
swift test -Xbuild-tools-swiftc -DTEST_CREDENTIALS --disable-swift-testing
Build the server
Build the server: Apache
Before starting, ensured the following:
a. Installed Apache HTTPD and the dev tools. Using the following command for installation:
yum install httpd httpd-devel redhat-rpm-config
b. After this, integrated it into the Apache server
environment with swift library that was built above. Used the following command to build the server using apxs:
• Since I was using an x86_64 machine:
apxs -i -a -c
-Wl,-L${PWD}/.build/x86_64-unknown-linux-gnu/debug/
-Wl,-lswift_fpssdk
-Wl,-L${PWD}/Sources/prebuilt/x86_64-unknown-linux-gnu -lfpscrypto
-Wl,-R${PWD}/.build/x86_64-unknown-linux-gnu/debug
server_setup/mod_fps.c
c. Next, copied the dependent libraries to the Apache modules folder using these commands:
• If using an x86_64 machine:
cp Sources/prebuilt/x86_64-unknown-linux-gnu/libfpscrypto.so /usr/lib64/httpd/modules/libfpscrypto.so
cp .build/x86_64-unknown-linux-gnu/debug/libswift_fpssdk.so /usr/lib64/httpd/modules/libswift_fpssdk.so
d. Configuring Apache HTTPD
Configured Apache HTTPD by adding the module and handler to your Apache
HTTPD configuration (/etc/httpd/conf/httpd.conf). Note that the apxs command
may automatically add the LoadModule line in the previous step.
Listen 8080
LoadFile /usr/lib64/httpd/modules/libfpscrypto.so
LoadFile /usr/lib64/httpd/modules/libswift_fpssdk.so
LoadModule fps_module /usr/lib64/httpd/modules/mod_fps.so
<Location "/fps">
SetHandler fps_handler
Copy the credentials to the Apache modules folder.
cp -r ../credentials /usr/lib64/httpd/modules/
export FPS_CERT_PATH=
/usr/lib64/httpd/modules/credentials/test_certificates.json
e. Run your server
You can run the Apache HTTPD server with the configured module by using the following command:
httpd -D FOREGROUND
No issues see till step.
Get SDK version
[azuki@AlmaDevVM Key_Server_Module]$ curl localhost:8080/fps/v
26.0.0
But when i try to generate license
[azuki@AlmaDevVM Key_Server_Module]$ curl -d ../Test_Inputs/iOS/spc_ios_hd_lease_2048.json localhost:8080/fps
{"fairplay-streaming-response":{"create-ckc":[{"id":1,"status":-42601}]}}
Can you please suggest what i might be missing here?
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I have a problem generating a 2048-bit FairPlay Streaming certificate.
I tried generating SDK v26.x certificate in two ways.
(1) Use existing certificate
(2) Create new certificate
Though, in both ways, Apple gives me a certificate bundle of 1024-bit certificate.
(fps_certificate.bin)
I've uploaded 2048-bit CSR on creating a certificate.
Just to note, I have created a SDK v4.x certificate few years ago.
Have anyone bumped into a same issue?
Or am I missing something?
Hello,
I am currently developing a video player using Custom AVPlayer SDK and testing LL-HLS live streaming.
I encountered a specific error, CoreMediaErrorDomain -15418, during playback. I have searched through the official documentation and the forums, but I could not find any information regarding this error code.
I would like to inquire about the following:
Description & Cause: What does the error code -15418 specifically represent in the context of CoreMedia and LL-HLS?
Severity: Is this a critical error that halts playback, or is it merely a warning?
Environment Details:
iOS Version: iOS 26.2
Device: iPhone 15 Pro Max
Stream Type: LL-HLS (Low-Latency HLS)
Impact: Quality drops
Any insights or references to documentation would be greatly appreciated.
Thank you.
Hi there,
We're working on offline playback of DRM tracks. The persistent keys (also known as track licenses) for offline playback are stored locally on the device and are served from cache when a user initiates playback of a downloaded track.
Our persistent keys have a limited validity time and need to be refreshed when they expire. To prevent a situation where a persistent key expires while the user is offline, we've decided to eagerly refresh these keys one week before their expiration date. To make that happen we need to be able to obtain the expiration date of the given track license.
We've been attempting to use the makeSecureTokenForExpirationDateOfPersistableContentKey API to facilitate this process. The documentation states that this API returns a secret token representing the persistent key, which we can then exchange with our license server for the expiration date: https://developer.apple.com/documentation/avfoundation/avcontentkeysession/makesecuretokenforexpirationdate(ofpersistablecontentkey:completionhandler:)?language=objc
However, every time we call makeSecureTokenForExpirationDateOfPersistableContentKey, we receive an error with code -46250. We haven't been able to find any public references or documentation for this specific error code, which is preventing us from troubleshooting the issue. We are conducting our tests on a physical device, as the simulator does not support FairPlay playback. We don't use dual expiry approach.
Is our understanding of how to obtain the expiration timestamp correct? Are we using the makeSecureTokenForExpirationDateOfPersistableContentKey API as it was intended? What does the -46250 error code mean, and what steps should we take to fix our FairPlay implementation to make this work?
Thanks in advance for your assistance.
Hello, our application is unable to HDMI output FairPlay protected content to TV via official Lightning HDMI AV Adapter, by checking the console log on mediaplayerd it is found that a CoreMediaErrorDomain Code=-19156 is raised, but we are unable to know what this error code means.
default 11:18:15.121584+0800 mediaplaybackd keyboss ckb_customURLReadCallback: 0x7fa62f800 60/0 customURLReqID 4 isComplete 1 err -19156 error <private> (0) dokeyCallbacksExist 0
default 11:18:15.121670+0800 mediaplaybackd keyboss ckb_processErrorForRequest: 0x7fa62f800 60/0 handler 4 err 0
default 11:18:15.121752+0800 mediaplaybackd <<<< FigCustomURLHandling >>>> curll_cancelRequestOnQueue: 0x7fa031360: requestID: 4
default 11:18:15.121932+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 reqFin err Error Domain=CoreMediaErrorDomain Code=-19156 (-19156) dokeyCallbacksExist 0
default 11:18:15.122025+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 retry
default 11:18:15.123195+0800 mediaplaybackd <<<< FigCPECryptorPKD >>>> PostKeyRequestErrorOccurred: 0x7fab7be80 029592C2-093D-400D-B57F-7AB06CC292D1 key request error: Error Domain=CoreMediaErrorDomain Code=-19160 (-19160)
The ASk is used by the KSM to derive the dASk, which is then used to decrypt the SK...R1.
If the only thing we give the client is the certificate, how does it encrypt the SK...R1 so the server is able to process it.
Would be nice to know it it works generally, because I've been getting questions about it and can't provide a helpful answer.
Thanks in advance.
Hello Apple team and developer community,
I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time.
The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration.
Feature Request:
A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience.
Context & Rationale:
In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical.
As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases.
Questions:
Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed?
Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases?
Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated.
Best regards,
Julian Zürn - IPI, HS Kempten
I want develop an app for real-time streaming spatial video transmission from an Apple Vision Pro to another Apple Vision Pro and play, like MV-HEVC, does it's possible? If it's possible how to make it?
Hi,
I'm trying to create a FairPlay Streaming Certificate for the SDK 26.x version.
Worth to mention that we already have 2 (1024 and 2048) and we only have the possibility to use our previous 1024-bit certificate (which we do not want because we want a 2048 cert)
Our main issue is that when I upload a new "CSR" file, the "Continue" button is still on "gray" and cannot move forward on the process.
The CSR file has been created with this command:
openssl req -out csr_2048.csr -new -newkey rsa:2048
-keyout priv_key_2048.pem
-subj /CN=SubjectName/OU=OrganizationalUnit/O=Organization/C=US
Some help will be appreciated.
Thanks in advance
Best,
We're troubleshooting SCK issues. They occur with a relatively small amount of sessions, but lack of context and/or ability to advise the customer on how they could make behavior more predictable and reliable is problematic.
Generally, there is 2 distinct issues which may or may not have the same root cause:
Failure to establish SCK session. Usually manifests within the app as SCShareableContent.getWithCompletionHandler call either never invoking the completion handler, or taking prohibitively long time (we usually give it 3-10 sec before giving up). In the system log it may look like this:
(log omitted - suspecting it triggers the content filter)
Note the 6 seconds delay to completion of fetchShareableContentWithOption (normally it's a 30-40ms operation).
Sometime, we'd see the stream established, but some minutes (or even hours) into the recording we'd stop receiving frames.
Both scenarios are likely to occur when the disk space is low, with reliable repro of the problem #2 at below 8gb of free space (in that case, we've seen replayd silently dropping the session, without ever notifying the client ... improving API could go a long way there). However, out of recent occurrences, while most have less than 100GB available, we've seen it on machines with as much as 500GB free.
Unfortunately, it's almost never reproducible in dev environment, so we have to rely on diagnostics we're able to collect in the field -- which nothing obvious yet.
I'd like to understand the root cause of both scenarios better and/or how what specific frameworks can cause these behaviors.
Hi,
I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple.
I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.).
Questions:
Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation?
Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)?
If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support?
Thanks!
Hello, I am developing a custom player SDK using AVPlayer to support HLS and LL-HLS live streaming. I have some questions about the internal logic of AVPlayer regarding ABR, as this information is not explicitly covered in the documentation.
ABR Switching Logic: Does AVPlayer trigger bitrate switching primarily based on stall occurrences (buffer starvation)? I am curious if the switching logic is reactive to stalls or if it proactively switches to prevent them based on throughput estimation.
Developer Controls for ABR: To influence or control the ABR selection, are preferredPeakBitRate and preferredForwardBufferDuration the only properties available to developers? Are there any other recommended APIs to assist with ABR decisions?
Thank you for your help.
We have the application 'ADS Smart', a companion application for our ADS Dashcam. We offer a feature that lets users stream the live footage of the dashcam cameras through the app. Currently, we are experiencing a time delay of 30+ seconds to see the live stream, i.e the first frame of the live footage is taking around 30+ seconds to display in the app. We are using the MobileVLCKit library to stream the videos in the app.
The current flow of the code,
Flutter triggers the native playback via a method channel
The Dart side calls the iOS method channel <identifier_name>/ts_player with method playTSFromURL passing:
url(e.g rtsp://.... for live),
playerId
viewId (stable ID used to host native UI)
showControls
optional localIp
AppDelegate receives the call and prepares networking
Entry point: AppDelegate.tsChannel handler for "playTSFromURL" in AppDelegate.swift.
It resolves the Wi‑Fi interface and local IP if possible:
Sets VLC_SOURCE_ADDRESS to the Wi‑Fi IP (when available) to prefer Wi‑Fi for the stream.
Uses NWPathMonitor and direct interface inspection to find the Wi‑Fi interface (e.g., en0) and IP.
Kicks off best-effort route priming to the dashcam IP/ports (non-blocking), see establishWiFiRoutePriority.
AppDelegate chooses the right player implementation
createPlayerForURL(_: ) decides:
RTSP(rasp://..) --> use VLCKit-backed player (class TSStreamPlayer -> TSStreamPlayer class provides a VLC-backed video player for iOS, handling playback of Transport Stream(TS) URLs with strict main-thread UI updates, view safety, and stream management, using MobileVLCKit)
.ts files --> use VLCKit-based player for playing already recorded videos in the app.
If the selected player supports extras (e.g. TSStreamPlayerExtras), it sets
LocalIP (if resolved)
Wi-fi interface name
AppDeletegate creates the native 'platform view' container and overlay
platformView(for:parent:showControls:):
Creates a container UIView attached to the Flutter root view
Adds a dedicated child videoHost[viewId]-the host UIView for rendering video.
If showControls == true, adds TSPlayerControlsOverlay over the video and wires overlay callbacks back to the Flutter via controlChannel (/player_controls)
If showControls == false, adds a minimal back button and wires it to onGalleryBack.
The player starts playback inside the host view
Class player.playTSFromURL(urlString, in:host){ success, error in...} on the main thread.
For RTSP/TS streams: this is handled by TSStreamPlayer(VLCKit).
Success/failure is reported back to Flutter
The completion closure invoked in step 5 returns true on first real playback or an error message on failure.
The method channel result responds:
true --> Flutter knows playback started
FlutterError -> Flutter can show an error
Stopping and cleanup
"stop" on tsChannel stops and disposes the player(s).
"removePlatformView" removes the overlay, back button, the host, and the container, and disposes any remaining players.
I am attaching the logs of the app while running.
The actual issue happening is that when the iOS device is connected to the dashcam's Wi-Fi, for the app's live streaming-related information, the iOS is using Mobile Data even though the wifi is the main communication channel.
The iOS device takes approximately 30 seconds to display the first frame of live footage in the app. Despite being connected to the dashcam’s Wi-Fi, the iOS device sets the value of ES (en0) to Wi-Fi after multiple attempts, causing the live footage to appear in the app after this delay.
So, how can we set up the configuration to display the live footage from the dashcam cameras within just 2 to 3 seconds in the iOS device?
ios.txt
I am working on Screen Record function in Apple Vision Pro, when I use broadcast upload extension, after I click record button, the XCode console show the exception:
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
we create and config the project as flow:
Create a Apple Vision Project.
Create a Broadcast Upload Extension Target.
Add App Group for Project Target and Extension Target, both use the same identifier.
Add "Main Camera Access", "Passthrough in Screen Capture" Capabilities for all targets.
Add "NSScreenCaptureUsageDescription", "NSMicrophoneUsageDescription" in Plist.
Add record button in view
Run debug in Apple Vision Pro device, after click record button, throw the exception.
Hi,
After updating to iOS 26, our app is facing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier.
Error - Domain[CoreMediaErrorDomain]:Code[-15628]:Desc[The operation couldn’t be completed.]:Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
Environment:
iOS version: ios 26
React Native: 0.69
Video library: react-native-video (AVPlayer under the hood)
Stream type: HLS (m3u8) with segment (.ts) files
Observed behaviour:
Playback works initially on iOS 26.
On iOS 26, the stream fails at runtime after a few seconds/minutes (not on first load).
Network logs show 307 redirects on some segment requests. After this, AVPlayer throws the above error.
Playback fails intermittently on slow/unstable networks.
We are trying to port our code to Apple TV on tvosVersion 17.6 while running the sample we are getting error CoreMediaErrorDomain error -42681. We understand that this error occurs when the FairPlay license (CKC) returned by the server contains incompatible or malformed version information that the iOS/tvOS FairPlay CDM cannot parse.
Can you please specify tvos 17.6 expect what fairplay version number or what fields are mandartory for fps version metadata ?
Hello there,
I'm trying to implement feature which uses AirPlay with Apple TV. I want to disconnect from the device programmatically when something happens. Under something I mean a situation when a user wants to stop broadcasting (for example close the PiP window on his phone). I use this snippet:
try audioSession.setCategory(.playAndRecord, options: .defaultToSpeaker)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
It works fine sometimes but not always (it works on iOS 18 but it doesn't on iOS 17 or ). So I thought it's a bug and create a ticker to feedback assistant (FB21220013). The support told me write a post on the forum.
Our license service is based on version 4.5.4 and we make use of sample .c/.h files for building license service.
We are told that version 4.5.4 is going to be deprecated in 2026 and we should migrate to latest SDK version 26.
When explored the SDK, we noticed that only python and Swift based SDk is provided.
Does Apple also provide C/C++ based SDK as it is going to easier for us to integrate.
If yes, please share the SDK package and sample license service solution.
We are experiencing an issue related to DepthData from the TrueDepth camera on a specific device.
On December 1, we tested with the complainant’s device iPhone 14 / iOS 26.0.1, and observed that the depth image is received with empty values.
However, the same implementation works normally on iPhone 17 Pro Max (iOS 26.1) and iPhone 13 Pro Max (iOS 26.0.1), where depth data is delivered correctly.
In the problematic case:
TrueDepth camera is active
Face ID works normally
The app receives a DepthData object, but all values are empty (0), not nil
Because the DepthData object is not nil, this makes it difficult to detect the issue through software fallback handling.
We developed the feature with reference to the following Apple sample:
https://developer.apple.com/documentation/AVFoundation/streaming-depth-data-from-the-truedepth-camera
We would like to ask:
Are there known cases where Face ID functions normally but DepthData from the TrueDepth camera is returned as empty values?
If so, is there a recommended approach for identifying or handling this situation?
Any guidance from Apple engineers or the community would be greatly appreciated.
Thank you.
For iOS17 we've had no problem playing Apple Fairplay encrypted content with keys delivered from our key server running on FairPlay Streaming Server SDK 5.1 and subsequently FairPlay Streaming Server SDK 26. It's built and deployed using Xcode Version 26.1.1 (17B100) with no changes to the code and - as expected - the content continued to be successfully decrypted and played (so far so good). However, as soon as a device was updated to iOS26, that device would no longer play the encrypted content.
Devices remaining on iOS17 continue to work normally and the debugging logs are a sanity-check that proves that. Is anyone else experiencing this issue?
Here's the code (you should be able to drop it into a fresh iOS Xcode project and provide a server url, content url and certificate).