Hello there,
I'm trying to implement feature which uses AirPlay with Apple TV. I want to disconnect from the device programmatically when something happens. Under something I mean a situation when a user wants to stop broadcasting (for example close the PiP window on his phone). I use this snippet:
try audioSession.setCategory(.playAndRecord, options: .defaultToSpeaker)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
It works fine sometimes but not always (it works on iOS 18 but it doesn't on iOS 17 or ). So I thought it's a bug and create a ticker to feedback assistant (FB21220013). The support told me write a post on the forum.
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Our license service is based on version 4.5.4 and we make use of sample .c/.h files for building license service.
We are told that version 4.5.4 is going to be deprecated in 2026 and we should migrate to latest SDK version 26.
When explored the SDK, we noticed that only python and Swift based SDk is provided.
Does Apple also provide C/C++ based SDK as it is going to easier for us to integrate.
If yes, please share the SDK package and sample license service solution.
We are experiencing an issue related to DepthData from the TrueDepth camera on a specific device.
On December 1, we tested with the complainant’s device iPhone 14 / iOS 26.0.1, and observed that the depth image is received with empty values.
However, the same implementation works normally on iPhone 17 Pro Max (iOS 26.1) and iPhone 13 Pro Max (iOS 26.0.1), where depth data is delivered correctly.
In the problematic case:
TrueDepth camera is active
Face ID works normally
The app receives a DepthData object, but all values are empty (0), not nil
Because the DepthData object is not nil, this makes it difficult to detect the issue through software fallback handling.
We developed the feature with reference to the following Apple sample:
https://developer.apple.com/documentation/AVFoundation/streaming-depth-data-from-the-truedepth-camera
We would like to ask:
Are there known cases where Face ID functions normally but DepthData from the TrueDepth camera is returned as empty values?
If so, is there a recommended approach for identifying or handling this situation?
Any guidance from Apple engineers or the community would be greatly appreciated.
Thank you.
For iOS17 we've had no problem playing Apple Fairplay encrypted content with keys delivered from our key server running on FairPlay Streaming Server SDK 5.1 and subsequently FairPlay Streaming Server SDK 26. It's built and deployed using Xcode Version 26.1.1 (17B100) with no changes to the code and - as expected - the content continued to be successfully decrypted and played (so far so good). However, as soon as a device was updated to iOS26, that device would no longer play the encrypted content.
Devices remaining on iOS17 continue to work normally and the debugging logs are a sanity-check that proves that. Is anyone else experiencing this issue?
Here's the code (you should be able to drop it into a fresh iOS Xcode project and provide a server url, content url and certificate).
Hi
Is it possible to have a playlist where I have a indication of a stream in clear, but then, someone started a DRM encrypted period and then someone turns it off.
Can I just do the following (I've removed the video segments part, I'm just interested in the parts where I want notify the new drm region )?
#EXT-X-MAP:URI="video_2_10000000_t17586401730000000_init.mp4"
#EXT-X-KEY:METHOD=NONE
...
#EXT-X-MAP:URI="video_2_10000000_t17587374640000000_init.mp4"
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://5df0b36ac4bb4d0ff954a73b502ac332",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1"
...
#EXT-X-MAP:URI="video_2_10000000_t17587376740000000_init.mp4?"
#EXT-X-KEY:METHOD=NONE
Should I insert discontinuity tags or something else?
Right now what I can observe is that I got some audio drops when I try to do this.
quotes are displayed incorrectly in subtitles of AVPlayerViewController when streaming VOD content using HLS.
single quote ' (escaped ') is displayed as apos;
double quotes " (escaped ") is displayed as quot;
following the vtt specification.
The same stream works fine in VLC player, showing quotes correctly in subtitles.
subtitle vtt files use
Content-Type: text/vtt
WEBVTT
X-TIMESTAMP-MAP=LOCAL:490014:06:04.000,MPEGTS:158764568760056
example line:
490014:05:46.000 --> 490014:05:50.440 align:start line:83% position:14%
and the playlist has:
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",LANGUAGE="da",NAME="Dansk",AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.transcribes-spoken-dialog,public.accessibility.describes-music-and-sound",URI="subs/dan_5/playlist.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=780000,CODECS="mp4a.40.5,avc1.42c01e",RESOLUTION=256x144,AUDIO="audio-aac",SUBTITLES="subs"
lære dig endnu bedre at kende."
adding 'wvtt' to CODECS list in playlist does not make a difference.
Is this a known bug? Is there a workaround?
I guess the AVResourceLoaderDelegate can be used to intercept and parse the subtitle files, but it seems like quite a hack and not really intended to be used for this.
I am working on Screen Record function in Apple Vision Pro, when I use broadcast upload extension, after I click record button, the XCode console show the exception:
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
we create and config the project as flow:
Create a Apple Vision Project.
Create a Broadcast Upload Extension Target.
Add App Group for Project Target and Extension Target, both use the same identifier.
Add "Main Camera Access", "Passthrough in Screen Capture" Capabilities for all targets.
Add "NSScreenCaptureUsageDescription", "NSMicrophoneUsageDescription" in Plist.
Add record button in view
Run debug in Apple Vision Pro device, after click record button, throw the exception.
Hello,
I'm investigating an issue with LL-HLS playback using AVPlayer, specifically during DVR Live seeking (seeking to a past time).
I noticed that in certain seeking scenarios, AVPlayer sends a Blocking Playlist Reload request that includes the _HLS_msn parameter but is missing the _HLS_part parameter.
While I understand this is compliant with the HLS spec, I would like to know the specific criteria AVPlayer uses to decide when to drop the _HLS_part parameter. Does AVPlayer intentionally omit the part info when it determines that loading a specific partial segment is unnecessary during a seek operation?
Clarification on this behavior would help us greatly in debugging our stream delivery.
Thanks in advance.
Macs do not support Multi-Stream Transport (MST), which prevents from using a single DisplayPort or USB-C port to daisy-chain multiple external monitors in an extended display mode. So the the virtual multiple display modes are not working correctly on Mac.
Topic:
Media Technologies
SubTopic:
Streaming
Just updated my computer, phone, and dev tools to the latest versions of everything. Now when I run my app in a previously-working simulator (iPhone 16 w. iOS 18.5) I get:
Failed retrieving MusicKit tokens: fetching the developer token is not supported in the simulator when running on this version of macOS; please upgrade your Mac to macOS Ventura.
Also:
<ICCloudServiceStatusMonitor: 0x600003320e60>: Invoking 1 completion handler for MusicKit tokens. error=<ICError.DeveloperTokenFetchingFailed (-8200) "Failed to fetch media token from <AMSMediaTokenService: 0x6000029049a0>." { underlyingErrors: [ <AMSErrorDomain.300 "Token request encoding failed The token request encoder finished with an error." { userInfo: { AMSDescription : "Token request encoding failed", AMSFailureReason : "The token request encoder finished with an error." }; underlyingErrors: [ <AMSErrorDomain.5 "Anisette Failed Platform not supported" { userInfo: { AMSDescription : "Anisette Failed", AMSFailureReason : "Platform not supported" };
Anybody know what gives here? The Ventura message is absurd because I'm on Tahoe 26.1. The same code works on a physical phone running iOS 26.
Hello Apple team and developer community,
I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time.
The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration.
Feature Request:
A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience.
Context & Rationale:
In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical.
As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases.
Questions:
Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed?
Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases?
Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated.
Best regards,
Julian Zürn - IPI, HS Kempten
Hi,
While I don't normally use FairPlay,
I got this email that is so strangely worded I am wondering if it makes sense to people who do know it or if it has some typo or what.
You can only generate certificates for SDK 4, also SDK 4 is no longer supported?
(Also "will not expire" is imprecise phrasing, certificates presumably will expire, I think it meant to say are still valid / are not invalidated.)
Hello, We have Video Stream app. It has HLS VOD Content. We supply 1080p, 4K Contents to users. Users were watching 1080p content before tvOS 26. Users can not watch 1080p content anymore when they update to tvOS 26. We have not changed anything at HLS playlist side and application version. This problem only occurs on Apple TV 4th Gen (A1625) tvOS 26 version. There is no problem with newer Apple TV devices. Would you help to resolve problem? Thanks in advance
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Apple TV
tvOS
HTTP Live Streaming
Hi, I submitted the FairPlay Streaming Credentials Approval request, but it's been 15 days and I haven't received a response yet. Do you happen to know how long they usually take to reply to these requests?
For devices that are still on ios17, playing Fairplay encrypted content still works fine. For devices that I've upgraded to ios26 playing the same content in the same app no longer works. I can advance and see the stream frames by tapping +10 scrubbing so I know that the content is being decrypted but tapping the play button of AVPlayer for an AVPlayerItem now does nothing in ios26. Is this a breaking change or is there a stricter requirement that I now have to implement?
The operation couldn’t be completed. (CoreMediaErrorDomain error -19156 - The operation couldn’t be completed. (CoreMediaErrorDomain error -19156.
Hi everyone,
After updating my Apple TV HD (model A1625) to tvOS 26, I’ve noticed a significant spike in CPU usage—up to 3× higher than before the update. Go from around 40% to 120%
Model: Apple TV HD (A1625)
tvOS Version: 26 (stable release) and beta version of 26.1,
App downgrade stream due to lack of cpu power
If anyone else is experiencing this, please share your findings or workarounds.
Would love to hear from Apple engineers or other developers if this is a known regression or if there’s a recommended fix.
Thanks!
We have a Low-Latency HLS stream, and on iOS 26, even though the bandwidth is sufficient, it still selects a low-bandwidth resolution (e.g., RESOLUTION=640x360) for playback instead of using a higher-bandwidth resolution (e.g., RESOLUTION=1920x1080) when using AVPlayerViewController with AVPlayer.
This works fine on iOS version 18 and previous versions. What could be the solution to this issue?
Topic:
Media Technologies
SubTopic:
Streaming
Hey there,
We're seeing a high rate of 403 - Invalid Authentication on this endpoint v1/me/library/artists since a few days.
Does anyone have the same issue ?
Hi,
After updating to iOS 26, our app is experiencing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier.
Error:
Domain [CoreMediaErrorDomain]
Code [-15628]
Description [The operation couldn’t be completed.]
Underlying Error Domain [(null)]
Code [0]
Description [(null)]
Environment:
iOS version: iOS 26
Stream type: HLS (m3u8) with segment (.ts) files
Observed behaviour:
We don’t have concrete steps to reproduce the issue, but so far, we have observed that this error tends to occur under low network conditions.