Hi!
I am writing a browser extension that allows you to control the playback of media content on a music service website. Unfortunately Safari does not support tracking changes to the audible property in an event tabs.onUpdated. Is there an alternative to this event? I'm looking for a way to track when the automatic inference engine interrupts playback on a music service website.
That you.
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We encounter issue with avplayer in case of EXT-X-DISCONTINUITY misalignment between audio and video produced after insertion of gaps.
The initial objective is to introduce an EXT-X-DISCONTINUITY in audio playlist after some missing segments (EXT-X-GAP) which durations are aligned to video segments durations, to handle irregular audio durations.
Please find below an example of corresponding video and audio playlists:
video:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-MEDIA-SEQUENCE:872524632
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-TARGETDURATION:2
#USP-X-TIMESTAMP-MAP:MPEGTS=7096045027,LOCAL=2025-05-09T12:38:32.369100Z
#EXT-X-MAP:URI="hls/StreamingBasic-video=979200.m4s"
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.369111Z
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524632.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524633.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524634.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524635.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524636.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524637.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524638.m4s
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.383111Z
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524639.m4s
#EXTINF:2.002, no desc
hls/StreamingBasic-video=979200-872524640.m4s
audio:
EXTM3U
#EXT-X-VERSION:7
#EXT-X-MEDIA-SEQUENCE:872524632
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-TARGETDURATION:2
#USP-X-TIMESTAMP-MAP:MPEGTS=7096045867,LOCAL=2025-05-09T12:38:32.378400Z
#EXT-X-MAP:URI="hls/StreamingBasic-audio_99500_eng=98800.m4s"
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.378444Z
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524632.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524633.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524634.m4s
#EXTINF:1.984, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524635.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524636.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524637.m4s
## Media sequence discontinuity
#EXT-X-GAP
#EXTINF:2.002, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524638.m4s
#EXT-X-DISCONTINUITY
#EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.778444Z
#EXTINF:1.6213, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524639.m4s
#EXTINF:2.0053, no desc
hls/StreamingBasic-audio_99500_eng=98800-872524640.m4s
In this case playback is broken with avplayer.
Is it conformed to Http Live Streaming?
Is it an avplayer bug?
What are the guidelines to handle such gaps?
Overlay changes color in HDR video When I’m using trying to add an overlay to an image with AVMutableVideoComposition, When the video is in HDR the overlay colors are changing and white becomes grey screen shot from original HDR video result from the code with the wrong overlay colorthe result when reducing to SDR (the right overlay color)
the distorted colorsthe way it should look(sdr)
Im creating the overlay with a CGContext
class CustomHdrCompositor: NSObject, AVVideoCompositing {
private let coreImageContext = CIContext(options: [CIContextOption.cacheIntermediates: false])
let combinedFilter = CIFilter(name: "CISourceOverCompositing")!
var sourcePixelBufferAttributes: [String: Any]? = [String(kCVPixelBufferPixelFormatTypeKey): [kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange]]
var requiredPixelBufferAttributesForRenderContext: [String: Any] =
[String(kCVPixelBufferPixelFormatTypeKey): [kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange]]
var supportsWideColorSourceFrames = true
var supportsHDRSourceFrames = true
func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {
return
}
func startRequest(_ request: AVAsynchronousVideoCompositionRequest) {
guard let outputPixelBuffer = request.renderContext.newPixelBuffer() else {
print("No valid pixel buffer found. Returning.")
request.finish(with: CustomCompositorError.ciFilterFailedToProduceOutputImage)
return
}
guard let requiredTrackIDs = request.videoCompositionInstruction.requiredSourceTrackIDs, !requiredTrackIDs.isEmpty else {
print("No valid track IDs found in composition instruction.")
return
}
let sourceCount = requiredTrackIDs.count
if sourceCount > 1 {
request.finish(with: CustomCompositorError.notSupportingMoreThanOneSources)
return
}
if sourceCount == 1 {
let sourceID = requiredTrackIDs[0]
let sourceBuffer = request.sourceFrame(byTrackID: sourceID.value(of: Int32.self)!)!
let sourceCIImage = CIImage(cvPixelBuffer: sourceBuffer)
var textImage = TextLayerPlayer.instance.getTextLayerAtTimesStamp(ts:request.compositionTime.seconds)
combinedFilter.setValue(textImage, forKey: "inputImage")
if let outputImage = combinedFilter.outputImage {
let renderDestination = CIRenderDestination(pixelBuffer: outputPixelBuffer)
do {
try coreImageContext.startTask(toRender: outputImage, to: renderDestination)
} catch {
}
}
}
request.finish(withComposedVideoFrame: outputPixelBuffer)
}
}
func regularCompositionHdr(asset: AVAsset) -> AVVideoComposition
{
self.isHdr = checkHdr(asset: asset)
let avComposition = AVMutableComposition()
let composition = AVMutableVideoComposition()
composition.colorPrimaries = AVVideoColorPrimaries_ITU_R_2020
composition.colorTransferFunction = AVVideoTransferFunction_ITU_R_2100_HLG
composition.colorYCbCrMatrix = AVVideoYCbCrMatrix_ITU_R_2020
composition.renderSize = assetSize
composition.frameDuration = CMTime(value: 1, timescale: 30)
composition.customVideoCompositorClass = CustomHdrCompositor.self
composition.perFrameHDRDisplayMetadataPolicy = .propagate
return composition
}
I’m using this function to transfer the transparent CGImage to CIImage that supports HDR
func convertToHDRCIImage(from cgImage: CGImage,
maxBrightness: CGFloat = 3.0) -> CIImage? {
// Create a CIImage from the input CGImage
let baseImage = CIImage(cgImage: cgImage)
// Create HDR color adjustment filter
let colorAdjust = CIFilter(name: "CIColorMatrix")!
colorAdjust.setValue(baseImage, forKey: kCIInputImageKey)
// Calculate HDR multipliers based on maxBrightness
// This will maintain color ratios while increasing brightness
colorAdjust.setValue(CIVector(x: maxBrightness, y: 0, z: 0, w: 0), forKey: "inputRVector")
colorAdjust.setValue(CIVector(x: 0, y: maxBrightness, z: 0, w: 0), forKey: "inputGVector")
colorAdjust.setValue(CIVector(x: 0, y: 0, z: maxBrightness, w: 0), forKey: "inputBVector")
// Maintain alpha channel
colorAdjust.setValue(CIVector(x: 0, y: 0, z: 0, w: 1), forKey: "inputAVector")
guard let adjustedImage = colorAdjust.outputImage else {
return nil
}
// Apply color space transformation using CIImage's colorSpace property
let transformedImage = adjustedImage.matchedFromWorkingSpace(to: hdrWorkingSpace)!
// Create context with HDR color space
let context = CIContext(options: [
.workingColorSpace: hdrColorSpace,
.outputColorSpace: hdrColorSpace
])
// Get the image bounds
let bounds = transformedImage.extent
// Create a new pixel buffer with HDR format
var pixelBuffer: CVPixelBuffer?
let pixelBufferAttributes = [
kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_64RGBAHalf,
kCVPixelBufferMetalCompatibilityKey: true
] as CFDictionary
CVPixelBufferCreate(kCFAllocatorDefault,
Int(bounds.width),
Int(bounds.height),
kCVPixelFormatType_64RGBAHalf,
pixelBufferAttributes,
&pixelBuffer)
guard let destinationBuffer = pixelBuffer else {
return nil
}
context.render(transformedImage,
to: destinationBuffer,
bounds: bounds,
colorSpace: hdrColorSpace)
// Create final CIImage from the HDR pixel buffer
let finalImage = CIImage(cvPixelBuffer: destinationBuffer,
options: [.colorSpace: hdrColorSpace])
return finalImage
}
When reducing the HDR to SDR it keeps the right color of the overlay with, but than it reduces the HDR effect which I want to keep
Topic:
Media Technologies
SubTopic:
Video
When multiple identical songs are added to a playlist, Playlist.Entry.id uses a suffix-based identifier (e.g. songID_0, songID_1, etc.). Removing one entry causes others to shift, changing their .id values. This leads to diffing errors and collection view crashes in SwiftUI or UIKit when entries are updated.
Steps to Reproduce:
Add the same song to a playlist multiple times.
Observe .id.rawValue of entries (e.g. i.SONGID_0, i.SONGID_1).
Remove one entry.
Fetch playlist again — note the other IDs have shifted.
FB18879062
Hello,
Our users have started to see a new fatal AVPlayer error during playback starting with iOS/tvOS 18.0. The error is defined as "CoreMediaErrorDomain Code=-15486".
We have not been able to reproduce this issue locally within our development team.
Is there any documentation on the cause of this error or steps to recover from this error?
Thank you,
Howard
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
HTTP Live Streaming
AVFoundation
I use the Apple Music API to poll my listening history at regular intervals.
Every morning between 5:30AM and 7:30AM, I observe a strange pattern in the API responses. During this window, one or more of the regular polling intervals returns a response that differs significantly from the prior history response, even though I had no listening activity at that time.
I'm using this endpoint: https://api.music.apple.com/v1/me/recent/played/tracks?types=songs,library-songs&include[library-songs]=catalog&include[songs]=albums,artists
Here’s a concrete example from this morning:
Time: 5:45AM
Fetch 1 Tracks (subset):
1799261990, 1739657416, 1786317143, 1784288789, 1743250261, 1738681804, 1789325498, 1743036755, ...
Time: 5:50AM
Fetch 2 Tracks (subset):
1799261990, 1739657416, 1786317143, 1623924746, 1635185172, 1574004238, 1198763630, 1621299055, ...
Time: 5:55AM
Fetch 3 Tracks (subset):
1799261990, 1739657416, 1786317143, 1784288789, 1743250261, 1738681804, 1789325498, 1743036755, ...
At 5:50, a materially different history is returned, then it returns back to the prior history at the next poll. I've listened to all of the tracks in each set, but the 5:50 history drops some tracks and returns some from further back in history.
I've connected other accounts and the behavior is consistent and repeatable every day across them. It appears the API is temporarily returning a different (possibly outdated or cached?) view of the user's history during that early morning window.
Has anyone seen this behavior before?
Is this a known issue with the Apple Music API or MusicKit backend? I'd love any insights into what might cause this, or recommendations on how to work around it.
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Swift
Apple TV
HTTP Live Streaming
Our Final Cut Pro workflow extension built with ProExtensionHost framework uses an advanced NSPasteboardItemDataProvider system with multi-version FCPXML support (1.9, 1.10, 1.13) and proper relative path
UIDs for Motion templates. We've implemented clip wrapper approach with placeholder assets and elements containing effects to enable direct timeline drag functionality. However, drag
and drop from our Final Cut Pro workflow extension directly to timeline is still not working despite proper element structure in our FCPXML. Our implementation creates valid clip elements with
effects applied, but Final Cut Pro timeline doesn't accept them during drag operations from our ProExtensionHost-based workflow extension.
Steps to Reproduce:
Create Final Cut Pro workflow extension using ProExtensionHost framework with NSPasteboardItemDataProvider implementation
Generate FCPXML with proper element structure:
Expected Result: Clip should be accepted by timeline and effect applied from workflow extension
Actual Result: Timeline rejects drag operation from ProExtensionHost-based workflow extension
Question: Are there additional requirements or ProExtensionHost API calls needed beyond standard NSPasteboardItemDataProvider for Final Cut Pro workflow extension timeline drag functionality?
I am writing an iOS app to present a slide show of assets in a Photo album, in a random order, including videos and live photos. I have got it all working quite nicely but for a Live Photo, I need to know what effect is selected (Live, Loop, Bounce, Long Exposure, Live Off) to display the image correctly. I can't find any mention of getting this information in the documentation. Anyone know how to do this? Thanks in advance.
Adrian.
(Xcode 16.1 iOS 18.0)
Topic:
Media Technologies
SubTopic:
Photos & Camera
hi,
i need to read wether the transport is playing or stopped but my current method that works for vst does not work for au.
is there a lpx resource available for developers anywhere?
if (auto* playHead = processor->getPlayHead())
{
juce::AudioPlayHead::CurrentPositionInfo posInfo;
if (playHead->getCurrentPosition(posInfo))
{
bool isCurrentlyPlaying = posInfo.isPlaying;
if (isCurrentlyPlaying != wasTransportPlaying)
{
if (isCurrentlyPlaying)
{
wasTransportPlaying = isCurrentlyPlaying;
startAllTimers();
}
else
{
wasTransportPlaying = isCurrentlyPlaying;
stopAllTimers();
}
}
}
}
thanks :)
I have an iPad app that I want to run on Apple Silicon macs.
Everything works fine except for VNDocumentCameraViewController. According to the docs this class is available on:
iOS 13.0+ iPadOS 13.0+ Mac Catalyst 13.1+ visionOS 1.0+
yet when I try using it I get Document camera is not available on my Mac Studio running macOS 15.2
Is this expected behaviour?
Thanks
I am working to update a live blog in Apple News. As far as I know there is an update endpoint to update a content in Apple News. Is there any feature in Apple News to trigger an event when the original content updated and pull the updated content?
Context
We develop an iOS/Apple TV app that allows to play HLS+FP Live streams (custom playback UI), some of which use the same FairPlay content key id. All FairPlay content keys are requested to the same content key server.
Implementation
Despite Apple documentation warning to not reuse AVContentKeySessions, we use only one AVContentKeySession for all channels which allows the system to reuse the content key when a content key id is met again. As seen in another thread, people seems to think this is OK.
Issue
When reusing the AVContentKeySession and the user quickly tunes channels multiple times (up to 2 or 3 times per second using gestures), an inconsistency may occur where the content key request for a previous streams is asked to the delegate after a new stream is already being prepared and its AVURLAsset already assigned as the content key session AVContentKeyRecipient. Note that the previous content key recipient is removed before the new one is added.
We also have been reported for crashes (though I haven't experienced it myself) when performing multiple channels tunings which makes us think that the AVContentKeySession should definitely not been reused.
Note: On the other hand if a new AVContentKeySession is used for each stream, the system systematically requests a content key even if previous streams have used the same content key id. In this case, neither the crash nor the inconsistency issue are observed but it dramatically increases the number of calls to the content key server.
Questions
Should AVContentKeySessions definitely not be reused? Otherwise, how to handle the inconsistency issue described above?
Before you post —Camera doesn't work on the Simulator— that's no longer true. I've made a solution that makes the Simulator believe there's an actual hardware device connected, allowing users to stream the macOS camera to the iOS Simulator (see for more info RocketSim's documentation: https://docs.rocketsim.app/features/hzQMSrSga7BGWvxdNVdwYs/simulator-camera-support/58tQ5jvevLNSnyUEA7VgAv)
Now, it works for VNDocumentCameraViewController, but when I try opening DataScannerViewController, I directly run into:
Failed to start scanning: The operation couldn’t be completed. (VisionKit.DataScannerViewController.ScanningUnavailable error 0.)
My question:
How does this view controller determine whether scanning is available?
Is there a certain capability the available AVCaptureDevice's need to support maybe?
Any direction would be helpful for me to make this work for developers, making them build apps faster!
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone.
I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for:
// returns false on iPhone 14, iPad gen 10
print(AVExternalStorageDeviceDiscoverySession.isSupported)
The following code returns null, when I try to access the external storage discovery session.
// returns null on iOS devices
print(AVExternalStorageDeviceDiscoverySession.shared)
The following returns false, without displaying a permission dialog:
AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in
// returns false with no permission dialog
print(granted);
What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession?
What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive).
Is there are sample code for using the AV External Storage api?
I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation the best choice for me) but after making the scale with scaleTimeRange API, there is some short noise sound in play back. Also, sometimes it's fine when play AVMutableCompostion using AVPlayer with AVPlayerItem, but after exporting with AVAssetReader, will catch some short noise sounds in result file.... Not sure why.
Here is the example project, which can build and run directly. https://github.com/luckysmg/daily_images/raw/refs/heads/main/TestDemo.zip
Hi. I am working on an audio app for iOS. I have added the CPNowPlayingPlaybackRateButton to my CPNowPlayingTemplate.
When the button is clicked, my handler changes the rate in the AVPlayer and updates the MPNowPlayingInfoCenter to the new rate, for example, 2.0.
Throughout, the Carplay button always displays "0x". I am wondering how to get this UI to accurately reflect the playback rate the user has selected, as always displaying 0x is a poor user experience.
You may suggest MPChangePlaybackRateCommand is relevant here, but I have not been able to get that to work either, and judging by posts online, not many other people have either. I have made a post about that here: https://developer.apple.com/forums/thread/773099
Is this a known Apple bug? Is there a way to get the UI to accurately reflect the playback rate of my audio?
Kind regards.
Topic:
Media Technologies
SubTopic:
Audio
Hello everyone,
I’m new to Swift development and have been working on an audio module that plays a specific sound at regular intervals - similar to a workout timer that signals switching exercises every few minutes.
Following AVFoundation documentation, I’m configuring my audio session like this:
let session = AVAudioSession.sharedInstance()
try session.setCategory(
.playback,
mode: .default,
options: [.interruptSpokenAudioAndMixWithOthers, .duckOthers]
)
self.engine.attach(self.player)
self.engine.connect(self.player, to: self.engine.outputNode, format: self.audioFormat)
try? session.setActive(true)
When it’s time to play cues, I schedule playback on a DispatchQueue:
// scheduleAudio uses DispatchQueue
self.scheduleAudio(at: interval.start) {
do {
try audio.engine.start()
audio.node.play()
for sample in interval.samples {
audio.node.scheduleBuffer(sample.buffer, at: AVAudioTime(hostTime: sample.hostTime))
}
} catch {
print("Audio activation failed: \(error)")
}
}
This works perfectly in the foreground. But once the app goes into the background, the scheduled callback runs, yet the audio engine fails to start, resulting in an error with code 561015905.
Interestingly, if the app is already playing audio before going to the background, the scheduled sounds continue to play as expected.
I have added the required background audio mode to my Info plist file by including the key UIBackgroundModes with the value audio.
Is there anything else I should configure? What is the best practice to play periodic audio when the app runs in the background? How do apps like turn-by-turn navigation handle continuous audio playback in the background?
Any advice or pointers would be greatly appreciated!
Hi. I am working on an audio app for iOS. I have implemented UI and handling which allows the user to change playback rate of audio. When the user selects a different rate, I update the rate property on my AVQueuePlayer. This is working well on device.
When I use Airplay, it works for some devices and not for others. Some devices won't change playback rate and will always play at 1x speed.
Is this possibly a limitation of those 3rd-party devices? Or is there something I'm missing/should check? Would love to get playback rate changes working across all Airplay devices with our app.
Kind regards.
Since MacOS 26 Apple Music has inconsitent drops to the Quality of some Tracks indiscrimantly. I don't know if others Expereinced it. It doesn't happen on the Speakers or connected via Bluetooth, but the AUX I/O has it quite often. It is more noticable on Headphones with 48kHz and higher Frequency Bandwidth.
Here is the FB18062589