Hi,
I'm working on an audio mixing app, that comes with bundled audio units that provide some of the app's core functionality.
For the next release of that app, we are planning to make two changes:
make the app sandboxed
package the bundled audio units as .appex bundles instead as .component bundles, so we don't need to take care of the installation at the correct spot in the file system
When trying this new approach, we run into problems where [[AVAudioUnitEffect alloc] initWithAudioComponentDescription:] crashes when trying to load our audio unit with the exception:
AVAEInternal.h:109 [AUInterface.mm:468:AUInterfaceBaseV3: (AudioComponentInstanceNew(comp, &_auv2)): error -10863
Our audio unit has the `sandboxSafe flag enabled, and loads fine when the host app is not sandboxed, so I'm guessing I got the bundle id/code signing requirements for the .appex correct.
It seems, that my .appex isn't even loaded, and the system rejects it because of its metadata. Maybe there something wrong the Info.plist generated by Juice?
"BuildMachineOSBuild" => "23H222"
"CFBundleDisplayName" => "elgato_sample_recorder"
"CFBundleExecutable" => "ElgatoSampleRecorder"
"CFBundleIdentifier" => "com.iwascoding.EffectLoader.samplerecorderAUv3"
"CFBundleName" => "elgato_sample_recorder"
"CFBundlePackageType" => "XPC!"
"CFBundleShortVersionString" => "1.0.0.0"
"CFBundleSignature" => "????"
"CFBundleSupportedPlatforms" => [
0 => "MacOSX"
]
"CFBundleVersion" => "1.0.0.0"
"DTCompiler" => "com.apple.compilers.llvm.clang.1_0"
"DTPlatformBuild" => "24C94"
"DTPlatformName" => "macosx"
"DTPlatformVersion" => "15.2"
"DTSDKBuild" => "24C94"
"DTSDKName" => "macosx15.2"
"DTXcode" => "1620"
"DTXcodeBuild" => "16C5032a"
"LSMinimumSystemVersion" => "10.13"
"NSExtension" => {
"NSExtensionAttributes" => {
"AudioComponents" => [
0 => {
"description" => "Elgato Sample Recorder"
"factoryFunction" => "elgato_sample_recorderAUFactoryAUv3"
"manufacturer" => "Manu"
"name" => "Elgato: Elgato Sample Recorder"
"sandboxSafe" => 1
"subtype" => "Znyk"
"tags" => [
0 => "Effects"
]
"type" => "aufx"
"version" => 65536
}
]
}
"NSExtensionPointIdentifier" => "com.apple.AudioUnit-UI"
"NSExtensionPrincipalClass" => "elgato_sample_recorderAUFactoryAUv3"
}
"NSHighResolutionCapable" => 1
}
Any ideas what I am missing?
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The following is my playground code. Any of the apple audio units show the plugin view, however anything else (i.e. kontakt, spitfire, etc.) does not. It does not error, just where the visual is expected is blank.
import AppKit
import PlaygroundSupport
import AudioToolbox
import AVFoundation
import CoreAudioKit
let manager = AVAudioUnitComponentManager.shared()
let description = AudioComponentDescription(componentType: kAudioUnitType_MusicDevice,
componentSubType: 0,
componentManufacturer: 0,
componentFlags: 0,
componentFlagsMask: 0)
var deviceComponents = manager.components(matching: description)
var names = deviceComponents.map{$0.name}
let pluginName: String = "AUSampler" // This works
//let pluginName: String = "Kontakt" // This does not
var plugin = deviceComponents.filter{$0.name.contains(pluginName)}.first!
print("Plugin name: \(plugin.name)")
var customViewController:NSViewController?
AVAudioUnit.instantiate(with: plugin.audioComponentDescription, options: []){avAudioUnit, error in
var ilip = avAudioUnit!.auAudioUnit.isLoadedInProcess
print("Loaded in process: \(ilip)")
guard error == nil else {
print("Error: \(error!.localizedDescription)")
return
}
print("AudioUnit successfully created.")
let audioUnit = avAudioUnit!.auAudioUnit
audioUnit.requestViewController{ vc in
if let viewCtrl = vc {
customViewController = vc
var b = vc?.view.bounds
PlaygroundPage.current.liveView = vc
print("Successfully added view controller.")
}else{
print("Failed to load controller.")
}
}
}
In our logging tools (Firebase) I see a lot of errors reported when users are playing content and the app transitions to the background. A AVPlayerItemFailedToPlayToEndTime notification is fired with an error containing error codes like -1102 and 1852797029 which seem to correspond to NSURLErrorNoPermissionsToReadFile and kCMIOHardwareIllegalOperationError respectively. To me, it looks like these might have something to do with caching logic.
The items being played are HLS streams and we make use of AVAssetDownloadTask to make any streamed content offline available. Our setup is similar to the sample provided here: https://developer.apple.com/documentation/avfoundation/using-avfoundation-to-play-and-persist-http-live-streams. Whenever an item is selected for playback the app will check if a cached version is available and if so gets the url to the stored file like the "localAssetForStream()" method in the example, or get the asset from a currently running AVAssetDownloadTask for the item, or else, starts a new AVAssetDownloadTask and returns a AVAsset from that task to play.
This seems to work fine, and I can't reproduce the issues our users and our logging tools are reporting.
Is there some case I am missing where AVAssetDownloadTask and associated AVAssets might become unreadable when the app transitions to the background? Or do these errors indicate a different problem entirely?
Topic:
Media Technologies
SubTopic:
Streaming
Everytime I put my AirPods in and connect them to my phone or my Mac or my iPad since the iOS 18.3 update on my devices they’ve been disconnecting without reason, pausing songs I’m in the middle of playing, and only partially reconnecting in one pod and it’s getting really frustrating
Topic:
Media Technologies
SubTopic:
Audio
How can I implement the same custom CIFilter as a Lightroom Color Grading tool for shadows, midtones, highlights and global areas?
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing.
As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
Topic:
Media Technologies
SubTopic:
Streaming
Context
We develop an iOS/Apple TV app that allows to play HLS+FP Live streams (custom playback UI), some of which use the same FairPlay content key id. All FairPlay content keys are requested to the same content key server.
Implementation
Despite Apple documentation warning to not reuse AVContentKeySessions, we use only one AVContentKeySession for all channels which allows the system to reuse the content key when a content key id is met again. As seen in another thread, people seems to think this is OK.
Issue
When reusing the AVContentKeySession and the user quickly tunes channels multiple times (up to 2 or 3 times per second using gestures), an inconsistency may occur where the content key request for a previous streams is asked to the delegate after a new stream is already being prepared and its AVURLAsset already assigned as the content key session AVContentKeyRecipient. Note that the previous content key recipient is removed before the new one is added.
We also have been reported for crashes (though I haven't experienced it myself) when performing multiple channels tunings which makes us think that the AVContentKeySession should definitely not been reused.
Note: On the other hand if a new AVContentKeySession is used for each stream, the system systematically requests a content key even if previous streams have used the same content key id. In this case, neither the crash nor the inconsistency issue are observed but it dramatically increases the number of calls to the content key server.
Questions
Should AVContentKeySessions definitely not be reused? Otherwise, how to handle the inconsistency issue described above?
Let's consider the following code.
I've created an actor that loads a list of .mp3 files from a Bundle and then makes it available for audio reproduction.
Unfortunately, I'm experiencing a memory leak.
At the play method.
player.play()
From Instruments I get
_malloc_type_malloc_outlined libsystem_malloc.dylib
start_wqthread libsystem_pthread.dylib
private actor AudioActor {
enum Failure: Error {
case soundsNotLoaded([AudioPlayerClient.Sound: Error])
}
enum Player {
case music(AVAudioPlayer)
}
var players: [Sound: Player] = [:]
let bundles: [Bundle]
init(bundles: UncheckedSendable<[Bundle]>) {
self.bundles = bundles.wrappedValue
}
func load(sounds: [Sound]) throws {
try AVAudioSession.sharedInstance().setActive(true, options: [])
var errors: [Sound: Error] = [:]
for sound in sounds {
guard let url = bundle.url(forResource: sound.name, withExtension: "mp3")
else { continue }
do {
self.players[sound] = try .music(AVAudioPlayer(contentsOf: url))
} catch {
errors[sound] = error
}
}
guard errors.isEmpty
else { throw Failure.soundsNotLoaded(errors) }
}
func play(sound: Sound, loops: Int?) throws {
guard let player = self.players[sound]
else { return }
switch player {
case let .music(player):
player.numberOfLoops = loops ?? -1
player.play()
}
}
func stop(sound: Sound) throws {
guard let player = self.players[sound]
else { throw Failure.soundsNotLoaded([:]) }
switch player {
case let .music(player):
player.stop()
}
}
}
We noticed the behaviour of expiration of FairPlay license changed from iOS 16.x to some version iOS 17 and the latest iOS 18.3.2 and Safari.
On iOS 16.x, the video playback will stop when the license expires, but on iOS 17.x + the video continues but no audio and no error fired.
On latest Safari the video and audio all continues.
Any changes for the latest FairPlay and how we adapt this from the license server?
Thanks
According to the docs:
The first time your app performs an operation that requires [photo library] authorization, the system automatically and asynchronously prompts the user for it.
(https://developer.apple.com/documentation/photokit/delivering-an-enhanced-privacy-experience-in-your-photos-app)
I.e. it's not necessary for the app to call PHPhotoLibrary.requestAuthorization.
This does seem to be what happens when my app runs on an iPhone or iPad; the prompt is shown. But when it runs on a Mac in "designed for iPad" mode, the permission dialog is not presented. Instead the code continues to see status == .notDetermined.
That's today, on macOS 15.3. It may have worked in the past.
Is anyone else seeing issues with this? Should I call requestAuthorization explicitly? (Would that actually work?)
Here we are focusing to change the cookie at every 120 seconds while playing , in apple avplayer we can't modify cookie after initialisation due to that we followed the approach to using " Resource loader delegate " to pass cookie as a header value .
What I notice is that the playlist file (.m3u8) gets downloaded correctly. Then video file (.m4a) some chunks also gets downloaded. I know that the .ts file is downloaded because I can see the GET request completing on the web server with status 200. I also set a breakpoint at the following line:
loadingRequest.dataRequest?.respond(with: data)
immediately got error from avplayer status as
"The operation could not be completed. An unknown error occurred (-12881) From core media"
Need confirmation on why I am unable to load HLS using resource loader.
is it possible to update cookie value while paying continuously on avplayer.
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let urlString = "localhost://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.ism/.m3u8"
guard let url = URL(string: urlString) else {
print("Invalid URL")
return
}
//Create cookie to prepare for player asset
let cookie = HTTPCookie(properties: [
.name: "dazn-token",
.value: "cookie value",
.domain: url.host() ?? "",
.path: "/",
.discard: true
])
//Create cookie key to set AVURLAsset
let options = [AVURLAssetHTTPCookiesKey: [cookie]]
let asset = AVURLAsset(url: url,options: options)
proxy = ReverseProxyResourceLoader()
proxy?.cookie = "exampleCookie"
// Set resource loader delegate to moniter the chunks
asset.resourceLoader.setDelegate(proxy, queue: DispatchQueue.global())
// Load asset keys asynchronously (e.g., "playable")
let keys = ["playable"]
// Initialize the AVPlayer with the URL
let playerItem = AVPlayerItem(asset: asset)
self.player = AVPlayer(playerItem: playerItem)
playerItem.addObserver(self, forKeyPath: "status", options: [.new, .initial], context: nil)
// Observe 'error' property (if needed)
playerItem.addObserver(self, forKeyPath: "error", options: [.new], context: nil)
let contentKeySessionDelegate = ContentKeyDelegate()
// Initialize AVContentKeySession
let contentKeySession = AVContentKeySession(keySystem: .clearKey)
self.contentKeySession = contentKeySession
contentKeySession.setDelegate(contentKeySessionDelegate, queue: DispatchQueue.main)
// Associate the asset with the content key session
contentKeySession.addContentKeyRecipient(asset)
// Create a layer for the AVPlayer and add it to the view
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = view.bounds
playerLayer?.videoGravity = .resizeAspect
if let playerLayer = playerLayer {
view.layer.addSublayer(playerLayer)
}
NotificationCenter.default.addObserver(
self,
selector: #selector(playerDidFinishPlaying),
name: .AVPlayerItemDidPlayToEndTime,
object: player?.currentItem
)
// Start playback
player?.play()
}
// Update cookie when ever needed
func updateCookie() {
proxy?.cookie = "update exampleCookie"
}
@objc private func playerDidFinishPlaying(notification: Notification) {
print("Playback finished!")
// Optionally, handle end-of-playback actions here
}
//
// ReverseProxyResourceLoader.swift
// HLSDemo
//
// Created by Gajje.Venkatarao on 12/12/24.
//
import Foundation
import AVKit
import AVFoundation
class ReverseProxyResourceLoader: NSObject, AVAssetResourceLoaderDelegate {
var cookie = ""
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
resourceLoader.preloadsEligibleContentKeys = true
guard let interceptedURL = loadingRequest.request.url else {
loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Invalid URL"]))
return false
}
if interceptedURL.scheme == "skd" {
print("Token updated Cookie:", interceptedURL )
return false
}
var components = URLComponents(url: interceptedURL, resolvingAgainstBaseURL: false)
components?.scheme = "https" // Replace with the original scheme
guard let originalURL = components?.url else {
loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to map URL"]))
loadingRequest.finishLoading()
return false
}
var request = URLRequest(url: originalURL)
request.httpMethod = "GET"
if let storeCoockie = HTTPCookie(properties: [
.name: "dazn-token",
.value: cookie,
.domain: originalURL.host ?? "",
.path: "/",
.discard: true
]){
HTTPCookieStorage.shared.setCookie(storeCoockie)
}
let headers = loadingRequest.request.allHTTPHeaderFields ?? [:]
for (key, value) in headers {
request.addValue(value, forHTTPHeaderField: key)
}
request.addValue(cookie, forHTTPHeaderField: "Cookie")
URLSession.shared.configuration.httpShouldSetCookies = true
request.httpShouldHandleCookies = true
let task = (URLSession.shared.dataTask(with: originalURL) { data, response, error in
if let error = error {
print("Error Received:", error)
loadingRequest.finishLoading(with: error)
return
}
print(originalURL)
guard let data = data , let url = response?.url else {
loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "No data received"]))
return
}
loadingRequest.dataRequest?.respond(with: data)
loadingRequest.finishLoading()
} as URLSessionDataTask)
task.resume()
return true
}
}
Example project
It seems currently 5G Network Slicing is only available on URL Request / URL Session / Low level network and not on AVPlayer, is it possible to have the app using default slice when start the app and only enable Network Slicing when streaming video via AVPlayer?
I have spent a long time refactoring lots of older Swift code to compile without error in Swift 6.
The app is a v3 audio unit host and audio unit.
Having installed Sonoma and XCode 16 I compile the code using Swift 6 and it compiles and runs without any warnings or errors.
My host will load my AU no problem.
LOGIC PRO is still the ONLY audio unit host that will load native Mac V3 audio units and so I like to test my code using Logic.
In Sonoma with XCode 16...
My AU passes the most stringent AUVAL tests both in terminal and Logic pro.
If I compile the AU source in Swift 5 Logic will see the AU, load it and run it without problems.
But when I compile the AU in Swift 6 Logic sees the AU, will scan it and verify it passes the tests but will not load the AU. In XCode I see a log message that a "helper application failed to run" but the debugger never connects to the AU and I don't think Logic even gets as far as instantiating the AU.
So... what is causing this? I'm stumped..
Developing AUv3 is a brain-aching maze of undocumented hurdles and I'm hoping someone might have found a solution for this one. Meanwhile I guess my only option is to continue using the Swift 5 compiler.
(appending a little note just to mention that all the DSP code is written in C/C++, Swift is used mainly for the user interface and also does some offline thready work )
Hello,
I am experiencing slow image retrieval when using the requestImageForAsset:targetSize:contentMode:options:resultHandler: method in my application. The delay is significantly impacting the performance of my app.
Here are the details of my implementation:
for (PHAsset *asset in assets) {
@autoreleasepool {
PHImageManager *imageManager = [PHImageManager defaultManager];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous = YES;
options.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
options.resizeMode = PHImageRequestOptionsResizeModeNone;
[imageManager requestImageForAsset:asset
targetSize:CGSizeMake(100, 100)
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *thumbnail, NSDictionary *info) {
CameraRollCellDto *cellDto = [[CameraRollCellDto alloc] init];
cellDto.index = index;
cellDto.thumbnail = thumbnail;
cellDto.propertyDate = asset.creationDate;
if (self.segmentedTorikomi.selectedSegmentIndex == SEG_INDEX_IKKATSU) {
cellDto.isSelected = YES;
} else {
cellDto.isSelected = NO;
}
[list addObject:cellDto];
}];
index++;
}
}
Has anyone else encountered this issue? Are there any known solutions or optimizations that can help improve the speed of image retrieval using this method?
Thank you for your assistance.
Topic:
Media Technologies
SubTopic:
Photos & Camera
As of iOS 18, as far as I can tell, it appears there's still no AVPlayer options that allow users to toggle the caption / subtitle track on and off. Does anyone know of a way to do this with AVPlayer or with SwiftUI's VideoPlayer?
The following code reproduces this issue. It can be pasted into an app playground. This is a random video and a random vtt file I found on the internet.
import SwiftUI
import AVKit
import UIKit
struct ContentView: View {
private let video = URL(string: "https://server15700.contentdm.oclc.org/dmwebservices/index.php?q=dmGetStreamingFile/p15700coll2/15.mp4/byte/json")!
private let captions = URL(string: "https://gist.githubusercontent.com/samdutton/ca37f3adaf4e23679957b8083e061177/raw/e19399fbccbc069a2af4266e5120ae6bad62699a/sample.vtt")!
@State private var player: AVPlayer?
var body: some View {
VStack {
VideoPlayerView(player: player)
.frame(maxWidth: .infinity, maxHeight: 200)
}
.task {
// Captions won't work for some reason
player = try? await loadPlayer(video: video, captions: captions)
}
}
}
private struct VideoPlayerView: UIViewControllerRepresentable {
let player: AVPlayer?
func makeUIViewController(context: Context) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.modalPresentationStyle = .overFullScreen
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
uiViewController.player = player
}
}
private func loadPlayer(video: URL, captions: URL?) async throws -> AVPlayer {
let videoAsset = AVURLAsset(url: video)
let videoPlusSubtitles = AVMutableComposition()
try await videoPlusSubtitles.add(videoAsset, withMediaType: .video)
try await videoPlusSubtitles.add(videoAsset, withMediaType: .audio)
if let captions {
let captionAsset = AVURLAsset(url: captions)
// Must add as .text. .closedCaption and .subtitle don't work?
try await videoPlusSubtitles.add(captionAsset, withMediaType: .text)
}
return await AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
}
private extension AVMutableComposition {
func add(_ asset: AVAsset, withMediaType mediaType: AVMediaType) async throws {
let duration = try await asset.load(.duration)
try await asset.loadTracks(withMediaType: mediaType).first.map { track in
let newTrack = self.addMutableTrack(withMediaType: mediaType, preferredTrackID: kCMPersistentTrackID_Invalid)
let range = CMTimeRangeMake(start: .zero, duration: duration)
try newTrack?.insertTimeRange(range, of: track, at: .zero)
}
}
}
I am reaching out regarding an issue with my Apple FairPlay Streaming Certificate. To generate the certificate signing request (CSR), I used the following OpenSSL commands:
openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr
However, according to the guide provided by Apple and instructions from my DRM provider, I should have used:
openssl genrsa -aes256 -out privatekey.pem 1024 openssl req -new -sha1 -key privatekey.pem -out certreq.csr -subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US"
I suspect this discrepancy might be causing the issue with my FairPlay certificate. After obtaining the fairplay.cer file and importing it into Keychain Access, I noticed the following:
When I expand the certificate in Keychain Access, I can only see a public key and no private key.
As a result, I am unable to export the certificate as a .p12 file, as this option is disabled.
As per my DRM provider's instructions, I need to export the certificate along with the corresponding private key as a .p12 file with a password. Since the private key is not visible in Keychain Access, I am unable to proceed further.
I have read the FairPlay Streaming Overview but could not find any reasons as to why this issue is occurring or guidance on the procedure to revoke a certificate.
Additionally, I came across the terms and conditions which mentioned reaching out to product-security at Apple for assistance in revoking corrupt certificates. However, despite reaching out, I have not received a response.
Any help on how to proceed will be great!
Hi,
I am trying to enable the default MIDINetworkSession in a Catalyst app on MacOS like this:
MIDINetworkSession.default().isEnabled = true
MIDINetworkSession.default().connectionPolicy = .anyone
In the AppSandbox I have both incoming and outgoing network connections enabled. And I also added the NSLocalNetworkUsageDescription key to the info.plist. Bonjour services are also added to the info.plist:
NSBonjourServices
_apple-midi._udp.
Nevertheless the session stays disabled. Running the same code works just fine on iOS.
Is there any special setup I need to make on MacOS to enable the MIDINetworkSession?
Thanks!
On an iOS 18 phone, I use AVCaptureSession to capture HDR with x420 format. The output CMSampleBuffer is HLG colorspace, the propagated attachments contain kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey. Now I use CAMetalLayer to render the CVPixelBuffer to the screen, but the brightness is brighter than AVSampleBufferDisplayLayer.
Here is my code.
- (void)_updateColorSpaceIfNeed:(CVPixelBufferRef)pixelBuffer {
CAMetalLayer *layer = (CAMetalLayer *)_mtkView.layer;
if (![layer isKindOfClass:CAMetalLayer.class]) return;
layer.wantsExtendedDynamicRangeContent = YES;
CFDataRef ambientViewingEnvironment = (CFDataRef)CVBufferCopyAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, NULL);
NSData *data = (__bridge NSData *)ambientViewingEnvironment;
if (ambientViewingEnvironment) CFRelease(ambientViewingEnvironment);
CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadataWithAmbientViewingEnvironment:data];
// CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadata];
layer.EDRMetadata = metadata;
layer.pixelFormat = MTLPixelFormatRGBA16Float;
CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_2100_HLG);
layer.colorspace = colorspace;
if (colorspace) CGColorSpaceRelease(colorspace);
}
Why does the CAEDRMetadata class have "HLGMetadataWithAmbientViewingEnvironment:" and "HLGMetadata" methods, but does not provide the "HLGMetadataWithAmbientViewingEnvironment:sceneIllumination" method?
I want to know how kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey affect tone mapping. Is there any documentation I can refer to?
I need to implement a solution through an API or custom driver to completely block out the built-in speakers and microphone of Mac, because I need other apps to use specified external devices as audio input and output. Is there a way to achieve this requirement? What I mean is that even in system preferences, it should not be possible to choose the built-in microphone and speakers; only my external device can be used.
In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit.
My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method.
let artworkQueue = DispatchQueue(
label: "MusicKit-ArtworkQueue",
qos: .default,
attributes: .concurrent
)
let artworkSemaphore = DispatchSemaphore(value: 5)
extension Song {
func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) {
artworkQueue.async {
artworkSemaphore.wait()
defer {
artworkSemaphore.signal()
}
let imageURL = artwork?.url(
width: Int(size.width),
height: Int(size.height)
)
// I hate doing this as it might very well break in the future
guard let imageURL, imageURL.scheme == "musicKit"
else {
return completion(nil)
}
guard let imageData = try? Data(contentsOf: imageURL),
let image = UIImage(data: imageData) else {
return completion(nil)
}
completion(image)
}
}
}
I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once.
Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly?
Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.