In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit.
My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method.
let artworkQueue = DispatchQueue(
label: "MusicKit-ArtworkQueue",
qos: .default,
attributes: .concurrent
)
let artworkSemaphore = DispatchSemaphore(value: 5)
extension Song {
func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) {
artworkQueue.async {
artworkSemaphore.wait()
defer {
artworkSemaphore.signal()
}
let imageURL = artwork?.url(
width: Int(size.width),
height: Int(size.height)
)
// I hate doing this as it might very well break in the future
guard let imageURL, imageURL.scheme == "musicKit"
else {
return completion(nil)
}
guard let imageData = try? Data(contentsOf: imageURL),
let image = UIImage(data: imageData) else {
return completion(nil)
}
completion(image)
}
}
}
I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once.
Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly?
Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
I’m a tracking company and have my own tracking platform. Looking for the solution that using tag device for animals like Air Tags but running on my platform.
Is there a way to allow my platform to interface with the Find My Phone to get the location data of my Tags ?
So I have been using the iOS 18 beta and I would like to say first and foremost really great update I love the timed messages and the car Sickness stuff that’s awesome but my only problem with the update is screen time. I have screen time on my phone and when I updated to iOS 18 beta I can’t seem to claim more screen time. I have my screen time to where I can ignore it once I run out But in the new update I can seem press the button all I want but It wont give me more screen time. This Is what I imagine a minor fix and I would really appreciate if this was fixed soon. But other than that small detail Amazing update love what your doing over here at apple!
I have a mac os app that uses screen capture logic. It was originally coded using the Quartz CG api:
if let cgimage = CGDisplayCreateImage(CGMainDisplayID(), rect: cgRect) {...}
and this worked as expected even when capturing a screen rect that began on secondary monitor and ended on primary monitor (or was entirely contained on secondary monitor).
However now that API is deprecated and you're supposed to use ScreenCaptureKit instead. So I have attempted to convert the code. The trial code is:
let scConfig = SCStreamConfiguration()
scConfig.sourceRect = drect
scConfig.width = Int(drect.width)
scConfig.height = Int(drect.height)
SCScreenshotManager.captureImage(contentFilter: sFilter, configuration: scConfig) {any,error in
if let cgim = any {
print("image dims \(cgim.width), \(cgim.height), requested: \(drect)")
self.writeToFile2(cgim)
}
else {
print("SCREEN CAP failed")
}
}
...
where sFilter was previously set based on main screen display (with no exclusions). This code also "works" as long as the capture rect is entirely on primary monitor. But it fails if the rect spans both monitors or is fully contained on secondary monitor. (By fails I mean it produces empty image)
So my question is: How to use ScreenCaptureKit to obtain screen shot of rectangle that spans dual monitors?
Hi,
I’ve encountered a bug related to including tracks as a relationship in the playlists list. The issue arises when there is more than one playlist.
Specifically:
Single Playlist: The functionality works as expected.
Multiple Playlists: The application crashes.
Please let me know if you need additional information or if there are any updates on this issue.
Thank you!
curl --request GET \
--url 'https://api.music.apple.com/v1/me/library/playlists?include=tracks' \
--header 'Authorization: Bearer {token}' \
--header 'Music-User-Token: {token}'
I was wondering if anyone could assist with the following query.
Apple's Private Relay functionality requires companies to register all email-sending subdomains for the service to function properly. With 26 markets and 3 subdomains per market for one department, and another department with around 20 markets and even more subdomains, the limit of 100 sending domains is exceeded.
As a result, we’re unable to register all the domains currently being used to send emails to our customers.
Does any have any recommendations to overcome this?
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it.
Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out.
Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it.
import SwiftUI
import MusicKit
struct ContentView: View {
var body: some View {
VStack{
Button("Play Music") {
Task{
await playMusic()
}
}
}
}
}
func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{
var songs:[Song]?
if let t = tracks{
songs = [Song]()
for track in t {
if case let .song(song) = track {
songs?.append(song)
print("track is song \(track.debugDescription)")
}else{
print("track not song \(track.debugDescription)")
}
}
}
if let songs = songs {
let topSongs = MusicItemCollection(songs)
return topSongs
}
return nil
}
func playMusic() async {
// Request authorization
let status = await MusicAuthorization.request()
guard status == .authorized else {
print("Music authorization denied.")
return
}
do {
// Perform a hardcoded search for a playlist
let searchTerm = "2000"
let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self])
let response = try await request.response()
guard let playlist = response.playlists.first else {
print("No playlists found for the search term '\(searchTerm)'.")
return
}
// Fetch the songs in the playlist
let detailedPlaylist = try await playlist.with([.tracks])
guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else {
print("no songs found")
return }
guard let t = detailedPlaylist.tracks else {
print("no tracks")
return
}
// Create a queue and play
let musicPlayer = ApplicationMusicPlayer.shared
let q = ApplicationMusicPlayer.Queue(for: t)
musicPlayer.queue = q
try await musicPlayer.play()
print("Now playing playlist: \(playlist.name)")
} catch {
print("An error occurred: \(error.localizedDescription)")
}
}
I have sync 3.4 on my ford F150 - CarPlay does not work with iOS 18.1. Sync says I don’t have anything connected. Something is up.
I'm experiencing an issue with QuickLook in iOS 18 where.reality files with audio playback are affected. When I open a.reality file that includes audio, the audio track plays twice: once from the moment the file is opened, and again from the start of the animation. This results in a duplicate audio playback.
I've tested this issue on multiple devices running iOS 16, 17, and 18, and the problem only occurs on iOS 18. I've tried restarting the devices and checking for any software updates, but the issue persists.
Steps to reproduce:
Open a.reality file with audio playback in QuickLook on an iOS 18 device.
Observe the audio playback.
Expected result:
The audio track should play only once, from the start of the animation.
Actual result:
The audio track plays twice, once from the moment the file is opened and again from the start of the animation.
Device and iOS version:
I've tested this issue on iPhone 12 Pro, iPhone 13 Pro running iOS 18, iPhone 13 running iOS 16 and iPhone 11 Pro running iOS 17,
Hello,
I noticed that SFSpeechRecognizer is broken on iOS 18. During a recognition task, it keeps dropping the recognized text on every pause. For example, if you say "how are you fine", it will drop the "how are you" part and only give you "fine" as the result.
Say "how are you <pause> fine"
// iOS 17 ✅ (perfect final result)
How
How are
How are you
How are you.
How are you. Fine.
// iOS 18 ❌
How
How are
How are you
How are you
Fine
(the text before the pause is dropped, and fail to recognize the punctuations.)
Reproducing the issue:
Download the official sample project.
Run it on an iOS 18 device or simulator.
Say "how are you fine"
Only "fine" will be displayed.
How can I obtain the invoice order id on the user's purchase order?For example, "MT2345678"
I'm attempting to use AVExternalStorageDevice.requestAccess on iOS 18 using Xcode 16.
When calling requestAccess, a dialog does appear, but the completionHandler closure is never called to indicate whether access was granted. If using the async version, the function just never returns.
Calling requestAccess also results in a mediaServicesWereReset (-11819) error without fail.
Supposedly, "the system only presents the dialog to a person the first time your app calls the method." That also doesn't appear to be the case. The dialog appears every time requestAccess is called, regardless of previous invocations and whether "Allow" or "Don't Allow" was selected.
The dialog itself says "You can change this in Privacy settings." I cannot find this permission anywhere in the Settings app, neither under Privacy & Security nor under the app-specific settings page.
Has anyone else experienced these issues? Am I missing something here? I did suspect permissions issues and tried adding a NSRemovableVolumesUsageDescription entry to the app. This did not appear to change anything.
I am facing an Issue regarding the voicemail feature. when someone calls me, it would go to voicemail only if I click on the voicemail icon. If I do not respond at all to the incoming call, it does not give the caller an option to record voicemail. I have tried switching the voicemail on and off, its working on other devices Iphone 14 and 15 for my family member, just not for me. How can i resolve this?
Topic:
Media Technologies
SubTopic:
General
Hello,
I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v".
This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer.
I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps.
@Observable
final class AVFoundationService: AVService {
// Live View
private let session: AVCaptureSession = .init()
var previewLayer: AVCaptureVideoPreviewLayer {
let layer = AVCaptureVideoPreviewLayer(session: session)
layer.videoGravity = .resizeAspect
return layer
}
var activeVideoDevice: AVCaptureDevice? {
// TODO: implement correct logic
if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) {
return device
}
return AVCaptureDevice.default(for: .video)
}
func setupStreamDemo(completion: @escaping (Error?) -> Void) {
session.beginConfiguration()
if let device = activeVideoDevice {
do {
let input = try AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
} else {
print("explode")
}
for format in device.formats {
let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription)
if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" {
let foundFPS = format.videoSupportedFrameRateRanges.first {
Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60
}
try device.lockForConfiguration()
device.activeFormat = format
device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration
device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration
device.unlockForConfiguration()
}
}
} catch {
return completion(error)
}
}
session.commitConfiguration()
session.startRunning()
completion(nil)
}
}
I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer.
struct VideoPreviewView: NSViewRepresentable {
private let previewLayer: AVCaptureVideoPreviewLayer
func makeNSView(context: Context) -> NSView {
let view = NSView()
view.layer = self.previewLayer
view.layer?.frame = view.bounds
return view
}
func updateNSView(_ nsView: NSView, context: Context) {
if let layer = nsView.layer as? AVCaptureVideoPreviewLayer {
layer.session = self.previewLayer.session
}
}
}
When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30.
If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode.
I am on Mac Sequoia 15.0 with Xcode 16.0.
What I am doing wrong?
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window.
But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false.
Is there a way and what's the best way to take full-resolution screenshots of the correct size?
import Cocoa
import ScreenCaptureKit
class ViewController: NSViewController {
@IBOutlet weak var imageView: NSImageView!
override func viewDidAppear() {
imageView.imageScaling = .scaleProportionallyUpOrDown
view.wantsLayer = true
view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1)
Task {
let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows
let window = windows[0]
let filter = SCContentFilter(desktopIndependentWindow: window)
let configuration = SCStreamConfiguration()
configuration.ignoreShadowsSingleWindow = false
configuration.showsCursor = false
configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale)
configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale)
print(filter.contentRect)
let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration)
imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height))
}
}
}
Since updating to iOS 18.1 my WhatsApp no longer allows reply in my car and will only read one message at a time…
Topic:
Media Technologies
SubTopic:
General
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit.
iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist.
I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks.
FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json