Hi, is there a way to display the animated cover art in an app? Not every album has a cover art but for the ones that have one I would like to display them instead of the artwork.
Thank you :)
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists.
I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists.
Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API.
Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help!
func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async {
let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap {
AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs"
})
let playlistID = playlist.id
// Build the request URL for adding a song to a playlist
guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else {
alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.")
return
}
// Authorization Header
guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else {
alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.")
return
}
do {
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization")
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
let encoder = JSONEncoder()
let data = try encoder.encode(tracks)
request.httpBody = data
let musicRequest = MusicDataRequest(urlRequest: request)
let musicRequestResponse = try await musicRequest.response()
// Check if the request was successful (status 201)
if musicRequestResponse.urlResponse.statusCode == 201 {
alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.")
} else {
print("Status Code: \(musicRequestResponse.urlResponse.statusCode)")
print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")")
// Attempt to decode the error response into the AppleMusicErrorResponse model
if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) {
let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred."
alert.wrappedValue = AlertItem(title: "Error", message: errorMessage)
} else {
alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.")
}
}
} catch {
alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)")
}
}
Hi,
On macOS Sonoma and earlier versions, I used my script along with a LaunchDaemon to continuously record my screen.
However, after upgrading to macOS Sequoia, the LaunchDaemon no longer works for screen recording. It only works when I use a LaunchAgent.
For my workflow, using a LaunchDaemon and running the ffmpeg process as root is much more efficient than running it as a regular user. Does anyone know how to run a script in the background using a LaunchDaemon on macOS Sequoia?
Here are my LaunchDaemon plist and script:
LaunchDaemon plist:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key> <string>local.ScreenRecord</string>
<key>Disabled</key> <false/>
<key>RunAtLoad</key> <true/>
<key>KeepAlive</key> <true/>
<key>Nice</key>
<integer>-20</integer>
<key>ProgramArguments</key>
<array>
<string>/bin/bash</string>
<string>/usr/local/distrib/record.bash</string>
</array>
</dict>
</plist>
Script
## !!! START CONFIGURATION !!!
#
FFMPEG_LOC="/usr/local/distrib/ffmpeg"
FFMPEG_INPUT="-hide_banner -f avfoundation -capture_cursor 1 -pixel_format uyvy422"
FFMPEG_VSET="-codec libx264 -r 22 -crf 26 -preset veryfast -b:v 5M -maxrate 7M -bufsize 14M"
TIME_REC="-t 600"
FOLDER_REC="/usr/local/distrib/REC/"
#
## !!! END CONFIGURATION !!!
#
# Find number id monitor
MON_ID=$($FFMPEG_LOC -hide_banner -f avfoundation -list_devices true -i "" 2>&1 | awk -F'[]|[]' '/Capture\ screen/ {print $4}')
#
if [ -n "$MON_ID" ]
then
# Time for name
DATE_REC=$(date +"%m-%d-%Y_%H-%M-%S")
# Number of output video file
OUTPUT_NUM=0
# Full command to record
FFMPEG_FULL_COMMAND=""
for MON_NUM in $MON_ID
do
FFMPEG_FULL_COMMAND=""$FFMPEG_FULL_COMMAND" "$FFMPEG_INPUT" -i "$MON_NUM" "$FFMPEG_VSET" "$TIME_REC" -map "$OUTPUT_NUM" "$FOLDER_REC""$DATE_REC"_Mon"$MON_NUM".mkv"
let OUTPUT_NUM=$OUTPUT_NUM+1
done
$FFMPEG_LOC $FFMPEG_FULL_COMMAND
else
echo "No monitors"
sleep 5
fi
Hello everyone,
I am currently working on an app project aimed at users who want to quickly and easily capture their ideas and notes while on the go. The basic concept is to develop an iOS app where users can store both typed notes and voice recordings – essentially a "brain dump" solution. The core functionality (storing, editing, synchronizing via CloudKit, etc.) will be handled within the iOS app.
In addition, I plan to integrate a CarPlay extension that allows the driver to start and stop a recording – ideally through a minimalist interface featuring a large record button and a "Done" button. Since the iPhone is often not within immediate reach in the car, the CarPlay integration should serve as a quick trigger to initiate the recording in the iOS app.
My questions are as follows:
Has anyone had experience implementing a CarPlay extension for an app that primarily handles notes and voice recordings, rather than falling into the traditional categories like navigation, audio, or communication?
Has such a concept ever been approved by Apple, or are there known hurdles and guidelines that must be observed?
Are there alternative approaches to implementing CarPlay integration in this context in a compliant and effective manner?
I would greatly appreciate any feedback, shared experiences, and tips on best practices.
Thank you in advance and best regards!
I noticed that Instagram and iMessage support receiving shared lyrics from Apple Music. Specifically, when users long-press lyrics, a sheet pops up showing iMessage and Instagram. Clicking on either app generates a beautifully formatted lyrics image.
I've looked through MusicKit documentation but couldn't find any related APIs.
How can I implement this functionality in my app?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device.
What I Have Tried So Far
I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings:
Google Cast SDK:
Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time.
ReplayKit for Screen Capture:
RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant.
RTSP/UDP Streaming:
Some third-party libraries support RTSP/UDP streaming for real-time screen sharing. Google TV does not natively support RTSP, making this approach difficult.
My Questions:
Is it possible to achieve real-time screen mirroring on Google TV using Google Cast SDK? Does Google TV support WebRTC or any low-latency streaming protocol that can be used from iOS? Are there any alternative approaches to mirror an iOS screen to Google TV with minimal latency? I would appreciate any guidance, code examples, or references to relevant documentation.
I have created an app where you can speak using SFSpeechRecognizer and it will recognize you speech into text, translate it and then return it back using speech synthesis. All locales for SFSpeechRecognizer and switching between them work fine when the app is in the foreground but after I turn off my screen(the app is still running I just turned off the screen) and try to create new recognitionTask it it receives this error inside the recognition task: User denied access to speech recognition. The weird thing about this is it only happens with some languages. The error happens with Croatian or Hungarian locale for speech recognition but doesn't with English or Spanish locale.
On an iOS 18 phone, I use AVCaptureSession to capture HDR with x420 format. The output CMSampleBuffer is HLG colorspace, the propagated attachments contain kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey. Now I use CAMetalLayer to render the CVPixelBuffer to the screen, but the brightness is brighter than AVSampleBufferDisplayLayer.
Here is my code.
- (void)_updateColorSpaceIfNeed:(CVPixelBufferRef)pixelBuffer {
CAMetalLayer *layer = (CAMetalLayer *)_mtkView.layer;
if (![layer isKindOfClass:CAMetalLayer.class]) return;
layer.wantsExtendedDynamicRangeContent = YES;
CFDataRef ambientViewingEnvironment = (CFDataRef)CVBufferCopyAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, NULL);
NSData *data = (__bridge NSData *)ambientViewingEnvironment;
if (ambientViewingEnvironment) CFRelease(ambientViewingEnvironment);
CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadataWithAmbientViewingEnvironment:data];
// CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadata];
layer.EDRMetadata = metadata;
layer.pixelFormat = MTLPixelFormatRGBA16Float;
CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_2100_HLG);
layer.colorspace = colorspace;
if (colorspace) CGColorSpaceRelease(colorspace);
}
Why does the CAEDRMetadata class have "HLGMetadataWithAmbientViewingEnvironment:" and "HLGMetadata" methods, but does not provide the "HLGMetadataWithAmbientViewingEnvironment:sceneIllumination" method?
I want to know how kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey affect tone mapping. Is there any documentation I can refer to?
My iOS app can access the iphone media library, but if on Mac M4 is Error Domain=NSCocoaError Domain Code=257
Topic:
Media Technologies
SubTopic:
General
Hello Apple,
i've been using ios for many years and never had any issues with urdu language keyboard, but since the new 18.4 beta update some words are not working correctly as it should like a name of my friend who's name is "راعنیہ" but the new updated version cannot type is together and keep seperating like "راعنی ہ" its so frustrating to use like that and its not just one but so many other words that it just cannot do properly also the new font and no gap concept its hurting my eyes so much while reading or even typing.. i hope apple fixes that asap..
thankyou
Hello Apple Developers,
I’m reaching out to the community with a concept that I truly believe could be a natural fit for the Apple ecosystem:
A privacy-focused, iOS-exclusive dating app designed to enhance connections between Apple users while staying true to Apple’s commitment to security and user privacy.
The idea is to create an iOS-only dating platform that fosters relationships between users who are part of the Apple ecosystem. The app would integrate seamlessly with Apple’s services (iMessage, FaceTime, Siri, etc.) and provide a premium user experience, where privacy is a priority.
Apple users already prefer to communicate using Apple services (iMessage, FaceTime). A dating app designed specifically for iOS users would deepen this ecosystem lock-in, making it easier for Apple customers to connect within a trusted space.
Apple is already known for its privacy focus, and an iOS-exclusive dating app would build upon that reputation. It would ensure secure, private interactions, minimizing the risks associated with data sharing in most dating apps today.
The app could integrate directly with features like iCloud, Apple Pay (for date-night bookings), and Siri (for matchmaking suggestions), offering users a truly native iOS experience.
While the app would remain free to use, here are a few potential monetization methods:
Bundling with Apple One/iCloud+ for premium matchmaking features.
Apple Pay-based date-night deals with local partners.
I’d love to hear your thoughts on whether Apple might be open to this idea. Would there be any challenges from a technical or business perspective in creating a dating app exclusively for iOS users?
I’m looking forward to hearing from you all, and thank you for your time and insights.
Yours Truly,
CapNKirk
P.S. This is an idea. But I do not care who uses, implements, or executes this idea. I just want to see Apple take advantage of it.
When I use musicKit SDK for Android 1.1.2, I found that MediaContainerType only defines three types:
NONE = 0;
ALBUM = 1;
PLAYLIST = 2;
The RADIO_STATION type is not defined.
However, the documentation of com.apple.android.music.playback.model states that the RADIO_STATION type is supported.
This problem causes an error after I pass in the stations ID:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
May I ask how to solve this problem?
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型:
无 = 0;
专辑 = 1;
播放列表 = 2;
未定义 RADIO_STATION 类型。
但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。
此问题在我传入 stations ID 后会导致错误:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
请问如何解决这个问题?
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync.
I don t know if it s a AVAssetWriterInputr setup problem,here is my code
NSDictionary *audioCompressionSettings = @{
AVEncoderBitRatePerChannelKey : @(64000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(2),
AVSampleRateKey : @(44100) };
AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
audioAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:audioAssetWriterInput];
NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5),
AVVideoMaxKeyFrameIntervalKey:@(30),
AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel};
NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264,
AVVideoScalingModeKey : AVVideoScalingModeResize,
AVVideoWidthKey:@(screenWidth*2),
AVVideoHeightKey:@(screenHeight*2),
AVVideoCompressionPropertiesKey:videoCompressSetting
};
AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting];
videoAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:videoAssetWriterInput];
Hello,
How do I find the apple music catalogID for songs in a apple music playlist?
Im building an iOS app that uses MusicKit/Apple Music API.
For this example, you can assume that my iOS app simply allows users to upload their apple music playlists. And when users open a specific playlist, I want to find the catalogID for each song.
Currently all the songs are returning song IDs in this format “i.PkdJvPXI2AJgm8”.
I believe these are libraryIDs, not catalogIDs.
I’d like to find a front end solution using MusicKit. But perhaps a back end solution using the Apple Music Rest API is required.
Any recommendations would be appreciated!
Hey, just wonderng if anyone knows whether the sdk for Android will be updated soon to support the new 16 KB memory page size requirement coming with Android 15?
Google’s going to require all apps targeting Android 15+ to support it starting November 2025
Has anyone heard anything from Apple or the SDK team about this?
Thanks!
I’m spot-checking some of the data I’m extracting from the Apple Music Feed parquet files and finding numerous issues of invalid album IDs.
For example, just looking at any album with a primary artist id of 163043, I see a few albums that are not available at music.apple.com/us/album/NNN. These include:
981094158 - Time and the River
1803443737 - Celebration (Live New York ’80)
1525426873 - Anything You Want: The Warner-Reprise-Elektra Years
I notice that neither of these album ids are returned using the general Music API, so I’m a bit confused why they would exist in the parquet files at all.
Thanks.
Topic:
Media Technologies
SubTopic:
General
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call:
-[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12.
I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem.
Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter
Feedback ID: FB17874311