Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Can individual Apple Developer accounts stream full tracks with MusicKit?
I have implemented fetching Apple Music preview songs using a Swift framework integrated into a Unity app. My requirement is to fetch full tracks from a user’s Apple Music library and play them inside Unity. To do this, I understand that I need to handle authentication, generate a Developer Token, and then obtain a Music User Token to access the user’s Apple Music content. Currently, I have an Individual Apple Developer account (not Organization). Based on my research, it seems that: With an Individual account, I can implement this functionality and even upload builds to TestFlight for internal testing. However, when releasing the app publicly on the App Store, full-track playback may be restricted for Individual accounts and allowed only for Organization accounts. 👉 Can you confirm if this understanding is correct? 👉 Specifically, is it possible for an Individual account to fetch and play full-length tracks from a subscribed Apple Music user’s library (at least for internal/TestFlight testing)?
0
0
179
Sep ’25
401 Unauthorized when attempting to access Apple Music Feed API
Hello, I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it. I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully). > GET /v1/feed/song/latest HTTP/2 > Host: api.media.apple.com > user-agent: insomnia/2023.5.8 > authorization: Bearer [REDACTED] > accept: */* < HTTP/2 401 < content-type: application/json; charset=utf-8 < content-length: 0 < x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q < x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod and also the Apple provided Python example code, which gives me authentication errors too. $ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path "/Users/foxt/Documents/am-feed/NMBH[...].p8" --out-dir . running.... INFO:__main__:Sending requests to https://api.media.apple.com INFO:__main__:Getting the latest export for feed artist Exception: Authentication Failed. Did you provide the correct team id, key id, and p8 file? Does this API need to be enabled on my account separately from the main Apple Music API? The documentation reads to me as if anyone with an Apple Developer Programme membership can use this API and I did not see any information regarding any other requirements
0
0
413
Sep ’25
MPNowPlayingInfoCenter playbackState fails to update after losing audio focus on macOS
My Environment: Device: Mac (Apple Silicon, arm64) OS: macOS 15.6.1 Description: I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state. I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue. Steps to Reproduce: Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B): Start playback in App A. Start playback in App B. (App B now has audio focus, and App A is still playing). Attempt to pause App A via the system's Control Center or its own UI. Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive. If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state. My Questions: Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application? Is this a known issue or expected behavior in macOS? Are there any official workarounds or solutions to ensure the UI updates correctly?
0
0
169
Sep ’25
Failed to change the TTS language to CN or TW
I have some question about the TTS My device default language is zh-HK. (cantonese) my device is iPhone 16 Pro, IOS 18.6 I create a function speakMandarin I want the device to speak the zh-CN (putonghua) however the device only can speak zh-HK (cantonese). I already set the AVSpeechSynthesisVoice language as zh-CN func speakMandarin(text: String) { print("speakMandarin, \(text)") lastError = nil // Reset error // Stop any ongoing speech before starting new if synthesizer.isSpeaking { synthesizer.stopSpeaking(at: .immediate) } // Configure speech utterance let utterance = AVSpeechUtterance(ssmlRepresentation: text)! utterance.rate = 0.5 // Natural speaking speed utterance.pitchMultiplier = 1.0 utterance.volume = 1.0 utterance.voice = AVSpeechSynthesisVoice(language: "zh-CN") let preferredLanguages = [ "zh-CN" , "zh-TW"] var selectedVoice: AVSpeechSynthesisVoice? for lang in preferredLanguages { if let voice = AVSpeechSynthesisVoice(language: lang) { print(lang) selectedVoice = voice utterance.voice = voice break } } // If no Mandarin voice found, use system default if selectedVoice == nil { selectedVoice = AVSpeechSynthesisVoice(language: nil) lastError = "未偵測到普通話語音包,將使用系統預設語音" print (lastError) } utterance.voice = selectedVoice print(utterance) synthesizer.speak(utterance) } here is my log speakMandarin, <speak>你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢?</speak> zh-CN [AVSpeechUtterance 0x1194efb80] String: 你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢? Voice: [AVSpeechSynthesisVoice 0x104ceff90] Language: zh-CN, Name: Tingting, Quality: Default [com.apple.voice.compact.zh-CN.Tingting] Rate: 0.50 Volume: 1.00 Pitch Multiplier: 1.00 Delays: Pre: 0.00(s) Post: 0.00(s)
0
0
310
Sep ’25
Clarification on SFSpeechRecognizer system alert message and service URLs for whitelisting
Hello Apple Engineers, I am developing a feature related to SpeechRecognizer (import Speech), and I have two questions: After adding the NSSpeechRecognitionUsageDescription key in my Info.plist, when I initialize an SFSpeechRecognizer instance, the system shows an authorization alert. The alert contains a pre-defined message from Apple: “Speech data from this app will be sent to Apple to process your requests. This will also help Apple improve its speech recognition technology.” Is it possible to remove or customize this message? My app runs in a network environment with a whitelist. I need to know which URL the SFSpeechRecognizer instance connects to, and which port it uses, so that I can add it to the whitelist. Thank you very much for your support! Best regards, Yu Cheng
0
0
136
Sep ’25
[AVFCore] IOS 26.0 EXC_BAD_ACCESS from _customCompositorShouldCancelPendingFrames
Hi, I'm working an a video editing software that lets you composite and export videos. I use a custom compositor to apply my effects etc. In my crash dashboard, I am seeing a report of an EXC_BAD_ACCESS crash from objc_msgSend. Below is the stacktrace. libobjc.A.dylib objc_msgSend libdispatch.dylib _dispatch_sync_invoke_and_complete_recurse libdispatch.dylib _dispatch_sync_f_slow [symbolication failed] libdispatch.dylib _dispatch_client_callout libdispatch.dylib _dispatch_lane_barrier_sync_invoke_and_complete AVFCore -[AVCustomVideoCompositorSession(AVCustomVideoCompositorSession_FigCallbackHandling) _customCompositorShouldCancelPendingFrames] AVFCore _customCompositorShouldCancelPendingFramesCallback MediaToolbox remoteVideoCompositor_HandleVideoCompositorClientMessage CoreMedia __figXPCConnection_CallClientMessageHandlers_block_invoke libdispatch.dylib _dispatch_call_block_and_release libdispatch.dylib _dispatch_client_callout libdispatch.dylib _dispatch_lane_serial_drain libdispatch.dylib _dispatch_lane_invoke libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh libdispatch.dylib _dispatch_workloop_worker_thread libsystem_pthread.dylib _pthread_wqthread libsystem_pthread.dylib start_wqthread What stood out to me is that this is only being reported from IOS 26.0+ devices. A part of the stacktrace failed to be symbolicated [symbolication failed]. I'm 90% confident that this is Apple code, not my app's code. I cannot reproduce this locally. Is this a known issue? What are the possible root-causes, and how can I verify/eliminate them? Thanks,
0
0
147
Oct ’25
Apple Music treats asian artists with romanized names as two different names
Hello, I'm trying to write a shortcut using Toolbox Pro that gets triggered by an accessibility trigger and then favorites the currently playing song. It's working pretty well, but I noticed that for some artists, especially asian ones, it simply doesn't work. While debugging, I noticed that the tool uses the same song ID, artist ID, everything as it should to search for the song and favorite it. However, I noticed that Apple Music treats artists with romanized names as two separate artists! https://music.apple.com/br/artist/王菲/41760704 https://music.apple.com/br/artist/faye-wong/41760704?l=en-GB You can see that the ID is the same (41760704). It seems that, when I search for the artist, the first artist (王菲) returns, so that when I open URLs on the web for the artist I can see a star next to the song name, meaning that it got a like. However, the romanized artist (faye-wong) doesn't have a like on the same song. This is very weird, right?
0
0
98
Oct ’25
Mute behavior of Volume button on AVPlayerViewController iOS 26
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before. On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons). As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system. But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior. I need some bases to explain this situation. Thank you.
0
1
167
Oct ’25
Best Approach for Monitoring Music Playback State Across Multiple Apps?
Hey Swift community! I'm exploring building a macOS app that needs to monitor what's currently playing in music apps like Spotify and Apple Music (track info, playback position, play/pause state). I'm trying to figure out the most efficient architecture before diving in. The Goal: Monitor playback state across multiple music players to react to changes in real-time, ideally with minimal CPU overhead since this would run continuously in the background. Approaches I'm Considering AppleScript / ScriptingBridge Distributed Notifications Native Frameworks (Apple Music only) What's the recommended way to do this on macOS? Are distributed notifications reliable enough to avoid polling entirely? Is there a performance difference between AppleScript and ScriptingBridge for IPC? For Apple Music specifically, should I use MusicKit, MediaPlayer, or stick with AppleScript? Are there other approaches I'm missing?
0
0
96
Nov ’25
Massive amounts of leaked memory with the tvOS 26 system player user interface
Hi, We identified massive amounts of leaked memory with the tvOS 26 standard player user interface as soon as chapters (navigation markers) are involved. Artwork images associated with chapters are not correctly released anymore, leaking memory in chunks of several MiBs. Over time apps will be terminated by the system due to excessive memory consumption. The issue was reported to Apple as tvOS 26 regression: Huge memory leaks associated with navigation marker artworks displayed in the tvOS standard user interface, filed under FB21160665.
0
0
182
Nov ’25
Unexpected artist names in song table
Hi team, In the Apple Music Feed datasets, we've noticed some unexpected values in the song and album tables. The primaryartists column from either song or album may contain a "non-default" artist name such as the katakana name shown in the example below: select id, name, namedefault, primaryartists from amf_song where id = '1698723329' id | name | namedefault | primaryartists ---------------------------------------- 1698723329 | {default=California} | California | [{id=1264818718, name=チャペル・ローン}] select * from amf_artist where id = '1264818718' id | name | namedefault | namepronunciation | ---------------------------------------------- 1264818718 | {default=Chappell Roan, ja=チャペル・ローン} | Chappell Roan | {ja=チャペルローン} | Shouldn't the primaryartists column be showing the namedefault instead of the Japanese language version? When can we expect this bug to resolved? Thanks,
0
2
315
Dec ’25
Repeat song listens not queryable
Hi all, I've been working on some personal programming projects and have gotten into using the Apple Music API. I'm currently looking to get a list of recent songs using the /v1/me/recent/played/tracks endpoint and it's working well. However, I know there are some songs I've listened to multiple times in a row, and those are not showing up as unique tracks when querying this endpoint. I'm only seeing a list of the different songs I've listened to lately, not a true list of the most recent plays on my account. Is this intended behavior or am I going about something incorrectly here? My query is using that endpoint & specifying the types to be only [songs]. Thanks in advance for any ideas or insight.
0
0
254
Dec ’25
Apple Music playlist create/delete works but DELETE returns 401 — and MusicKit write APIs are macOS‑unavailable. How to build a playlist editor on macOS?
I’m trying to build a playlist editor on macOS. I can create playlists via the Apple Music HTTP API, but DELETE always returns 401 even immediately after creation with the same tokens. Minimal repro: #!/usr/bin/env bash set -euo pipefail BASE_URL="https://api.music.apple.com/v1" PLAYLIST_NAME="${PLAYLIST_NAME:-blah}" : "${APPLE_MUSIC_DEV_TOKEN:?}" : "${APPLE_MUSIC_USER_TOKEN:?}" create_body="$(mktemp)" delete_body="$(mktemp)" trap 'rm -f "$create_body" "$delete_body"' EXIT curl -sS --compressed -o "$create_body" -w "Create status: %{http_code}\n" \ -X POST "${BASE_URL}/me/library/playlists" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" \ -d "{\"attributes\":{\"name\":\"${PLAYLIST_NAME}\"}}" playlist_id="$(python3 - "$create_body" <<'PY' import json, sys with open(sys.argv[1], "r", encoding="utf-8") as f: data = json.load(f) print(data["data"][0]["id"]) PY )" curl -sS --compressed -o "$delete_body" -w "Delete status: %{http_code}\n" \ -X DELETE "${BASE_URL}/me/library/playlists/${playlist_id}" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" I capture the response bodies like this: cat "$create_body" cat "$delete_body" Result: Create: 201 Delete: 401 I also checked the latest macOS SDK’s MusicKit interfaces and MusicLibrary.createPlaylist/edit/add(to:) are marked @available(macOS, unavailable), so I can’t create/ delete via MusicKit on macOS either. Question: How can I implement a playlist editor on macOS (create/delete/modify) if: MusicKit write APIs are unavailable on macOS, and The HTTP API can create but DELETE returns 401? Any guidance or official workaround would be hugely appreciated.
0
0
133
1w
Apple Music API no longer returns standalone singles as “single” albums
I’ve been using Apple Music API for quite a while now and a recent change must have happened which is quite disruptive. On many occasions, artists release singles from an album as part of promoting this album. For recent examples, Harry Styles released “Aperture” (a single) to promote his upcoming album “Kiss All The Time. Disco, Occasionally“. Similarly, Bruno Mars released “I Just Might”, a single from the upcoming album “The Romantic”. Previously, those would return at the endpoint ”artists/{artist_id}/albums” with a “- Single” suffix. But it seems a recent change happens where they only appear as playable tracks inside the album. This behavior is also evident in the Apple Music app itself. Those singles no longer appear under “Singles & EPs”. Instead, they would only be visible if the single becomes popular enough to be shown on “Top Songs“. Otherwise one would have to know to tap on the (future) album to discover if there are released singles. Meanwhile Spotify’s API returns those as singles properly, just like Apple Music API used to. This change must be recent but the question is if it’s intentional, and if so, how can the API be used from here on out to “extract” those singles and represent them?
0
1
148
6d