Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

401 Unauthorized when attempting to access Apple Music Feed API
Hello, I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it. I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully). > GET /v1/feed/song/latest HTTP/2 > Host: api.media.apple.com > user-agent: insomnia/2023.5.8 > authorization: Bearer [REDACTED] > accept: */* < HTTP/2 401 < content-type: application/json; charset=utf-8 < content-length: 0 < x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q < x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod and also the Apple provided Python example code, which gives me authentication errors too. $ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path "/Users/foxt/Documents/am-feed/NMBH[...].p8" --out-dir . running.... INFO:__main__:Sending requests to https://api.media.apple.com INFO:__main__:Getting the latest export for feed artist Exception: Authentication Failed. Did you provide the correct team id, key id, and p8 file? Does this API need to be enabled on my account separately from the main Apple Music API? The documentation reads to me as if anyone with an Apple Developer Programme membership can use this API and I did not see any information regarding any other requirements
0
0
390
Sep ’25
MPNowPlayingInfoCenter playbackState fails to update after losing audio focus on macOS
My Environment: Device: Mac (Apple Silicon, arm64) OS: macOS 15.6.1 Description: I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state. I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue. Steps to Reproduce: Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B): Start playback in App A. Start playback in App B. (App B now has audio focus, and App A is still playing). Attempt to pause App A via the system's Control Center or its own UI. Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive. If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state. My Questions: Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application? Is this a known issue or expected behavior in macOS? Are there any official workarounds or solutions to ensure the UI updates correctly?
0
0
148
Sep ’25
Failed to change the TTS language to CN or TW
I have some question about the TTS My device default language is zh-HK. (cantonese) my device is iPhone 16 Pro, IOS 18.6 I create a function speakMandarin I want the device to speak the zh-CN (putonghua) however the device only can speak zh-HK (cantonese). I already set the AVSpeechSynthesisVoice language as zh-CN func speakMandarin(text: String) { print("speakMandarin, \(text)") lastError = nil // Reset error // Stop any ongoing speech before starting new if synthesizer.isSpeaking { synthesizer.stopSpeaking(at: .immediate) } // Configure speech utterance let utterance = AVSpeechUtterance(ssmlRepresentation: text)! utterance.rate = 0.5 // Natural speaking speed utterance.pitchMultiplier = 1.0 utterance.volume = 1.0 utterance.voice = AVSpeechSynthesisVoice(language: "zh-CN") let preferredLanguages = [ "zh-CN" , "zh-TW"] var selectedVoice: AVSpeechSynthesisVoice? for lang in preferredLanguages { if let voice = AVSpeechSynthesisVoice(language: lang) { print(lang) selectedVoice = voice utterance.voice = voice break } } // If no Mandarin voice found, use system default if selectedVoice == nil { selectedVoice = AVSpeechSynthesisVoice(language: nil) lastError = "未偵測到普通話語音包,將使用系統預設語音" print (lastError) } utterance.voice = selectedVoice print(utterance) synthesizer.speak(utterance) } here is my log speakMandarin, <speak>你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢?</speak> zh-CN [AVSpeechUtterance 0x1194efb80] String: 你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢? Voice: [AVSpeechSynthesisVoice 0x104ceff90] Language: zh-CN, Name: Tingting, Quality: Default [com.apple.voice.compact.zh-CN.Tingting] Rate: 0.50 Volume: 1.00 Pitch Multiplier: 1.00 Delays: Pre: 0.00(s) Post: 0.00(s)
0
0
234
Sep ’25
Clarification on SFSpeechRecognizer system alert message and service URLs for whitelisting
Hello Apple Engineers, I am developing a feature related to SpeechRecognizer (import Speech), and I have two questions: After adding the NSSpeechRecognitionUsageDescription key in my Info.plist, when I initialize an SFSpeechRecognizer instance, the system shows an authorization alert. The alert contains a pre-defined message from Apple: “Speech data from this app will be sent to Apple to process your requests. This will also help Apple improve its speech recognition technology.” Is it possible to remove or customize this message? My app runs in a network environment with a whitelist. I need to know which URL the SFSpeechRecognizer instance connects to, and which port it uses, so that I can add it to the whitelist. Thank you very much for your support! Best regards, Yu Cheng
0
0
66
Sep ’25
ShazamKit for Android and 16 KB native library alignment
Hello, I'm working on a Flutter app targeting both Android and iOS, where I implemented ShazamKit. In order to achieve that, I first tried with the flutter_shazam_kit package, but since it's not maintained anymore, I forked it here, and tried to update it to meet the Google Play Store requirements, as you can see here: https://github.com/mregnauld/flutter_shazam_kit/tree/fix-16k Unfortunately, after trying everything, my app still doesn't meet the (not so) new 16 KB native library alignment. Also, I'm 100% sure it comes from that because the error message disappears if I remove that package from my app. So after investigating, it seems that the problem comes from the ShazamKit for Android (that you can find here: https://developer.apple.com/download/all/?q=Android%20ShazamKit), and especially the .so files in the .aar file. Is there anything I can do to fix that, or should I wait before the ShazamKit team fix that? I'm totally stuck with that so any help is highly appreciated. Thanks.
3
0
554
Oct ’25
[AVFCore] IOS 26.0 EXC_BAD_ACCESS from _customCompositorShouldCancelPendingFrames
Hi, I'm working an a video editing software that lets you composite and export videos. I use a custom compositor to apply my effects etc. In my crash dashboard, I am seeing a report of an EXC_BAD_ACCESS crash from objc_msgSend. Below is the stacktrace. libobjc.A.dylib objc_msgSend libdispatch.dylib _dispatch_sync_invoke_and_complete_recurse libdispatch.dylib _dispatch_sync_f_slow [symbolication failed] libdispatch.dylib _dispatch_client_callout libdispatch.dylib _dispatch_lane_barrier_sync_invoke_and_complete AVFCore -[AVCustomVideoCompositorSession(AVCustomVideoCompositorSession_FigCallbackHandling) _customCompositorShouldCancelPendingFrames] AVFCore _customCompositorShouldCancelPendingFramesCallback MediaToolbox remoteVideoCompositor_HandleVideoCompositorClientMessage CoreMedia __figXPCConnection_CallClientMessageHandlers_block_invoke libdispatch.dylib _dispatch_call_block_and_release libdispatch.dylib _dispatch_client_callout libdispatch.dylib _dispatch_lane_serial_drain libdispatch.dylib _dispatch_lane_invoke libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh libdispatch.dylib _dispatch_workloop_worker_thread libsystem_pthread.dylib _pthread_wqthread libsystem_pthread.dylib start_wqthread What stood out to me is that this is only being reported from IOS 26.0+ devices. A part of the stacktrace failed to be symbolicated [symbolication failed]. I'm 90% confident that this is Apple code, not my app's code. I cannot reproduce this locally. Is this a known issue? What are the possible root-causes, and how can I verify/eliminate them? Thanks,
0
0
82
Oct ’25
How to fetch a library song via MusicKit or Apple Music API if the id a non numeric format?
If I fetch a library playlist like the generated "Favorites" playlist via MusicKit like this guard let initialTracks = try await playlist.with([.tracks]).tracks else { return nil } I get a list of tracks like this: ... TrackID: i.e5gmPS6rZ856 TrackID: i.4ZQMxU0OxNg0 TrackID: i.J198KH4P85K4 TrackID: i.J1AaRC4P85K4 TrackID: i.4BPqWt0OxNg0 TrackID: 4473570282773028026 TrackID: 4473570282773028025 TrackID: 4015088256684964387 TrackID: 4473570282773028024 TrackID: 7541557725362154249 TrackID: 4473570282773028027 I save the IDs for later use, but when I want to fetch them, only the ones with ids that starts with "i." work. static func getLibrarySong(from id: String) async -> Song? { var request = MusicLibraryRequest<Song>() request.filter(matching: \.id, equalTo: MusicItemID(id)) do { let response = try await request.response() return response.items.first } catch { ... } } Or the Apple Music API endpoint : static func getLibrarySongFromAPI(with id: String) async -> Song? { guard let url = AppleMusicURL.getURL(for: .getSongById, id: id) else { return nil } do { let dataRequest = MusicDataRequest(urlRequest: URLRequest(url: url)) let dataResponse = try await dataRequest.response() let response = try JSONDecoder().decode(SongsResponse.self, from: dataResponse.data) return response.data.first } catch { ... } } Both functions above won't work for the non numeric like 4473570282773028024 so it seems the ID is wrong, but how do I make it work? Otherwise I can fetch all the songs fine, in catalog or in the library, but these few songs can't be individually fetched, only with the try await playlist.with([.tracks])` fetch, that gets the whole playlist. But obviously this isn't always possible. Thanks in advance!
1
0
555
Oct ’25
Apple Music treats asian artists with romanized names as two different names
Hello, I'm trying to write a shortcut using Toolbox Pro that gets triggered by an accessibility trigger and then favorites the currently playing song. It's working pretty well, but I noticed that for some artists, especially asian ones, it simply doesn't work. While debugging, I noticed that the tool uses the same song ID, artist ID, everything as it should to search for the song and favorite it. However, I noticed that Apple Music treats artists with romanized names as two separate artists! https://music.apple.com/br/artist/王菲/41760704 https://music.apple.com/br/artist/faye-wong/41760704?l=en-GB You can see that the ID is the same (41760704). It seems that, when I search for the artist, the first artist (王菲) returns, so that when I open URLs on the web for the artist I can see a star next to the song name, meaning that it got a like. However, the romanized artist (faye-wong) doesn't have a like on the same song. This is very weird, right?
0
0
62
Oct ’25
FaceTime Hang Up
When I’m on FaceTime, my phone will randomly end my call? I have an iPhone 17, iOS 26.1. Some times I won’t even be touching my phone screen and it’ll hang up. I’m not sure if this is a universal issue or just a me problem. It’s getting really annoying.
1
0
145
Oct ’25
Best Approach for Monitoring Music Playback State Across Multiple Apps?
Hey Swift community! I'm exploring building a macOS app that needs to monitor what's currently playing in music apps like Spotify and Apple Music (track info, playback position, play/pause state). I'm trying to figure out the most efficient architecture before diving in. The Goal: Monitor playback state across multiple music players to react to changes in real-time, ideally with minimal CPU overhead since this would run continuously in the background. Approaches I'm Considering AppleScript / ScriptingBridge Distributed Notifications Native Frameworks (Apple Music only) What's the recommended way to do this on macOS? Are distributed notifications reliable enough to avoid polling entirely? Is there a performance difference between AppleScript and ScriptingBridge for IPC? For Apple Music specifically, should I use MusicKit, MediaPlayer, or stick with AppleScript? Are there other approaches I'm missing?
0
0
49
3w
PDFKit doesn't return the correct page
Hello, We are experiencing on some occasions a wrong behavior with PDFDocument method: func page(at index: Int) -> PDFPage? With certain PDF files, this method returns the wrong PDFPage. This occurs on iOS 18.3, 18.5 and 18.6.2 (an maybe on other versions). Try this PDF for instance (page 81 is returned when index = 2): https://drive.google.com/open?id=1MHm2wjfsbWB8OiRmARUMmvODYxp4DIqP&usp=drive_fs Also, I mention that this doesn't occur systematically with this PDF. When making a copy of this file we don't observe the issue. Could this be linked some kind of internal cache issue ?
0
0
76
3w
Massive amounts of leaked memory with the tvOS 26 system player user interface
Hi, We identified massive amounts of leaked memory with the tvOS 26 standard player user interface as soon as chapters (navigation markers) are involved. Artwork images associated with chapters are not correctly released anymore, leaking memory in chunks of several MiBs. Over time apps will be terminated by the system due to excessive memory consumption. The issue was reported to Apple as tvOS 26 regression: Huge memory leaks associated with navigation marker artworks displayed in the tvOS standard user interface, filed under FB21160665.
0
0
137
2w
Repeat song listens not queryable
Hi all, I've been working on some personal programming projects and have gotten into using the Apple Music API. I'm currently looking to get a list of recent songs using the /v1/me/recent/played/tracks endpoint and it's working well. However, I know there are some songs I've listened to multiple times in a row, and those are not showing up as unique tracks when querying this endpoint. I'm only seeing a list of the different songs I've listened to lately, not a true list of the most recent plays on my account. Is this intended behavior or am I going about something incorrectly here? My query is using that endpoint & specifying the types to be only [songs]. Thanks in advance for any ideas or insight.
0
0
198
5d
Editing a Library Playlist (MusicKit: iOS 16 beta)
I've just begun to dip my toes into the iOS16 waters. One of the first things that I've attempted is to edit a library playlist using: try await MusicLibrary.shared.edit(targetPlaylist, items: tracksToAdd) Where targetPlaylist is of type MusicItemCollection<MusicKit.Playlist>.Element and tracksToAdd is of type [Track] The targetPlaylist was created, using new iOS16 way, here: let newPlaylist = try await MusicLibrary.shared.createPlaylist(name: name, description: description) tracksToAdd is derived by performing a MusicLibraryRequest on a specific playlist ID, and then doing something like this: if let tracksToAdd = try await playlist.with(.tracks).tracks {    // add tracks to target playlist } My problem is that when I perform attempt the edit, I am faced with a rather sad looking crash. libdispatch.dylib`dispatch_group_leave.cold.1:     0x10b43d62c <+0>:  mov    x8, #0x0     0x10b43d630 <+4>:  stp    x20, x21, [sp, #-0x10]!     0x10b43d634 <+8>:  adrp   x20, 6     0x10b43d638 <+12>: add    x20, x20, #0xfbf          ; "BUG IN CLIENT OF LIBDISPATCH: Unbalanced call to dispatch_group_leave()"     0x10b43d63c <+16>: adrp   x21, 40     0x10b43d640 <+20>: add    x21, x21, #0x260          ; gCRAnnotations     0x10b43d644 <+24>: str    x20, [x21, #0x8]     0x10b43d648 <+28>: str    x8, [x21, #0x38]     0x10b43d64c <+32>: ldp    x20, x21, [sp], #0x10 ->  0x10b43d650 <+36>: brk    #0x1 I assume that I must be doing something wrong, but I frankly have no idea how to troubleshoot this. Any help would be most appreciated. Thanks. @david-apple?
11
0
3.3k
May ’25