Integrate photo, audio, and video content into your apps.

Posts under Media tag

44 Posts

Post

Replies

Boosts

Views

Activity

_MPRemoteCommandEventDispatch crashes on iOS 26.x devices.
I'm seeing crashes in _MPRemoteCommandEventDispatch on iOS 26.x devices in 3 apps. According to Bugsnag logs they are: NSInternalInconsistencyException: event dispatch <_MPRemoteCommandEventDispatch: <MPRemoteCommandEvent: 0x11c049500 commandID=THV0 command=<MPRemoteCommand: 0x109ad1ea0 type=Play (0) enabled=YES handlers=[0x109b6a310]> sourceID=(null) ([HostedRoutingSessionDataSource] handleControlSendingCommand<2W5E>)> state:201> deallocated without calling continuation I attached a log from Xcode organizer matching Bugsnag crash. mpr_remote_command_event.crash When I set the brakpoint on the -[_MPRemoteCommandEventDispatch dealloc] I can see it it's hit every time I tap play or pause on locked screen play button. Thread 0 Crashed: 0 libsystem_kernel.dylib 0x00000002370420cc __pthread_kill + 8 (:-1) 1 libsystem_pthread.dylib 0x00000001e975c810 pthread_kill + 268 (pthread.c:1721) 2 libsystem_c.dylib 0x0000000198f8ff64 abort + 124 (abort.c:122) 3 libc++abi.dylib 0x000000018a7cf808 __abort_message + 132 (abort_message.cpp:66) 4 libc++abi.dylib 0x000000018a7be484 demangling_terminate_handler() + 304 (cxa_default_handlers.cpp:76) 5 libobjc.A.dylib 0x000000018a6cff78 _objc_terminate() + 156 (objc-exception.mm:496) 6 xxxxxxxxxxxxxx 0x00000001003a7db8 CPPExceptionTerminate() + 416 (BSG_KSCrashSentry_CPPException.mm:156) 7 libc++abi.dylib 0x000000018a7cebdc std::__terminate(void (*)()) + 16 (cxa_handlers.cpp:59) 8 libc++abi.dylib 0x000000018a7ceb80 std::terminate() + 108 (cxa_handlers.cpp:88) 9 CoreFoundation 0x000000018d7341c4 __CFRunLoopPerCalloutARPEnd + 256 (CFRunLoop.c:769) 10 CoreFoundation 0x000000018d70bb5c __CFRunLoopRun + 1976 (CFRunLoop.c:3179) 11 CoreFoundation 0x000000018d70aa6c _CFRunLoopRunSpecificWithOptions + 532 (CFRunLoop.c:3462) 12 GraphicsServices 0x000000022e31c498 GSEventRunModal + 120 (GSEvent.c:2049) 13 UIKitCore 0x00000001930ceba4 -[UIApplication _run] + 792 (UIApplication.m:3902) 14 UIKitCore 0x0000000193077a78 UIApplicationMain + 336 (UIApplication.m:5577) 15 xxxxxxxxxxxxxx 0x00000001000c0134 main + 308 (main.swift:15) 16 dyld 0x000000018a722e28 start + 7116 (dyldMain.cpp:1477) Is the crash happening when the app is being terminated? Thank you!
1
1
355
5d
PHImageManager.requestImageDataAndOrientation callback is never called
I occasionally receive reports from users that photo import from the Photos library gets stuck and the progress appears to stop indefinitely. I’m using the following APIs (code to be added): func fetchAsset(_ asset: PHAsset) { let options = PHImageRequestOptions() options.deliveryMode = .highQualityFormat options.resizeMode = .exact options.isSynchronous = false options.isNetworkAccessAllowed = true options.progressHandler = { (progress, error, stop, info) in // 🚨 never called } let requestId = PHImageManager.default().requestImageDataAndOrientation( for: asset, options: options ) { data, _, _, info in // 🚨 never called } } Due to repeated reports, I added detailed logs inside the callback closures. Based on the logs, it looks like the request keeps waiting without any callbacks being invoked — neither the progressHandler nor the completion block of requestImageDataAndOrientation is called. This happens not only with the PHImageManager approach, but also when using PHAsset with PHContentEditingInputRequestOptions — the completion callback is not invoked as well. func fetchAssetByContentEditingInput(_ asset: PHAsset) { let options = PHContentEditingInputRequestOptions() options.isNetworkAccessAllowed = true asset.requestContentEditingInput(with: nil) { contentEditingInput, info in // 🚨 never called } } I suspect this is related to iCloud Photos. Here is what I confirmed from affected users: 1. Using the native picker, iCloud download proceeds normally and the photo can be attached. However, using the PHImageManager-based approach in my app, the same photo cannot be attached. 2. Even after verifying that the photo has been fully downloaded from iCloud (e.g., by trying “Export Unmodified Originals” in the Photos app as described here: https://support.apple.com/en-us/111762, and confirming the iCloud download progress completed), the callback is still not invoked for that asset. Detailed flow for (1): • I asked the user to attach the problematic photo (the one where callbacks never fire) using the native photo picker (UIImagePickerController). • The UI showed “Downloading from iCloud” progress. • The progress advanced and the photo was attached successfully. • Then I asked the user to attach the same photo again using my custom photo picker (which uses the PHImageManager APIs mentioned above). • The progress did not advance. • No callbacks were invoked. • The operation waited indefinitely and never completed. Workaround / current behavior: • If I ask users to force-quit and relaunch the app and try again, about 6 out of 10 users can attach successfully afterward. • The remaining ~4 out of 10 users still cannot attach even after relaunching. • For users who are not fixed immediately after relaunch, it seems to resolve naturally after some time. I’ve seen similar reports elsewhere, so I’m wondering if Apple is already aware of an internal issue related to this. If there is any known information, guidance, or recommended workaround, I would appreciate it. I also logged the properties of affected PHAssets (metadata) when the issue occurs, and I can share them below if that helps troubleshooting: [size=3.91MB] [PHAssetMediaSubtype(rawValue: 528)+DepthEffect | userLibrary | (4284x5712) | adjusted=true] [size=3.91MB] [PHAssetMediaSubtype(rawValue: 528)+DepthEffect | userLibrary | (4284x5712) | adjusted=true] [size=2.72MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true] [size=2.72MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true] [size=2.49MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true] [size=2.49MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
2
0
170
1w
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
2
0
784
1w
LiveKit SDK + VisionPro
I’m developing an app for visionOS and testing it on AVP (visionOS 26), on iOS 17 and 26 devices, and in simulators (visionOS 2.5). The idea of the app is random video calls. For video calls I use LiveKit SDK. At the moment, SDK the version is 2.11.0. Maybe this will help clarify where the problem is: A user presses the "Start" button, which calls matchingViewModel.startMatching(). After that, matchingViewModel.connectionState changes to .searching. Another user presses "Start" and the same thing happens. Then the API returns information for both users (myInfo, partners, roomId, myLiveKitToken). When a user receives the room information, matchingViewModel.connectionState changes to .connecting. At this point, the connection to LiveKit should happen. In the MatchingWrapperView file, the handleChangeRoomIdOrPartners method checks whether it should connect to LiveKit (when partner information and a room ID are available) or disconnect from the room (when the partner ends the call). The connectRoom method handles connecting to the LiveKit room, enables the camera, and runs emitHasConnectedToLiveKit() (matchingViewModel.connectionState changes to .connected, and information is sent to the other partner that the user has joined LiveKit). When testing visionOS device + visionOS device or iPhone or visionOS simulator, most of the time the video from the iPhone is not shown on the visionOS device. Less often, the video from the visionOS device is not shown on the iPhone or in the simulator. When testing iPhone + iPhone + visionOS simulator, everything usually works fine. Occasionally the video doesn’t appear, but this happens much less often. Maybe you know why this issue occurs much more frequently on visionOS? Here is all the code for the core functionality. If you need any additional code, please let me know. RoomModel.swift
2
0
79
2w
iOS React Native: Can two WebRTC stacks (Wazo & Jitsi) share media?
Hi everyone, I’m building a React Native iOS app where I’m integrating Wazo (native WebRTC) and Jitsi (WebView / WebRTC). Use case: Wazo is used to maintain a background call session (mainly signaling + audio keep-alive). Jitsi is used in the foreground for video calls. Problem: When Jitsi starts, it takes control of the microphone and camera. The Wazo call disconnects after ~5 minutes (likely due to media / audio session conflict). Even if Wazo audio/video is muted or tracks are disabled, the session still drops. My questions: Is it officially supported or recommended to run two WebRTC stacks (Wazo + Jitsi) simultaneously on iOS? Can Wazo stay connected without active audio/video tracks while Jitsi uses mic/camera? Is there a way to release Wazo media streams temporarily (but keep signaling alive) while Jitsi is loading or active? Are there any AVAudioSession / background mode limitations on iOS that make this impossible by design? If this is not supported, what is the recommended architecture (single WebRTC pipeline, switching media ownership, etc.)? Environment: iOS (React Native) Wazo SDK (native WebRTC) Jitsi Meet (WebView) CallKit + PushKit enabled Any guidance, documentation, or real-world experience would be greatly appreciated. Thanks in advance 🙏
1
0
337
3w
Live Streaming issue for the RTSP
We have the application 'ADS Smart', a companion application for our ADS Dashcam. We offer a feature that lets users stream the live footage of the dashcam cameras through the app. Currently, we are experiencing a time delay of 30+ seconds to see the live stream, i.e the first frame of the live footage is taking around 30+ seconds to display in the app. We are using the MobileVLCKit library to stream the videos in the app. The current flow of the code, Flutter triggers the native playback via a method channel The Dart side calls the iOS method channel <identifier_name>/ts_player with method playTSFromURL passing: url(e.g rtsp://.... for live), playerId viewId (stable ID used to host native UI) showControls optional localIp AppDelegate receives the call and prepares networking Entry point: AppDelegate.tsChannel handler for "playTSFromURL" in AppDelegate.swift. It resolves the Wi‑Fi interface and local IP if possible: Sets VLC_SOURCE_ADDRESS to the Wi‑Fi IP (when available) to prefer Wi‑Fi for the stream. Uses NWPathMonitor and direct interface inspection to find the Wi‑Fi interface (e.g., en0) and IP. Kicks off best-effort route priming to the dashcam IP/ports (non-blocking), see establishWiFiRoutePriority. AppDelegate chooses the right player implementation createPlayerForURL(_: ) decides: RTSP(rasp://..) --> use VLCKit-backed player (class TSStreamPlayer -> TSStreamPlayer class provides a VLC-backed video player for iOS, handling playback of Transport Stream(TS) URLs with strict main-thread UI updates, view safety, and stream management, using MobileVLCKit) .ts files --> use VLCKit-based player for playing already recorded videos in the app. If the selected player supports extras (e.g. TSStreamPlayerExtras), it sets LocalIP (if resolved) Wi-fi interface name AppDeletegate creates the native 'platform view' container and overlay platformView(for:parent:showControls:): Creates a container UIView attached to the Flutter root view Adds a dedicated child videoHost[viewId]-the host UIView for rendering video. If showControls == true, adds TSPlayerControlsOverlay over the video and wires overlay callbacks back to the Flutter via controlChannel (/player_controls) If showControls == false, adds a minimal back button and wires it to onGalleryBack. The player starts playback inside the host view Class player.playTSFromURL(urlString, in:host){ success, error in...} on the main thread. For RTSP/TS streams: this is handled by TSStreamPlayer(VLCKit). Success/failure is reported back to Flutter The completion closure invoked in step 5 returns true on first real playback or an error message on failure. The method channel result responds: true --> Flutter knows playback started FlutterError -> Flutter can show an error Stopping and cleanup "stop" on tsChannel stops and disposes the player(s). "removePlatformView" removes the overlay, back button, the host, and the container, and disposes any remaining players. I am attaching the logs of the app while running. The actual issue happening is that when the iOS device is connected to the dashcam's Wi-Fi, for the app's live streaming-related information, the iOS is using Mobile Data even though the wifi is the main communication channel. The iOS device takes approximately 30 seconds to display the first frame of live footage in the app. Despite being connected to the dashcam’s Wi-Fi, the iOS device sets the value of ES (en0) to Wi-Fi after multiple attempts, causing the live footage to appear in the app after this delay. So, how can we set up the configuration to display the live footage from the dashcam cameras within just 2 to 3 seconds in the iOS device? ios.txt
0
0
208
4w
Is there an errors with SpatialAudioCLI?
Hi, everyone, I downloaded the source code EditingSpatialAudioWithAnAudioMix.zip from https://developer.apple.com/documentation/Cinematic/editing-spatial-audio-with-an-audio-mix, when I carried out one of the actions named "process" in command line the program crashed!! Form the source code, I found that the value of componentType is set to kAudioUnitType_FormatConverter: // The actual `AudioUnit`. public var auAudioMix = AVAudioUnitEffect() init() { // Generate a component description for the audio unit. let componentDescription = AudioComponentDescription( componentType: kAudioUnitType_FormatConverter, componentSubType: kAudioUnitSubType_AUAudioMix, componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0) auAudioMix=AVAudioUnitEffect(audioComponentDescription: componentDescription) } But in the document from https://developer.apple.com/documentation/avfaudio/avaudiouniteffect/init(audiocomponentdescription:), it seems that componentType can not be set to kAudioUnitType_FormatConverter and : Has everyone encountered this problem?
1
0
159
Nov ’25
PhotosPickerItem.itemIdentifier always nil
I'm using the SwiftUI Photos Picker to select videos from the users Photos library and then opening the video using the PhotosPickerItem. I'm looking for a way to allow the user to open the same video on their other devices as the app uses SwiftData and CloudKit to provide access to a recently watched list of videos. The URL from the PhotosPickerItem appears to be device specific and so I was looking to see if I can use the itemIdentifier and then the init that takes the itemIdentifier to create the PhotosPickerItem on the other devices. The itemIdentifier however is always nil and so wouldn't be able to be used in this way. Is there an alternative approach whereby the users can open a video using a PhotosPickerItem and that item would be viewable on their other devices with an item identifier or a URL that is device agnostic. This approach should also not involve copying the video into other storage as it would simply expand the use of the users iCloud storage, providing a less than ideal user experience. If the user has opened the video from their Photos library, there should be a way to allow the same user (e.g. same Apple ID), to use the same app on another device to open the video again.
1
0
412
Nov ’25
FaceTime Screen-Share Audio and Video Experience
FaceTime’s screen-share audio balance is insanely absurd right now. Whenever I share media, the system audio that gets sent through FaceTime is a tiny whisper even at full volume (or even when connected to my speaker or headphones). The moment anyone on the call makes any noise at all, the shared audio ducks so hard it disappears, while the voice (or rustling or air conditioning noise) spikes to painful levels. It’s impossible to watch or listen to anything together. Also, the feature where FaceTime would shrink to a square during screen-sharing has been completely removed. That was a good feature and I'm really confused why it's gone. Now, the FaceTime window stays as a long rectangle that covers part of the content I'm trying to share (unless I do full screen tile, but then I can't pull up any other windows during the call) and can't be made smaller than about a third of the screen. You can't resize the window or adjust its dimensions, so it ends up blocking the actual media you're trying to watch. Here are some feature requests/fixes that would greatly improve the FaceTime screen-share experience: Option to adjust the shared media volume independently of call audio. Disable/toggle the extreme automatic audio docking while screen-sharing Reintroduce the minimized “floating square” mode or allow full manual resizing and repositioning of the FaceTime window during screen-share sessions. Overall, this setup makes FaceTime screen-sharing basically unusable. The audio balance is so inconsistent that it’s easier to switch to Zoom or Google Meet, which both handle shared sound correctly and let you move the call window out of the way. Until these issues are fixed, there’s no practical reason to use FaceTime for shared viewing at all.
1
0
263
Nov ’25
PHAsset asset.mediaSubtypes.contains(.photoHDR) always return false
we are working on HDR images and found this problem, asset.mediaSubtypes.contains(.photoHDR) always return false func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { guard !results.isEmpty else {return} let fetchResult = PHAsset.fetchAssets(withLocalIdentifiers: results.map({$0.assetIdentifier!}), options: nil) var selectedAssets:[PHAsset] = [] fetchResult.enumerateObjects { asset, _, _ in selectedAssets.append(asset) } for i in selectedAssets.enumerated() { let asset:PHAsset = i.element print("\(i.offset): ") print(PHAssetMediaSubtype.photoHDR.rawValue) print(asset.mediaSubtypes) print("photoHDR \(PHAssetMediaSubtype.photoHDR.rawValue): \(asset.mediaSubtypes.contains(.photoHDR))") print("photoLive \(PHAssetMediaSubtype.photoLive.rawValue): \(asset.mediaSubtypes.contains(.photoLive))") print("photoPanorama \(PHAssetMediaSubtype.photoPanorama.rawValue): \(asset.mediaSubtypes.contains(.photoPanorama))") print("photoScreenshot \(PHAssetMediaSubtype.photoScreenshot.rawValue): \(asset.mediaSubtypes.contains(.photoScreenshot))") print("videoHighFrameRate \(PHAssetMediaSubtype.videoHighFrameRate.rawValue): \(asset.mediaSubtypes.contains(.videoHighFrameRate))") print("videoScreenRecording \(PHAssetMediaSubtype.videoScreenRecording.rawValue): \(asset.mediaSubtypes.contains(.videoScreenRecording))") } }
1
0
414
Nov ’25
Recording Constant Frame Rate at 4K 60 fps mode
Hi, We have created an app which allows recording 4K 60 fps videos in the app. We have noted that some time the recording switched to 20 fps (the value 20 is constant) even though the resolution settings is at 4K 60fps. We are using AVCaptureDevice to invoke the camera. has anyone experienced this problem before ? What is unique to 20 fps? Why does it resort to 20 fps from 60 fps and why not to other numbers ?
1
0
241
Oct ’25
Windows Apple Music: how to enumerate the local library or export it? Is Library.musicdb documented / API available?
Environment Windows 11 [edition/build]: [e.g., 23H2, 22631.x] Apple Music for Windows version: [e.g., 1.x.x from Microsoft Store] Library folder: C:\Users<user>\Music\Apple Music\Apple Music Library.musiclibrary Summary I need a supported way to programmatically enumerate the local Apple Music library on Windows (track file paths, playlists, etc.) for reconciliation with the on-disk Media folder. On macOS this used to be straightforward via scripting/export; on Windows I can’t find an equivalent. What I’m seeing in the library bundle Library.musicdb → not SQLite. First 4 bytes: 68 66 6D 61 ("hfma"). Library Preferences.musicdb → also starts with "hfma". artwork.sqlite → SQLite but appears to be artwork cache only (no track file paths). Extras.itdb → has SQLite format 3 header but (from a quick scan) not seeing track locations. Genius.itdb → not a SQLite database on this machine. What I’ve tried Attempted to open Library.musicdb with SQLite providers → error: “file is not a database.” Binary/string scans (ASCII, UTF-16LE/BE, null-stripped) of Library.musicdb → did not reveal file paths or obvious plist/XML/JSON blobs. The Windows Apple Music UI doesn’t appear to expose “Export Library / Export Playlist” like legacy iTunes did, and I can’t find a public API for local library enumeration on Windows. What I’m trying to accomplish Read local track entries (absolute or relative paths), detect broken links, and reconcile against the Media folder. A read-only solution is fine; I do not need to modify the library. Questions for Apple Is the Library.musicdb file format documented anywhere, or is there a supported SDK/API to enumerate the local library on Windows? Is there a supported export mechanism (CLI, UI, or API) on Windows Apple Music to dump the local library and/or playlists (XML/CSV/JSON)? Is there a Windows-specific equivalent to the old iTunes COM automation or any MusicKit surface that can return local library items (not streaming catalog) and their file locations? If none of the above exist today, is there a recommended workaround from Apple for library reconciliation on Windows (e.g., documented support for importing M3U/M3U8 to rebuild the local library from disk)? Are there any plans/timeline for adding Windows feature parity with iTunes/Music on macOS for exporting or scripting the local library? Why this matters For large personal libraries, users occasionally end up with orphaned files on disk or broken links in the app. Without an export or API, it’s difficult to audit and fix at scale on Windows. Reference details (in case it helps triage) Library.musicdb header bytes: 68-66-6D-61-A0-00-00-00-10-26-34-00-15-00-01-00 (ASCII shows hfma…). artwork.sqlite is readable but doesn’t contain track file paths (appears limited to artwork). I can supply a minimal repro tool and logs if that’s helpful. Feature request (if no current API) Add an official Export Library/Playlists action on Windows Apple Music, or Provide a read-only Windows API (or schema doc) that surfaces track file locations and playlist membership from the local library. Thanks in advance for any guidance or pointers to docs I might have missed.
0
0
226
Sep ’25
ICDeviceBrowser authorization requests always return ICAuthorizationStatusNotDetermined on iOS 26.1 beta
Area ImageCaptureCore / ICDeviceBrowser Description On iOS 26.1 beta, calling requestControlAuthorization() requestContentsAuthorization() always returns .notDetermined and never transitions to .authorized or .denied. This prevents apps from properly accessing device control or contents authorization. The issue occurs regardless of device state or prior requests. Steps to Reproduce 1. Create and start an ICDeviceBrowser instance. 2. Call requestControlAuthorization() or requestContentsAuthorization(). 3. Inspect the returned ICAuthorizationStatus. Expected Result • The system should prompt the user if necessary. • A final status of either .authorized or .denied should be returned. Actual Result • The completion handler always reports .notDetermined. • No user prompt appears and the status does not change. Version / Build • iOS 26.1 beta • Xcode Hardware • [iPhone 15 Pro, iPad Pro (M2)] Impact This regression blocks development and testing of features relying on ImageCaptureCore. Applications depending on device browsing and content access cannot proceed, which significantly affects workflows involving external device integration. Notes This appears to be a regression compared to earlier iOS releases.
1
0
250
Sep ’25
On iOS26, in our video playback app(use AVPlayer), the sound and video are out of sync when playing after seeking.
Our app plays TS files on an iPhone. The app fragments the TS files, creates an M3U8 playlist, converts them to HLS(HTTP Live Streaming), and then uses AVPlayer to play the video content. On a device running iOS 26, after starting playback and seeking, restarting playback causes the video and audio to be out of sync (by about 2-3 seconds depending on the situation). This also occurs on iPadOS/macOS 26. This issue was not observed prior to iOS 18. We are trying to fix this issue on the app side, but we have the following questions: The behavior of AVPlayer is different between iOS 26 and previous versions. Has there been any change that could be considered? Or is it a bug? We tried pausing before seeking, but it didn’t seem to have any effect. Are there any APIs or workarounds that can improve this? We would appreciate it if you could tell us any other helpful documents or URLs.
0
0
327
Sep ’25
MPMediaItemPropertyArtwork crashes on Swift 6
Hey all! in my personal quest to make future proof apps moving to Swift 6, one of my app has a problem when setting an artwork image in MPNowPlayingInfoCenter Here's what I'm using to set the metadata func setMetadata(title: String? = nil, artist: String? = nil, artwork: String? = nil) async throws { let defaultArtwork = UIImage(named: "logo")! var nowPlayingInfo = [ MPMediaItemPropertyTitle: title ?? "***", MPMediaItemPropertyArtist: artist ?? "***", MPMediaItemPropertyArtwork: MPMediaItemArtwork(boundsSize: defaultArtwork.size) { _ in defaultArtwork } ] as [String: Any] if let artwork = artwork { guard let url = URL(string: artwork) else { return } let (data, response) = try await URLSession.shared.data(from: url) guard (response as? HTTPURLResponse)?.statusCode == 200 else { return } guard let image = UIImage(data: data) else { return } nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in image } } MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo } the app crashes when hitting MPMediaItemPropertyArtwork: MPMediaItemArtwork(boundsSize: defaultArtwork.size) { _ in defaultArtwork } or nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { _ in image } commenting out these two make the app work again. Again, no clue on why. Thanks in advance
6
0
3.1k
Sep ’25