Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Under certain conditions, using CallKit does not automatically enable the microphone.
Issue: Under certain conditions, using CallKit does not automatically enable the microphone. Steps to Reproduce: 1.Start an outgoing call, then the user manually mutes the audio. 2.Receive a native incoming call, end the current call, then answer the new incoming call.(This order is important.) 3.End the incoming call. 4.Start another outgoing call and observe the microphone; do not manually mute or unmute. Actual Behavior: The audio icon indicates that the audio is unmuted, but the microphone remains off, and the small yellow dot in the top status bar (which represents the microphone) does not appear. Expected Behavior: The microphone should be on, consistent with the audio icon display, and the small yellow dot should appear in the top status bar. Device: iPhone 16 pro & iPhone 15 pro, iOS 18.0+ Can it be reproduced using speakerbox(CallKit Demo)? YES
2
1
448
Mar ’25
MusicKit for Android reports an error when playing stations
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型: 无 = 0; 专辑 = 1; 播放列表 = 2; 未定义 RADIO_STATION 类型。 但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。 此问题在我传入 stations ID 后会导致错误: MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException 请问如何解决这个问题?
1
1
63
May ’25
Uploading PhotoKit Resources in the Background
The introduction of PHBackgroundResourceUploadExtension is a welcome addition in iOS 26.1. I wonder however, how to attach a debugger and actually get the system to call the process() method of the extension. I tried to run the extension both inside photos app (and also the main app for testing), but when I take a photo or add photos to the library (saving), the process() method does never get called. Any hints would be appreciated to debug the PHBackgroundResourceUploadExtension during development.
0
1
155
Nov ’25
Accessing External Timecode from Blackmagic ProDock in Custom App
Hi everyone, I’m exploring using the iPhone 17 Pro with the Blackmagic ProDock in a custom capture app. The genlock functionality seems accessible via AVExternalSyncDevice and related APIs, which is great. I’m specifically curious about external timecode coming in from the ProDock: • Is there a public way to access the timecode feed in a custom app via AVFoundation or another Apple API? • If so, what is the recommended approach to read or apply that timecode during capture? • Are there any current limitations or entitlements required to access timecode from ProDock in a third-party app? I’m excited to start integrating synchronized capture in my app, and any guidance or sample patterns would be greatly appreciated. Thanks in advance! — [Artem]
2
0
267
Oct ’25
APMP & Photography?
Hi, I'm a fan of the gallery in vision pro which has video as well as still photography but I'm wondering if Apple has considered adding the projected media tags to heic so that we can go that next step from Spatial photos to Immersive photos. I have a device that can give me 12k x 6k fisheye images in HDR, but it can't do it at a framerate or resolution that's good enough for video, so I want to cut my losses and show off immersive photos instead. Is there something Apple is already working on for APMP stills or should I create my own app that reads metadata inside a HEIC that I infer in a similar way to the demo "ProjectedMediaConversion" is doing for Video. It would be great to have 180VR photos, which could show as Spatial in a gallery view, but going immersive would half-surround you instead of floating in the blurred view. I think that would be a pretty amazing effect.
2
0
299
Oct ’25
Indicate Packet Loss With AVAudioConverter for OPUS Decoding
I'm using an AVAudioConverter object to decode an OPUS stream for VoIP. The decoding itself works well, however, whenever the stream stalls (no more audio packet is available to decode because of network instability) this can be heard in crackling / abrupt stop in decoded audio. OPUS can mitigate this by indicating packet loss by passing a null pointer in the C-library to int opus_decode_float (OpusDecoder * st, const unsigned char * data, opus_int32 len, float * pcm, int frame_size, int decode_fec), see https://opus-codec.org/docs/opus_api-1.2/group__opus__decoder.html#ga9c554b8c0214e24733a299fe53bb3bd2. However, with AVAudioConverter using Swift I'm constructing an AVAudioCompressedBuffer like so:         let compressedBuffer = AVAudioCompressedBuffer(             format: VoiceEncoder.Constants.networkFormat,             packetCapacity: 1,             maximumPacketSize: data.count         )         compressedBuffer.byteLength = UInt32(data.count)         compressedBuffer.packetCount = 1   compressedBuffer.packetDescriptions! .pointee.mDataByteSize = UInt32(data.count)         data.copyBytes(             to: compressedBuffer.data .assumingMemoryBound(to: UInt8.self),             count: data.count         ) where data: Data contains the raw OPUS frame to be decoded. How can I specify data loss in this context and cause the AVAudioConverter to output PCM data whenever no more input data is available? More context: I'm specifying the audio format like this:         static let frameSize: UInt32 = 960         static let sampleRate: Float64 = 48000.0         static var networkFormatStreamDescription = AudioStreamBasicDescription(             mSampleRate: sampleRate,             mFormatID: kAudioFormatOpus,             mFormatFlags: 0,             mBytesPerPacket: 0,             mFramesPerPacket: frameSize,             mBytesPerFrame: 0,             mChannelsPerFrame: 1,             mBitsPerChannel: 0,             mReserved: 0         )         static let networkFormat = AVAudioFormat( streamDescription: &networkFormatStreamDescription )! I've tried 1) setting byteLength and packetCount to zero and 2) returning nil but setting .haveData in the AVAudioConverterInputBlock I'm using with no success.
1
1
874
May ’25
Process to request the restricted entitlement behind “DJ with Apple Music” (tempo control / time-stretch on Apple Music streams)?
Hi, I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically: • Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade. • Beat-matched transitions between tracks. From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit. Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team? For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed. Thanks in advance!
0
1
228
Oct ’25
Why Does AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps Occur on iPhone?
Hi everyone, We're encountering an unexpected issue with our iPhone-only camera app: 👉 TimeMark - Photo Proof https://apps.apple.com/us/app/timemark-photo-proof/id6446071834 Problem Description: Our app uses a full-screen camera view via AVCaptureSession. In some cases reported by users, the camera fails immediately upon app launch, and we receive this interruption reason: AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps According to the Apple documentation https://developer.apple.com/documentation/avfoundation/avcapturesession/interruptionreason/videodevicenotavailablewithmultipleforegroundapps?language=objc , this interruption typically occurs when the app is running in a multi-app layout such as Slide Over, Split View, or Picture in Picture — all of which are iPad-only features. However, this issue is being reported on iPhones, and our app does not support iPad at all. Also noted in the documentation: "Given your present AVCaptureSession configuration, the session may only be run if your app occupies the full screen." Additional Context: The issue occurs immediately on app launch, before the user can interact with the camera. We don’t enable multitaskingCameraAccessEnabled. We are 100% sure this is happening on iPhone, not iPad. It’s hard to reproduce; users report it happening sporadically. Locally, we tried playing Picture-in-Picture videos (e.g., Safari/YouTube) before launching our app, but we could not reproduce the issue. Questions: Why is this interruption reason occurring on iPhone, which doesn’t officially support Slide Over or Split View? Could this be caused by some system-level multitasking or resource contention (e.g., Picture in Picture from FaceTime or Safari)? Would enabling multitaskingCameraAccessEnabled help prevent this issue on iPhone, even though it's designed for iPad? Enabling multitaskingCameraAccessEnabled seems to require enabling UIBackgroundModes → voip. Would adding this background mode cause any App Store review risk or rejection if our app doesn't actually use VoIP functionality? Any help, insight, or suggestions would be greatly appreciated. Thanks in advance!
3
0
750
Oct ’25
iOS 17 camera capture assertions and issues
Hello, Starting in iOS 17, our application started having some issue publishing to our video session. More specifically the video capture seems to be broken in some, but not all sessions. What's troubling is that we're seeing that it fails consistently every 4 sessions. It also fails silently, without reporting any problems to the app. We only notice that there are no frames being rendered or sent to the remote devices. Here's what shows-up in the console: <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:253) - (err=-16453) Anyone seeing this? Any idea what could be the cause? Our sessions work perfectly on iOS16 and below. Thanks
3
1
1.3k
Oct ’25
After iOS 18.5, the AVFoundation AVCaptureSessionInterruptionReason.videoDeviceNotAvailableWithMultipleForegroundApps error has caused a significant increase in camera black screen issues.
Issue: After iOS 18.5 release, our app is experiencing a significant increase in AVCaptureSessionInterruptionReason.videoDeviceNotAvailableWithMultipleForegroundApps errors. Details: Our camera-related code has not been updated recently.However, we've observed that the error rate has significantly increased starting from May 2025. The error rate has risen from approximately 0.02% (2 in 10,000 users) to 0.1% (1 in 1,000 users). This represents a 5x increase in error occurrence. The frequency has increased noticeably since iOS 18.5 This is affecting our app's camera functionality and user experience Questions: Are there any known changes in iOS 18.5 regarding camera access management? What are the recommended best practices to handle this interruption reason? Are there any API changes we should be aware of? Best, Shay
2
1
413
Oct ’25
CoreAudio server plugin gaining write access with SystemConfiguration.framework functions
Hi, our CourAudio server plugin utilizes the SystemConfiguration.framework to store and restore specific shared system wide settings. While our application can authenticate to utilize the SystemConfiguration.framework to gain write access to the shared configuration settings the CoreAudio server plugin obviously can't have any user interaction and therefor does not authenticate. Is it possible to authenticate the CoreAudio server plugin to gain write permissions? Are there any entitlements or other means that would allow this? Thanks!
2
0
108
Apr ’25
Audio session activation occasionally fails from CarPlay
I'm working on adding CarPlay support to an audio app and am running into an issue. Occasionally, when a user opens the app from CarPlay while the main app scene is either not connected or is currently in the background, I will receive an error when attempting to activate the audio session. The code below mimics my setup: do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .spokenAudio) try AVAudioSession.sharedInstance().setActive(true) } catch { print(error) // NSOSStatusErrorDomain - 560557684: Session activation failed } That error code maps to AVAudioSession.ErrorCode.cannotInterruptOthers. Once in this state, all subsequent attempts to play different pieces of content will fail. However, things will start working normally if the user opens the app on their phone and tries again from CarPlay (while the app is in the foreground on their phone). I'm not sure why it would behave this way and want to note that I do have the audio background mode capability enabled. Has anyone else encountered this? Are there any workarounds or changes I could make to prevent this from happening?
0
1
161
Apr ’25
Question about PT Framework channel tone behaviour
I've been wondering if there is a way to modify or even disable tones for indicating channel states. The behaviour regarding tones seems like a black box with little documentation. During migration to Apple's PT Framework we've noticed that there are few scenarios where a tone is played which doesn't match certain certifications. For example; moving from a channel to another produces a tone which would fail a test case. I understand the reasoning fully, as it marks that the channel is ready to transmit or receive, but this doesn't mirror the behaviour of TETRA which would be wanted in this case. I'm also wondering if there would be any way to directly communicate feedback regarding PT Framework?
3
0
367
Oct ’25
Terrible performance when using MusicKit's Artwork with UIKit
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image. The way I have it setup is really simple: extension MusicKit.Song { func imageURL(for cgSize: CGSize) -> URL? { return artwork?.url( width: Int(cgSize.width), height: Int(cgSize.height) ) } func localImage(for cgSize: CGSize) -> UIImage? { guard let url = imageURL(for: cgSize), url.scheme == "musicKit", let data = try? Data(contentsOf: url) else { return nil } return .init(data: data) } } Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these: 2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.} 2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7 2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted) My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller. I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX. SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
5
0
1.5k
Mar ’25
Playlist IDs from MusicKit not working with Apple Music API
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists. I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists. Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API. Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help! func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async { let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap { AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs" }) let playlistID = playlist.id // Build the request URL for adding a song to a playlist guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else { alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.") return } // Authorization Header guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else { alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.") return } do { var request = URLRequest(url: url) request.httpMethod = "POST" request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization") request.setValue("application/json", forHTTPHeaderField: "Content-Type") let encoder = JSONEncoder() let data = try encoder.encode(tracks) request.httpBody = data let musicRequest = MusicDataRequest(urlRequest: request) let musicRequestResponse = try await musicRequest.response() // Check if the request was successful (status 201) if musicRequestResponse.urlResponse.statusCode == 201 { alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.") } else { print("Status Code: \(musicRequestResponse.urlResponse.statusCode)") print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")") // Attempt to decode the error response into the AppleMusicErrorResponse model if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) { let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred." alert.wrappedValue = AlertItem(title: "Error", message: errorMessage) } else { alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.") } } } catch { alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)") } }
1
1
792
Dec ’24
Mute behavior of Volume button on AVPlayerViewController iOS 26
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before. On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons). As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system. But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior. I need some bases to explain this situation. Thank you.
0
1
124
Oct ’25
iOS 26.0 (23A5276f) - Bluetooth Call Audio Broken (AirPods + Car)
iOS 26.0 (23A5276f) – Bluetooth Call Audio Issue I’m experiencing a Bluetooth audio issue on iOS 26.0 (build 23A5276f). I cannot make or receive phone calls properly using Bluetooth devices — this affects both my car’s Bluetooth system and my AirPods Pro (2nd generation). Notably: Regular phone calls have no audio (either I can’t hear the other person, or they can’t hear me). WhatsApp and other VoIP apps work fine with the same Bluetooth devices. Media playback (music, video, etc.) works without issues over Bluetooth. It seems this bug is limited to the native Phone app or the system audio routing for regular cellular calls. Please advise if this is a known issue or if a fix is expected in upcoming beta releases.
1
1
284
Jun ’25
ProRAW to CIRAWFilter to HEIF producing borked HDR results
Following WWDC 2023 "Support HDR images in your app", I'm trying to save 48-megapixel ProRAWs (taken on an iPhone 14 Pro Max) as HDR HEICs to the Photo Library. After processing the ProRAW file using CIRAWFilter, whether I use CIContext.heif10Representation() or convert to a CGImage, then UIImage, and use UIImage.heicData(), I get photos that behave oddly in the Photo Library. They appear too dark, and visibly brighten when first viewed, but more problematic is that the photos brighten a great deal more when you edit them with the Photos editor. This is the behavior when using the itur_2100_PQ color space, but itur_2100_HLG behaves similarly, except that it gets dramatically darker when edited. This behavior occurs whether CIRAWFilter.extendedDynamicRangeAmount is set to 0.0, or 2.0, or not set at all. So what am I doing wrong? Here is a minimal iOS app -- well, just the ContentView -- that demonstrates the issue. You also need a .dng ProRAW file included in the project directory named test.dng. I'd love to include such a file, but I can't. Be prepared for a multi-second wait when you save the photo. import SwiftUI import Photos struct ContentView: View { let context = CIContext() let hdrColorSpace = CGColorSpace(name: CGColorSpace.itur_2100_PQ)! var body: some View { VStack(spacing: 100) { Button("Save Photo From CGImage/UIImage") { savePhotoFromUIImage() } Button("Save Photo From CIImage") { savePhotoDirectFromCIImage() } }.padding(60) } //convert RAW with CIRAWFilter to CIImage, then convert to CGImage, then UIImage, then HEIF private func savePhotoFromUIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { guard let outputCGImage = context.createCGImage(ciImage, from: ciImage.extent, format: .RGB10, colorSpace: hdrColorSpace) else { return } let uiImage = UIImage(cgImage: outputCGImage) if let heicData = uiImage.heicData() { saveHEIFPhotoToLibrary(imageData: heicData) } else { print("Failed to convert UIImage to HEIC") } } } //convert RAW with CIRAWFilter to CIImage, then to HEIF private func savePhotoDirectFromCIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { do { let heif = try context.heif10Representation(of: ciImage, colorSpace: hdrColorSpace) saveHEIFPhotoToLibrary(imageData: heif) } catch { print("Failed to get HEIF representation from CIContext") } } } private func processRAW(url: URL) -> CIImage? { guard let coreRawFilter = CIRAWFilter(imageURL: url) else { return nil } coreRawFilter.extendedDynamicRangeAmount = 2.0 //the issue persists whether this is not set, or set to 0, or set to, say, 2.0 guard let ciImage = coreRawFilter.outputImage else { return nil } return ciImage } private func saveHEIFPhotoToLibrary(imageData: Data) { PHPhotoLibrary.shared().performChanges({ let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() creationRequest.addResource(with: .photo, data: imageData, options: options) }) { success, error in if let error = error { print("Error saving photo: \(error.localizedDescription)") } else { print("Photo saved.") } } } }
0
1
586
Jan ’25
How to detect a song end?
I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one. Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it. The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc. The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case. When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
14
3
5.4k
Jan ’25