Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

How can I add support for Apple Music lyrics sharing in my app?
I noticed that Instagram and iMessage support receiving shared lyrics from Apple Music. Specifically, when users long-press lyrics, a sheet pops up showing iMessage and Instagram. Clicking on either app generates a beautifully formatted lyrics image. I've looked through MusicKit documentation but couldn't find any related APIs. How can I implement this functionality in my app?
1
1
447
Feb ’25
How to Implement Screen Mirroring in iOS for Google TV?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device. What I Have Tried So Far I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings: Google Cast SDK: Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time. ReplayKit for Screen Capture: RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant. RTSP/UDP Streaming: Some third-party libraries support RTSP/UDP streaming for real-time screen sharing. Google TV does not natively support RTSP, making this approach difficult. My Questions: Is it possible to achieve real-time screen mirroring on Google TV using Google Cast SDK? Does Google TV support WebRTC or any low-latency streaming protocol that can be used from iOS? Are there any alternative approaches to mirror an iOS screen to Google TV with minimal latency? I would appreciate any guidance, code examples, or references to relevant documentation.
1
1
663
Dec ’25
SFSpeechRecognizer throws User denied access to speech recognition
I have created an app where you can speak using SFSpeechRecognizer and it will recognize you speech into text, translate it and then return it back using speech synthesis. All locales for SFSpeechRecognizer and switching between them work fine when the app is in the foreground but after I turn off my screen(the app is still running I just turned off the screen) and try to create new recognitionTask it it receives this error inside the recognition task: User denied access to speech recognition. The weird thing about this is it only happens with some languages. The error happens with Croatian or Hungarian locale for speech recognition but doesn't with English or Spanish locale.
1
0
386
Mar ’25
HDR video metadata
On an iOS 18 phone, I use AVCaptureSession to capture HDR with x420 format. The output CMSampleBuffer is HLG colorspace, the propagated attachments contain kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey. Now I use CAMetalLayer to render the CVPixelBuffer to the screen, but the brightness is brighter than AVSampleBufferDisplayLayer. Here is my code. - (void)_updateColorSpaceIfNeed:(CVPixelBufferRef)pixelBuffer { CAMetalLayer *layer = (CAMetalLayer *)_mtkView.layer; if (![layer isKindOfClass:CAMetalLayer.class]) return; layer.wantsExtendedDynamicRangeContent = YES; CFDataRef ambientViewingEnvironment = (CFDataRef)CVBufferCopyAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, NULL); NSData *data = (__bridge NSData *)ambientViewingEnvironment; if (ambientViewingEnvironment) CFRelease(ambientViewingEnvironment); CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadataWithAmbientViewingEnvironment:data]; // CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadata]; layer.EDRMetadata = metadata; layer.pixelFormat = MTLPixelFormatRGBA16Float; CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_2100_HLG); layer.colorspace = colorspace; if (colorspace) CGColorSpaceRelease(colorspace); } Why does the CAEDRMetadata class have "HLGMetadataWithAmbientViewingEnvironment:" and "HLGMetadata" methods, but does not provide the "HLGMetadataWithAmbientViewingEnvironment:sceneIllumination" method? I want to know how kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey affect tone mapping. Is there any documentation I can refer to?
1
0
463
Mar ’25
Urdu Language Keyboard Bug
Hello Apple, i've been using ios for many years and never had any issues with urdu language keyboard, but since the new 18.4 beta update some words are not working correctly as it should like a name of my friend who's name is "راعنیہ" but the new updated version cannot type is together and keep seperating like "راعنی ہ" its so frustrating to use like that and its not just one but so many other words that it just cannot do properly also the new font and no gap concept its hurting my eyes so much while reading or even typing.. i hope apple fixes that asap.. thankyou
1
0
488
Feb ’25
Proposal for a Privacy-Focused, iOS-Exclusive Dating App
Hello Apple Developers, I’m reaching out to the community with a concept that I truly believe could be a natural fit for the Apple ecosystem: A privacy-focused, iOS-exclusive dating app designed to enhance connections between Apple users while staying true to Apple’s commitment to security and user privacy. The idea is to create an iOS-only dating platform that fosters relationships between users who are part of the Apple ecosystem. The app would integrate seamlessly with Apple’s services (iMessage, FaceTime, Siri, etc.) and provide a premium user experience, where privacy is a priority. Apple users already prefer to communicate using Apple services (iMessage, FaceTime). A dating app designed specifically for iOS users would deepen this ecosystem lock-in, making it easier for Apple customers to connect within a trusted space. Apple is already known for its privacy focus, and an iOS-exclusive dating app would build upon that reputation. It would ensure secure, private interactions, minimizing the risks associated with data sharing in most dating apps today. The app could integrate directly with features like iCloud, Apple Pay (for date-night bookings), and Siri (for matchmaking suggestions), offering users a truly native iOS experience. While the app would remain free to use, here are a few potential monetization methods: Bundling with Apple One/iCloud+ for premium matchmaking features. Apple Pay-based date-night deals with local partners. I’d love to hear your thoughts on whether Apple might be open to this idea. Would there be any challenges from a technical or business perspective in creating a dating app exclusively for iOS users? I’m looking forward to hearing from you all, and thank you for your time and insights. Yours Truly, CapNKirk P.S. This is an idea. But I do not care who uses, implements, or executes this idea. I just want to see Apple take advantage of it.
1
0
510
Mar ’25
MusicKit for Android reports an error when playing stations
When I use musicKit SDK for Android 1.1.2, I found that MediaContainerType only defines three types: NONE = 0; ALBUM = 1; PLAYLIST = 2; The RADIO_STATION type is not defined. However, the documentation of com.apple.android.music.playback.model states that the RADIO_STATION type is supported. This problem causes an error after I pass in the stations ID: MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException May I ask how to solve this problem?
1
1
85
Apr ’25
MusicKit for Android reports an error when playing stations
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型: 无 = 0; 专辑 = 1; 播放列表 = 2; 未定义 RADIO_STATION 类型。 但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。 此问题在我传入 stations ID 后会导致错误: MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException 请问如何解决这个问题?
1
1
74
May ’25
Screen recording audio and video out of sync
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync. I don t know if it s a AVAssetWriterInputr setup problem,here is my code NSDictionary *audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(64000), AVFormatIDKey : @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey : @(2), AVSampleRateKey : @(44100) }; AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings]; audioAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:audioAssetWriterInput]; NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5), AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel}; NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264, AVVideoScalingModeKey : AVVideoScalingModeResize, AVVideoWidthKey:@(screenWidth*2), AVVideoHeightKey:@(screenHeight*2), AVVideoCompressionPropertiesKey:videoCompressSetting }; AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting]; videoAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:videoAssetWriterInput];
1
0
122
Apr ’25
How do I find the catalogID for songs in a apple music playlist? (MusicKit & Apple Music API)
Hello, How do I find the apple music catalogID for songs in a apple music playlist? Im building an iOS app that uses MusicKit/Apple Music API. For this example, you can assume that my iOS app simply allows users to upload their apple music playlists. And when users open a specific playlist, I want to find the catalogID for each song. Currently all the songs are returning song IDs in this format “i.PkdJvPXI2AJgm8”. I believe these are libraryIDs, not catalogIDs. I’d like to find a front end solution using MusicKit. But perhaps a back end solution using the Apple Music Rest API is required. Any recommendations would be appreciated!
1
0
110
Jun ’25
ShazamKit android sdk – 16 KB Page Size Support?
Hey, just wonderng if anyone knows whether the sdk for Android will be updated soon to support the new 16 KB memory page size requirement coming with Android 15? Google’s going to require all apps targeting Android 15+ to support it starting November 2025 Has anyone heard anything from Apple or the SDK team about this? Thanks!
1
0
185
May ’25
Invalid album IDs in parquet files
I’m spot-checking some of the data I’m extracting from the Apple Music Feed parquet files and finding numerous issues of invalid album IDs. For example, just looking at any album with a primary artist id of 163043, I see a few albums that are not available at music.apple.com/us/album/NNN. These include: 981094158 - Time and the River 1803443737 - Celebration (Live New York ’80) 1525426873 - Anything You Want: The Warner-Reprise-Elektra Years I notice that neither of these album ids are returned using the general Music API, so I’m a bit confused why they would exist in the parquet files at all. Thanks.
1
0
118
May ’25
[CoreImage] OS 26 breaks Metal kernels for CIFilters
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call: -[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12. I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem. Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter Feedback ID: FB17874311
1
0
172
Jun ’25
_MediaPlayer_AppIntents compilation error for iOS 26
getting an interesting error attempting to compile my app in Xcode 26 beta. error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher') note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher') Unable to find module dependency: '_MediaPlayer_AppIntents' Not sure what to try and pull to fix this issue
1
1
150
Jun ’25
ffmpeg xcframework not working on Mac, but working correctly on iOS
I have an app (currently in development stage) which needs to use ffmpeg, so I tried searching how to embed ffmpeg in apple apps and found this article https://doc.qt.io/qt-6/qtmultimedia-building-ffmpeg-ios.html It is working correctly for iOS but not for macOS ( I have made changes macOS specific using chatgpt and traditional web searching) Drive link for the file and instructions which I'm following: https://drive.google.com/drive/folders/11wqlvb8SU2thMSfII4_Xm3Kc2fPSCZed?usp=share_link Please can someone from apple or in general help me to figure out what I'm doing wrong?
1
0
186
Jun ’25
Apple Music / MusicKit and Simulator
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask. At the very least a small sample library that works in the simulator would be welcome (similar to how photos works) Cheers
1
1
183
Jul ’25
AVSpeechSynthesizer read Mandarin as Cantonese(iOS 26 beta 3))
In iOS 26, AVSpeechSynthesizer read Mandarin into Cantonese pronunciation. No matter how you set the language, and change the settings of my phone system, it doesn't work. let utterance = AVSpeechUtterance(string: "你好啊") //let voice = AVSpeechSynthesisVoice(language: "zh-CN") // not work let voice = AVSpeechSynthesisVoice(language: "zh-Hans") // not work too utterance.voice = voice et synth = AVSpeechSynthesizer() synth.speak(utterance)
1
0
299
Aug ’25
iTunes Search API no longer returning explicit results?
My app has been using the iTunes Search API (itunes.apple.com/search) for a few years now, but at some point over the last week or so (late Sept. 2025) it is no longer returning track results with explicit content, regardless of whether I provide "explicit=Yes" (which is the default anyway, according to the API documentation - https://performance-partners.apple.com/search-api). Has anyone else experienced this with this API and have you figured out a workaround? FYI, I do also use the more robust Apple Music API in another part of my app, which isn't going through this issue, so I know it's technically an alternative. I just need to stick with iTunes Search API in this particular case. Thanks.
1
4
315
Oct ’25