Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

Playing music with Musickit.js in Chrome and Firefox
I'm unable to play music using Musickit.js in the Chrome or Firefox browsers. Even using the apple guide here: https://js-cdn.music.apple.com/musickit/v1/index.html - I've added my Music Developer Token and song/album url, but it only works in Safari and not in Chrome or Firefox. I'm unsure if this is a global issue, or if there is something I need to do to enable playback in other browsers, but as it stands it's not working for me. Thanks for any help in advance!
5
0
946
Jan ’25
Easy way to output live
Hey everyone😊, I am building an app that includes a live camera feed preview. That's all I need to do along side identifying the images with createML's image classification. I don't need to capture images at all. I've seen some very complicated tutorials. I just want to use a couple of lines of code.
6
0
656
Jan ’25
Apple Music Won't Play using the latest version of Xcode/MacOS
I have tried everything. The songs load unto the playlists and on searches, but when prompted to play, they just won't play. I have a wrapper since my main player (which carries the buttons for play/rewind/forward/etc.), is in Objc. // // ApplePlayerWrapper.swift // UniversallyMac // // Created by Dorian Mattar on 11/10/24. // import Foundation import MusicKit import MediaPlayer @objc public class MusicKitWrapper: NSObject { @objc public static let shared = MusicKitWrapper() private let player = ApplicationMusicPlayer.shared // Play the current track @objc public func play() { guard !player.queue.entries.isEmpty else { print("Queue is empty. Cannot start playback.") return } logPlayerState(message: "Before play") Task { do { try await player.prepareToPlay() try await player.play() print("Playback started successfully.") } catch { if let nsError = error as NSError? { print("NSError Code: \(nsError.code), Domain: \(nsError.domain)") } } logPlayerState(message: "After play") } } // Log the current player state @objc public func logPlayerState(message: String = "") { print("Player State - \(message):") print("Playback Status: \(player.state.playbackStatus)") print("Queue Count: \(player.queue.entries.count)") // Only log current track details if the player is playing if player.state.playbackStatus == .playing { if let currentEntry = player.queue.currentEntry { print("Current Track: \(currentEntry.title)") print("Current Position: \(player.playbackTime) seconds") print("Track Length: \(currentEntry.endTime ?? 0.0) seconds") } else { print("No current track.") } } else { print("No track is playing.") } print("----------") } // Debug the queue @objc public func debugQueue() { print("Debugging Queue:") for (index, entry) in player.queue.entries.enumerated() { print("\(index): \(entry.title)") } } // Ensure track availability in the queue public func queueTracks(_ tracks: [Track]) { Task { do { for track in tracks { // Validate Play Parameters guard let playParameters = track.playParameters else { print("Track \(track.title) has no Play Parameters.") continue } // Log the Play Parameters print("Track Title: \(track.title)") print("Play Parameters: \(playParameters)") print("Raw Values: \(track.id.rawValue)") // Ensure the ID is valid if track.id.rawValue.isEmpty { print("Track \(track.title) has an invalid or empty ID in Play Parameters.") continue } // Queue the track try await player.queue.insert(track, position: .afterCurrentEntry) print("Queued track: \(track.title)") } print("Tracks successfully added to the queue.") } catch { print("Error queuing tracks: \(error)") } debugQueue() } } // Clear the current queue @objc public func resetMusicPlayer() { Task { player.stop() player.queue.entries.removeAll() print("Queue cleared.") print("Apple Music player reset successfully.") } } } I opened an Apple Dev. ticket, but I'm trying here as well. Thanks!
1
0
462
Jan ’25
Rear View Camera Installed – Now CarPlay Audio Stops After 15 Seconds via Bluetooth
I recently installed a rear-view camera in my car, and ever since, I've been experiencing a frustrating issue with my CarPlay. After about 15 seconds of playing audio via Bluetooth, the sound stops coming out of the speakers, even though the song continues to run in the background. For context, my stereo system is an aftermarket unit that I installed to enable CarPlay functionality. Everything worked perfectly before adding the rear-view camera. Unfortunately, my unit does not have a port for a wired connection, so I can't test the audio using a cable. Has anyone experienced a similar issue? Could the camera installation be interfering with the Bluetooth or audio system somehow? Any advice or troubleshooting tips would be greatly appreciated!
1
0
314
Jan ’25
ShazamKit supported for iOS apps that can run on Mac silicon?
I am having issues deploying my iOS app, that uses ShazamKit, to get working on a Mac with Apple silicon. When uploading the archive to App Store Connect I do get ITMS-90863: Macs with Apple silicon support issue - The app links with libraries that aren’t present in macOS: /usr/lib/swift/libswiftShazamKit.dylib Is ShazamKit not supported for iOS apps that can run on Macs with Apple silicon? Or is there something I should fix in my setup / deployment?
26
0
1k
Jan ’25
SystemMusicPlayer.item nil when state = .playing
I've been working with MusicKit without enrolling for a developer account and I haven't run into any issues until I noticed nil on the SysteyMusicPlayer item for some songs and I don't understand why. Are some songs blocked from the framework? Or is it somehow a limitation to not having registered MusicKit to the bundle ID? I am planning on using MusicKit properly in prod and this is just a test app for a package I'm working on. These are the nil songs which I got from the Discovery Station: https://music.apple.com/ca/album/okay/950816298?i=950816304 https://music.apple.com/ca/album/youre-so-cool/1670485433?i=1670485446
0
0
416
Jan ’25
Compatibility Issue Between 360 Cameras and iPad Pro M4
Hello everyone, I need some help about this things. If you also know, pls comment. Overview We are planning to develop an app using the “Support external cameras in your iPadOS app” feature introduced in iPadOS 17. Before implementing this feature, it is necessary for the iPad to recognize external cameras. However, among the iPad models compatible with iPadOS 17, we have found that some of the iPads owned by our development team can recognize external cameras, while others cannot. If you have any reports regarding compatibility issues or information on how to resolve these problems, please share them with us. Detailed Explanation: The results of our investigation are as follows: External Camera Used: A 360-degree camera Devices Firmware RICOH Theta X 2.61.0(2024/12/26Latest) RICOH Theta Z1 Tested iPad Devices Firmware Status 12.9インチiPad Pro(第3世代) IOS 17.5.1 OK 11インチiPad Pro(M4) IOS 18.2 NG Verification Method Step 1: Power on the iPad and the external camera, ensuring both are ready for connection. Step 2: Connect the iPad and the external camera using a USB-C cable. Step 3: Launch FaceTime on the iPad and check the displayed camera feed. If the external camera is recognized, the feed from the external camera will be displayed.
1
0
513
Jan ’25
AVAudioEngine Voice Processing Fails with Mismatched Input/Output Devices: AggregateDevice Channel Count Mismatch
I'm encountering errors while using AVAudioEngine with voice processing enabled (setVoiceProcessingEnabled(true)) in scenarios where the input and output audio devices are not the same. This issue arises specifically with mismatched devices, preventing the application from functioning as expected. Works: Paired devices (e.g., MacBook Pro mic → MacBook Pro speakers) Fails: Mismatched devices (e.g., AirPods mic → MacBook Pro speakers) When using paired input and output devices: The setup works as expected. Example: MacBook Pro microphone → MacBook Pro speakers. When using mismatched devices: AVAudioEngine setup fails during aggregate device construction. Example: AirPods microphone → MacBook Pro speakers. Error logs indicate a channel count mismatch. Here are the partial logs. Due to the content limit, I cannot post the entire logs. AUVPAggregate.cpp:1000 client-side input and output formats do not match (err=-10875) AUVPAggregate.cpp:1036 err=-10875 AVAEInternal.h:109 [AVAudioEngineGraph.mm:1344:Initialize: (err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)): error -10875 AggregateDevice.mm:329 Failed expectation of constructed aggregate (312): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (312): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) AggregateDevice.mm:182 error fetching default pair AggregateDevice.mm:329 Failed expectation of constructed aggregate (336): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (336): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) AUHAL.cpp:1782 ca_verify_noerr: [AudioDeviceSetProperty(mDeviceID, NULL, 0, isInput, kAudioDevicePropertyIOProcStreamUsage, theSize, theStreamUsage), 560227702] AudioHardware-mac-imp.cpp:3484 AudioDeviceSetProperty: no device with given ID AUHAL.cpp:1782 ca_verify_noerr: [AudioDeviceSetProperty(mDeviceID, NULL, 0, isInput, kAudioDevicePropertyIOProcStreamUsage, theSize, theStreamUsage), 560227702] AggregateDevice.mm:182 error fetching default pair AggregateDevice.mm:329 Failed expectation of constructed aggregate (348): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (348): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) Is it possible to use voice processing with different input/output devices? If yes, are there any specific configurations required to handle mismatched devices? How can we resolve channel count mismatch errors during aggregate device construction? Are there settings or API adjustments to enforce compatibility between input/output devices? Are there any workarounds or alternative approaches to achieve voice processing functionality with mismatched devices? For instance, can we force an intermediate channel configuration or downmix input/output formats?
3
2
690
Jan ’25
Unable to generate .p12 file from fairplay.cer
I am reaching out regarding an issue with my Apple FairPlay Streaming Certificate. To generate the certificate signing request (CSR), I used the following OpenSSL commands: openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr However, according to the guide provided by Apple and instructions from my DRM provider, I should have used: openssl genrsa -aes256 -out privatekey.pem 1024 openssl req -new -sha1 -key privatekey.pem -out certreq.csr -subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US" I suspect this discrepancy might be causing the issue with my FairPlay certificate. After obtaining the fairplay.cer file and importing it into Keychain Access, I noticed the following: When I expand the certificate in Keychain Access, I can only see a public key and no private key. As a result, I am unable to export the certificate as a .p12 file, as this option is disabled. As per my DRM provider's instructions, I need to export the certificate along with the corresponding private key as a .p12 file with a password. Since the private key is not visible in Keychain Access, I am unable to proceed further. I have read the FairPlay Streaming Overview but could not find any reasons as to why this issue is occurring or guidance on the procedure to revoke a certificate. Additionally, I came across the terms and conditions which mentioned reaching out to product-security at Apple for assistance in revoking corrupt certificates. However, despite reaching out, I have not received a response. Any help on how to proceed will be great!
0
0
535
Jan ’25
Exporting Higher Resolution Spatial Videos in Final Cut
How do we export a spatial video at a higher resolution than 2200 x 2200? For example, I want to export a video that I edited in a spatial timeline as 4096 x 4096 resolution with spatial metadata to output as an MV-HEVC file. I looked under both Final Cut and Compressor export settings, but couldn't find a way to do this. Thanks for your help!
1
0
483
Jan ’25
Help Needed: How to Make iOS Timer More Stable?
I’m currently developing an iOS metronome app using DispatchSourceTimer as the timer. The interval is set very small, around 50 milliseconds, and I’m using CFAbsoluteTimeGetCurrent to calculate the elapsed time to ensure the beat is played within a ±0.003-second margin. The problem is that once the app goes to the background, the timing becomes unstable—it slows down noticeably, then recovers after 1–2 seconds. When coming back to the foreground, it suddenly speeds up, and again, it takes 1–2 seconds to return to normal. It feels like the app is randomly “powering off” and then “overclocking.” It’s super frustrating. I’ve noticed that some metronome apps in the App Store have similar issues, but there’s one called “Professional Metronome” that’s rock solid with no such problems. What kind of magic are they using? Any experts out there who can help? Thanks in advance! P.S. I’ve already enabled background audio permissions. The professional metronome that has no issues: https://link.zhihu.com/?target=https%3A//apps.apple.com/cn/app/pro-metronome-%25E4%25B8%2593%25E4%25B8%259A%25E8%258A%2582%25E6%258B%258D%25E5%2599%25A8/id477960671
2
0
555
Jan ’25
The operation couldn’t be completed error during live restart playback in native AV player
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time. Error: The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason: Scenario: The live event is scheduled from 7:00 AM to 8:00 AM. Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time. As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
1
0
825
Jan ’25
Is there a way to filter PHPickerViewController by the creation date of the assets?
Our app filters the photo library to a certain date range for ease of picking photos. However, to do this, we have to require full permissions to the photo library. We would like to use the PHPickerViewController and have it filter the results by the assets creation date? This would allow us to use it. I see other filter options, but not this one. And if it isn't there, is this something that is being thought about or on a roadmap?
1
0
563
Jan ’25
Slow Image Retrieval Using requestImageForAsset:targetSize:contentMode:options:resultHandler:
Hello, I am experiencing slow image retrieval when using the requestImageForAsset:targetSize:contentMode:options:resultHandler: method in my application. The delay is significantly impacting the performance of my app. Here are the details of my implementation: for (PHAsset *asset in assets) { @autoreleasepool { PHImageManager *imageManager = [PHImageManager defaultManager]; PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init]; options.synchronous = YES; options.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat; options.resizeMode = PHImageRequestOptionsResizeModeNone; [imageManager requestImageForAsset:asset targetSize:CGSizeMake(100, 100) contentMode:PHImageContentModeAspectFill options:options resultHandler:^(UIImage *thumbnail, NSDictionary *info) { CameraRollCellDto *cellDto = [[CameraRollCellDto alloc] init]; cellDto.index = index; cellDto.thumbnail = thumbnail; cellDto.propertyDate = asset.creationDate; if (self.segmentedTorikomi.selectedSegmentIndex == SEG_INDEX_IKKATSU) { cellDto.isSelected = YES; } else { cellDto.isSelected = NO; } [list addObject:cellDto]; }]; index++; } } Has anyone else encountered this issue? Are there any known solutions or optimizations that can help improve the speed of image retrieval using this method? Thank you for your assistance.
0
0
300
Jan ’25
Open an app from the photos share sheet
I'd like to add a share extension to my app (an Action app extension, I think). The extension would appear when users share a photo in the Photos app (and, ideally, Safari). If you tapped my app icon on the share sheet, iOS would pass the photo to my app and switch the user from Photos or Safari to my full app, with the shared photo(s) available for my app to work with. I know this is possible, because Instagram (a third-party app) works exactly like this. If you look at an image in the Photos app, tap Share and then tap Instagram, iOS will background the Photos app, activate the Instagram app and let you edit and post your photo in the main Instagram app. It seems like NSExtensionContext#open(_:completionHandler:) might do this if I add a custom URL to my main app, but the documentation for that says: Each extension point determines whether to support this method, or under which conditions to support this method. In iOS, the Today and iMessage app extension points support this method. That would rule out an Action, Photo Editing or Share extension. But then how does Instagram do this, and how can I achieve the same in my app? I know that it's possible for an Action, Photo Editing or Share extension to open as a mini-app on top of the app providing the content. But coordinating the IPC for that is much, much more work (for my particular app) than just switching the user over to the app, with full access to all the functionality and data that my main app usually has access to.
0
0
457
Jan ’25