Hello,
I'm trying to receive parquet files using the example that provided in documentation. I've done all required steps but receive constantly error 500 with "Upstream Service Error". By looking into the issues list, seems this error exists for months. Is it possible to get it working?
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
who can help me to make app for iOS with XCode
Topic:
Media Technologies
SubTopic:
Streaming
iPadOS 18.3 beta 3 (22D5055b) fixed the issue for me and my 7th generation iPad.
Topic:
Media Technologies
SubTopic:
Audio
Our team conducted security testing and found one vulnerability with fairplay license acquisition.
Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed.
Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
In the AVP project, a selector pops up, only wanting to filter spatial videos. When selecting the material of one of the spatial videos, the selection result returns empty. How can we obtain the video selected by the user and get the path and the URL of the file
The code is as follows:
PhotosPicker(selection: $selectedItem, matching: .videos) {
Text("Choose a spatial photo or video")
}
func loadTransferable(from imageSelection: PhotosPickerItem) -> Progress {
return imageSelection.loadTransferable(type: URL.self) { result in
DispatchQueue.main.async {
// guard imageSelection == self.imageSelection else { return }
print("加载成功的图片集合:(result)")
switch result {
case .success(let url?):
self.selectSpatialVideoURL = url
print("获取视频链接:(url)")
case .success(nil):
break
// Handle the success case with an empty value.
case .failure(let error):
print("spatial错误:(error)")
// Handle the failure case with the provided error.
}
}
}
}
Hello, my company is developing a product that will send data to/from the phone over cable and Wi-Fi. I have three questions:
Do we need an MFi authentication chip in our product if we plan to send video and commands to the iPhone/iPad over USB or Lightning cable?
Likewise, do we need an MFI authentication chip for communication over Wi-Fi? (Informal research suggests that the answer is no to this one.)
And, do we even still need MFI certification at all for Wi-Fi comms? (We are not using HomeKit.)
Thank you!
Topic:
Media Technologies
SubTopic:
Photos & Camera
I'd like to add a share extension to my app (an Action app extension, I think). The extension would appear when users share a photo in the Photos app (and, ideally, Safari). If you tapped my app icon on the share sheet, iOS would pass the photo to my app and switch the user from Photos or Safari to my full app, with the shared photo(s) available for my app to work with.
I know this is possible, because Instagram (a third-party app) works exactly like this. If you look at an image in the Photos app, tap Share and then tap Instagram, iOS will background the Photos app, activate the Instagram app and let you edit and post your photo in the main Instagram app.
It seems like NSExtensionContext#open(_:completionHandler:) might do this if I add a custom URL to my main app, but the documentation for that says:
Each extension point determines whether to support this method, or under which conditions to support this method. In iOS, the Today and iMessage app extension points support this method.
That would rule out an Action, Photo Editing or Share extension. But then how does Instagram do this, and how can I achieve the same in my app?
I know that it's possible for an Action, Photo Editing or Share extension to open as a mini-app on top of the app providing the content. But coordinating the IPC for that is much, much more work (for my particular app) than just switching the user over to the app, with full access to all the functionality and data that my main app usually has access to.
I want to modify the photo's exif information, which means putting the original image in through CGImageDestinationAddImageFromSource, and then adding the modified exif information to a place called properties.
Here is the complete process:
static func setImageMetadata(asset: PHAsset, exif: [String: Any]) {
let options = PHContentEditingInputRequestOptions()
options.canHandleAdjustmentData = { (adjustmentData) -> Bool in
return true
}
asset.requestContentEditingInput(with: options, completionHandler: { input, map in
guard let inputNN = input else {
return
}
guard let url = inputNN.fullSizeImageURL else {
return
}
let output = PHContentEditingOutput(contentEditingInput: inputNN)
let adjustmentData = PHAdjustmentData(formatIdentifier: AppInfo.appBundleId(), formatVersion: AppInfo.appVersion(), data: Data())
output.adjustmentData = adjustmentData
let outputURL = output.renderedContentURL
guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else {
return
}
guard let dest = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.jpeg.identifier as CFString, 1, nil) else {
return
}
CGImageDestinationAddImageFromSource(dest, source, 0, exif as CFDictionary)
let d = CGImageDestinationFinalize(dest)
// d is true, and I checked the content of outputURL, image has been write correctly, it could be convert to UIImage and image is ok.
PHPhotoLibrary.shared().performChanges {
let changeReq = PHAssetChangeRequest(for: asset)
changeReq.contentEditingOutput = output
} completionHandler: { succ, err in
if !succ {
print(err) // 3303 here, always!
}
}
})
}
I am developing an app that uses MusicKit to play music and then I need to have spoken words played to the user, while ducking the audio coming from MusicKit (application music player)
the built in Siri voices are not off sufficient quality so I am using an external service to create an mp3 file and then play this back using AVAudioSession
Sample code below
the problem I am having is that .duckOthers is not ducking the Application Music Player output
Is this a bug or am I doing this wrong?
// Configure audio session for system-wide ducking
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .spokenAudio, options: [.duckOthers, .mixWithOthers])
try AVAudioSession.sharedInstance().setActive(true)
// Set the ducking level to maximum
try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(0.005)
// Create and configure audio player
self.audioPlayer = try AVAudioPlayer(data: audioData)
self.audioPlayer?.delegate = self
self.audioPlayer?.volume = 1.0 // Ensure full volume for speech
self.audioPlayer?.prepareToPlay()
// Set the audio player's settings for maximum clarity
self.audioPlayer?.enableRate = false
self.audioPlayer?.pan = 0.0 // Center the audio
self.audioPlayer?.play()
Hi there,
Can anyone tell me how to possibly get approved as an Apple News Publisher in 2025?
We attempted in 2024, but received this message from Apple support:
"Thank you for your interest in Apple News. At this time, we're not accepting new applications."
When I inquired further, I got this second response:
"Apple News is no longer accepting unsolicited applications. To learn more about Apple News requirements, visit the Apple News support page. If you have any feedback, please use this form to send us your comments. Keep in mind that while we read all feedback, we are unable to respond to each submission individually."
My questions are:
Is this still the case? (Especially when you are a legit local news outlet)
Is there a link to apply as a news publisher? I don't seem to have that option at all.
Thanks for any feedback.
I prefer to use the album fetched from the library instead of the catalog since this is faster. If doing so, how can I check if all tracks of an album are added to the library. In this case I'd like to fetch the catalog version or throw an error (for example when offline).
Using .with(.tracks) on the library album fetches the tracks added to the library.
The trackCount property is referring to the tracks that can be fetched from the library.
The isComplete property is always nil when fetching from the library.
One possible way is checking the trackNumber and discCount properties. However this only detects that not all tracks of an album are added to the library if there is a song not added ahead of one that is. I'd like to be able to handle this edge case as well.
Is there currently a way to do this? I'd prefer to not rely on the apple music catalog for this since this is supposed to work offline as well. Fetching and storing all trackIDs when connected and later comparing against these would work, but this would potentially mean storing tens of thousands of track ids.
Thank you
I can't play video content with HEVC and DRM. Tested HEVC only: OK. Tested DRM+AVC: Ok.
Tested 2 players (Clappr/Stevie and BitMovin)
Master, variants and EXT-X-MAPs are downloaded Ok, DRM keys Ok and then, for instance with BitMovin Player:
[BMP] [Player] [Error] Event: SourceError, Data: {"code":2001,"data":{"message":"The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","code":-12927},"message":"Source Error. The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","timestamp":1740320663.4505711,"type":"onSourceError"} code: 2001 [Data code: -12927, message: The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.), underlying error: Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"]
4k-master.m3u8.txt
4k.m3u8.txt
4k-audio.m3u8.txt
i need to make app for iOS to get code for XCode
Topic:
Media Technologies
SubTopic:
Streaming
ApplicationMusicPlayer with queue created from playlist crashes with random occurrence shortly after skipping back or forth using controls embedded in the notification, with the error on console log: applicationController: xpc service connection interrupted.
I've noticed that the issue occurs more frequently the shorter is time between skipping entries. Since ApplicationMusicPlayer is run on a remote process, the main app does not crash, but the music stops playing without any exception, and the playback control turns uninitiated.
Here is how I'm initiating the queue:
let entries = playlist
.with(.entries).entries!
.map { ApplicationMusicPlayer.Queue.Entry($0) }
ApplicationMusicPlayer.shared.queue = .init(
entries, startingAt: entries.last
)
Please give me some tips on how to solve this.
EDIT:
The issue does not occur when navigating quickly through the station.
Hi. I work on an audio app for iOS which is successfully using the MPRemoteCommandCenter for commands like next, back, skip forward, skip backward etc.
I am trying to implement playback rate controls in my app (so that users can change the playback speed of audio to 0.5x or 2x for example).
While the above commands work, the changePlaybackRateCommand does not seem to. I have enabled the command, given it a target/handler and set supported rates. With the other commands, this caused the UI to change on lock screen, in command center etc, by adding the control for the command (a next button for the next command for example). However, it does not seem to do anything for the playback rate command.
I can implement my own "rate button" UI and rate change handling, but I'm wondering if this is a known bug within Apple? Looking online, it seems other people face the same issue and haven't been able to get this command to work. Why is this API provided if it doesn't seem to do anything? Is there something I'm missing?
Kind regards.
Topic:
Media Technologies
SubTopic:
Audio
Hi There, I have an app which access the media library, to save and load files. Since the IOS 18.2, the access to the media library stopped working.
Now, I've noticed that our App doesn't show in the List of apps with access to Files ( Privacy & Security -> Files & Folders).
Weird behavior is that, one iPhone with iOS 18.3.1 can access to the Files but others no, same iOS version 18.3.1. Test on Simulators (MAC) and works fine also.
My info.plist file have the keys to access media library for long time and hasn't changed (at least in the las 4 years) including the key "Privacy - Media Library Usage Description".
Also, I've noticed, that the message (popup) that request access to the media library, when using the app for the first time, doesn't show up anymore. We request access to the network (wifi) and this message still showing up but no the media library.
I'm using Visual Studio with Xamarin on a MAC.
I really appreciate any help you can because is very odd behavior and this started from the iOS 18.2.
Hello,
I'm observing an intermittent memory leak being reported in the iOS Simulator when initializing and starting an AVAudioEngine. Even with minimal setup—just attaching a single AVAudioPlayerNode and connecting it to the mainMixerNode—Xcode's memory diagnostics and Instruments sometimes flag a leak.
Here is a simplified version of the code I'm using:
// This function is called when the user taps a button in the view controller:
#import "ViewController.h"
@interface ViewController ()
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
}
- (IBAction)myButtonAction:(id)sender {
NSLog(@"Test");
soundCreate();
}
@end
// media.m
static AVAudioEngine *audioEngine = nil;
void soundCreate(void)
{
if (audioEngine != nil)
return;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
audioEngine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode* playerNode = [[AVAudioPlayerNode alloc] init];
[audioEngine attachNode:playerNode];
[audioEngine connect:playerNode to:(AVAudioNode *)[audioEngine mainMixerNode] format:nil];
[audioEngine startAndReturnError:nil];
}
In the memory leak report, the following call stack is repeated, seemingly in a loop:
ListenerMap::InsertEvent(XAudioUnitEvent const&, ListenerBinding*) AudioToolboxCore
ListenerMap::AddParameter(AUListener*, void*, XAudioUnitEvent const&) AudioToolboxCore
AUListenerAddParameter AudioToolboxCore
addOrRemoveParameterListeners(OpaqueAudioComponentInstance*, AUListenerBase*, AUParameterTree*, bool) AudioToolboxCore
0x180178ddf
I found some documentation about Kext, but I heard they have now moved to Dext.
So I was wondering if Dext could completely imitate the previous Kext.
https://developer.apple.com/documentation/kernel/implementing_drivers_system_extensions_and_kexts
This page is written like this
Important
In macOS 11 and later, the kernel doesn’t load a kext if an equivalent DriverKit solution exists. You may continue to use kexts in macOS 10.15 and earlier.
Topic:
Media Technologies
SubTopic:
General
A few months ago, I had the opportunity to receive a 2018 iMac, and I’ve been using it to create content for my social media. I was truly impressed by the power of its processors. Even with this older model, I’ve been able to grow my presence online—something I couldn’t achieve with newer computers from other brands that I previously purchased.
I would love to become a promoter of your brand in the gaming world. All I ask for is technological support with more recent equipment and a minimal payment for collaborating with you. I am genuinely interested in being part of your company and leveraging the potential and reputation of Apple to reach even greater heights.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
GameplayKit
External Graphics Processors
Developer Tools
I've been trying to use AVMIDIControlChangeEvent with a bankSelect message type to change the instrument the sequencer uses on a AVMusicTrack with no luck.
I started with the Apple AVAEMixerSample, converting the initial setup/loading and portions dealing with the sequencer to Swift. I got that working and playing the "bluesyRiff" and then modified it to play individual notes. So my createAndSetupSequencer looked like
func createAndSetupSequencer() {
sequencer = AVAudioSequencer(audioEngine: engine)
// guard let midiFileURL = Bundle.main.url(forResource: "bluesyRiff", withExtension: "mid") else {
// print (" failed guard trying to get URL for bluesyRiff")
// return
// }
let track = sequencer.createAndAppendTrack()
var currTime = 1.0
for i: UInt32 in 0...8 {
let newNoteEvent = AVMIDINoteEvent(channel: 0, key: 60+i, velocity: 64, duration: 2.0)
track.addEvent(newNoteEvent, at: AVMusicTimeStamp(currTime))
currTime += 2.0
}
The notes played, so then I also replaced the gs_instruments sound bank with GeneralUser GS MuseScore v1.442 first by trying
guard let soundBankURL = Bundle.main.url(forResource: "GeneralUser GS MuseScore v1.442", withExtension: "sf2") else {
return}
do {
try sampler.loadSoundBankInstrument(at: soundBankURL, program: 0x001C, bankMSB: 0x79, bankLSB: 0x08)
} catch{....
}
This appears to work, the instrument (8 which is "Funk Guitar") plays. If I change to bankLSB: 0x00 I get the "Palm Muted guitar". So I know that the soundfont has these instruments
Stuff goes off the rails when I try to change the instruments in createAndSetupSequencer. Putting
let programChange = AVMIDIProgramChangeEvent(channel: 0, programNumber: 0x001C)
let bankChange = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.bankSelect, value: 0x00)
track.addEvent(programChange, at: AVMusicTimeStamp(1.0))
track.addEvent(bankChange, at: AVMusicTimeStamp(1.0))
just before my add note loop doesn't produce any change. Loading bankLSB 8 (Funk) in sampler.loadSoundBankInstrument and trying to change with bankSelect 0 (Palm muted) in createAndSetupSequencer results in instrument 8 (Funk) playing not Palm Muted.
Loading bankLSB 0 (Palm muted) and trying to change with bankSelect 8 (Funk) doesn't work, 0 (Palm muted) plays
I also tried sampler.loadInstrument(at: soundBankURL) and then I always get the first instrument in the sound font file (piano)no matter what values I put in my programChange/bankChange
I've also changed the time in the track.addEvent to be 0, 1.0, 3.0 etc to no success
The sampler.loadSoundBankInstrument specifies two UInt8 parameters, bankMSB and BankLSB while the AVMIDIControlChangeEvent bankSelect value is UInt32 suggesting it might be some combination of bankMSB and BankLSB. But the documentation makes no mention of what this should look like. I tried various combinations of 0x7908, 0X0879 etc to no avail
I will also point out that I am able to successfully execute other control change events
For example adding
if i == 1 {
let portamentoOnEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamento, value: 0xFF)
track.addEvent(portamentoOnEvent, at: AVMusicTimeStamp(currTime))
let portamentoRateEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamentoTime, value: 64)
track.addEvent(portamentoRateEvent, at: AVMusicTimeStamp(currTime))
}
does produce a change in the sound. (As an aside, a definition of what portamento time is, other than "the rate of portamento" would be welcome. is it notes/seconds? freq/minute? beats/hour?)
I was able to get the instrument to change in a different program using MusicPlayer and a series of MusicTrackNewMIDIChannelEvent on a track but these operate on a MusicTrack not the AVMusicTrack which the sequencer uses.
Has anyone been successful in switching instruments through an AVMIDIControlChangeEvent or have any feedback on how to do this?