Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

AVCaptureDevice rotationCoordinator modifying CALayer on switching devices
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras. Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator. func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) { guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return } cancellables.removeAll() rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer) guard let coordinator = rotationCoordinator else { return } coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview) .receive(on: DispatchQueue.main) .sink { degrees in let radians = degrees * .pi / 180 MainActor.assumeIsolated { callback(radians) } } .store(in: &cancellables) } This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method. Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace: ( 0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492 1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076 2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992 3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596 4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636 5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752 6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376 7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972 8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72 9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1 10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1 11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1 12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1 13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721 )
2
0
570
Feb ’25
Missing Depth Frames When Recording with AVCaptureVideoDataOutputSampleBufferDelegate/AVCaptureDataOutputSynchronizerDelegate and AVAssetWriter
I’ve tried both AVCaptureVideoDataOutputSampleBufferDelegate (captureOutput) and AVCaptureDataOutputSynchronizerDelegate (dataOutputSynchronizer), but the number of depth frames and saved timestamps is significantly lower than the number of frames in the .mp4 file written by AVAssetWriter. In my code, I save: Timestamps for each frame to a metadata file Depth frames to a binary file Video to an .mp4 file If I record a 4-second video at 30fps, the .mp4 file correctly plays for 4 seconds, but the number of stored timestamps and depth frames is much lower—around 70 frames instead of the expected 120. Does anyone know why this mismatch happens? func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) { // Read all outputs guard let syncedDepthData: AVCaptureSynchronizedDepthData = synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData, let syncedVideoData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else { // only work on synced pairs return } if syncedDepthData.depthDataWasDropped || syncedVideoData.sampleBufferWasDropped { return } let depthData = syncedDepthData.depthData let depthPixelBuffer = depthData.depthDataMap let sampleBuffer = syncedVideoData.sampleBuffer guard let videoPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer), let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else { return } addToPreviewStream?(CIImage(cvPixelBuffer: videoPixelBuffer)) if !canWrite() { return } // Extract the presentation timestamp (PTS) from the sample buffer let timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) //sessionAtSourceTime is the first buffer we will write to the file if self.sessionAtSourceTime == nil { //Make sure we don't start recording until the buffer reaches the correct time (buffer is always behind, this will fix the difference in time) guard sampleBuffer.presentationTimeStamp >= self.recordFromTime! else { return } self.sessionAtSourceTime = sampleBuffer.presentationTimeStamp self.videoWriter!.startSession(atSourceTime: sampleBuffer.presentationTimeStamp) } if self.videoWriterInput!.isReadyForMoreMediaData { self.videoWriterInput!.append(sampleBuffer) self.videoTimestamps.append( Timestamp( frame: videoTimestamps.count, value: timestamp.value, timescale: timestamp.timescale ) ) let ddm = depthData.depthDataMap depthCapture.addDepthData(pixelBuffer: ddm, timestamp: timestamp) } }
3
0
447
Feb ’25
Why personal music recommendations contain no more than 12 item?
Hi! I get personal recommendations MusicItemCollection using this code: func getRecommendations() async throws -> MusicItemCollection<MusicPersonalRecommendation> { let request = MusicPersonalRecommendationsRequest() let response = try await request.response() let recommendations = response.recommendations return recommendations } However, all recommendations contain no more than 12 MusicItem's, while the Music.app application provides much more for some recommendations, for example, for the You recently listened recommendation, the Music.app application displays 40 items. Each recommendation has an items property that contains a collection of musical items MusicItemCollection<MusicPersonalRecommendation.Item>, the hasNextBatch property for these collections is always false. I expected that for some collections loading of new items would be available. Please tell me if I'm doing something wrong or is this a MusicKit bug? Thank you!
1
0
454
Feb ’25
AVPlayer error: Too many open files
For some users in production, there's a high probability that after launching the App, using AVPlayer to play any local audio resources results in the following error. Restarting the App doesn't help. issue: [error: Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(24), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x30311f270 {Error Domain=NSPOSIXErrorDomain Code=24 "Too many open files"}} I've checked the code, and there aren't actually multiple AVPlayers playing simultaneously. What could be causing this?
0
0
417
Feb ’25
Build a vision capture system for apple vision pro
Hi, I am a newbie here. We have been given a task to build a robotic vision system to capture an immersive video in a hazed environment, which will later be played on Apple Vision Pro. I am thinking of starting with 2 or 4 basic CMOS camera sensors, such as IMX378, AR0144, or VD66GY, and designing an FPGA-based circuit to synchronously capture and store raw frame-by-frame data. Some frame initial processing such as demosaicing and filtering can also be done by the FPGA. Then, I would use software for post-processing to convert the data into a compatible video format for Apple Vision Pro. Will this idea work? I can handle the raw data capture, but I’m unsure if this approach is feasible and what post-processing software I should use. Thanks a lot for your suggestions! Charlie
0
0
311
Feb ’25
How to write RGB & Depth Frames Without Losing Synchronization
I’m currently working on a project where I capture both depth frames and RGB frames using AVCaptureDataOutputSynchronizer. Depth frames are stored as raw binary data and RGB frames are saved with AVAssetWriter. The issue I’m facing is that AVAssetWriter enforces a fixed framerate, meaning it adds or discards frames to maintain that rate (as I understand it). This causes a desynchronization between the depth and RGB frames, which is a problem because I need each depth frame to be exactly matched with the corresponding RGB frame as they were captured. How can I ensure that the RGB frames are saved without AVAssetWriter modifying the frame count?
0
0
363
Feb ’25
Music Keeps cutting off
Everytime I put my AirPods in and connect them to my phone or my Mac or my iPad since the iOS 18.3 update on my devices they’ve been disconnecting without reason, pausing songs I’m in the middle of playing, and only partially reconnecting in one pod and it’s getting really frustrating
1
0
316
Feb ’25
Does PHAsset localIdentifier change after device transfer?
Hi, I’m working on a photo backup app. I track the PHAsset localIdentifier to determine which photos have been backed up and which haven’t. Recently, I’ve noticed that two users seem to have experienced the localIdentifier changes after transferring data to a new iPhone using Quick Start. Additionally, others on StackOverflow have mentioned that the localIdentifier sometimes changes after updating the iOS version. https://stackoverflow.com/questions/40094728/phobject-localidentifier-reliability I’d like to confirm the reliability of the localIdentifier after an iOS version upgrade or device transfer. Can I continue using these locally stored localIdentifiers? Or is there another recommended approach, such as using PHCloudIdentifier?
2
0
383
Feb ’25
Unkown photos were added to my iPhone Photo App.
There are several unknown pictures were added to my iPhone Photos. I'm developing a iOS app in swiftUI. And I'm sure that neither my developing app asked a permission to save photos to iOS device, nor these pictures are taken by myself, downloaded from the web, or synced from my iCloud. Why does this happen? Is my account or my iPhone attacked or hacked? btw, my device has turned the debug mode on, will this setting option would leads some vulnerability or some debug tools would add photos to my albums? The attached pictures are the unknown photos in my Photos App. I'v search this photo with google. And I found that many developers in stack overflow use this picture when they encounter troubles with image display or other manipulation. Where where are this photo come from? Why does it exist in my Photo albums?
1
0
451
Feb ’25
Musickit Media player missing output device selection
Hi All, I am working on a DJ playout app (MACOS). The app has a few AVAudioPlayerNode's combined with the ApplicationMusicPlayer from Musickit. I can route the output of the AVaudioPlayer to a hardware device so that the audio files are directed to their own dedicated output on my Mac. The ApplicationMusicPlayer is following the default output and this is pretty annoying. Has anyone found a solution to chain the ApplicationMusicPlayer and get it set to a output device? Thanks Pancras
2
0
568
Feb ’25
Reference White Calculation for HDR Video Rendering in Metal
Our multimedia application Boinx FotoMagico displays media files of various kinds with a Metal rendering engine. At the moment we still use .bgra8Unorm pixel format and sRGB color space and only render in SDR, which is increasingly a problem, as much of the video content is HDR nowadays (e.g. videos shot on an iPhone). For that reason we would like to switch to EDR rendering with .rgba16Float pixel format and extendedLinearDisplayP3 color space. We have already worked out how to do this for HDR image files, but still have a technical problem when rendering HDR video files. We are using AVFoundation to get the video frames as CVPixelBuffers and convert them to MTLTexture using a CVMetalTextureCache. MTLTextures are then further processed in various compute shaders before being rendered to screen. However the pixel values in the texture are not what we expected. Video frames appear too bright/overexposed. In WWDC21 session "Explore HDR rendering with EDR" Ken Greenebaum mentioned: “AVFoundation does not presently decode HDR formats, such as HDR10, to EDR. Consequently, these need to be adapted for use with EDR rendering. This conversion is straightforward and involves two steps. First, converting to linear light by applying the inverse transfer function. And second, dividing by the medium's reference white.” https://developer.apple.com/videos/play/wwdc2021/10161?time=1498 However, the session does not explain, how to get or calculate the correct value for "reference white". We could not find any relevant info on the web. This is why we need DTS assistance. We need the code that calculates the correct value for reference white for any kind of video, whether it is SDR or HDR, and regardless of codec and encoding. I assume that Ken Greenebaum is the best Apple engineer to ask in this case, because he recorded most of the EDR related WWDC sessions in recent years? We have written a small test app that renders a short sample video (HLG encoding). The window contains two views. The upper view uses an AVPlayerLayer and renders the video natively just like QuickTime Player. The video content looks correct here. BTW, the window background is SDR white, so that bright EDR pixels can be clearly identified, e.g. the clouds just above the mountains in the upper left corner of the sample video. You may need to lower display brightness a bit if these clouds do not appear brighter than the white window background. The bottom view uses a CAMetalLayer and low-level Metal rendering. The CVPixelBuffers we receive from AVFoundation still need to be scaled down so that SDR reference white reaches pixel value 1.0. Entering a value of 9.0 to 10.0 for reference white in the text field makes it look about right on my Studio Display. But that is just experimental for this sample video file. We need code to calculate the correct value for reference white for any kind of video file! We have a couple of questions: SDR videos should probably use 1.0 as reference white, as their encoded pixel values can already be used as is? Is this assumption correct? Different video encoding of HDR video (HLG, PQ, etc) will probably lead to different values for reference white? Is the value for reference white constant throughout a video, or can it vary over time, either scene by scene, or even frame by frame? If it can vary, does the CVPixelBuffer of the current video frame contain all the necessary metadata to calculate the correct value? Does the NSScreen.maximumExtendedDynamicRangeColorComponentValue also influence the reference white value? The attached sample project is structured in a way that the only piece of code that needs to be modified is the ViewController.sdrReferenceWhiteValue() function. Please read the comments and the #warning in this function. This is where the code for calculating the reference white value should be inserted. Here is the download link for the sample project: https://www.dropbox.com/scl/fi/4w5gmftav5xhbixu9u6pb/HDRMetalTest.zip?rlkey=n8cm02soux3rx03vplgo6h1lm&dl=0
9
2
671
Feb ’25
AVSpeechSynthesizer & Bluetooth Issues
Hello, I have a CarPlay Navigation app and utilize the AVSpeechSynthesizer to speak directions to a user. Everything works great on my CarPlay simulator as well as when plugged into my GMC truck. However, I found out yesterday that one of my users with a Ford truck the audio would cut in an out. After much troubleshooting, I was able to replicate this on my own truck when using Bluetooth to connect to CarPlay. My user was also utilizing Bluetooth. Has anyone else experienced this? Is there a fix to the problem? import SwiftUI import AVFoundation class TextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate { private var speechSynthesizer = AVSpeechSynthesizer() static let shared = TextToSpeechService() override init() { super.init() speechSynthesizer.delegate = self } func configureAudioSession() { speechSynthesizer.delegate = self do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .allowBluetooth]) } catch { print("Failed to set audio session category: \(error.localizedDescription)") } } func speak(_ text: String) { Task(priority: .high) { let speechUtterance = AVSpeechUtterance(string: text) speechUtterance.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode()) try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation) speechSynthesizer.speak(speechUtterance) } } func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) { Task { stopSpeech() try AVAudioSession.sharedInstance().setActive(false) } } func stopSpeech() { speechSynthesizer.stopSpeaking(at: .immediate) } }
0
0
432
Feb ’25
HDR10 MVHECV can not play on Safari
Hi, just generated a HDR10 MVHEVC file, mediainfo is below: Color range : Limited Color primaries : BT.2020 Transfer characteristics : PQ Matrix coefficients : BT.2020 non-constant Codec configuration box : hvcC+lhvC then generate the segment files with below command: mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov then upload the segment files and prog_index.m3u8 to web server. just find that can not play the HLS stream on Safari... the url is http://ip/vod/prog_index.m3u8 just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file. above same mediafilesegmenter command and upload the files to web server. the new version of HLS stream is can play on Safari... Is there any way to play HLS PQ video on Safari. thanks.
2
1
825
Feb ’25
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
3
2
1.1k
Feb ’25
How to receive AVMetricEvent performance data?
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app: let playerItem: AVPlayerItem = ... let allMetrics = playerItem.allMetrics() Task.init { print("metrics task started") do { for try await metricEvent in allMetrics { print("metric event: \(metricEvent.description)") } } catch { print("unexpected metric iterator error \(error)") } } Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen. What am I doing wrong that prevents metric data being received?
2
0
383
Feb ’25
On iOS 18, Mandarin is read aloud as Cantonese
Please include the line below in follow-up emails for this request. Case-ID: 11089799 When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16. 1.let utterance = AVSpeechUtterance(string: textView.text) let voice = AVSpeechSynthesisVoice(language: "zh-CN") utterance.voice = voice 2.In the phone settings, Siri is set to Cantonese
3
1
537
Feb ’25
3D scanning and 3D model with texture mapping
I am new here and would appreciate help in coding or an explanation what to use in swift for an app which will be able to capture LiDAR scanning and RGB data from taken pictures, generate a 3D mesh, and create .OBJ, .MTL, and .JPEG file set for further manipulation of 3D model. I am able to create from LiDAR scanning 3D mesh and .OBJ file but can not generate .MTL and .JPEG for a texture of 3D model.
1
0
386
Feb ’25
[Crash Report] PHAssetResourceManager.writeDataForAssetResource causes app crash on iOS18
We are encountering a critical, intermittently occurring crash issue when accessing photo data using PHAssetResourceManager.writeDataForAssetResource on iOS 18. The problem does not arise on iOS 17 or earlier versions. We have been unable to identify a consistent reproduction path. Based on user feedback, the issue seems to involve Live Photo and Raw image files. Our investigation has revealed that the crash occurs in the +[PISchema identifier] method of the PhotoImaging Framework. When called manually, this method causes a crash on iOS 18 but works without issues on iOS 17. Reproduction Steps: 1.Fetch PHAsset. 2.Get PHAssetResource by [PHAssetResource assetResourcesForAsset:]. 3.Call [PHAssetResourceManager writeDataForAssetResource:toFile:options:completionHandler:]. Crash Log: Incident Identifier: CFD60092-FDB1-43B4-BA42-3F507F7B8B96 CrashReporter Key: 260b4780989083a54e0cb451930fe9a3bed64862 Hardware Model: iPhone13,4 AppStoreTools: 16C5031b AppVariant: 1:iPhone13,4:18 Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Date/Time: 2025-02-15 19:07:57.7054 +0800 Launch Time: 2025-02-15 19:07:55.4106 +0800 OS Version: iPhone OS 18.3.1 (22D72) Release Type: User Baseband Version: 5.20.03 Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: mCloud_iPhone [11109] Triggered by Thread: 11 Application Specific Information: abort() called Thread 11 name: Dispatch queue: com.apple.NSXPCConnection.m-user.com.apple.photos.service Thread 11 Crashed: 0 libsystem_kernel.dylib 0x1e850b2d4 __pthread_kill + 8 1 libsystem_pthread.dylib 0x221b4959c pthread_kill + 268 2 libsystem_c.dylib 0x19ec24b08 abort + 128 3 NeutrinoCore 0x1bdcdbdec -[NUAssertionPolicyAbort notifyAssertion:] + 68 4 NeutrinoCore 0x1bdcdbbf4 -[NUAssertionPolicyComposite notifyAssertion:] + 160 5 NeutrinoCore 0x1bdcdc098 -[NUAssertionPolicyUnique notifyAssertion:] + 176 6 NeutrinoCore 0x1bdcdb524 -[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156 7 NeutrinoCore 0x1bdcdc4bc _NUAssertFailHandler + 176 8 NeutrinoCore 0x1bdc8ea98 -[NUIdentifier initWithNamespace:name:version:] + 2352 9 NeutrinoCore 0x1bdc8eba8 -[NUIdentifier initWithName:version:] + 84 10 NeutrinoCore 0x1bdc8ec10 -[NUIdentifier initWithName:] + 68 11 PhotoImaging 0x1bda54ce4 +[PISchema identifier] + 36 12 PhotoImaging 0x1bda550fc +[PISchema registeredPhotosSchemaIdentifier] + 32 13 PhotoImaging 0x1bd9d7128 +[PIPhotoEditHelper newComposition] + 28 14 PhotoImaging 0x1bd940798 +[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160 15 PhotoImaging 0x1bd9412ec +[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224 16 PhotoLibraryServices 0x1afabf75c -[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1856 17 PhotoLibraryServices 0x1afabffe4 +[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108 18 Photos 0x1af4ac360 __167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260 19 Photos 0x1af4ac67c -[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136 20 Photos 0x1af4ac4b0 -[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88 21 Photos 0x1af4abb8c -[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 404 22 Photos 0x1af4a911c -[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624 23 Photos 0x1af2c1d10 -[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88 24 Photos 0x1af2c11e8 -[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 88 25 Photos 0x1af505184 -[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124 26 Photos 0x1af5050a0 __39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584 27 PhotoLibraryServicesCore 0x1b001be8c __106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]_block_invoke.86 + 864 28 CoreFoundation 0x196dd8e34 __invoking___ + 148 29 CoreFoundation 0x196dd7e7c -[NSInvocation invoke] + 428 30 Foundation 0x195a64ae0 __NSXPCCONNECTION_IS_CALLING_OUT_TO_EXPORTED_OBJECT__ + 16 31 Foundation 0x195a63514 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 532 32 Foundation 0x195a6653c __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188 33 libxpc.dylib 0x221babb80 _xpc_connection_reply_callout + 116 34 libxpc.dylib 0x221b9e2d0 _xpc_connection_call_reply_async + 80 35 libdispatch.dylib 0x19eb6b028 _dispatch_client_callout3 + 20 36 libdispatch.dylib 0x19eb88b64 _dispatch_mach_msg_async_reply_invoke + 340 37 libdispatch.dylib 0x19eb7242c _dispatch_lane_serial_drain + 352 38 libdispatch.dylib 0x19eb73158 _dispatch_lane_invoke + 432 39 libdispatch.dylib 0x19eb7e38c _dispatch_root_queue_drain_deferred_wlh + 288 40 libdispatch.dylib 0x19eb7dbd8 _dispatch_workloop_worker_thread + 540 41 libsystem_pthread.dylib 0x221b44680 _pthread_wqthread + 288 42 libsystem_pthread.dylib 0x221b42474 start_wqthread + 8
0
1
380
Feb ’25