Post

Replies

Boosts

Views

Activity

Reply to Why does CADisplayLink of an external UIScreen drift in time?
Thanks for your help. The bug report ID: FB19273833 There are two very important updates in the bug report, which I found later, after posting on the Developer Forums: My original post was incorrect: drifting happens when using 60 or 30 FPS as the preferred framerate of the CADisplayLink. When I changed the framerate to 15 FPS, there was no drifting. The drifting happens only when not using the SceneDelegate-based lifecycle. When I created a new SceneDelegate-based app, the CADisplayLink's callback was called without drifting even at 30 and 60 FPS. Nevertheless, this potential bug led me to question the reliability of the dongle. The dongle seems to use the AirPlay protocol under the hood, so it likely receives a H.264 video stream from the iDevice which it then decodes and sends to the external monitor via HDMI. Because the dongle is an active component, I'd like to ask if I can rely on iOS and the dongle to stream the video at a constant framerate without drifting over time. That is, when the external screen runs at 60 FPS, does iOS and the dongle guarantee that the average frame duration will always be approx. 0.016667 seconds and not continuously increasing or decreasing? I don't mind frame drops as I belive a frame drop should be detectable based on the timestamp values in the CADisplayLink's callback. I just want to ask if the framerate is guaranteed to always be the same (driftless) and if the dongle runs at the native framerate of the external monitor. So if the external monitor would run at 59.94 FPS instead of exactly 60 FPS, would the dongle really detect this and stream the video frames at the native framerate of the monitor even though it is not a multiple of the iDevice's native framerate?
Topic: Media Technologies SubTopic: General Tags:
Aug ’25
Reply to AVFoundation on visionOS?
Thank you, @Developer Tools Engineer. It seems that not even ARKit will let developers access any sensor data: https://developer.apple.com/videos/play/wwdc2023/10082?time=262 In the WWDC video you shared, it was mentioned that only a "composite .front camera" will be abailable via AVCapture DiscoverySession. May I ask what is meant by "composite camera"? Does it mean that visionOS will provide stream of RGB-only frames captured by a virtual camera, which is created by internally combining data from several hardware cameras? Or is the meaning of "composite" meant to be "multiple modalities" (i.e. AVFoundation providing access to RGB and depth output via a virtual composite camera)? Is AVFoundation going to be the only Framework on visionOS capable of providing access to (composite) camera data, as suggested by the video you shared?
Topic: Media Technologies SubTopic: Audio Tags:
Jun ’23