I support the question. How did the developers at Apple intend for audio and video to be transmitted? How can I transmit video if I'm using, for example, WebRTC in the main application? Many people solve this by serializing CVPixelBuffer into data and passing it to the application... But why not allow the transmission of IOSurface? This mechanism exists, but the methods for bootstrap port or XPC are closed for iOS.
Topic:
Media Technologies
SubTopic:
General
Tags: