Hi everyone,
I’m building a React Native iOS app where I’m integrating Wazo (native WebRTC) and Jitsi (WebView / WebRTC).
Use case:
Wazo is used to maintain a background call session (mainly signaling + audio keep-alive).
Jitsi is used in the foreground for video calls.
Problem:
When Jitsi starts, it takes control of the microphone and camera.
The Wazo call disconnects after ~5 minutes (likely due to media / audio session conflict).
Even if Wazo audio/video is muted or tracks are disabled, the session still drops.
My questions:
Is it officially supported or recommended to run two WebRTC stacks (Wazo + Jitsi) simultaneously on iOS?
Can Wazo stay connected without active audio/video tracks while Jitsi uses mic/camera?
Is there a way to release Wazo media streams temporarily (but keep signaling alive) while Jitsi is loading or active?
Are there any AVAudioSession / background mode limitations on iOS that make this impossible by design?
If this is not supported, what is the recommended architecture (single WebRTC pipeline, switching media ownership, etc.)?
Environment:
iOS (React Native)
Wazo SDK (native WebRTC)
Jitsi Meet (WebView)
CallKit + PushKit enabled
Any guidance, documentation, or real-world experience would be greatly appreciated.
Thanks in advance 🙏
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Hello,
I am implementing video calling on iOS and need to support Picture in Picture (PiP) behavior similar to FaceTime or WhatsApp.
What works
Audio continues correctly in background
CallKit UI works as expected
Video works correctly while the app is in the foreground
What I’m trying to achieve
When the user presses the Home button or switches apps, I want to show a system Picture in Picture window (floating video outside the app).
Current setup
Video is rendered via WebRTC
The video is displayed inside a WKWebView (HTML / JavaScript)
PiP works only while the app is foregrounded
When the app backgrounds, the video disappears (only audio remains)
Questions
Does iOS support system Picture in Picture for:
WebRTC video
WKWebView / HTML video
2 Is AVPictureInPictureController limited only to:
AVPlayerLayer
AVSampleBufferDisplayLayer
3 If PiP requires native rendering:
Is it mandatory to render WebRTC frames natively using AVSampleBufferDisplayLayer?
Is PiP explicitly unsupported for WebView / HTML video?
📌 Clarification
Apps like FaceTime and WhatsApp are able to show PiP outside the app.
I want to understand whether this behavior is achievable only with native video pipelines, or if WebView-based video is fundamentally restricted by iOS.
Any official clarification or documentation reference would be appreciated.
Thank you.