ReplayKit

RSS for tag

Record or stream video from the screen and audio from the app and microphone using ReplayKit.

Posts under ReplayKit tag

28 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Having serious trouble with replaykit
Hi everyone, I'm totally new to this and am just having fun making an app for myself. I'm attempting to get a broadcastupload extension working, but whatever i do i cant get the replaykit to work. I keep getting this error in xcode: Provisioning profile "Project v6 Broadcast Upload Development" doesn't include the com.apple.developer.replaykit.broadcast entitlement. What I've tried: Created separate App IDs for each target (Explicit App IDs, not Wildcard) Enabled App Groups capability on all three App IDs in Apple Developer Portal Selected the correct App Group for all App IDs Added App Groups capability in Xcode for all targets and all build configurations Created entitlements file with com.apple.developer.replaykit.broadcast: true for Broadcast Upload extension Recreated provisioning profiles multiple times Used manual code signing with correct certificates I'm completely lost. I reached out directly to apple developer support and they just told me to come here... Any help would be grgeatly appreciated.
0
0
132
1w
Screen Rocorder plus Front Camera
I want to build an app for ios using react native. preferably expo. The app will be for recording user experiences with technology. the SLUDGE that they face while navigating through technology. I want to have basic login, signup. The main feature would be to have 2 recording modes. First is record the screen and the front camera simultaneously. Second is to record the back camera and the front camera simultaneously. I can then patch the two outputs later on that is the screen recording and the front camera clip in post processing. I want to know if this is possible as I was told that react native and expo does not have the support yet. if not is there any library or another approach to make this app come alive.
0
0
38
3w
Remote control
Hi everyone, I’m working on a concept for an iOS app that would allow a user to remotely control an Enterprise iOS device, similar to how AnyDesk or TeamViewer work on desktop. I understand that apps like TeamViewer for iOS offer screen sharing, and some level control but not a full level control. Before I invest further in development, I’d like to clarify a few points: Is there any official Apple-supported way (public or private API) to allow remote control of an iOS device? Has Apple ever approved apps that allow true remote control of iOS (not just screen sharing)? If full control is not allowed, what are the permitted alternatives (e.g. screen broadcast via ReplayKit, remote assistance mode, etc.)? Would such an app be considered for enterprise distribution only (via MDM), or is there a potential App Store path? Any insight or experience from developers who’ve tried this would be very appreciated. Thanks!
0
0
118
Jul ’25
Broadcast Extension sound is not received after another app uses the mic on iOS
Hello all, I have an application that uses broadcast extension of Replay kit to record the entire screen and mic sound to a file on iOS. Up until some months ago, everything was smooth. Currently I am facing the following issue. If another app uses the microphone, then I loose the sound and never comes back. In order to debug this issue I have added a log to the processSampleBuffer, that logs a text each time it receives a .audioMic buffer type. I start the recording and everything works as expected. Later on, I go onto an app that uses microphone, and then I do not get any log for audioMic. I stop the recording on that app, but the sound never comes back. This as a result makes my video file to not have any sound at all. In the above context I also noticed that even with Photos app broadcast extension, if you start recording a video, and you go to Speech to text feature of the keyboard, then the sound is joined. While the STT is on, there is no sound and the sound of whatever comes after STT stops is joined to the sound before the STT starts, so I guess that this is something general. Also on the same research I did, I saw that google Meet app does not allow any microphone to be used from another app while you are in a meet (Even the STT is grayed out). I would like to know my options here. What can I do to have a valid video file with sound? How can I not allow other apps to use the microphone while my app is recording? Is there any entitlement? How does Google Meet do that? P.S. I have added an observer to observe the interruptions for the session and the type .began runs, but the type .ended does not, so I can not actually set the AVAudioSession to active again.
0
0
157
Jul ’25
Broadcast Upload Extension - Screen Sharing Fails to Start After Countdown (No Errors Logged)
Hello everyone, I'm working on implementing a screen sharing feature using RPSystemBroadcastPickerView and a Broadcast Upload Extension to share the entire app screen in an iOS application. The Broadcast Upload Extension is set up following Apple's ReplayKit guidelines. However, I’m encountering an issue during the broadcast startup sequence: ❗ Problem Description The Screen Broadcast UI appears as expected I tap “Start Broadcast” The countdown (3 → 2 → 1) completes Then it immediately reverts to the "Start Broadcast" screen, and screen sharing does not begin No error messages are displayed None of the extension lifecycle methods (broadcastStarted(withSetupInfo:), processSampleBuffer, etc.) are called There are no logs or crash reports, neither in the main app nor in the extension ✅ What Has Been Verified Info.plist of the Broadcast Upload Extension includes: NSExtensionPointIdentifier = com.apple.broadcast-services-upload NSExtensionPrincipalClass set correctly RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer preferredExtension is set properly to the extension’s bundle identifier Extension is listed in the main app's build settings under "Frameworks, Libraries, and Embedded Content" ⚠️ Additional Concern We noticed that in Xcode (latest version), the Broadcast Upload Extension is listed under "Embedded Frameworks" with the setting "Embed Without Signing", and there is no option to change it to "Embed & Sign". We're wondering if this could be the reason the extension fails to launch correctly at runtime, despite being detected by the broadcast picker. ❓ Questions Has anyone faced similar issues where the broadcast never starts despite correct setup? Could the "Embed Without Signing" be causing the system to silently cancel or ignore the extension at runtime? Are there any provisioning profile or entitlement requirements specific to Broadcast Upload Extensions that might trigger this behavior silently? Any insights, suggestions, or workarounds would be greatly appreciated. Thank you in advance!
0
0
137
Jul ’25
Broadcast-Upload Entitlement
Hi, I am developing an iOS app that includes a ReplayKit Broadcast Upload Extension which requires the com.apple.developer.broadcast-upload entitlement. The app is intended for internal development and testing on my own devices and is not yet distributed on the App Store. Even after setting com.apple.developer.broadcast-upload=true in my .entitlements file, and linking it in Build Settings > Code Signing Entitlements; my downloaded provisional profile still did not contain the broadcast-upload entitlement. May I know if I need explicit Apple's approval for adding the broadcast-upload entitlement; even if it's just for testing on my own devices? Thanks.
2
0
177
Jul ’25
Assistance with Capturing WebView Snapshot in Broadcast Upload Extension Without Adding to View Hierarchy
Hello, I am currently developing a game streaming application using ReplayKit and Broadcast Upload Extension. I would like to ask for your assistance regarding capturing snapshots of a WKWebView in the upload extension without adding it to a visible view hierarchy. From my understanding, calling takeSnapshot(with:) on a WKWebView that is not added to the view hierarchy generally works for simple web pages. However, when it comes to more complex web content — such as animations or WebGL — the snapshot returns a blank or static image. I believe this is because rendering such content requires access to the GPU, which is not fully available when the web view is off-screen. That said, I’ve observed that certain apps are able to capture live animated web content inside their broadcast upload extensions, even when the main app is terminated. This suggests that the snapshot is not being generated by the main app or from a remote server — especially since the network activity confirms the content is served locally (via localhost or local IP). Given this, I believe there must be a way to achieve GPU-accelerated rendering for WKWebView directly within the upload extension context, without attaching it to the app's UI. I would greatly appreciate any guidance, APIs, or recommended techniques that could help me achieve this behavior correctly and within system limitations. Thank you in advance for your support. I look forward to your advice. Warm regards,
0
0
53
Jun ’25
How to prevent the main app from being terminated by the system during long - term system - level recording
After logging in to the main App, turn on screen recording, then switch to the interface of another App to perform operations. After about ten-odd minutes, when returning to the main App, it was found that the app was forcefully quit by the system, and subsequent operations could not be carried out.
1
0
59
May ’25
app crashed _CFRelease.cold.1
In my app, I implemented a screen recording functionality. But there was an unexpected crash. 0 CoreFoundation _CFRelease.cold.1 + 16 1 CoreFoundation ___CFTypeCollectionRelease 2 ReplayKit ___56-[RPScreenRecorder captureHandlerWithSample:timingData:]_block_invoke + 148 3 libdispatch.dylib __dispatch_call_block_and_release + 32 4 libdispatch.dylib __dispatch_client_callout + 16 5 libdispatch.dylib __dispatch_lane_serial_drain + 740 6 libdispatch.dylib __dispatch_lane_invoke + 388 7 libdispatch.dylib __dispatch_root_queue_drain_deferred_wlh + 292 8 libdispatch.dylib __dispatch_workloop_worker_thread + 540 9 libsystem_pthread.dylib __pthread_wqthread + 292
4
0
98
Jul ’25
Screen recording audio and video out of sync
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync. I don t know if it s a AVAssetWriterInputr setup problem,here is my code NSDictionary *audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(64000), AVFormatIDKey : @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey : @(2), AVSampleRateKey : @(44100) }; AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings]; audioAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:audioAssetWriterInput]; NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5), AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel}; NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264, AVVideoScalingModeKey : AVVideoScalingModeResize, AVVideoWidthKey:@(screenWidth*2), AVVideoHeightKey:@(screenHeight*2), AVVideoCompressionPropertiesKey:videoCompressSetting }; AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting]; videoAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:videoAssetWriterInput];
1
0
60
Apr ’25
Replaykit stop screen record failed, recording status is false
I want to record screen ,and than when I call the method stopCaptureWithHandler:(nullable void (^)(NSError *_Nullable error))handler to stop recording and saving file. before call it,I check the value record of RPScreenRecorder sharedRecorder ,the value is false , It's weird! The screen is currently being recorded ! I wonder if the value of [RPScreenRecorder sharedRecorder].record will affect the method stopCaptureWithHandler: -(void)startCaptureScreen { [[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) { //code } completionHandler:^(NSError * _Nullable error) { //code }]; } - (void)stopRecordingHandler { if([[RPScreenRecorder sharedRecorder] isRecording]){ // deal error .sometime isRecording is false }else { [[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) { }]; } } here are my code.
0
0
52
Apr ’25
RePlayKit:screen recording method return sampleBuffer is nil
I want record screen in my app,the method startCaptureWithHandler:completionHandler:,the sampleBuffer, It is supposed to exist but it has become nil.Not only that,but there‘s another problem,when I want to stop recording and save the video,I will check [RPScreenRecorder sharedRecorder].recording first, it will be false sometime,that problems are unusual in iOS 18.3.2 iPhoneXs Max,and unexpected,here is my code -(void)startCaptureScreen { NSLog(@"AKA++ startCaptureScreen"); if ([[RPScreenRecorder sharedRecorder] isRecording]) { return; } //屏幕录制 [[RPScreenRecorder sharedRecorder]setMicrophoneEnabled:YES]; NSLog(@"AKA++ MicrophoneEnabled AAAA startCaptureScreen"); [[RPScreenRecorder sharedRecorder]setCameraEnabled:YES]; [[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) { if(self.assetWriter == nil){ if (self.AVAssetWriterStatus == 0) { [self setupAssetWriterAndStartWith:sampleBuffer]; } } if (self.AVAssetWriterStatus != 2) { return; } if (error) { // deal with error return; } if (self.assetWriter.status != AVAssetWriterStatusWriting) { [self assetWriterAppendSampleBufferFailWith:bufferType]; return; } if (bufferType == RPSampleBufferTypeVideo) { if(self.assetWriter.status == 0 ||self.assetWriter.status > 2){ } else if(self.videoAssetWriterInput.readyForMoreMediaData == YES){ BOOL success = [self.videoAssetWriterInput appendSampleBuffer:sampleBuffer]; } } if (bufferType == RPSampleBufferTypeAudioMic) { if(self.assetWriter.status == 0 ||self.assetWriter.status > 2){ } else if(self.audioAssetWriterInput.readyForMoreMediaData == YES){ BOOL success = [self.audioAssetWriterInput appendSampleBuffer:sampleBuffer]; } } } completionHandler:^(NSError * _Nullable error) { //deal with error }]; } and than ,when want to save it : -(void)stopRecording { if([[RPScreenRecorder sharedRecorder] isRecording]){ // The problem is sporadic,recording action failed,it makes me confused } [[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) { if(!error) { //post message } }]; }
0
0
42
Apr ’25
ReplayKit start and stop capture breaks and give me an error when switching from Immersive to Mixed and back.
Hi, I'm developing a virtual camera system using ReplayKit to capture scene video by directly accessing raw video buffers. The capture mechanism works flawlessly when repeatedly starting and stopping video capture within a continuous immersive environment. However, a critical issue arises when interrupting the immersive space: Step 1: Enter immersive environment and start and stop capture videos(Multiple times with no issues) Step 2: Press the crown button to exit the immersive environment Step 3: Return to the immersive space subsequently Step 4: Attempt to start the video capture At this point, the startCapture method throws an unexpected error, disrupting the video capture workflow. This is the Xcode error that I see " [ERROR] -[RPScreenRecorder startCaptureWithHandler:completionHandler:]_block_invoke_2:500 failed to start due to error: Error Domain=com.apple.ReplayKit.RPRecordingErrorDomain Code=-5803 "Recording failed to start" UserInfo={NSLocalizedDescription=Recording failed to start}" I have tried all possible ways to stopCapture including OnDisappear and other methods and nothing seems to solve this.
3
0
257
Mar ’25
unable to find boardcast extension
Issue Summary: In our Flutter application, we utilize Tencent's TRTC API for voice and video communication. While the broadcast functionality operates correctly on Android, it fails to respond on iOS devices. Attempting to initiate a broadcast results in no action, and long-pressing the broadcast button does not reveal the broadcast extension. Steps to Reproduce: Add Broadcast Upload Extension: In Xcode, navigate to File > New > Target. Select Broadcast Upload Extension and add it to the project. 2. Build the Project: Attempt to build the project. Encounter the error: "Cycle inside Runner; building could produce unreliable results." 3. Resolve Build Cycle Error: Go to the project’s Build Phases. Locate the Embed App Extensions phase. Move Embed App Extensions just below Copy Bundle Resources. Ensure the Copy only when installing option is selected. Rebuild the project; the cycle error is resolved. 4.Test Broadcast Functionality: Install the app on an iOS device. Tap the broadcast button; observe no response. Long-press the broadcast button in the top right hand scroll down menu; the broadcast extension is not listed. 5. Isolate the Issue: Create a new Flutter project. Repeat the above steps to add the broadcast upload extension. The issue persists: broadcast functionality remains unresponsive on iOS.
1
0
498
Feb ’25
Screen sharing application - URGENT question
There are different kinds of screen-sharing applications, all using different APIs. The API used by AnyDesk, for example, or TeamViewer, which doesn't require light signals. I wonder if this is more appropriate for a corporate application, i.e. MDM, A screen-sharing application could be created and validated by Apple to display no light signals, and which could access the user's screen whenever the person wanted to after an initial acceptance? In other words, the user accepts to share his screen once, but won't be notified to accept the next time. Or is this impossible on iOS? I'd be honored to have some answers
3
0
469
Feb ’25
How to Implement Screen Mirroring in iOS for Google TV?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device. What I Have Tried So Far I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings: Google Cast SDK: Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time. ReplayKit for Screen Capture: RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant. RTSP/UDP Streaming: Some third-party libraries support RTSP/UDP streaming for real-time screen sharing. Google TV does not natively support RTSP, making this approach difficult. My Questions: Is it possible to achieve real-time screen mirroring on Google TV using Google Cast SDK? Does Google TV support WebRTC or any low-latency streaming protocol that can be used from iOS? Are there any alternative approaches to mirror an iOS screen to Google TV with minimal latency? I would appreciate any guidance, code examples, or references to relevant documentation.
0
1
417
Feb ’25