I want record screen in my app,the method startCaptureWithHandler:completionHandler:,the sampleBuffer, It is supposed to exist but it has become nil, that problem is unusual in iOS 18.3.2 iPhoneXs Max
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want record screen in my app,the method startCaptureWithHandler:completionHandler:,the sampleBuffer, It is supposed to exist but it has become nil.Not only that,but there‘s another problem,when I want to stop recording and save the video,I will check [RPScreenRecorder sharedRecorder].recording first, it will be false sometime,that problems are unusual in iOS 18.3.2 iPhoneXs Max,and unexpected,here is my code
-(void)startCaptureScreen {
NSLog(@"AKA++ startCaptureScreen");
if ([[RPScreenRecorder sharedRecorder] isRecording]) {
return;
}
//屏幕录制
[[RPScreenRecorder sharedRecorder]setMicrophoneEnabled:YES];
NSLog(@"AKA++ MicrophoneEnabled AAAA startCaptureScreen");
[[RPScreenRecorder sharedRecorder]setCameraEnabled:YES];
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
if(self.assetWriter == nil){
if (self.AVAssetWriterStatus == 0) {
[self setupAssetWriterAndStartWith:sampleBuffer];
}
}
if (self.AVAssetWriterStatus != 2) {
return;
}
if (error) {
// deal with error
return;
}
if (self.assetWriter.status != AVAssetWriterStatusWriting) {
[self assetWriterAppendSampleBufferFailWith:bufferType];
return;
}
if (bufferType == RPSampleBufferTypeVideo) {
if(self.assetWriter.status == 0 ||self.assetWriter.status > 2){
} else if(self.videoAssetWriterInput.readyForMoreMediaData == YES){
BOOL success = [self.videoAssetWriterInput appendSampleBuffer:sampleBuffer];
}
}
if (bufferType == RPSampleBufferTypeAudioMic) {
if(self.assetWriter.status == 0 ||self.assetWriter.status > 2){
} else if(self.audioAssetWriterInput.readyForMoreMediaData == YES){
BOOL success = [self.audioAssetWriterInput appendSampleBuffer:sampleBuffer];
}
}
} completionHandler:^(NSError * _Nullable error) {
//deal with error
}];
}
and than ,when want to save it :
-(void)stopRecording {
if([[RPScreenRecorder sharedRecorder] isRecording]){
// The problem is sporadic,recording action failed,it makes me confused
}
[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
if(!error) {
//post message
}
}];
}
I want to record screen ,and than when I call the method stopCaptureWithHandler:(nullable void (^)(NSError *_Nullable error))handler to stop recording and saving file. before call it,I check the value record of RPScreenRecorder sharedRecorder ,the value is false , It's weird! The screen is currently being recorded !
I wonder if the value of [RPScreenRecorder sharedRecorder].record will affect the method stopCaptureWithHandler:
-(void)startCaptureScreen {
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
//code
} completionHandler:^(NSError * _Nullable error) {
//code
}];
}
- (void)stopRecordingHandler {
if([[RPScreenRecorder sharedRecorder] isRecording]){
// deal error .sometime isRecording is false
}else {
[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
}];
}
}
here are my code.
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync.
I don t know if it s a AVAssetWriterInputr setup problem,here is my code
NSDictionary *audioCompressionSettings = @{
AVEncoderBitRatePerChannelKey : @(64000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(2),
AVSampleRateKey : @(44100) };
AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
audioAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:audioAssetWriterInput];
NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5),
AVVideoMaxKeyFrameIntervalKey:@(30),
AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel};
NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264,
AVVideoScalingModeKey : AVVideoScalingModeResize,
AVVideoWidthKey:@(screenWidth*2),
AVVideoHeightKey:@(screenHeight*2),
AVVideoCompressionPropertiesKey:videoCompressSetting
};
AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting];
videoAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:videoAssetWriterInput];
In my app, I implemented a screen recording functionality.
But there was an unexpected crash.
0
CoreFoundation
_CFRelease.cold.1 + 16
1
CoreFoundation
___CFTypeCollectionRelease
2
ReplayKit
___56-[RPScreenRecorder captureHandlerWithSample:timingData:]_block_invoke + 148
3
libdispatch.dylib
__dispatch_call_block_and_release + 32
4
libdispatch.dylib
__dispatch_client_callout + 16
5
libdispatch.dylib
__dispatch_lane_serial_drain + 740
6
libdispatch.dylib
__dispatch_lane_invoke + 388
7
libdispatch.dylib
__dispatch_root_queue_drain_deferred_wlh + 292
8
libdispatch.dylib
__dispatch_workloop_worker_thread + 540
9
libsystem_pthread.dylib
__pthread_wqthread + 292