当前位置: 首页 > news >正文

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

在之前已经实现了WebRTC调用ossrs服务,实现直播视频通话功能。但是在使用过程中,RTCCameraVideoCapturer类提供的方法不能修改及调节相机的灯光等设置,那就需要自定义RTCVideoCapturer自行采集画面了。

在这里插入图片描述

iOS端WebRTC调用ossrs相关,实现直播视频通话功能请查看:
https://blog.csdn.net/gloryFlow/article/details/132262724

这里自定义RTCVideoCapturer

一、自定义相机需要的几个类

需要了解的几个类

  • AVCaptureSession
    AVCaptureSession是iOS提供的一个管理和协调输入设备到输出设备之间数据流的对象。

  • AVCaptureDevice
    AVCaptureDevice是指硬件设备。

  • AVCaptureDeviceInput
    AVCaptureDeviceInput是用来从AVCaptureDevice对象捕获Input数据。

  • AVCaptureMetadataOutput
    AVCaptureMetadataOutput是用来处理AVCaptureSession产生的定时元数据的捕获输出的。

  • AVCaptureVideoDataOutput
    AVCaptureVideoDataOutput是用来处理视频数据输出的。

  • AVCaptureVideoPreviewLayer
    AVCaptureVideoPreviewLayer是相机捕获的视频预览层,是用来展示视频的。

二、自定义RTCVideoCapturer相机采集

自定义相机,我们需要为AVCaptureSession添加AVCaptureVideoDataOutput的output

 self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];[self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];self.dataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};self.dataOutput.alwaysDiscardsLateVideoFrames = YES;[self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];if ([self.session canAddOutput:self.dataOutput]) {[self.session addOutput:self.dataOutput];}else{NSLog( @"Could not add video data output to the session" );return nil;}

需要实现AVCaptureVideoDataOutputSampleBufferDelegate代理方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

将CMSampleBufferRef转换成CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。

完整代码如下

SDCustomRTCCameraCapturer.h

#import <AVFoundation/AVFoundation.h>
#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>@protocol SDCustomRTCCameraCapturerDelegate <NSObject>- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;@end@interface SDCustomRTCCameraCapturer : RTCVideoCapturer@property(nonatomic, weak) id<SDCustomRTCCameraCapturerDelegate>delegate;// Capture session that is used for capturing. Valid from initialization to dealloc.
@property(nonatomic, strong) AVCaptureSession *captureSession;// Returns list of available capture devices that support video capture.
+ (NSArray<AVCaptureDevice *> *)captureDevices;
// Returns list of formats that are supported by this class for this device.
+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device;// Returns the most efficient supported output pixel format for this capturer.
- (FourCharCode)preferredOutputPixelFormat;// Starts the capture session asynchronously and notifies callback on completion.
// The device will capture video in the format given in the `format` parameter. If the pixel format
// in `format` is supported by the WebRTC pipeline, the same pixel format will be used for the
// output. Otherwise, the format returned by `preferredOutputPixelFormat` will be used.
- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fpscompletionHandler:(nullable void (^)(NSError *))completionHandler;
// Stops the capture session asynchronously and notifies callback on completion.
- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler;// Starts the capture session asynchronously.
- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fps;
// Stops the capture session asynchronously.
- (void)stopCapture;#pragma mark - 自定义相机
@property (nonatomic, readonly) dispatch_queue_t bufferQueue;@property (nonatomic, assign) AVCaptureDevicePosition devicePosition; // default AVCaptureDevicePositionFront@property (nonatomic, assign) AVCaptureVideoOrientation videoOrientation;@property (nonatomic, assign) BOOL needVideoMirrored;@property (nonatomic, strong , readonly) AVCaptureConnection *videoConnection;@property (nonatomic, copy) NSString *sessionPreset;  // default 640x480@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;@property (nonatomic, assign) BOOL bSessionPause;@property (nonatomic, assign) int iExpectedFPS;@property (nonatomic, readwrite, strong) NSDictionary *videoCompressingSettings;- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePositionsessionPresset:(AVCaptureSessionPreset)sessionPresetfps:(int)iFPSneedYuvOutput:(BOOL)needYuvOutput;- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame;- (void)setISOValue:(float)value;- (void)startRunning;- (void)stopRunning;- (void)rotateCamera;- (void)rotateCamera:(BOOL)isUseFrontCamera;- (void)setWhiteBalance;- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit;@end

SDCustomRTCCameraCapturer.m

#import "SDCustomRTCCameraCapturer.h"
#import <UIKit/UIKit.h>
#import "EFMachineVersion.h"//#import "base/RTCLogging.h"
//#import "base/RTCVideoFrameBuffer.h"
//#import "components/video_frame_buffer/RTCCVPixelBuffer.h"
//
//#if TARGET_OS_IPHONE
//#import "helpers/UIDevice+RTCDevice.h"
//#endif
//
//#import "helpers/AVCaptureSession+DevicePosition.h"
//#import "helpers/RTCDispatcher+Private.h"typedef NS_ENUM(NSUInteger, STExposureModel) {STExposureModelPositive5,STExposureModelPositive4,STExposureModelPositive3,STExposureModelPositive2,STExposureModelPositive1,STExposureModel0,STExposureModelNegative1,STExposureModelNegative2,STExposureModelNegative3,STExposureModelNegative4,STExposureModelNegative5,STExposureModelNegative6,STExposureModelNegative7,STExposureModelNegative8,
};static char * kEffectsCamera = "EffectsCamera";static STExposureModel currentExposureMode;@interface SDCustomRTCCameraCapturer ()<AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, AVCaptureMetadataOutputObjectsDelegate>
@property(nonatomic, readonly) dispatch_queue_t frameQueue;
@property(nonatomic, strong) AVCaptureDevice *currentDevice;
@property(nonatomic, assign) BOOL hasRetriedOnFatalError;
@property(nonatomic, assign) BOOL isRunning;
// Will the session be running once all asynchronous operations have been completed?
@property(nonatomic, assign) BOOL willBeRunning;@property (nonatomic, strong) AVCaptureDeviceInput *deviceInput;
@property (nonatomic, strong) AVCaptureVideoDataOutput *dataOutput;
@property (nonatomic, strong) AVCaptureMetadataOutput *metaOutput;
@property (nonatomic, strong) AVCaptureStillImageOutput *stillImageOutput;@property (nonatomic, readwrite) dispatch_queue_t bufferQueue;@property (nonatomic, strong, readwrite) AVCaptureConnection *videoConnection;@property (nonatomic, strong) AVCaptureDevice *videoDevice;
@property (nonatomic, strong) AVCaptureSession *session;@end@implementation SDCustomRTCCameraCapturer {AVCaptureVideoDataOutput *_videoDataOutput;AVCaptureSession *_captureSession;FourCharCode _preferredOutputPixelFormat;FourCharCode _outputPixelFormat;RTCVideoRotation _rotation;float _autoISOValue;
#if TARGET_OS_IPHONEUIDeviceOrientation _orientation;BOOL _generatingOrientationNotifications;
#endif
}@synthesize frameQueue = _frameQueue;
@synthesize captureSession = _captureSession;
@synthesize currentDevice = _currentDevice;
@synthesize hasRetriedOnFatalError = _hasRetriedOnFatalError;
@synthesize isRunning = _isRunning;
@synthesize willBeRunning = _willBeRunning;- (instancetype)init {return [self initWithDelegate:nil captureSession:[[AVCaptureSession alloc] init]];
}- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegate {return [self initWithDelegate:delegate captureSession:[[AVCaptureSession alloc] init]];
}// This initializer is used for testing.
- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegatecaptureSession:(AVCaptureSession *)captureSession {if (self = [super initWithDelegate:delegate]) {// Create the capture session and all relevant inputs and outputs. We need// to do this in init because the application may want the capture session// before we start the capturer for e.g. AVCapturePreviewLayer. All objects// created here are retained until dealloc and never recreated.if (![self setupCaptureSession:captureSession]) {return nil;}NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
#if TARGET_OS_IPHONE_orientation = UIDeviceOrientationPortrait;_rotation = RTCVideoRotation_90;[center addObserver:selfselector:@selector(deviceOrientationDidChange:)name:UIDeviceOrientationDidChangeNotificationobject:nil];[center addObserver:selfselector:@selector(handleCaptureSessionInterruption:)name:AVCaptureSessionWasInterruptedNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleCaptureSessionInterruptionEnded:)name:AVCaptureSessionInterruptionEndedNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleApplicationDidBecomeActive:)name:UIApplicationDidBecomeActiveNotificationobject:[UIApplication sharedApplication]];
#endif[center addObserver:selfselector:@selector(handleCaptureSessionRuntimeError:)name:AVCaptureSessionRuntimeErrorNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleCaptureSessionDidStartRunning:)name:AVCaptureSessionDidStartRunningNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleCaptureSessionDidStopRunning:)name:AVCaptureSessionDidStopRunningNotificationobject:_captureSession];}return self;
}//- (void)dealloc {
//  NSAssert(
//      !_willBeRunning,
//      @"Session was still running in RTCCameraVideoCapturer dealloc. Forgot to call stopCapture?");
//  [[NSNotificationCenter defaultCenter] removeObserver:self];
//}+ (NSArray<AVCaptureDevice *> *)captureDevices {
#if defined(WEBRTC_IOS) && defined(__IPHONE_10_0) && \__IPHONE_OS_VERSION_MIN_REQUIRED >= __IPHONE_10_0AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySessiondiscoverySessionWithDeviceTypes:@[ AVCaptureDeviceTypeBuiltInWideAngleCamera ]mediaType:AVMediaTypeVideoposition:AVCaptureDevicePositionUnspecified];return session.devices;
#elsereturn [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#endif
}+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device {// Support opening the device in any format. We make sure it's converted to a format we// can handle, if needed, in the method `-setupVideoDataOutput`.return device.formats;
}- (FourCharCode)preferredOutputPixelFormat {return _preferredOutputPixelFormat;
}- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fps {[self startCaptureWithDevice:device format:format fps:fps completionHandler:nil];
}- (void)stopCapture {[self stopCaptureWithCompletionHandler:nil];
}- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fpscompletionHandler:(nullable void (^)(NSError *))completionHandler {_willBeRunning = YES;[RTCDispatcherdispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{RTCLogInfo("startCaptureWithDevice %@ @ %ld fps", format, (long)fps);#if TARGET_OS_IPHONEdispatch_async(dispatch_get_main_queue(), ^{if (!self->_generatingOrientationNotifications) {[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];self->_generatingOrientationNotifications = YES;}});
#endifself.currentDevice = device;NSError *error = nil;if (![self.currentDevice lockForConfiguration:&error]) {RTCLogError(@"Failed to lock device %@. Error: %@",self.currentDevice,error.userInfo);if (completionHandler) {completionHandler(error);}self.willBeRunning = NO;return;}[self reconfigureCaptureSessionInput];[self updateOrientation];[self updateDeviceCaptureFormat:format fps:fps];[self updateVideoDataOutputPixelFormat:format];[self.captureSession startRunning];[self.currentDevice unlockForConfiguration];self.isRunning = YES;if (completionHandler) {completionHandler(nil);}}];
}- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler {_willBeRunning = NO;[RTCDispatcherdispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{RTCLogInfo("Stop");self.currentDevice = nil;for (AVCaptureDeviceInput *oldInput in [self.captureSession.inputs copy]) {[self.captureSession removeInput:oldInput];}[self.captureSession stopRunning];#if TARGET_OS_IPHONEdispatch_async(dispatch_get_main_queue(), ^{if (self->_generatingOrientationNotifications) {[[UIDevice currentDevice] endGeneratingDeviceOrientationNotifications];self->_generatingOrientationNotifications = NO;}});
#endifself.isRunning = NO;if (completionHandler) {completionHandler();}}];
}#pragma mark iOS notifications#if TARGET_OS_IPHONE
- (void)deviceOrientationDidChange:(NSNotification *)notification {[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{[self updateOrientation];}];
}
#endif#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate//- (void)captureOutput:(AVCaptureOutput *)captureOutput
//    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
//           fromConnection:(AVCaptureConnection *)connection {
//  NSParameterAssert(captureOutput == _videoDataOutput);
//
//  if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
//      !CMSampleBufferDataIsReady(sampleBuffer)) {
//    return;
//  }
//
//  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//  if (pixelBuffer == nil) {
//    return;
//  }
//
//#if TARGET_OS_IPHONE
//  // Default to portrait orientation on iPhone.
//  BOOL usingFrontCamera = NO;
//  // Check the image's EXIF for the camera the image came from as the image could have been
//  // delayed as we set alwaysDiscardsLateVideoFrames to NO.
//
//    AVCaptureDeviceInput *deviceInput =
//        (AVCaptureDeviceInput *)((AVCaptureInputPort *)connection.inputPorts.firstObject).input;
//    usingFrontCamera = AVCaptureDevicePositionFront == deviceInput.device.position;
//
//    switch (_orientation) {
//    case UIDeviceOrientationPortrait:
//      _rotation = RTCVideoRotation_90;
//      break;
//    case UIDeviceOrientationPortraitUpsideDown:
//      _rotation = RTCVideoRotation_270;
//      break;
//    case UIDeviceOrientationLandscapeLeft:
//      _rotation = usingFrontCamera ? RTCVideoRotation_180 : RTCVideoRotation_0;
//      break;
//    case UIDeviceOrientationLandscapeRight:
//      _rotation = usingFrontCamera ? RTCVideoRotation_0 : RTCVideoRotation_180;
//      break;
//    case UIDeviceOrientationFaceUp:
//    case UIDeviceOrientationFaceDown:
//    case UIDeviceOrientationUnknown:
//      // Ignore.
//      break;
//  }
//#else
//  // No rotation on Mac.
//  _rotation = RTCVideoRotation_0;
//#endif
//
//    if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {
//        [self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];
//    }
//}#pragma mark - AVCaptureSession notifications- (void)handleCaptureSessionInterruption:(NSNotification *)notification {NSString *reasonString = nil;
#if TARGET_OS_IPHONENSNumber *reason = notification.userInfo[AVCaptureSessionInterruptionReasonKey];if (reason) {switch (reason.intValue) {case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableInBackground:reasonString = @"VideoDeviceNotAvailableInBackground";break;case AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient:reasonString = @"AudioDeviceInUseByAnotherClient";break;case AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient:reasonString = @"VideoDeviceInUseByAnotherClient";break;case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps:reasonString = @"VideoDeviceNotAvailableWithMultipleForegroundApps";break;}}
#endifRTCLog(@"Capture session interrupted: %@", reasonString);
}- (void)handleCaptureSessionInterruptionEnded:(NSNotification *)notification {RTCLog(@"Capture session interruption ended.");
}- (void)handleCaptureSessionRuntimeError:(NSNotification *)notification {NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];RTCLogError(@"Capture session runtime error: %@", error);[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{
#if TARGET_OS_IPHONEif (error.code == AVErrorMediaServicesWereReset) {[self handleNonFatalError];} else {[self handleFatalError];}
#else[self handleFatalError];
#endif}];
}- (void)handleCaptureSessionDidStartRunning:(NSNotification *)notification {RTCLog(@"Capture session started.");[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{// If we successfully restarted after an unknown error,// allow future retries on fatal errors.self.hasRetriedOnFatalError = NO;}];
}- (void)handleCaptureSessionDidStopRunning:(NSNotification *)notification {RTCLog(@"Capture session stopped.");
}- (void)handleFatalError {[RTCDispatcherdispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{if (!self.hasRetriedOnFatalError) {RTCLogWarning(@"Attempting to recover from fatal capture error.");[self handleNonFatalError];self.hasRetriedOnFatalError = YES;} else {RTCLogError(@"Previous fatal error recovery failed.");}}];
}- (void)handleNonFatalError {[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{RTCLog(@"Restarting capture session after error.");if (self.isRunning) {[self.captureSession startRunning];}}];
}#if TARGET_OS_IPHONE#pragma mark - UIApplication notifications- (void)handleApplicationDidBecomeActive:(NSNotification *)notification {[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{if (self.isRunning && !self.captureSession.isRunning) {RTCLog(@"Restarting capture session on active.");[self.captureSession startRunning];}}];
}#endif  // TARGET_OS_IPHONE#pragma mark - Private- (dispatch_queue_t)frameQueue {if (!_frameQueue) {_frameQueue =dispatch_queue_create("org.webrtc.cameravideocapturer.video", DISPATCH_QUEUE_SERIAL);dispatch_set_target_queue(_frameQueue,dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));}return _frameQueue;
}- (BOOL)setupCaptureSession:(AVCaptureSession *)captureSession {NSAssert(_captureSession == nil, @"Setup capture session called twice.");_captureSession = captureSession;
#if defined(WEBRTC_IOS)_captureSession.sessionPreset = AVCaptureSessionPresetInputPriority;_captureSession.usesApplicationAudioSession = NO;
#endif[self setupVideoDataOutput];// Add the output.if (![_captureSession canAddOutput:_videoDataOutput]) {RTCLogError(@"Video data output unsupported.");return NO;}[_captureSession addOutput:_videoDataOutput];return YES;
}- (void)setupVideoDataOutput {NSAssert(_videoDataOutput == nil, @"Setup video data output called twice.");AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];// `videoDataOutput.availableVideoCVPixelFormatTypes` returns the pixel formats supported by the// device with the most efficient output format first. Find the first format that we support.NSSet<NSNumber *> *supportedPixelFormats = [RTCCVPixelBuffer supportedPixelFormats];NSMutableOrderedSet *availablePixelFormats =[NSMutableOrderedSet orderedSetWithArray:videoDataOutput.availableVideoCVPixelFormatTypes];[availablePixelFormats intersectSet:supportedPixelFormats];NSNumber *pixelFormat = availablePixelFormats.firstObject;NSAssert(pixelFormat, @"Output device has no supported formats.");_preferredOutputPixelFormat = [pixelFormat unsignedIntValue];_outputPixelFormat = _preferredOutputPixelFormat;
//  videoDataOutput.videoSettings = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : pixelFormat};//    [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];videoDataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};videoDataOutput.alwaysDiscardsLateVideoFrames = YES;[videoDataOutput setSampleBufferDelegate:self queue:self.frameQueue];_videoDataOutput = videoDataOutput;//    // 设置视频方向和旋转
//    AVCaptureConnection *connection = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
//    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//    [connection setVideoMirrored:NO];
}- (void)updateVideoDataOutputPixelFormat:(AVCaptureDeviceFormat *)format {
//  FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format.formatDescription);
//  if (![[RTCCVPixelBuffer supportedPixelFormats] containsObject:@(mediaSubType)]) {
//    mediaSubType = _preferredOutputPixelFormat;
//  }
//
//  if (mediaSubType != _outputPixelFormat) {
//    _outputPixelFormat = mediaSubType;
//    _videoDataOutput.videoSettings =
//        @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(mediaSubType) };
//  }
}#pragma mark - Private, called inside capture queue- (void)updateDeviceCaptureFormat:(AVCaptureDeviceFormat *)format fps:(NSInteger)fps {NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],@"updateDeviceCaptureFormat must be called on the capture queue.");@try {_currentDevice.activeFormat = format;_currentDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);} @catch (NSException *exception) {RTCLogError(@"Failed to set active format!\n User info:%@", exception.userInfo);return;}
}- (void)reconfigureCaptureSessionInput {NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],@"reconfigureCaptureSessionInput must be called on the capture queue.");NSError *error = nil;AVCaptureDeviceInput *input =[AVCaptureDeviceInput deviceInputWithDevice:_currentDevice error:&error];if (!input) {RTCLogError(@"Failed to create front camera input: %@", error.localizedDescription);return;}[_captureSession beginConfiguration];for (AVCaptureDeviceInput *oldInput in [_captureSession.inputs copy]) {[_captureSession removeInput:oldInput];}if ([_captureSession canAddInput:input]) {[_captureSession addInput:input];} else {RTCLogError(@"Cannot add camera as an input to the session.");}[_captureSession commitConfiguration];
}- (void)updateOrientation {NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],@"updateOrientation must be called on the capture queue.");
#if TARGET_OS_IPHONE_orientation = [UIDevice currentDevice].orientation;
#endif
}- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePositionsessionPresset:(AVCaptureSessionPreset)sessionPresetfps:(int)iFPSneedYuvOutput:(BOOL)needYuvOutput
{self = [super init];if (self) {self.bSessionPause = YES;self.bufferQueue = dispatch_queue_create("STCameraBufferQueue", NULL);self.session = [[AVCaptureSession alloc] init];self.videoDevice = [self cameraDeviceWithPosition:iDevicePosition];_devicePosition = iDevicePosition;NSError *error = nil;self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.videoDeviceerror:&error];if (!self.deviceInput || error) {NSLog(@"create input error");return nil;}self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];[self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];self.dataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};self.dataOutput.alwaysDiscardsLateVideoFrames = YES;[self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];self.metaOutput = [[AVCaptureMetadataOutput alloc] init];[self.metaOutput setMetadataObjectsDelegate:self queue:self.bufferQueue];self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];self.stillImageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};if ([self.stillImageOutput respondsToSelector:@selector(setHighResolutionStillImageOutputEnabled:)]) {self.stillImageOutput.highResolutionStillImageOutputEnabled = YES;}[self.session beginConfiguration];if ([self.session canAddInput:self.deviceInput]) {[self.session addInput:self.deviceInput];}else{NSLog( @"Could not add device input to the session" );return nil;}if ([self.session canSetSessionPreset:sessionPreset]) {[self.session setSessionPreset:sessionPreset];_sessionPreset = sessionPreset;}else if([self.session canSetSessionPreset:AVCaptureSessionPreset1920x1080]){[self.session setSessionPreset:AVCaptureSessionPreset1920x1080];_sessionPreset = AVCaptureSessionPreset1920x1080;}else if([self.session canSetSessionPreset:AVCaptureSessionPreset1280x720]){[self.session setSessionPreset:AVCaptureSessionPreset1280x720];_sessionPreset = AVCaptureSessionPreset1280x720;}else{[self.session setSessionPreset:AVCaptureSessionPreset640x480];_sessionPreset = AVCaptureSessionPreset640x480;}if ([self.session canAddOutput:self.dataOutput]) {[self.session addOutput:self.dataOutput];}else{NSLog( @"Could not add video data output to the session" );return nil;}if ([self.session canAddOutput:self.metaOutput]) {[self.session addOutput:self.metaOutput];self.metaOutput.metadataObjectTypes = @[AVMetadataObjectTypeFace].copy;}if ([self.session canAddOutput:self.stillImageOutput]) {[self.session addOutput:self.stillImageOutput];}else {NSLog(@"Could not add still image output to the session");}self.videoConnection =  [self.dataOutput connectionWithMediaType:AVMediaTypeVideo];if ([self.videoConnection isVideoOrientationSupported]) {[self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];self.videoOrientation = AVCaptureVideoOrientationPortrait;}if ([self.videoConnection isVideoMirroringSupported]) {[self.videoConnection setVideoMirrored:YES];self.needVideoMirrored = YES;}if ([_videoDevice lockForConfiguration:NULL] == YES) {
//            _videoDevice.activeFormat = bestFormat;_videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, iFPS);_videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, iFPS);[_videoDevice unlockForConfiguration];}[self.session commitConfiguration];NSMutableDictionary *tmpSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] mutableCopy];if (!EFMachineVersion.isiPhone5sOrLater) {NSNumber *tmpSettingValue = tmpSettings[AVVideoHeightKey];tmpSettings[AVVideoHeightKey] = tmpSettings[AVVideoWidthKey];tmpSettings[AVVideoWidthKey] = tmpSettingValue;}self.videoCompressingSettings = [tmpSettings copy];self.iExpectedFPS = iFPS;[self addObservers];}return self;
}- (void)rotateCamera {if (self.devicePosition == AVCaptureDevicePositionFront) {self.devicePosition = AVCaptureDevicePositionBack;}else{self.devicePosition = AVCaptureDevicePositionFront;}
}- (void)rotateCamera:(BOOL)isUseFrontCamera {if (isUseFrontCamera) {self.devicePosition = AVCaptureDevicePositionFront;}else{self.devicePosition = AVCaptureDevicePositionBack;}
}- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame {BOOL isFrontCamera = self.devicePosition == AVCaptureDevicePositionFront;float fX = point.y / frame.size.height;float fY = isFrontCamera ? point.x / frame.size.width : (1 - point.x / frame.size.width);[self focusWithMode:self.videoDevice.focusMode exposureMode:self.videoDevice.exposureMode atPoint:CGPointMake(fX, fY)];
}- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{NSError *error = nil;AVCaptureDevice * device = self.videoDevice;if ( [device lockForConfiguration:&error] ) {device.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
//        - (void)setISOValue:(float)value{[self setISOValue:0];
//device.exposureTargetBias// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation// Call -set(Focus/Exposure)Mode: to apply the new point of interestif ( focusMode != AVCaptureFocusModeLocked && device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {device.focusPointOfInterest = point;device.focusMode = focusMode;}if ( exposureMode != AVCaptureExposureModeCustom && device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {device.exposurePointOfInterest = point;device.exposureMode = exposureMode;}device.subjectAreaChangeMonitoringEnabled = YES;[device unlockForConfiguration];}
}- (void)setWhiteBalance{[self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {if ([captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) {[captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];}}];
}- (void)changeDeviceProperty:(void(^)(AVCaptureDevice *))propertyChange{AVCaptureDevice *captureDevice= self.videoDevice;NSError *error;if ([captureDevice lockForConfiguration:&error]) {propertyChange(captureDevice);[captureDevice unlockForConfiguration];}else{NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);}
}- (void)setISOValue:(float)value exposeDuration:(int)duration{float currentISO = (value < self.videoDevice.activeFormat.minISO) ? self.videoDevice.activeFormat.minISO: value;currentISO = value > self.videoDevice.activeFormat.maxISO ? self.videoDevice.activeFormat.maxISO : value;NSError *error;if ([self.videoDevice lockForConfiguration:&error]){[self.videoDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];[self.videoDevice unlockForConfiguration];}
}- (void)setISOValue:(float)value{
//    float newVlaue = (value - 0.5) * (5.0 / 0.5); // mirror [0,1] to [-8,8]NSLog(@"%f", value);NSError *error = nil;if ( [self.videoDevice lockForConfiguration:&error] ) {[self.videoDevice setExposureTargetBias:value completionHandler:nil];[self.videoDevice unlockForConfiguration];}else {NSLog( @"Could not lock device for configuration: %@", error );}
}- (void)setDevicePosition:(AVCaptureDevicePosition)devicePosition
{if (_devicePosition != devicePosition && devicePosition != AVCaptureDevicePositionUnspecified) {if (_session) {AVCaptureDevice *targetDevice = [self cameraDeviceWithPosition:devicePosition];if (targetDevice && [self judgeCameraAuthorization]) {NSError *error = nil;AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:targetDevice error:&error];if(!deviceInput || error) {NSLog(@"Error creating capture device input: %@", error.localizedDescription);return;}_bSessionPause = YES;[_session beginConfiguration];[_session removeInput:_deviceInput];if ([_session canAddInput:deviceInput]) {[_session addInput:deviceInput];_deviceInput = deviceInput;_videoDevice = targetDevice;_devicePosition = devicePosition;}_videoConnection =  [_dataOutput connectionWithMediaType:AVMediaTypeVideo];if ([_videoConnection isVideoOrientationSupported]) {[_videoConnection setVideoOrientation:_videoOrientation];}if ([_videoConnection isVideoMirroringSupported]) {[_videoConnection setVideoMirrored:devicePosition == AVCaptureDevicePositionFront];}[_session commitConfiguration];[self setSessionPreset:_sessionPreset];_bSessionPause = NO;}}}
}- (void)setSessionPreset:(NSString *)sessionPreset
{if (_session && _sessionPreset) {//        if (![sessionPreset isEqualToString:_sessionPreset]) {_bSessionPause = YES;[_session beginConfiguration];if ([_session canSetSessionPreset:sessionPreset]) {[_session setSessionPreset:sessionPreset];_sessionPreset = sessionPreset;}[_session commitConfiguration];self.videoCompressingSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] copy];//        [self setIExpectedFPS:_iExpectedFPS];_bSessionPause = NO;
//        }}
}- (void)setIExpectedFPS:(int)iExpectedFPS
{_iExpectedFPS = iExpectedFPS;if (iExpectedFPS <= 0 || !_dataOutput.videoSettings || !_videoDevice) {return;}CGFloat fWidth = [[_dataOutput.videoSettings objectForKey:@"Width"] floatValue];CGFloat fHeight = [[_dataOutput.videoSettings objectForKey:@"Height"] floatValue];AVCaptureDeviceFormat *bestFormat = nil;AVFrameRateRange *bestFrameRateRange = nil;for (AVCaptureDeviceFormat *format in [_videoDevice formats]) {CMFormatDescriptionRef description = format.formatDescription;if (CMFormatDescriptionGetMediaSubType(description) != kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) {continue;}CMVideoDimensions videoDimension = CMVideoFormatDescriptionGetDimensions(description);if ((videoDimension.width == fWidth && videoDimension.height == fHeight)||(videoDimension.height == fWidth && videoDimension.width == fHeight)) {for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {if (range.maxFrameRate >= bestFrameRateRange.maxFrameRate) {bestFormat = format;bestFrameRateRange = range;}}}}if (bestFormat) {CMTime minFrameDuration;if (bestFrameRateRange.minFrameDuration.timescale / bestFrameRateRange.minFrameDuration.value < iExpectedFPS) {minFrameDuration = bestFrameRateRange.minFrameDuration;}else{minFrameDuration = CMTimeMake(1, iExpectedFPS);}if ([_videoDevice lockForConfiguration:NULL] == YES) {_videoDevice.activeFormat = bestFormat;_videoDevice.activeVideoMinFrameDuration = minFrameDuration;_videoDevice.activeVideoMaxFrameDuration = minFrameDuration;[_videoDevice unlockForConfiguration];}}
}- (void)startRunning
{if (![self judgeCameraAuthorization]) {return;}if (!self.dataOutput) {return;}if (self.session && ![self.session isRunning]) {if (self.bufferQueue) {dispatch_async(self.bufferQueue, ^{[self.session startRunning];});}self.bSessionPause = NO;}
}- (void)stopRunning
{if (self.session && [self.session isRunning]) {if (self.bufferQueue) {dispatch_async(self.bufferQueue, ^{[self.session stopRunning];});}self.bSessionPause = YES;}
}- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit
{CGRect rectRet = rect;if (self.dataOutput.videoSettings) {CGFloat fWidth = [[self.dataOutput.videoSettings objectForKey:@"Width"] floatValue];CGFloat fHeight = [[self.dataOutput.videoSettings objectForKey:@"Height"] floatValue];float fScaleX = fWidth / CGRectGetWidth(rect);float fScaleY = fHeight / CGRectGetHeight(rect);float fScale = bScaleToFit ? fmaxf(fScaleX, fScaleY) : fminf(fScaleX, fScaleY);fWidth /= fScale;fHeight /= fScale;CGFloat fX = rect.origin.x - (fWidth - rect.size.width) / 2.0f;CGFloat fY = rect.origin.y - (fHeight - rect.size.height) / 2.0f;rectRet = CGRectMake(fX, fY, fWidth, fHeight);}return rectRet;
}- (BOOL)judgeCameraAuthorization
{AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];if (authStatus == AVAuthorizationStatusRestricted || authStatus == AVAuthorizationStatusDenied) {return NO;}return YES;
}- (AVCaptureDevice *)cameraDeviceWithPosition:(AVCaptureDevicePosition)position
{AVCaptureDevice *deviceRet = nil;if (position != AVCaptureDevicePositionUnspecified) {NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];for (AVCaptureDevice *device in devices) {if ([device position] == position) {deviceRet = device;}}}return deviceRet;
}- (AVCaptureVideoPreviewLayer *)previewLayer
{if (!_previewLayer) {_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];}return _previewLayer;
}- (void)snapStillImageCompletionHandler:(void (^)(CMSampleBufferRef imageDataSampleBuffer, NSError *error))handler
{if ([self judgeCameraAuthorization]) {self.bSessionPause = YES;NSString *strSessionPreset = [self.sessionPreset mutableCopy];self.sessionPreset = AVCaptureSessionPresetPhoto;// 改变preset会黑一下[NSThread sleepForTimeInterval:0.3];dispatch_async(self.bufferQueue, ^{[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {self.bSessionPause = NO;self.sessionPreset = strSessionPreset;handler(imageDataSampleBuffer , error);}];} );}
}
BOOL updateExporureModel(STExposureModel model){if (currentExposureMode == model) return NO;currentExposureMode = model;return YES;
}- (void)test:(AVCaptureExposureMode)model{if ([self.videoDevice lockForConfiguration:nil]) {if ([self.videoDevice  isExposureModeSupported:model]) {[self.videoDevice setExposureMode:model];}[self.videoDevice unlockForConfiguration];}
}- (void)setExposureTime:(CMTime)time{if ([self.videoDevice lockForConfiguration:nil]) {if (@available(iOS 12.0, *)) {self.videoDevice.activeMaxExposureDuration = time;} else {// Fallback on earlier versions}[self.videoDevice unlockForConfiguration];}
}
- (void)setFPS:(float)fps{if ([_videoDevice lockForConfiguration:NULL] == YES) {
//            _videoDevice.activeFormat = bestFormat;_videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);_videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, fps);[_videoDevice unlockForConfiguration];}
}- (void)updateExposure:(CMSampleBufferRef)sampleBuffer{CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);NSDictionary * metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary *)metadataDict];CFRelease(metadataDict);NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];if(brightnessValue > 2 && updateExporureModel(STExposureModelPositive2)){[self setISOValue:500 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];[self setFPS:30];}else if(brightnessValue > 1 && brightnessValue < 2 && updateExporureModel(STExposureModelPositive1)){[self setISOValue:500 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];[self setFPS:30];}else if(brightnessValue > 0 && brightnessValue < 1 && updateExporureModel(STExposureModel0)){[self setISOValue:500 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];[self setFPS:30];}else if (brightnessValue > -1 && brightnessValue < 0 && updateExporureModel(STExposureModelNegative1)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:40];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -2 && brightnessValue < -1 && updateExporureModel(STExposureModelNegative2)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:35];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -2.5 && brightnessValue < -2 && updateExporureModel(STExposureModelNegative3)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -3 && brightnessValue < -2.5 && updateExporureModel(STExposureModelNegative4)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:25];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -3.5 && brightnessValue < -3 && updateExporureModel(STExposureModelNegative5)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:20];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -4 && brightnessValue < -3.5 && updateExporureModel(STExposureModelNegative6)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 250 exposeDuration:15];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -5 && brightnessValue < -4 && updateExporureModel(STExposureModelNegative7)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:10];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if(brightnessValue < -5 && updateExporureModel(STExposureModelNegative8)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 150 exposeDuration:5];[self test:AVCaptureExposureModeContinuousAutoExposure];}//    NSLog(@"current brightness %f min iso %f max iso %f min exposure %f max exposure %f", brightnessValue, self.videoDevice.activeFormat.minISO, self.videoDevice.activeFormat.maxISO, CMTimeGetSeconds(self.videoDevice.activeFormat.minExposureDuration),   CMTimeGetSeconds(self.videoDevice.activeFormat.maxExposureDuration));
}- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{if (!self.bSessionPause) {if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {[self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];}
//        if (self.delegate && [self.delegate respondsToSelector:@selector(captureOutput:didOutputSampleBuffer:fromConnection:)]) {
//            //[connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//            [self.delegate captureOutput:captureOutput didOutputSampleBuffer:sampleBuffer fromConnection:connection];
//        }}
//    [self updateExposure:sampleBuffer];
}- (void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection{AVMetadataFaceObject *faceObject = nil;for(AVMetadataObject *object  in metadataObjects){if (AVMetadataObjectTypeFace == object.type) {faceObject = (AVMetadataFaceObject*)object;}}static BOOL hasFace = NO;if (!hasFace && faceObject.faceID) {hasFace = YES;}if (!faceObject.faceID) {hasFace = NO;}
}#pragma mark - Notifications
- (void)addObservers {[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appWillResignActive) name:UIApplicationWillResignActiveNotification object:nil];[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appDidBecomeActive) name:UIApplicationDidBecomeActiveNotification object:nil];[[NSNotificationCenter defaultCenter] addObserver:selfselector:@selector(dealCaptureSessionRuntimeError:)name:AVCaptureSessionRuntimeErrorNotificationobject:self.session];
}- (void)removeObservers {[[NSNotificationCenter defaultCenter] removeObserver:self];
}#pragma mark - Notification
- (void)appWillResignActive {RTCLogInfo("SDCustomRTCCameraCapturer appWillResignActive");
}- (void)appDidBecomeActive {RTCLogInfo("SDCustomRTCCameraCapturer appDidBecomeActive");[self startRunning];
}- (void)dealCaptureSessionRuntimeError:(NSNotification *)notification {NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];RTCLogError(@"SDCustomRTCCameraCapturer dealCaptureSessionRuntimeError error: %@", error);if (error.code == AVErrorMediaServicesWereReset) {[self startRunning];}
}#pragma mark - Dealloc
- (void)dealloc {DebugLog(@"SDCustomRTCCameraCapturer dealloc");[self removeObservers];if (self.session) {self.bSessionPause = YES;[self.session beginConfiguration];[self.session removeOutput:self.dataOutput];[self.session removeInput:self.deviceInput];[self.session commitConfiguration];if ([self.session isRunning]) {[self.session stopRunning];}self.session = nil;}
}@end

三、实现WebRTC结合ossrs视频通话功能

iOS端WebRTC调用ossrs相关,实现直播视频通话功能请查看:
https://blog.csdn.net/gloryFlow/article/details/132262724

这里列出来需要改动的地方

在createVideoTrack中使用SDCustomRTCCameraCapturer类

- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];self.localVideoSource = videoSource;// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {if (@available(iOS 10, *)) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];} else {// Fallback on earlier versions}} else{self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}

当需要渲染到界面上的时候,需要设置startCaptureLocalVideo,为 localVideoTrack添加渲染的界面View,renderer是一个RTCMTLVideoView

self.localRenderer = [[RTCMTLVideoView alloc] initWithFrame:CGRectZero];self.localRenderer.delegate = self;
- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}if (!renderer) {return;}if (!self.videoCapturer) {return;}[self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;[cameraVideoCapturer setISOValue:0.0];[cameraVideoCapturer rotateCamera:self.usingFrontCamera];self.videoCapturer.delegate = self;AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;if (!formatNilable) {return;}DebugLog(@"formatNilable:%@", formatNilable);NSInteger fps = [self selectFpsForFormat:formatNilable];CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);float width = videoVideoDimensions.width;float height = videoVideoDimensions.height;DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);[cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];[self changeResolution:width height:height fps:(int)fps];}if (@available(iOS 10, *)) {if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}} else {// Fallback on earlier versions}[self.localVideoTrack addRenderer:renderer];
}

获得相机采集的画面- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

具体代码如下

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);RTCCVPixelBuffer *rtcPixelBuffer =[[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *1000000000;RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}

WebRTCClient完整代码如下

WebRTCClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>
#import <UIKit/UIKit.h>
#import "SDCustomRTCCameraCapturer.h"#define kSelectedResolution @"kSelectedResolutionIndex"
#define kFramerateLimit 30.0@protocol WebRTCClientDelegate;
@interface WebRTCClient : NSObject@property (nonatomic, weak) id<WebRTCClientDelegate> delegate;/**connect工厂*/
@property (nonatomic, strong) RTCPeerConnectionFactory *factory;/**是否push*/
@property (nonatomic, assign) BOOL isPublish;/**connect*/
@property (nonatomic, strong) RTCPeerConnection *peerConnection;/**RTCAudioSession*/
@property (nonatomic, strong) RTCAudioSession *rtcAudioSession;/**DispatchQueue*/
@property (nonatomic) dispatch_queue_t audioQueue;/**mediaConstrains*/
@property (nonatomic, strong) NSDictionary *mediaConstrains;/**publishMediaConstrains*/
@property (nonatomic, strong) NSDictionary *publishMediaConstrains;/**playMediaConstrains*/
@property (nonatomic, strong) NSDictionary *playMediaConstrains;/**optionalConstraints*/
@property (nonatomic, strong) NSDictionary *optionalConstraints;/**RTCVideoCapturer摄像头采集器*/
@property (nonatomic, strong) RTCVideoCapturer *videoCapturer;/**local语音localAudioTrack*/
@property (nonatomic, strong) RTCAudioTrack *localAudioTrack;/**localVideoTrack*/
@property (nonatomic, strong) RTCVideoTrack *localVideoTrack;/**remoteVideoTrack*/
@property (nonatomic, strong) RTCVideoTrack *remoteVideoTrack;/**RTCVideoRenderer*/
@property (nonatomic, weak) id<RTCVideoRenderer> remoteRenderView;/**localDataChannel*/
@property (nonatomic, strong) RTCDataChannel *localDataChannel;/**localDataChannel*/
@property (nonatomic, strong) RTCDataChannel *remoteDataChannel;/**RTCVideoSource*/
@property (nonatomic, strong) RTCVideoSource *localVideoSource;- (instancetype)initWithPublish:(BOOL)isPublish;- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer;- (void)addIceCandidate:(RTCIceCandidate *)candidate;- (void)answer:(void (^)(RTCSessionDescription *sdp))completionHandler;- (void)offer:(void (^)(RTCSessionDescription *sdp))completionHandler;- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion;- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate;- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps;- (NSArray<NSString *> *)availableVideoResolutions;#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer;#pragma mark - Hiden or show Video
- (void)hidenVideo;- (void)showVideo;#pragma mark - Hiden or show Audio
- (void)muteAudio;- (void)unmuteAudio;- (void)speakOff;- (void)speakOn;#pragma mark - 设置视频码率BitrateBps
- (void)setMaxBitrate:(int)maxBitrate;- (void)setMinBitrate:(int)minBitrate;#pragma mark - 设置视频帧率
- (void)setMaxFramerate:(int)maxFramerate;- (void)close;@end@protocol WebRTCClientDelegate <NSObject>// 处理美颜设置
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer;- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate;
- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state;
- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data;@end

WebRTCClient.m

#import "WebRTCClient.h"@interface WebRTCClient ()<RTCPeerConnectionDelegate, RTCDataChannelDelegate, RTCVideoCapturerDelegate, SDCustomRTCCameraCapturerDelegate>@property (nonatomic, assign) BOOL usingFrontCamera;@end@implementation WebRTCClient- (instancetype)initWithPublish:(BOOL)isPublish {self = [super init];if (self) {self.isPublish = isPublish;self.usingFrontCamera = YES;RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.publishMediaConstrains optionalConstraints:self.optionalConstraints];RTCConfiguration *newConfig = [[RTCConfiguration alloc] init];newConfig.sdpSemantics = RTCSdpSemanticsUnifiedPlan;newConfig.continualGatheringPolicy = RTCContinualGatheringPolicyGatherContinually;self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];[self createMediaSenders];[self createMediaReceivers];// srs not support data channel.// self.createDataChannel()[self configureAudioSession];self.peerConnection.delegate = self;}return self;
}- (void)createMediaSenders {if (!self.isPublish) {return;}NSString *streamId = @"stream";// AudioRTCAudioTrack *audioTrack = [self createAudioTrack];self.localAudioTrack = audioTrack;RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;audioTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];// VideoRTCRtpEncodingParameters *encodingParameters = [[RTCRtpEncodingParameters alloc] init];encodingParameters.maxBitrateBps = @(6000000);
//    encodingParameters.bitratePriority = 1.0;RTCVideoTrack *videoTrack = [self createVideoTrack];self.localVideoTrack = videoTrack;RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;videoTrackTransceiver.streamIds = @[streamId];// 设置该属性后,SRS服务无法播放视频画面
//    videoTrackTransceiver.sendEncodings = @[encodingParameters];[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];[self setDegradationPreference:RTCDegradationPreferenceBalanced];[self setMaxBitrate:6000000];
}- (void)createMediaReceivers {if (!self.isPublish) {return;}if (self.peerConnection.transceivers.count > 0) {RTCRtpTransceiver *transceiver = self.peerConnection.transceivers.firstObject;if (transceiver.mediaType == RTCRtpMediaTypeVideo) {RTCVideoTrack *track = (RTCVideoTrack *)transceiver.receiver.track;self.remoteVideoTrack = track;}}
}- (void)configureAudioSession {[self.rtcAudioSession lockForConfiguration];@try {RTCAudioSessionConfiguration *configuration =[[RTCAudioSessionConfiguration alloc] init];configuration.category = AVAudioSessionCategoryPlayAndRecord;configuration.categoryOptions = AVAudioSessionCategoryOptionDefaultToSpeaker;configuration.mode = AVAudioSessionModeDefault;BOOL hasSucceeded = NO;NSError *error = nil;if (self.rtcAudioSession.isActive) {hasSucceeded = [self.rtcAudioSession setConfiguration:configuration error:&error];} else {hasSucceeded = [self.rtcAudioSession setConfiguration:configurationactive:YESerror:&error];}if (!hasSucceeded) {DebugLog(@"Error setting configuration: %@", error.localizedDescription);}} @catch (NSException *exception) {DebugLog(@"configureAudioSession exception:%@", exception);}[self.rtcAudioSession unlockForConfiguration];
}- (RTCAudioTrack *)createAudioTrack {/// enable google 3A algorithm.NSDictionary *mandatory = @{@"googEchoCancellation": kRTCMediaConstraintsValueTrue,@"googAutoGainControl": kRTCMediaConstraintsValueTrue,@"googNoiseSuppression": kRTCMediaConstraintsValueTrue,};RTCMediaConstraints *audioConstrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatory optionalConstraints:self.optionalConstraints];RTCAudioSource *audioSource = [self.factory audioSourceWithConstraints:audioConstrains];RTCAudioTrack *audioTrack = [self.factory audioTrackWithSource:audioSource trackId:@"audio0"];return audioTrack;
}- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];self.localVideoSource = videoSource;// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {if (@available(iOS 10, *)) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];} else {// Fallback on earlier versions}} else{self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}- (void)addIceCandidate:(RTCIceCandidate *)candidate {[self.peerConnection addIceCandidate:candidate];
}- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {if (self.isPublish) {self.mediaConstrains = self.publishMediaConstrains;} else {self.mediaConstrains = self.playMediaConstrains;}RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];DebugLog(@"peerConnection:%@",self.peerConnection);__weak typeof(self) weakSelf = self;[weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {DebugLog(@"offer offerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {DebugLog(@"offer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}- (void)answer:(void (^)(RTCSessionDescription *sdp))completion {RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];__weak typeof(self) weakSelf = self;[weakSelf.peerConnection answerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {DebugLog(@"answer answerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {DebugLog(@"answer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion {[self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate {[self.peerConnection addIceCandidate:remoteCandidate];
}- (void)setMaxBitrate:(int)maxBitrate {if (self.peerConnection.senders.count > 0) {RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;RTCRtpParameters *parametersToModify = videoSender.parameters;for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {encoding.maxBitrateBps = @(maxBitrate);}[videoSender setParameters:parametersToModify];}
}- (void)setDegradationPreference:(RTCDegradationPreference)degradationPreference {// RTCDegradationPreferenceMaintainResolutionNSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];for (RTCRtpSender *sender in self.peerConnection.senders) {if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {// [videoSenders addObject:sender];RTCRtpParameters *parameters = sender.parameters;parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];[sender setParameters:parameters];[videoSenders addObject:sender];}}for (RTCRtpSender *sender in videoSenders) {RTCRtpParameters *parameters = sender.parameters;parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];[sender setParameters:parameters];}
}- (void)setMinBitrate:(int)minBitrate {if (self.peerConnection.senders.count > 0) {RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;RTCRtpParameters *parametersToModify = videoSender.parameters;for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {encoding.minBitrateBps = @(minBitrate);}[videoSender setParameters:parametersToModify];}
}- (void)setMaxFramerate:(int)maxFramerate {if (self.peerConnection.senders.count > 0) {RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;RTCRtpParameters *parametersToModify = videoSender.parameters;// 该版本暂时没有maxFramerate,需要更新到最新版本for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {encoding.maxFramerate = @(maxFramerate);}[videoSender setParameters:parametersToModify];}
}#pragma mark - Private
- (AVCaptureDevice *)findDeviceForPosition:(AVCaptureDevicePosition)position {NSArray<AVCaptureDevice *> *captureDevices =[SDCustomRTCCameraCapturer captureDevices];for (AVCaptureDevice *device in captureDevices) {if (device.position == position) {return device;}}return captureDevices[0];
}- (NSArray<NSString *> *)availableVideoResolutions {NSMutableSet<NSArray<NSNumber *> *> *resolutions =[[NSMutableSet<NSArray<NSNumber *> *> alloc] init];for (AVCaptureDevice *device in [SDCustomRTCCameraCapturer captureDevices]) {for (AVCaptureDeviceFormat *format in[SDCustomRTCCameraCapturer supportedFormatsForDevice:device]) {CMVideoDimensions resolution =CMVideoFormatDescriptionGetDimensions(format.formatDescription);NSArray<NSNumber *> *resolutionObject = @[@(resolution.width), @(resolution.height) ];[resolutions addObject:resolutionObject];}}NSArray<NSArray<NSNumber *> *> *sortedResolutions =[[resolutions allObjects] sortedArrayUsingComparator:^NSComparisonResult(NSArray<NSNumber *> *obj1, NSArray<NSNumber *> *obj2) {NSComparisonResult cmp = [obj1.firstObject compare:obj2.firstObject];if (cmp != NSOrderedSame) {return cmp;}return [obj1.lastObject compare:obj2.lastObject];}];NSMutableArray<NSString *> *resolutionStrings = [[NSMutableArray<NSString *> alloc] init];for (NSArray<NSNumber *> *resolution in sortedResolutions) {NSString *resolutionString =[NSString stringWithFormat:@"%@x%@", resolution.firstObject, resolution.lastObject];[resolutionStrings addObject:resolutionString];}return [resolutionStrings copy];
}- (int)videoResolutionComponentAtIndex:(int)index inString:(NSString *)resolution {if (index != 0 && index != 1) {return 0;}NSArray<NSString *> *components = [resolution componentsSeparatedByString:@"x"];if (components.count != 2) {return 0;}return components[index].intValue;
}- (AVCaptureDeviceFormat *)selectFormatForDevice:(AVCaptureDevice *)device {RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;NSArray *availableVideoResolutions = [self availableVideoResolutions];NSString *selectedIndexStr = [[NSUserDefaults standardUserDefaults] objectForKey:kSelectedResolution];int selectedIndex = 0;if (selectedIndexStr && selectedIndexStr.length > 0) {selectedIndex = [selectedIndexStr intValue];}NSString *videoResolution = [availableVideoResolutions objectAtIndex:selectedIndex];DebugLog(@"availableVideoResolutions:%@, videoResolution:%@", availableVideoResolutions, videoResolution);NSArray<AVCaptureDeviceFormat *> *formats =[SDCustomRTCCameraCapturer supportedFormatsForDevice:device];int targetWidth = [self videoResolutionComponentAtIndex:0 inString:videoResolution];int targetHeight = [self videoResolutionComponentAtIndex:1 inString:videoResolution];AVCaptureDeviceFormat *selectedFormat = nil;int currentDiff = INT_MAX;for (AVCaptureDeviceFormat *format in formats) {CMVideoDimensions dimension = CMVideoFormatDescriptionGetDimensions(format.formatDescription);FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription);int diff = abs(targetWidth - dimension.width) + abs(targetHeight - dimension.height);if (diff < currentDiff) {selectedFormat = format;currentDiff = diff;} else if (diff == currentDiff && pixelFormat == [cameraVideoCapturer preferredOutputPixelFormat]) {selectedFormat = format;}}return selectedFormat;}return nil;
}- (NSInteger)selectFpsForFormat:(AVCaptureDeviceFormat *)format {Float64 maxSupportedFramerate = 0;for (AVFrameRateRange *fpsRange in format.videoSupportedFrameRateRanges) {maxSupportedFramerate = fmax(maxSupportedFramerate, fpsRange.maxFrameRate);}
//    return fmin(maxSupportedFramerate, kFramerateLimit);return 20;
}- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}if (!renderer) {return;}if (!self.videoCapturer) {return;}[self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;[cameraVideoCapturer setISOValue:0.0];[cameraVideoCapturer rotateCamera:self.usingFrontCamera];self.videoCapturer.delegate = self;AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;if (!formatNilable) {return;}DebugLog(@"formatNilable:%@", formatNilable);NSInteger fps = [self selectFpsForFormat:formatNilable];CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);float width = videoVideoDimensions.width;float height = videoVideoDimensions.height;DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);[cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];[self changeResolution:width height:height fps:(int)fps];}if (@available(iOS 10, *)) {if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}} else {// Fallback on earlier versions}[self.localVideoTrack addRenderer:renderer];
}- (void)renderRemoteVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}self.remoteRenderView = renderer;
}- (RTCDataChannel *)createDataChannel {RTCDataChannelConfiguration *config = [[RTCDataChannelConfiguration alloc] init];RTCDataChannel *dataChannel = [self.peerConnection dataChannelForLabel:@"WebRTCData" configuration:config];if (!dataChannel) {return nil;}dataChannel.delegate = self;self.localDataChannel = dataChannel;return dataChannel;
}- (void)sendData:(NSData *)data {RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:data isBinary:YES];[self.remoteDataChannel sendData:buffer];
}#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer {self.usingFrontCamera = !self.usingFrontCamera;[self startCaptureLocalVideo:renderer];
}#pragma mark - 更改分辨率
- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps {if (!self.localVideoSource) {RTCVideoSource *videoSource = [self.factory videoSource];[videoSource adaptOutputFormatToWidth:width height:height fps:fps];} else {[self.localVideoSource adaptOutputFormatToWidth:width height:height fps:fps];}return YES;
}#pragma mark - Hiden or show Video
- (void)hidenVideo {[self setVideoEnabled:NO];self.localVideoTrack.isEnabled = NO;
}- (void)showVideo {[self setVideoEnabled:YES];self.localVideoTrack.isEnabled = YES;
}- (void)setVideoEnabled:(BOOL)isEnabled {[self setTrackEnabled:[RTCVideoTrack class] isEnabled:isEnabled];
}- (void)setTrackEnabled:(Class)track isEnabled:(BOOL)isEnabled {for (RTCRtpTransceiver *transceiver in self.peerConnection.transceivers) {if (transceiver && [transceiver isKindOfClass:track]) {transceiver.sender.track.isEnabled = isEnabled;}}
}#pragma mark - Hiden or show Audio
- (void)muteAudio {[self setAudioEnabled:NO];self.localAudioTrack.isEnabled = NO;
}- (void)unmuteAudio {[self setAudioEnabled:YES];self.localAudioTrack.isEnabled = YES;
}- (void)speakOff {__weak typeof(self) weakSelf = self;dispatch_async(self.audioQueue, ^{[weakSelf.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *ooapError;[self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&ooapError];DebugLog(@"speakOff error:%@, ooapError:%@", error, ooapError);} @catch (NSException *exception) {DebugLog(@"speakOff exception:%@", exception);}[weakSelf.rtcAudioSession unlockForConfiguration];});
}- (void)speakOn {__weak typeof(self) weakSelf = self;dispatch_async(self.audioQueue, ^{[weakSelf.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *ooapError;[self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&ooapError];NSError *activeError;[self.rtcAudioSession setActive:YES error:&activeError];DebugLog(@"speakOn error:%@, ooapError:%@, activeError:%@", error, ooapError, activeError);} @catch (NSException *exception) {DebugLog(@"speakOn exception:%@", exception);}[weakSelf.rtcAudioSession unlockForConfiguration];});
}- (void)setAudioEnabled:(BOOL)isEnabled {[self setTrackEnabled:[RTCAudioTrack class] isEnabled:isEnabled];
}- (void)close {RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;[cameraVideoCapturer stopRunning];}[self.peerConnection close];self.peerConnection = nil;self.videoCapturer = nil;
}#pragma mark - RTCPeerConnectionDelegate
/** Called when the SignalingState changed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didChangeSignalingState:(RTCSignalingState)stateChanged {DebugLog(@"peerConnection didChangeSignalingState:%ld", (long)stateChanged);
}/** Called when media is received on a new stream from remote peer. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didAddStream:(RTCMediaStream *)stream {DebugLog(@"peerConnection didAddStream");if (self.isPublish) {return;}NSArray *videoTracks = stream.videoTracks;if (videoTracks && videoTracks.count > 0) {RTCVideoTrack *track = videoTracks.firstObject;self.remoteVideoTrack = track;}if (self.remoteVideoTrack && self.remoteRenderView) {id<RTCVideoRenderer> remoteRenderView = self.remoteRenderView;RTCVideoTrack *remoteVideoTrack = self.remoteVideoTrack;[remoteVideoTrack addRenderer:remoteRenderView];}/**if let audioTrack = stream.audioTracks.first{print("audio track faund")audioTrack.source.volume = 8}*/
}/** Called when a remote peer closes a stream.*  This is not called when RTCSdpSemanticsUnifiedPlan is specified.*/
- (void)peerConnection:(RTCPeerConnection *)peerConnection didRemoveStream:(RTCMediaStream *)stream {DebugLog(@"peerConnection didRemoveStream");
}/** Called when negotiation is needed, for example ICE has restarted. */
- (void)peerConnectionShouldNegotiate:(RTCPeerConnection *)peerConnection {DebugLog(@"peerConnection peerConnectionShouldNegotiate");
}/** Called any time the IceConnectionState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidChangeIceConnectionState:(RTCIceConnectionState)newState {DebugLog(@"peerConnection didChangeIceConnectionState:%ld", newState);if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didChangeConnectionState:)]) {[self.delegate webRTCClient:self didChangeConnectionState:newState];}if (RTCIceConnectionStateConnected == newState || RTCIceConnectionStateChecking == newState) {[self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];}
}/** Called any time the IceGatheringState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidChangeIceGatheringState:(RTCIceGatheringState)newState {DebugLog(@"peerConnection didChangeIceGatheringState:%ld", newState);
}/** New ice candidate has been found. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidGenerateIceCandidate:(RTCIceCandidate *)candidate {DebugLog(@"peerConnection didGenerateIceCandidate:%@", candidate);if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didDiscoverLocalCandidate:)]) {[self.delegate webRTCClient:self didDiscoverLocalCandidate:candidate];}
}/** Called when a group of local Ice candidates have been removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidRemoveIceCandidates:(NSArray<RTCIceCandidate *> *)candidates {DebugLog(@"peerConnection didRemoveIceCandidates:%@", candidates);
}/** New data channel has been opened. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidOpenDataChannel:(RTCDataChannel *)dataChannel {DebugLog(@"peerConnection didOpenDataChannel:%@", dataChannel);self.remoteDataChannel = dataChannel;
}/** Called when signaling indicates a transceiver will be receiving media from*  the remote endpoint.*  This is only called with RTCSdpSemanticsUnifiedPlan specified.*/
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver {DebugLog(@"peerConnection didStartReceivingOnTransceiver:%@", transceiver);
}/** Called when a receiver and its track are created. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidAddReceiver:(RTCRtpReceiver *)rtpReceiverstreams:(NSArray<RTCMediaStream *> *)mediaStreams {DebugLog(@"peerConnection didAddReceiver");
}/** Called when the receiver and its track are removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidRemoveReceiver:(RTCRtpReceiver *)rtpReceiver {DebugLog(@"peerConnection didRemoveReceiver");
}#pragma mark - RTCDataChannelDelegate
/** The data channel state changed. */
- (void)dataChannelDidChangeState:(RTCDataChannel *)dataChannel {DebugLog(@"dataChannelDidChangeState:%@", dataChannel);
}/** The data channel successfully received a data buffer. */
- (void)dataChannel:(RTCDataChannel *)dataChannel
didReceiveMessageWithBuffer:(RTCDataBuffer *)buffer {if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didReceiveData:)]) {[self.delegate webRTCClient:self didReceiveData:buffer.data];}
}#pragma mark - RTCVideoCapturerDelegate处理代理
- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame {
//    DebugLog(@"capturer:%@ didCaptureVideoFrame:%@", capturer, frame);
//    RTCVideoFrame *aFilterVideoFrame;
//    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didCaptureVideoFrame:)]) {
//        aFilterVideoFrame = [self.delegate webRTCClient:self didCaptureVideoFrame:frame];
//    }//  操作C 需要手动释放  否则内存暴涨
//      CVPixelBufferRelease(_buffer)//    拿到pixelBuffersampleBuffer
//        ((RTCCVPixelBuffer*)frame.buffer).pixelBuffer//    if (!aFilterVideoFrame) {
//        aFilterVideoFrame = frame;
//    }
//
//    [self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];
}- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);RTCCVPixelBuffer *rtcPixelBuffer =[[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *1000000000;RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {if (!_factory) {RTCInitializeSSL();RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];_factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];}return _factory;
}- (dispatch_queue_t)audioQueue {if (!_audioQueue) {_audioQueue = dispatch_queue_create("cn.dface.webrtc", NULL);}return _audioQueue;
}- (RTCAudioSession *)rtcAudioSession {if (!_rtcAudioSession) {_rtcAudioSession = [RTCAudioSession sharedInstance];}return _rtcAudioSession;
}- (NSDictionary *)mediaConstrains {if (!_mediaConstrains) {_mediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",nil];}return _mediaConstrains;
}- (NSDictionary *)publishMediaConstrains {if (!_publishMediaConstrains) {_publishMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                   @"2160", kRTCMediaConstraintsMinWidth,
//                                   @"3840", kRTCMediaConstraintsMinHeight,
//                                   @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                   @"1", kRTCMediaConstraintsMaxAspectRatio,nil];}return _publishMediaConstrains;
}- (NSDictionary *)playMediaConstrains {if (!_playMediaConstrains) {_playMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                @"2160", kRTCMediaConstraintsMinWidth,
//                                @"3840", kRTCMediaConstraintsMinHeight,
//                                @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                @"1", kRTCMediaConstraintsMaxAspectRatio,nil];}return _playMediaConstrains;
}- (NSDictionary *)optionalConstraints {if (!_optionalConstraints) {_optionalConstraints = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueTrue, @"DtlsSrtpKeyAgreement",kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",nil];}return _optionalConstraints;
}- (void)dealloc {[self.peerConnection close];self.peerConnection = nil;
}@end

至此WebRTC视频自定义RTCVideoCapturer相机完毕。

其他
之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音视频通话高分辨率不显示画面问题,可以查看:https://blog.csdn.net/gloryFlow/article/details/132240952
修改SDP中的码率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage视频通话视频美颜滤镜,可以查看:https://blog.csdn.net/gloryFlow/article/details/132265842
RTC直播本地视频或相册视频,可以查看:https://blog.csdn.net/gloryFlow/article/details/132267068

四、小结

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机。主要获得相机采集的画面CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。内容较多,描述可能不准确,请见谅。

https://blog.csdn.net/gloryFlow/article/details/132308673

学习记录,每天不停进步。

相关文章:

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机 在之前已经实现了WebRTC调用ossrs服务&#xff0c;实现直播视频通话功能。但是在使用过程中&#xff0c;RTCCameraVideoCapturer类提供的方法不能修改及调节相机的灯光等设置&#xff0c;那就需要自定义RTCVideoCaptur…...

【基于鲲鹏及openEuler20.03TLS下MySQL8.0.17性能调优】

【基于鲲鹏及openEuler20.03TLS下MySQL8.0.17性能调优】 一、环境说明二、实验过程三、实验小结 一、环境说明 华为云ECS 规格&#xff1a;8vCPU 32G arm架构操作系统&#xff1a;openEuler 20.03.TLSMySQL版本&#xff1a;8.0.17 二、实验过程 创建用户及用户组&#xff1a;…...

GRPC 学习记录

GRPC 安装 安装 grpcio、grpcio-tools、protobuf、 pip install grpcio -i https://pypi.tuna.tsinghua.edu.cn/simple pip install grpcio-tools -i https://pypi.tuna.tsinghua.edu.cn/simple pip install protobuf -i https://pypi.tuna.tsinghua.edu.cn/simple常用类型 p…...

C++语言的QT写软件界面,结合python深度学习模型的综合应用处理方案

C与python问题合集&#xff1a; 后面内容涉及到api的创建问题 如果我用C语言的QT写软件界面&#xff0c;然后用python语言去写和人工智能相关的东西。就比如说一些模型&#xff0c;那么现在我想将用python写的模型放在QT写的软件当中调用&#xff0c;那么请问是否会导致C语言…...

Linux环境下python连接Oracle教程

下载Oracle client需要的 安装包 rpm包下载地址&#xff1a;Oracle官方下载地址 选择系统版本 选择Oracle版本 下载3个rpm安装包 oracle-instantclient12.2-basic-12.2.0.1.0-1.i386.rpm oracle-instantclient12.2-devel-12.2.0.1.0-1.i386.rpm oracle-instantclient12.2-sq…...

第 7 章 排序算法(1)

7.1排序算法的介绍 排序也称排序算法(Sort Algorithm)&#xff0c;排序是将一组数据&#xff0c;依指定的顺序进行排列的过程。 7.2排序的分类&#xff1a; 内部排序: 指将需要处理的所有数据都加载到**内部存储器(内存)**中进行排序。外部排序法&#xff1a; 数据量过大&am…...

wsl,字体乱码问题

配置wsl&#xff0c;字体乱码问题 一、前言 用zsh配置好wsl&#xff0c;每次打开还是会出现乱码&#xff0c;只有再新打开一个终端才会显示字体 如下图&#xff1a;第一次打开&#xff0c;出现乱码 如图&#xff1a;按加号&#xff0c;再开一个新终端才会显示字体。 二、解…...

【NetCore】10-路由定义

文章目录 路由与终结点&#xff1a;如何规划好Web Api1. 路由1.1 路由映射1.2 路由注册方式1.3 路由约束总结: Web Api定义 路由与终结点&#xff1a;如何规划好Web Api 1. 路由 1.1 路由映射 路由系统核心作用是指URL和应用程序Controller的对应关系的一种映射 这种映射的作…...

软考:中级软件设计师:数据库模式、ER模型

软考&#xff1a;中级软件设计师:数据库模式、ER模型 提示&#xff1a;系列被面试官问的问题&#xff0c;我自己当时不会&#xff0c;所以下来自己复盘一下&#xff0c;认真学习和总结&#xff0c;以应对未来更多的可能性 关于互联网大厂的笔试面试&#xff0c;都是需要细心准…...

海量数据迁移,亚马逊云科技云数据库服务为大库治理提供新思路

1.背景 目前&#xff0c;文档型数据库由于灵活的schema和接近关系型数据库的访问特点&#xff0c;被广泛应用&#xff0c;尤其是游戏、互联网金融等行业的客户使用MongoDB构建了大量应用程序&#xff0c;比如游戏客户用来处理玩家的属性信息&#xff1b;又如股票APP用来存储与时…...

DevOps系列文章之 GitlabCICD自动化部署SpringBoot项目

一、概述 本文主要记录如何通过Gitlab CI/CD自动部署SpringBoot项目jar包。 二、前期准备 准备三台 CentOS7服务器&#xff0c;分别部署以下服务&#xff1a; 序号系统IP服务1CentOS7192.168.56.10Gitlab2CentOS7192.168.56.11Runner &#xff08;安装Docker&#xff09;3Cen…...

汽车租赁管理系统/汽车租赁网站的设计与实现

摘 要 租赁汽车走进社区&#xff0c;走进生活&#xff0c;成为当今生活中不可缺少的一部分。随着汽车租赁业的发展&#xff0c;加强管理和规范管理司促进汽车租赁业健康发展的重要推动力。汽车租赁业为道路运输车辆一种新的融资服务形式、广大人民群众一种新的出行消费方式和…...

语句覆盖、条件覆盖、判定覆盖、条件-判定覆盖、路径覆盖

白盒测试是结构测试&#xff0c;主要对代码的逻辑进行验证。 逻辑覆盖率&#xff1a;语句覆盖<条件覆盖<判定覆盖<条件-判定覆盖<组合覆盖<路径覆盖 例子 一、语句覆盖 最基础的覆盖&#xff0c;只要每一个执行处理框内的语句都能执行就可&#xff0c;不用关注…...

二进制逻辑运算符

运算的优先级&#xff1a;非>与>或 1.逻辑与&#xff1a;“ ∧ \wedge ∧“&#xff0c;“ ⋅ \cdot ⋅“&#xff0c;and 在逻辑问题中与是所有的都是真结果才是真&#xff0c;比如&#xff1a; 1010101011 1010101011 1010101011和 1010110010 1010110010 1010110010…...

Bug日记-webstorm运行yarn 命令报错

在windows中输入yarn -v正确输出&#xff0c;在webstrom终端中运行yarn命令输出错误 问题&#xff1a;可能是由于 WebStorm 配置问题导致的。 解决方案&#xff1a; 检查 WebStorm 的终端配置&#xff1a;在 WebStorm 中&#xff0c;点击菜单栏的 “File”&#xff08;文件&am…...

C++11并发与多线程笔记(9) async、future、packaged_task、promise

C11并发与多线程笔记&#xff08;9&#xff09; async、future、packaged_task、promise 1、std::async、std::future创建后台任务并返回值2、std::packaged_task&#xff1a;打包任务&#xff0c;把任务包装起来3、std::promise3、小结 1、std::async、std::future创建后台任务…...

Mr. Cappuccino的第63杯咖啡——Spring之AnnotationConfigApplicationContext源码分析

Spring之AnnotationConfigApplicationContext源码分析 源码分析 源码分析 以上一篇文章《Spring之Bean的生命周期》的代码进行源码分析 AnnotationConfigApplicationContext applicationContext new AnnotationConfigApplicationContext(SpringConfig02.class); LifeCycleBe…...

opencv直方图与模板匹配

import cv2 #opencv读取的格式是BGR import numpy as np import matplotlib.pyplot as plt#Matplotlib是RGB %matplotlib inline def cv_show(img,name):cv2.imshow(name,img)cv2.waitKey()cv2.destroyAllWindows() 直方图 cv2.calcHist(images,channels,mask,histSize,ran…...

Apache Doris 入门教程31:计算节点

需求场景​ 目前Doris是一个典型Share-Nothing的架构, 通过绑定数据和计算资源在同一个节点获得非常好的性能表现. 但随着Doris计算引擎性能持续提高, 越来越多的用户也开始选择使用Doris直接查询数据湖数据. 这类场景是一种Share-Disk场景, 数据往往存储在远端的HDFS/S3上, 计…...

Nacos和GateWay路由转发NotFoundException: 503 SERVICE_UNAVAILABLE “Unable to find

问题再现&#xff1a; 2023-08-15 16:51:16,151 DEBUG [reactor-http-nio-2][CompositeLog.java:147] - [dc73b32c-1] Encoding [{timestampTue Aug 15 16:51:16 CST 2023, path/content/course/list, status503, errorService Unavai (truncated)...] 2023-08-15 16:51:16,17…...

2021年9月全国计算机等级考试真题(二级C语言)

2021年9月全国计算机等级考试真题&#xff08;二级C语言&#xff09; 第1题 下列叙述中正确的是&#xff08; &#xff09;。 A. 算法的复杂度是指算法所处理的数据量 B. 算法的复杂度是指算法程序中指令的数量 C. 算法的复杂度是指算法控制结构的复杂程度 D. 算法的复杂度包…...

串口通讯

USART是全双工同步通讯 在同步通信中&#xff0c;数据信号所传输的内容绝大多数属于有效数据&#xff0c;而异步通信中包含了各种帧的标识符&#xff0c;所以同步通讯的效率更高。但是同步通信对时钟要求苛刻&#xff0c;允许的误差小。而异步通信则允许双方的误差较大 比特率…...

自动拉取 GitHub 仓库更新的脚本

更好的阅读体验 \huge{\color{red}{更好的阅读体验}} 更好的阅读体验 由于将 HAUE-CS-WIKI 部署到了我自己的服务器上作为国内镜像站&#xff0c;每次在源站更新后都需要手动拉取镜像站的更新实在是太麻烦了&#xff0c;因此产生了编写该脚本的需求&#xff08; 读者可根据该…...

如何获得Android 14复活节彩蛋

每个新的安卓版本都有隐藏复活节彩蛋的悠久传统&#xff0c;可以追溯到以前&#xff0c;每个版本都以某种甜食命名。安卓14也不例外&#xff0c;但这一次的主题都是围绕太空构建的——还有一个复活节彩蛋。 安卓14复活节彩蛋实际上是一款很酷的小迷你游戏&#xff0c;你可以乘…...

国产32位单片机XL32F001,带1 路 12bit ADC,I2C、SPI、USART 等外设

XL32F001 系列单片机采用高性能的 32 位 ARM Cortex-M0内核&#xff0c;宽电压工作范围的 MCU。嵌入 24KbytesFlash 和 3Kbytes SRAM 存储器&#xff0c;最高工作频率 24MHz。包含多种不同封装类型多款产品。芯片集成 I2C、SPI、USART 等通讯外设&#xff0c;1 路 12bit ADC&am…...

typescript基础之null和undefined

TypeScript是一种基于JavaScript的编程语言&#xff0c;它支持静态类型检查和面向对象的特性。TypeScript中的null和undefined是两种基本类型&#xff0c;它们分别表示空值或未定义的值。在本文中&#xff0c;我将介绍TypeScript中null和undefined的含义、区别、检查方法和使用…...

php_mb_strlen指定扩展

1 中文在utf-字符集下占3个字节,所以计算出来长度为9。 2 可以引入php多字节字符的扩展&#xff0c;默认是没有的&#xff0c;需要自己配置这个函数 3 找到php.ini文件&#xff0c;去掉;extension mbstring的注释&#xff0c;接着重启apache服务 可以看到准确输出的中文的长度…...

利用OpenCV光流算法实现视频特征点跟踪

光流简介 光流&#xff08;optical flow&#xff09;是运动物体在观察成像平面上的像素运动的瞬时速度。光流法是利用图像序列中像素在时间域上的变化以及相邻帧之间的相关性来找到上一帧跟当前帧之间存在的对应关系&#xff0c;从而计算出相邻帧之间物体的运动信息的一种方法。…...

探索无限创造力的星辰大道,画出想象的浩瀚宇宙!-turtle

介绍 视频教程地址在此&#xff1a;https://www.bilibili.com/video/BV1Pm4y1H7Tb/ 大家好&#xff0c;欢迎来到本视频&#xff01;今天&#xff0c;我们将一同探索Python编程世界中的一个有趣而创意的库——Turtle库。无需专业绘画技能&#xff0c;你就可以轻松地用代码绘制…...

企业数字化转型大数据湖一体化平台项目建设方案PPT

导读&#xff1a;原文《企业数字化转型大数据湖一体化平台项目建设方案PPT》&#xff08;获取来源见文尾&#xff09;&#xff0c;本文精选其中精华及架构部分&#xff0c;逻辑清晰、内容完整&#xff0c;为快速形成售前方案提供参考。 喜欢文章&#xff0c;您可以点赞评论转发…...

【3Ds Max】车削命令的简单使用(以制作花瓶为例)

简介 在3ds Max中&#xff0c;"车削"&#xff08;Lathe&#xff09;是一种建模命令&#xff0c;用于创建围绕轴线旋转的几何形状。通过车削命令&#xff0c;您可以将一个闭合的平面或曲线几何形状旋转&#xff0c;从而生成一个立体对象。这种方法常用于创建圆柱体、…...

Python 3 使用HBase 总结

HBase 简介和安装 请参考文章&#xff1a;HBase 一文读懂 Python3 HBase API HBase 前期准备 1 安装happybase库操作hbase 安装该库 pip install happybase2 确保 Hadoop 和 Zookeeper 可用并开启 确保Hadoop 正常运行 确保Zookeeper 正常运行3 开启HBase thrift服务 使用命…...

Maven方式构建SpringBoot项目

目录 1、创建maven项目 2、添加springboot相关依赖 3、配置启动端口 4、修改APP文件 5、配置controller 6、启动应用 1、创建maven项目 项目如下&#xff1a; 2、添加springboot相关依赖 <parent><groupId>org.springframework.boot</groupId><arti…...

不花一分钱,利用免费电脑软件将视频MV变成歌曲音频MP3

教程 1.点击下载电脑软件下载地址&#xff0c;点击下载&#xff0c;安装。&#xff08;没有利益关系&#xff0c;没有打广告&#xff0c;只是单纯教学&#xff09; 2.安装完成后&#xff0c;点击格式工厂 3.然后如图所示依次&#xff0c;点击【音频】->【-MP3】 3.然后点击…...

运营知识之用户运营(一)触达用户的几种方式

运营知识之用户运营&#xff08;一&#xff09;触达用户的几种方式 APP推送短信&#xff08;DeepLink/Deferred DeepLink&#xff09;&#xff1a;短信拉起app电子邮件 EDM电话/外呼&#xff08;人工、AI&#xff09;电话外呼加短信&#xff08;操作步骤短链&#xff09;微信生…...

cocos creator pageView 循环展示 广告牌功能

在使用 creator pageView 滑动到最大或者最小为止的时候 滑动不了没法流畅的运行到最开始或者最后那个界面 循环展示 1.策划大人有需要就是要循环流畅的展示 解决方案: 做预制件的时候 最第一个界面之前 做一个最后的界面放到最前边去 比如 1,2,3,4,5,6,7,8 修改成 8,1…...

PyTorch Lightning:通过分布式训练扩展深度学习工作流

一、介绍 欢迎来到我们关于 PyTorch Lightning 系列的第二篇文章&#xff01;在上一篇文章中&#xff0c;我们向您介绍了 PyTorch Lightning&#xff0c;并探讨了它在简化深度学习模型开发方面的主要功能和优势。我们了解了 PyTorch Lightning 如何为组织和构建 PyTorch 代码提…...

无涯教程-Perl - splice函数

描述 此函数从LENGTH元素的OFFSET元素中删除ARRAY元素,如果指定,则用LIST替换删除的元素。如果省略LENGTH,则从OFFSET开始删除所有内容。 语法 以下是此函数的简单语法- splice ARRAY, OFFSET, LENGTH, LISTsplice ARRAY, OFFSET, LENGTHsplice ARRAY, OFFSET返回值 该函数…...

归并排序:从二路到多路

前言 我们所熟知的快速排序和归并排序都是非常优秀的排序算法。 但是快速排序和归并排序的一个区别就是&#xff1a;快速排序是一种内部排序&#xff0c;而归并排序是一种外部排序。 简单理解归并排序&#xff1a;递归地拆分&#xff0c;回溯过程中&#xff0c;将排序结果进…...

【Vue】运行项目报错 This dependency was not found

背景 运行Vue 项目报错&#xff0c;提示This dependency was not found&#xff1b;然后我根据提示 执行 npm install --save vue/types/umd ,执行后发现错误&#xff0c;我一开始一直以为是我本地装不上这个依赖。后来找了资料后&#xff0c;看到应该是自己的代码里面随意的i…...

Shell编程之正则表达式

文本处理器&#xff1a;三剑客&#xff1a;grep查找sed awk shell正则表达式由一类特殊字符以及文本字符所编写的一种模式&#xff0c;处理文本当中的内容&#xff0c;其中的一些字符不表示字符的字面含义表示一种控制或者通配的功能 通配符&#xff1a;匹配文件名和目录名&a…...

QGraphicsView 实例3地图浏览器

主要介绍Graphics View框架&#xff0c;实现地图的浏览、放大、缩小&#xff0c;以及显示各个位置的视图、场景和地图坐标 效果图: mapwidget.h #ifndef MAPWIDGET_H #define MAPWIDGET_H #include <QLabel> #include <QMouseEvent> #include <QGraphicsView&…...

Windows基础安全知识

目录 常用DOS命令 ipconfig ping dir cd net user 常用DOS命令 内置账户访问控制 Windows访问控制 安全标识符 访问控制项 用户账户控制 UAC令牌 其他安全配置 本地安全策略 用户密码策略复杂性要求 强制密码历史&#xff1a; 禁止密码重复使用 密码最短使用期限…...

自定义注解和自定义注解处理器来扫描所有带有某个特定注解的Controller层

在Spring Boot中&#xff0c;您可以使用自定义注解和自定义注解处理器来扫描所有带有某个特定注解的Controller层。 以下是一个简单的示例&#xff0c;演示如何实现这个功能&#xff1a; 首先&#xff0c;创建自定义注解 CustomAnnotation &#xff0c;用于标记需要被扫描的C…...

浏览器渲染原理 - 输入url 回车后发生了什么

目录 渲染时间点渲染流水线1&#xff0c;解析&#xff08;parse&#xff09;HTML1.1&#xff0c;DOM树1.2&#xff0c;CSSOM树1.3&#xff0c;解析时遇到 css 是怎么做的1.4&#xff0c;解析时遇到 js 是怎么做的 2&#xff0c;样式计算 Recalculate style3&#xff0c;布局 la…...

大文本的全文检索方案附件索引

一、简介 Elasticsearch附件索引是需要插件支持的功能&#xff0c;它允许将文件内容附加到Elasticsearch文档中&#xff0c;并对这些附件内容进行全文检索。本文将带你了解索引附件的原理和使用方法&#xff0c;并通过一个实际示例来说明如何在Elasticsearch中索引和检索文件附…...

35_windows环境debug Nginx 源码-CLion配置CMake和启动

文章目录 生成 CMakeLists.txt 组态档35_windows环境debug Nginx 源码-CLion配置CMake和启动生成 CMakeLists.txt 组态档 修改auto目录configure文件,在 . auto/make 上边增加 . auto/cmake, 大概在 106 行。在 auto 目录下创建cmake 文件其内容如下: #!/usr/bin/env bash NG…...

收集的一些比较好的git网址

1、民间故事 https://github.com/folkstory/lingqiu/blob/master/%E4%BC%A0%E8%AF%B4%E9%83%A8%E5%88%86/%E4%BA%BA%E7%89%A9%E4%BC%A0%E8%AF%B4/%E2%80%9C%E6%B5%B7%E5%BA%95%E6%8D%9E%E6%9C%88%E2%80%9D%E7%9A%84%E6%AD%A6%E4%B8%BE.md 2、童话故事 https://gutenberg.org/c…...

容斥原理 博弈论(多种Nim游戏解法)

目录 容斥原理容斥原理的简介能被整除的数&#xff08;典型例题&#xff09;实现思路代码实现扩展&#xff1a;用DPS实现 博弈论博弈论中的相关性质博弈论的相关结论先手必败必胜的证明Nim游戏&#xff08;典型例题&#xff09;代码实现 台阶-Nim游戏&#xff08;典型例题&…...

【C++】函数指针

2023年8月18日&#xff0c;周五上午 今天在B站看Qt教学视频的时候遇到了 目录 语法和typedef或using结合我的总结 语法 返回类型 (*指针变量名)(参数列表)以下是一些示例来说明如何声明不同类型的函数指针&#xff1a; 声明一个不接受任何参数且返回void的函数指针&#xf…...