Comments (22)
@onix-dolzhenko thanks, your code helped quite a bit. I changed a few things around, namely starting/stopping the running of the capture session when starting/stopping the recording, including the asset url in the completion block, as well as making sure the audio samples were synced up with the video capture (was getting good audio in the video, but black screen for the actual video imagery). I don't have time to put together a PR, but I attached my currently working copy of the source in case you want to update your fork, maybe @alskipp can then test and merge it in as a new feature.
from asscreenrecorder.
Hi - if i get your email I can mail you the zipped project - screen recording with sound, so that you can post it on git and it would be helpful for many guys.
from asscreenrecorder.
@amitailanciano! i was having so many declaration issues maybe due to versions issues!
i am still having `AVAssetWriter invalid parameter not satisfying CMTIME_IS_NUMERIC!
and i can't see audio permission added in .plist
from asscreenrecorder.
This won't be the answer you were hoping to hear, but, audio capture will actually be a fairly complicated procedure.
The way screen capture works is to take screenshots in as close to real time as possible whilst appending them to an AVAssetWriter. To add audio to the video the audio will need to be captured separately to a different file. The 2 files will then need to be composed together using AVFoundation to create the final video with audio.
I haven't needed to capture any audio in my projects so I'm not very familiar with the APIs. It might be possible to capture in-app audio using AVFoundation (there's a chance only external audio can be captured with AVFoundation, in which case Core Audio will need to be used).
There's a project which may simplify the audio capture here: https://github.com/TheAmazingAudioEngine/TheAmazingAudioEngine. I've not used this project but it seems to be well documented: http://theamazingaudioengine.com
If it was a quick task to implement audio recording I would happily add it to the project. Unfortunately it's not something I have time to do currently. Good luck with this and do let me know if you have success.
All the best,
Al
from asscreenrecorder.
Hi @alskipp - I have successfully done the sound recording separately and merging both files to make a single video fine. Thank you for this awesome control.
Now only stuck with this issue - #5
from asscreenrecorder.
That's great news that you've managed to add audio to the recording. Did you use the audio engine framework, or something else?
from asscreenrecorder.
Hi I have used normal recorder to record the sound and then merged the video and audio file using AVMutableCompositionTrack
from asscreenrecorder.
I also have implemented audio by using AVMutableCompositionTrack. I can share the code if you guys are interested in it.
from asscreenrecorder.
Would be of great help if you guys can share the code. Thanks in advance
from asscreenrecorder.
Looking for the shared code with audio support
from asscreenrecorder.
Any chance on that code for adding the sound?
from asscreenrecorder.
@berio any change on that sound code?
from asscreenrecorder.
I'm on the hunt for sound as well. Preferably sounds from the app, not the mic. It seems like some people found some solutions, does anyone have them by any chance to share?
from asscreenrecorder.
Hello guys!
I'm currently working on getting sound from mic/app.
Here my fork: https://github.com/onix-dolzhenko/ASScreenRecorder/tree/development
from asscreenrecorder.
Thanks for your work @amitailanciano . I reviewed your project and used it in my project. It's wonderful!
from asscreenrecorder.
Hi @nihtin9mk, Great job! Any chance you can send me the github project that includes the audio recording from the device as well?
from asscreenrecorder.
Hi @amitailanciano ,
Thanks for sharing your modified file - It works perfectly.
from asscreenrecorder.
Hi @amitailanciano , i am replece your file but video can not stor
how can i stor video
-
(void)recorderGesture:(UIGestureRecognizer *)recognizer
{
ASScreenRecorder *recorder = [ASScreenRecorder sharedInstance];if (recorder.isRecording) {
[recorder stopRecordingWithCompletion:(VideoCompletionBlock)^{ // i can replce this metho this is right?
NSLog(@"Finished recording");
[self playEndSound];
}];
} else {
[recorder startRecording];
NSLog(@"Start recording");
[self playStartSound];
}
}
from asscreenrecorder.
@amitailanciano Tried your code and got a crash -
'NSInvalidArgumentException', reason: '*** -[AVAssetWriter startSessionAtSourceTime:] invalid parameter not satisfying: ((Boolean)(((startTime).flags & (kCMTimeFlags_Valid | kCMTimeFlags_ImpliedValueFlagsMask)) == kCMTimeFlags_Valid))
from asscreenrecorder.
Great work @amitailanciano! Thanks for sharing your code.
from asscreenrecorder.
Thx, that's really helpful ! @amitailanciano
from asscreenrecorder.
Please check below code for app crash resolve but outside voice coming....
//
// ASScreenRecorder.m
// ScreenRecorder
//
// Created by Alan Skipp on 23/04/2014.
// Copyright (c) 2014 Alan Skipp. All rights reserved.
//
#import "ASScreenRecorder.h"
#import <AVFoundation/AVFoundation.h>
#import <QuartzCore/QuartzCore.h>
#import <AssetsLibrary/AssetsLibrary.h>
@interface ASScreenRecorder(){
BOOL isStartAudio;
}
@Property (strong, nonatomic) AVAssetWriter *videoWriter;
@Property (strong, nonatomic) AVAssetWriterInput *videoWriterInput;
@Property (strong, nonatomic) AVAssetWriterInputPixelBufferAdaptor *avAdaptor;
@Property (strong, nonatomic) CADisplayLink *displayLink;
@Property (strong, nonatomic) NSDictionary *outputBufferPoolAuxAttributes;
@Property (nonatomic) AVCaptureDeviceInput *audioCaptureInput;
@Property (nonatomic) AVAssetWriterInput *audioInput;
@Property (nonatomic) AVCaptureAudioDataOutput *audioCaptureOutput;
@Property (nonatomic) AVCaptureSession *captureSession;
@Property (nonatomic) NSDictionary *audioSettings;
@Property (nonatomic) CMTime firstAudioTimeStamp;
@Property (nonatomic) NSDate *startedAt;
@Property (nonatomic) CFTimeInterval firstTimeStamp;
@Property (nonatomic) BOOL isRecording;
@EnD
@implementation ASScreenRecorder
{
dispatch_queue_t _audio_capture_queue;
dispatch_queue_t _render_queue;
dispatch_queue_t _append_pixelBuffer_queue;
dispatch_semaphore_t _frameRenderingSemaphore;
dispatch_semaphore_t _pixelAppendSemaphore;
CGSize _viewSize;
CGFloat _scale;
CGColorSpaceRef _rgbColorSpace;
CVPixelBufferPoolRef _outputBufferPool;
}
#pragma mark - initializers
- (instancetype)sharedInstance {
static dispatch_once_t once;
static ASScreenRecorder *sharedInstance;
dispatch_once(&once, ^{
sharedInstance = [[self alloc] init];
});
return sharedInstance;
}
-
(instancetype)init
{
self = [super init];
if (self) {
_viewSize = [UIApplication sharedApplication].delegate.window.bounds.size;
_scale = [UIScreen mainScreen].scale;
// record half size resolution for retina iPads
if ((UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) && _scale > 1) {
_scale = 1.0;
}
_isRecording = NO;_append_pixelBuffer_queue = dispatch_queue_create("ASScreenRecorder.append_queue", DISPATCH_QUEUE_SERIAL); _render_queue = dispatch_queue_create("ASScreenRecorder.render_queue", DISPATCH_QUEUE_SERIAL); dispatch_set_target_queue(_render_queue, dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_HIGH, 0)); _frameRenderingSemaphore = dispatch_semaphore_create(1); _pixelAppendSemaphore = dispatch_semaphore_create(1); [self setUpAudioCapture];
}
return self;
}
#pragma mark - public
-
(void)setViewToCapture:(UIView *)viewToCapture {
_viewSize = viewToCapture.bounds.size;
_viewToCapture = viewToCapture;
} -
(void)setVideoURL:(NSURL *)videoURL
{
NSAssert(!_isRecording, @"videoURL can not be changed whilst recording is in progress");
_videoURL = videoURL;
} -
(BOOL)startRecording
{
if (!_isRecording) {
[_captureSession startRunning];
}
return _isRecording;
} -
(void)stopRecordingWithCompletion:(VideoCompletionBlock)completionBlock;
{
if (_isRecording) {
[_captureSession stopRunning];
_isRecording = NO;
[_displayLink removeFromRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
[self completeRecordingSession:completionBlock];
}
}
#pragma mark - private
-(void)setUpWriter
{
_rgbColorSpace = CGColorSpaceCreateDeviceRGB();
NSDictionary *bufferAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA),
(id)kCVPixelBufferCGBitmapContextCompatibilityKey : @YES,
(id)kCVPixelBufferWidthKey : @(_viewSize.width * _scale),
(id)kCVPixelBufferHeightKey : @(_viewSize.height * _scale),
(id)kCVPixelBufferBytesPerRowAlignmentKey : @(_viewSize.width * _scale * 4)
};
_outputBufferPool = NULL;
CVPixelBufferPoolCreate(NULL, NULL, (__bridge CFDictionaryRef)(bufferAttributes), &_outputBufferPool);
NSError* error = nil;
_videoWriter = [[AVAssetWriter alloc] initWithURL:self.videoURL ?: [self tempFileURL] fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(_videoWriter);
NSInteger pixelNumber = _viewSize.width * _viewSize.height * _scale;
NSDictionary* videoCompression = @{AVVideoAverageBitRateKey: @(pixelNumber * 11.4)};
NSDictionary* videoSettings = @{AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:_viewSize.width*_scale],
AVVideoHeightKey: [NSNumber numberWithInt:_viewSize.height*_scale],
AVVideoCompressionPropertiesKey: videoCompression};
_videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSParameterAssert(_videoWriterInput);
_videoWriterInput.expectsMediaDataInRealTime = YES;
_videoWriterInput.transform = [self videoTransformForDeviceOrientation];
_avAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:_videoWriterInput sourcePixelBufferAttributes:nil];
[_videoWriter addInput:_videoWriterInput];
_audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:_audioSettings];
_audioInput.expectsMediaDataInRealTime = YES;
NSParameterAssert([_videoWriter canAddInput:_audioInput]);
[_videoWriter addInput:_audioInput];
[_videoWriter startWriting];
[_videoWriter startSessionAtSourceTime:CMTimeAdd(_firstAudioTimeStamp, CMTimeMake(0.0, 1000.0))];
}
-
(CGAffineTransform)videoTransformForDeviceOrientation
{
CGAffineTransform videoTransform;
switch ([UIDevice currentDevice].orientation) {
case UIDeviceOrientationLandscapeLeft:
videoTransform = CGAffineTransformMakeRotation(-M_PI_2);
break;
case UIDeviceOrientationLandscapeRight:
videoTransform = CGAffineTransformMakeRotation(M_PI_2);
break;
case UIDeviceOrientationPortraitUpsideDown:
videoTransform = CGAffineTransformMakeRotation(M_PI);
break;
default:
videoTransform = CGAffineTransformIdentity;
}
return videoTransform;
} -
(NSURL*)tempFileURL
{
NSString *outputPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/screencapture.mp4"];
[self removeTempFilePath:outputPath];
return [NSURL fileURLWithPath:outputPath];
} -
(void)removeTempFilePath:(NSString*)filePath
{
NSFileManager* fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:filePath]) {
NSError* error;
if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
NSLog(@"Could not delete old recording:%@", [error localizedDescription]);
}
}
} -
(void)completeRecordingSession:(VideoCompletionBlock)completionBlock;
{
dispatch_async(_render_queue, ^{
dispatch_sync(self->_append_pixelBuffer_queue, ^{
dispatch_sync(self->_audio_capture_queue, ^{
[self->_audioInput markAsFinished];
[self->_videoWriterInput markAsFinished];[self->_videoWriter finishWritingWithCompletionHandler:^{ void (^completion)(NSURL *url) = ^(NSURL *url) { [self cleanup]; dispatch_async(dispatch_get_main_queue(), ^{ if (completionBlock) completionBlock(url); }); }; if (self.videoURL) { completion(self.videoURL); } else { completion(self->_videoWriter.outputURL); } }]; }); });
});
} -
(void)cleanup
{
self.avAdaptor = nil;
self.videoWriterInput = nil;
self.videoWriter = nil;
self.firstTimeStamp = 0;self.startedAt = nil;
self.firstAudioTimeStamp = kCMTimeZero;self.outputBufferPoolAuxAttributes = nil;
CGColorSpaceRelease(_rgbColorSpace);
CVPixelBufferPoolRelease(_outputBufferPool);
} -
(void)writeVideoFrame
{
// throttle the number of frames to prevent meltdown
// technique gleaned from Brad Larson's answer here: http://stackoverflow.com/a/5956119
if (dispatch_semaphore_wait(_frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
dispatch_async(_render_queue, ^{
if (![self->_videoWriterInput isReadyForMoreMediaData]) return;if (!self.firstTimeStamp) { self.firstTimeStamp = self->_displayLink.timestamp; } CFTimeInterval elapsed = (self->_displayLink.timestamp - self.firstTimeStamp); CMTime time = CMTimeAdd(self->_firstAudioTimeStamp, CMTimeMakeWithSeconds(elapsed, 1000)); CVPixelBufferRef pixelBuffer = NULL; CGContextRef bitmapContext = [self createPixelBufferAndBitmapContext:&pixelBuffer]; if (self.delegate) { [self.delegate writeBackgroundFrameInContext:&bitmapContext]; } // draw each window into the context (other windows include UIKeyboard, UIAlert) // FIX: UIKeyboard is currently only rendered correctly in portrait orientation dispatch_sync(dispatch_get_main_queue(), ^{ UIGraphicsPushContext(bitmapContext); { if (self->_viewToCapture) { [self->_viewToCapture drawViewHierarchyInRect:self->_viewToCapture.bounds afterScreenUpdates:NO]; } else { for (UIWindow *window in [[UIApplication sharedApplication] windows]) { [window drawViewHierarchyInRect:CGRectMake(0, 0, self->_viewSize.width, self->_viewSize.height) afterScreenUpdates:NO]; } } } UIGraphicsPopContext(); }); // append pixelBuffer on a async dispatch_queue, the next frame is rendered whilst this one appends // must not overwhelm the queue with pixelBuffers, therefore: // check if _append_pixelBuffer_queue is ready // if itβs not ready, release pixelBuffer and bitmapContext if (dispatch_semaphore_wait(self->_pixelAppendSemaphore, DISPATCH_TIME_NOW) == 0) { dispatch_async(self->_append_pixelBuffer_queue, ^{ BOOL success = [self->_avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time]; if (!success) { NSLog(@"Warning: Unable to write buffer to video"); } CGContextRelease(bitmapContext); CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); CVPixelBufferRelease(pixelBuffer); dispatch_semaphore_signal(self->_pixelAppendSemaphore); }); } else { CGContextRelease(bitmapContext); CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); CVPixelBufferRelease(pixelBuffer); } dispatch_semaphore_signal(self->_frameRenderingSemaphore);
});
} -
(CGContextRef)createPixelBufferAndBitmapContext:(CVPixelBufferRef *)pixelBuffer
{
CVPixelBufferPoolCreatePixelBuffer(NULL, _outputBufferPool, pixelBuffer);
CVPixelBufferLockBaseAddress(*pixelBuffer, 0);CGContextRef bitmapContext = NULL;
bitmapContext = CGBitmapContextCreate(CVPixelBufferGetBaseAddress(*pixelBuffer),
CVPixelBufferGetWidth(*pixelBuffer),
CVPixelBufferGetHeight(*pixelBuffer),
8, CVPixelBufferGetBytesPerRow(*pixelBuffer), _rgbColorSpace,
kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst
);
CGContextScaleCTM(bitmapContext, _scale, _scale);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, _viewSize.height);
CGContextConcatCTM(bitmapContext, flipVertical);return bitmapContext;
}
pragma mark - audio recording
-
(void)setUpAudioCapture
{
NSError *error;AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (device && device.connected)
NSLog(@"Connected Device: %@", device.localizedName);
else
{
NSLog(@"AVCaptureDevice Failed");
return;
}// add device inputs
_audioCaptureInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!_audioCaptureInput)
{
NSLog(@"AVCaptureDeviceInput Failed");
return;
}
if (error)
{
NSLog(@"%@", error);
return;
}// add output for audio
_audioCaptureOutput = [[AVCaptureAudioDataOutput alloc] init];
if (!_audioCaptureOutput)
{
NSLog(@"AVCaptureMovieFileOutput Failed");
return;
}_audio_capture_queue = dispatch_queue_create("AudioCaptureQueue", NULL);
[_audioCaptureOutput setSampleBufferDelegate:self queue:_audio_capture_queue];_captureSession = [[AVCaptureSession alloc] init];
if (!_captureSession)
{
NSLog(@"AVCaptureSession Failed");
return;
}
_captureSession.sessionPreset = AVCaptureSessionPresetMedium;
if ([_captureSession canAddInput:_audioCaptureInput])
[_captureSession addInput:_audioCaptureInput];
else
{
NSLog(@"Failed to add input device to capture session");
return;
}
if ([_captureSession canAddOutput:_audioCaptureOutput])
[_captureSession addOutput:_audioCaptureOutput];
else
{
NSLog(@"Failed to add output device to capture session");
return;
}_audioSettings = [_audioCaptureOutput recommendedAudioSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie];
NSLog(@"Audio capture session running");
} -
(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (captureOutput == _audioCaptureOutput) {
NSLog(@"capture");
if (_startedAt == nil) {
isStartAudio = true;
_startedAt = [NSDate date];
_firstAudioTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[self setUpWriter];
self.isRecording = (self->_videoWriter.status == AVAssetWriterStatusWriting);
self->_displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(writeVideoFrame)];
[self->_displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];} if (_isRecording && [_audioInput isReadyForMoreMediaData]) { [_audioInput appendSampleBuffer:sampleBuffer]; }
}
}
from asscreenrecorder.
Related Issues (20)
- How to record particular view only ? HOT 1
- UIAlertview not visible HOT 2
- Urgent Help: Black video on iOS9? HOT 1
- how to add Audio ? HOT 1
- Warning: Unable to write buffer to video HOT 1
- Not working properly HOT 1
- Appstore? HOT 1
- [FPS] How change FPS of the video to 60/1? HOT 4
- In background Mode HOT 1
- Recored only Subview HOT 1
- After once record,Twice will be crash. HOT 1
- How to Capture AVCaptureVideoPreviewLayer? (camera.previewlayer) HOT 4
- Carsh on my iPhone HOT 1
- cpu very high HOT 4
- record half size resolution for retina iPads????
- unable to capture the camera preview layer HOT 8
- Unable to record youtube full screen video
- Unable to build on Swift 5
- FPS HOT 12
- Possible to include video from AVPlayer? HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from asscreenrecorder.