Giter Club home page Giter Club logo

screcorder's Introduction

SCRecorder

A Vine/Instagram like audio/video recorder and filter framework in Objective-C.

In short, here is a short list of the cool things you can do:

  • Record multiple video segments
  • Zoom/Focus easily
  • Remove any record segment that you don't want
  • Display the result into a convenient video player
  • Save the record session for later somewhere using a serializable NSDictionary (works in NSUserDefaults)
  • Add a configurable and animatable video filter using Core Image
  • Add a UIView as overlay, so you can render anything you want on top of your video
  • Merge and export the video using fine tunings that you choose

Examples for iOS are provided.

Want something easy to create your filters in this project? Checkout https://github.com/rFlex/CoreImageShop

Framework needed:

  • CoreVideo
  • AudioToolbox
  • GLKit

Podfile

If you are using cocoapods, you can use this project with the following Podfile

platform :ios, '7.0'
pod 'SCRecorder'

Manual install

Drag and drop the SCRecorder.xcodeproject in your project. In your project, add the libSCRecorder.a dependency in the Build Phases into the "Link Binary with Librairies" section (as done in the example).

Swift

For using the project in Swift, follow either the Podfile or Manual install instructions (they both work on Swift too). Then, to allow SCRecorder to be accessible from Swift, just add the following line in your bridge header:

#import <SCRecorder/SCRecorder.h>

Easy and quick

SCRecorder is the main class that connect the inputs and outputs together. It processes the audio and video buffers and append them in a SCRecordSession.

// Create the recorder
SCRecorder *recorder = [SCRecorder recorder]; // You can also use +[SCRecorder sharedRecorder]
	
// Start running the flow of buffers
if (![recorder startRunning]) {
	NSLog(@"Something wrong there: %@", recorder.error);
}

// Create a new session and set it to the recorder
recorder.session = [SCRecordSession recordSession];

// Begin appending video/audio buffers to the session
[recorder record];

// Stop appending video/audio buffers to the session
[recorder pause];

Configuring the recorder

You can configure the input device settings (framerate of the video, whether the flash should be enabled etc...) directly on the SCRecorder.

// Set the AVCaptureSessionPreset for the underlying AVCaptureSession.
recorder.captureSessionPreset = AVCaptureSessionPresetHigh;

// Set the video device to use
recorder.device = AVCaptureDevicePositionFront;

// Set the maximum record duration
recorder.maxRecordDuration = CMTimeMake(10, 1);

// Listen to the messages SCRecorder can send
recorder.delegate = self;

You can configure the video, audio and photo output settings in their configuration instance (SCVideoConfiguration, SCAudioConfiguration, SCPhotoConfiguration), that you can access just like this:

// Get the video configuration object
SCVideoConfiguration *video = recorder.videoConfiguration;

// Whether the video should be enabled or not
video.enabled = YES;
// The bitrate of the video video
video.bitrate = 2000000; // 2Mbit/s
// Size of the video output
video.size = CGSizeMake(1280, 720);
// Scaling if the output aspect ratio is different than the output one
video.scalingMode = AVVideoScalingModeResizeAspectFill;
// The timescale ratio to use. Higher than 1 makes a slow motion, between 0 and 1 makes a timelapse effect
video.timeScale = 1;
// Whether the output video size should be infered so it creates a square video
video.sizeAsSquare = NO;
// The filter to apply to each output video buffer (this do not affect the presentation layer)
video.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];

// Get the audio configuration object
SCAudioConfiguration *audio = recorder.audioConfiguration;

// Whether the audio should be enabled or not
audio.enabled = YES;
// the bitrate of the audio output
audio.bitrate = 128000; // 128kbit/s
// Number of audio output channels
audio.channelsCount = 1; // Mono output
// The sample rate of the audio output
audio.sampleRate = 0; // Use same input 
// The format of the audio output
audio.format = kAudioFormatMPEG4AAC; // AAC

// Get the photo configuration object
SCPhotoConfiguration *photo = recorder.photoConfiguration;
photo.enabled = NO;

Playing back your recording

SCRecorder provides two easy classes to play a video/audio asset: SCPlayer and SCVideoPlayerView.

SCPlayer is a subclass of AVPlayer that adds some methods to make it easier to use. Plus, it also adds the ability to use a filter renderer, to apply a live filter on a video.

SCRecordSession *recordSession = ... // Some instance of a record session
	
// Create an instance of SCPlayer
SCPlayer *player = [SCPlayer player];
	
// Set the current playerItem using an asset representing the segments
// of an SCRecordSession
[player setItemByAsset:recordSession.assetRepresentingSegments];
	
UIView *view = ... // Some view that will get the video
	
// Create and add an AVPlayerLayer
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = view.bounds;
[view.layer.addSublayer:playerLayer];

// Start playing the asset and render it into the view
[player play];
	
// Render the video directly through a filter
SCFilterImageView *filterView = [[SCFilterImageView alloc] initWithFrame:view.bounds];
filterVieww.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];
	
player.SCImageView = filterView;
	
[view addSubview:filterView];

SCVideoPlayerView is a subclass of UIView that holds an SCPlayer. The video buffers are rendered directly in this view. It removes the need to handle the creation of an AVPlayerLayer and makes it really easy to play a video in your app.

SCRecordSession *recordSession = ... // Some instance of a record session
	
SCVideoPlayerView *playerView = // Your instance somewhere
	
// Set the current playerItem using an asset representing the segments
// of an SCRecordSession
[playerView.player setItemByAsset:recordSession.assetRepresentingSegments];
	
// Start playing the asset and render it into the view
[playerView.player play];

Editing your recording

SCRecordSession gets the video and audio buffers from the SCRecorder and append them into a SCRecordSessionSegment. A SCRecordSessionSegment is just a continuous file, really. When calling [SCRecorder pause], the SCRecorder asks the SCRecordSession to asynchronously complete its current record segment. Once done, the segment will be added in the [SCRecordSession segments] array. SCRecorder has also [SCRecorder pause:] with a completion handler. In this method, the completion handler will be called once the SCRecordSession has completed and added the record segment in the segments array.

You can add/remove segments easily in a SCRecordSession. You can also merge all the segments into one file.

SCRecordSession *recordSession = ... // An SCRecordSession instance

// Remove the last segment
[recordSession removeLastSegment];

// Add a segment at the end
[recordSession addSegment:[SCRecordSessionSegment segmentWithURL:anURL info:nil]];

// Get duration of the whole record session
CMTime duration = recordSession.duration;

// Get a playable asset representing all the record segments
AVAsset *asset = recordSession.assetRepresentingSegments;

// Get some information about a particular segment
SCRecordSessionSegment *segment = [recordSession.segments firstObject];

// Get thumbnail of this segment
UIImage *thumbnail = segment.thumbnail;

// Get duration of this segment
CMTime duration = segment.duration;

Exporting your recording

You basically have two ways for exporting an SCRecordSession.

First, you can use [SCRecordSession mergeSegmentsUsingPreset:completionHandler:]. This methods takes an AVAssetExportPreset as parameter and will use an AVAssetExportSession behind the hood. Although this is the fastest and easiest way of merging the record segments, this also provide no configuration on the output settings.

// Merge all the segments into one file using an AVAssetExportSession
[recordSession mergeSegmentsUsingPreset:AVAssetExportPresetHighestQuality completionHandler:^(NSURL *url, NSError *error) {
	if (error == nil) {
	   	// Easily save to camera roll
		[url saveToCameraRollWithCompletion:^(NSString *path, NSError *saveError) {
		     
		}];
	} else {
		NSLog(@"Bad things happened: %@", error);
	}
}];

You can also use SCAssetExportSession, which is the SCRecorder counterpart of AVAssetExportSession. This provides a lot more options, like configuring the bitrate, the output video size, adding a filter, adding a watermark... This is at a cost of a little more configuration and more processing time. Like SCRecorder, SCAssetExportSession also holds an SCVideoConfiguration and SCAudioConfiguration instance (ain't that amazing?).

AVAsset *asset = session.assetRepresentingSegments;
SCAssetExportSession assetExportSession = [[SCAssetExportSession alloc] initWithAsset:asset];
assetExportSession.outputUrl = recordSession.outputUrl;
assetExportSession.outputFileType = AVFileTypeMPEG4;
assetExportSession.videoConfiguration.filter = [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"];
assetExportSession.videoConfiguration.preset = SCPresetHighestQuality;
assetExportSession.audioConfiguration.preset = SCPresetMediumQuality;
[assetExportSession exportAsynchronouslyWithCompletionHandler: ^{
	if (assetExportSession.error == nil) {
		// We have our video and/or audio file
	} else {
		// Something bad happened
	}
}];

Creating/manipulating filters

SCRecorder comes with a filter API built on top of Core Image. SCFilter is the class that wraps a CIFilter. Each filter can also have a chain of sub filters. When processing an image through a filter, first all its sub filters will process the image then the filter itself. An SCFilter can be saved directly into a file and restored from this file.

SCFilter *blackAndWhite = [SCFilter filterWithCIFilterName:@"CIColorControls"];
[blackAndWhite setParameterValue:@0 forKey:@"inputSaturation"];

SCFilter *exposure = [SCFilter filterWithCIFilterName:@"CIExposureAdjust"];
[exposure setParameterValue:@0.7 forKey:@"inputEV"];

// Manually creating a filter chain
SCFilter *filter = [SCFilter emptyFilter];
[filter addSubFilter:blackAndWhite];
[filter addSubFilter:exposure];

SCVideoConfiguration *videoConfiguration = ... // A video configuration

videoConfiguration.filter = blackAndWhite; // Will render a black and white video
videoConfiguration.filter = exposure; // Will render a video with less exposure
videoConfiguration.filter = filter; // Will render a video with both black and white and less exposure

// Saving to a file
NSError *error = nil;
[filter writeToFile:[NSURL fileUrlWithPath:@"some-url.cisf"] error:&error];
if (error == nil) {

}

// Restoring the filter group
SCFilter *restoredFilter = [SCFilter filterWithContentsOfUrl:[NSURL fileUrlWithPath:@"some-url.cisf"]];

// Processing a UIImage through the filter
UIImage *myImage = ... // Some image
UIImage *processedImage = [restoredFilter UIImageByProcessingUIImage:myImage];

// Save it to the photo library
[processedImage saveToCameraRollWithCompletion: ^(NSError *error) {

}];

If you want to create your own filters easily, you can also check out CoreImageShop which is a Mac application that will generate serialized SCFilter directly useable by the filter classes in this project.

Using the filters

SCFilter can be either used in a view to render a filtered image in real time, or in a processing object to render the filter to a file. You can use an SCFilter in one of the following classes:

Animating the filters

Parameters of SCFilter can be animated. You can for instance, progressively blur your video. To do so, you need to add an animation within an SCFilter. Animations are represented as SCFilterAnimation which is a model object that represents a ramp from a start value to an end value and start applying at a given time and duration.

Some examples:

// Fade from completely blurred to sharp at the beginning of the video
SCFilter *blurFadeFilter = [SCFilter filterWithCIFilterName:@"CIFilterGaussianBlur"];
[blurFadeFilter addAnimationForPameterKey:kCIInputRadiusKey startValue:@100 endValue:@0 startTime:0 duration:0.5];

// Make the video instantly become black and white at 2 seconds for 1 second
SCFilter *blackAndWhite = [SCFilter filterWithCIFilterName:@"CIColorControls"];
[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@1 endValue:@1 startTime:0 duration:2];
[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@0 endValue:@0 startTime:2 duration:1];
[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@1 endValue:@1 startTime:3 duration:1];

Some details about the other provided classes

Configurable view that can have an SCRecorder instance and handle tap to focus, pinch to zoom.

Class that can render a CIImage through either EAGL, Metal or CoreGraphics. This class is intended for live rendering of CIImage's. If you want to alter the rendering when subclassing, you can override renderedCIImageInRect:.

A subclass of SCImageView that can have a filter. It renders the input CIImage using the SCFilter, if there is any.

A subclass of SCImageView that has a scrollview and a list of SCFilter. It let the user scrolls between the filters so he can chose one. The selected filter can be retrieved using -[SCSwipeableFilterView selectedFilter]. This basically works the same as the Snapchat composition page.

Player based on the Apple AVPlayer. It adds some convenience methods and the possibility to have a CIImageRenderer that will be used to render the video image buffers. You can combine this class with a CIImageRenderer to render a live filter on a video.

A view that render an SCPlayer easily. It supports tap to play/pause. By default, it holds an SCPlayer instance itself and share the same lifecycle as this SCPlayer. You can disable this feature by calling +[SCVideoPlayerView setAutoCreatePlayerWhenNeeded:NO].

screcorder's People

Contributors

anthopakpak avatar aprato avatar cerupcat avatar coeur avatar dwalker39 avatar fjcaetano avatar guodong000 avatar hojustin avatar jensgrud avatar jhk115 avatar jlalvarez18 avatar legoless avatar lilejia avatar mjgaylord avatar numen31337 avatar peterpaulis avatar renjithn avatar rflex avatar robwagstaff1984 avatar scorsinsc avatar squishykid avatar twomedia avatar xhzengaib avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

screcorder's Issues

problem

Hello, @rFlex
thanks for the great source, it works like a charm.
however i have a litte problem : How to use the source alone recording Audio? Can give an example??
Jack

Question

Hello, @rFlex
thanks for the great source, it works like a charm.
however i have a litte problem : How to do a video recording as instagrm section removed ?,I saw your NSArray+SCAdditions is used for this function?

Problem switching camera device while paused on iPhone 4

I have noticed that on my iPhone 4S there is a problem with switching the camera in between calls to pause and record. It does not matter which camera device you start with - when switching to the other camera and resuming record no video will be captured to the session. I have tried this with my own project as well as the sample project you provided and see the same issue.

This behavior currently works as expected in testing on multiple iPhone 5 devices

One thing to note is that there is no issue when switching the camera during recording. The problem only appears when you have paused the recorder and then do the switch.

Steps to reproduce on iPhone 4S running 7.1.1:

  1. Start running session normally with either front or back camera
  2. Record video
  3. Pause
  4. Switch the camera device
  5. Resume recording
  6. Repeat steps 3-5 if desired
  7. End session and play output video

Observed:
No video is recorded by the camera that was switched to during the pause

Record Session Terminated Automatically?

For some reason the record session restarts when taking a few clips:

"2014-07-25 19:48:42.315[1210:327713] Initialized audio in record session
2014-07-25 19:48:42.322[1210:327713] Initialized video in record session
2014-07-25 19:48:42.322[1210:327713] Began record segment: (null)
2014-07-25 19:48:42.324[1210:327713] End record segment -1: (null)
2014-07-25 19:48:54.290[1210:327713] Began record segment: (null)
2014-07-25 19:49:07.480[1210:327713] End record segment -1: (null)"

Any ideas?

Q: mp4?

I'm sorry, i want ask, is it possible to save the video file in the container 'mp4'

Save video file

Hi,

I tried your exampe for iOS for test purpouses. It seems to work great but it doesn't save the video to the CameraRoll. It asks for permission the first time, it says that video has been successfully saved but there is nothing there.

Save Recorded Session Error

I've been trying many different ways to get a video to be after the recording session. First, I'm having problems actually stopping the session. Calling endRunningSession does nothing and even other methods still doesn't stop the session. Basically didCompleteRecordSession never gets called.

//
//  RecordChallengeViewController.m
//  Sportsy Beta
//
//  Created by Pirate Andy on 8/25/14.
//  Copyright (c) 2014 Sportsy. All rights reserved.
//

#import "RecordChallengeViewController.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import "SCAudioTools.h"
#import "SCRecorderFocusView.h"
#import "SCRecorder.h"
#import "SCRecordSessionManager.h"
#import "SCTouchDetector.h"
#import "SCAssetExportSession.h"

#import "GridView.h"

#define kVideoPreset AVCaptureSessionPresetHigh

@interface RecordChallengeViewController () {
    SCRecorder *_recorder;
    UIImage *_photo;
    SCRecordSession *_recordSession;
}

@property (strong, nonatomic) SCRecorderFocusView *focusView;

@end

@implementation RecordChallengeViewController

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        // Custom initialization
    }
    return self;
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    _secondDelay = 0;

    _recorder = [SCRecorder recorder];
    _recorder.sessionPreset = AVCaptureSessionPreset1280x720;
    _recorder.audioEnabled = YES;
    _recorder.delegate = self;
    _recorder.autoSetVideoOrientation = YES;

    UIView *previewView = self.previewView;
    _recorder.previewView = previewView;


    [self.retakeButton addTarget:self action:@selector(handleRetakeButtonTapped:) forControlEvents:UIControlEventTouchUpInside];
    [self.reverseCamera addTarget:self action:@selector(handleReverseCameraTapped:) forControlEvents:UIControlEventTouchUpInside];

    //[self.previewView addGestureRecognizer:[[SCTouchDetector alloc] initWithTarget:self action:@selector(handleTouchDetected:)]];

    self.focusView = [[SCRecorderFocusView alloc] initWithFrame:previewView.bounds];
    self.focusView.recorder = _recorder;
    [previewView addSubview:self.focusView];

    self.focusView.outsideFocusTargetImage = [UIImage imageNamed:@"record-focus"];
    self.focusView.insideFocusTargetImage = [UIImage imageNamed:@"record-focus"];

    [_recorder openSession:^(NSError *sessionError, NSError *audioError, NSError *videoError, NSError *photoError) {
        NSLog(@"==== Opened session ====");
        NSLog(@"Session error: %@", sessionError.description);
        NSLog(@"Audio error : %@", audioError.description);
        NSLog(@"Video error: %@", videoError.description);
        NSLog(@"Photo error: %@", photoError.description);
        NSLog(@"=======================");
        [self prepareCamera];
    }];

    GridView *gView = [[GridView alloc] initWithFrame:self.view.frame];
    gView.backgroundColor = [UIColor clearColor];
    [self.gridContainerView addSubview:gView];

    // Do any additional setup after loading the view from its nib.
}

- (void)viewDidAppear:(BOOL)animated {
    [super viewDidAppear:animated];

    [_recorder startRunningSession];
}
- (void)viewWillDisappear:(BOOL)animated {
    [super viewWillDisappear:animated];

    [_recorder endRunningSession];
}

- (void) prepareCamera {
    if (_recorder.recordSession == nil) {

        SCRecordSession *session = [SCRecordSession recordSession];
        session.suggestedMaxRecordDuration = CMTimeMakeWithSeconds(5, 10000);

        _recorder.recordSession = session;
    }
}

- (void)showError:(NSError*)error {
    [[[UIAlertView alloc] initWithTitle:@"Something went wrong" message:error.localizedDescription delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
}



- (void)recorder:(SCRecorder *)recorder didCompleteRecordSession:(SCRecordSession *)recordSession {


    void(^completionHandler)(NSURL *video, NSError *error) = ^(NSURL *video, NSError *error) {
        [[UIApplication sharedApplication] endIgnoringInteractionEvents];
        if (error == nil) {
            [[[UIAlertView alloc] initWithTitle:@"Saved to camera roll" message:@"" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
        } else {
            [[[UIAlertView alloc] initWithTitle:@"Failed to save" message:error.localizedDescription delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
        }
    };

    NSURL *video = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingString:@"_output.mp4"]];
    SCAssetExportSession *exportSession = [[SCAssetExportSession alloc] initWithAsset:[_recordSession assetRepresentingRecordSegments]];
    exportSession.sessionPreset = SCAssetExportSessionPresetHighestQuality;
    exportSession.outputUrl = video;
    exportSession.outputFileType = AVFileTypeMPEG4;
    exportSession.keepVideoSize = YES;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            if (completionHandler != nil) {
                completionHandler(video, exportSession.error);
            }
            return;
        });
    }];
    DLog(@"Recoreed did complete record session");
}

- (void)recorder:(SCRecorder *)recorder didInitializeAudioInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
    if (error == nil) {
        NSLog(@"Initialized audio in record session");
    } else {
        NSLog(@"Failed to initialize audio in record session: %@", error.localizedDescription);
    }
}

- (void)recorder:(SCRecorder *)recorder didInitializeVideoInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
    if (error == nil) {
        NSLog(@"Initialized video in record session");
    } else {
        NSLog(@"Failed to initialize video in record session: %@", error.localizedDescription);
    }
}

- (void)recorder:(SCRecorder *)recorder didBeginRecordSegment:(SCRecordSession *)recordSession error:(NSError *)error {
    NSLog(@"Began record segment: %@", error);
}

- (void)recorder:(SCRecorder *)recorder didEndRecordSegment:(SCRecordSession *)recordSession segmentIndex:(NSInteger)segmentIndex error:(NSError *)error {
    DLog(@"Ended record segment: %@", error);
}

#pragma mark - Focus
- (void)recorderDidStartFocus:(SCRecorder *)recorder {
    [self.focusView showFocusAnimation];
}

- (void)recorderDidEndFocus:(SCRecorder *)recorder {
    [self.focusView hideFocusAnimation];
}

- (void)recorderWillStartFocus:(SCRecorder *)recorder {
    [self.focusView showFocusAnimation];
}

#pragma mark - Recorder Actions
- (void) handleStopButtonTapped:(id)sender {
    SCRecordSession *recordSession = _recorder.recordSession;

    if (recordSession != nil) {
        [self finishSession:recordSession];
    }
}

- (void)finishSession:(SCRecordSession *)recordSession {
    [recordSession endRecordSegment:^(NSInteger segmentIndex, NSError *error) {
        [[SCRecordSessionManager sharedInstance] saveRecordSession:recordSession];

        _recordSession = recordSession;
        [self prepareCamera];
    }];
}


-(IBAction)delayClick:(id)sender {

    CGPoint currentPositionDescription = self.descriptionScrollView.contentOffset;

    if(currentPositionDescription.y >= self.descriptionView.frame.origin.y) {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPositionDescription.y - self.descriptionView.frame.origin.y) animated:YES];
    }

    CGPoint currentPosition = self.delayScrollView.contentOffset;

    if(currentPosition.y >= self.delayView.frame.origin.y) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPosition.y - self.delayView.frame.origin.y) animated:YES];
    } else {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, self.delayView.frame.origin.y) animated:YES];
    }

    if(_secondDelay > 0) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
        _secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
    } else {
        _secondLabel.text = @"";
    }

}
-(IBAction)micClick:(id)sender {
    if(_recorder.audioEnabled) {
        [_micBtn setImage:[UIImage imageNamed:@"record-mic"] forState:UIControlStateNormal];
        _recorder.audioEnabled = NO;
    } else {
        [_micBtn setImage:[UIImage imageNamed:@"record-mic-on"] forState:UIControlStateNormal];
        _recorder.audioEnabled = YES;
    }
    [self prepareCamera];
}
-(IBAction)gridClick:(id)sender {
    if(self.gridContainerView.hidden) {
        [_gridBtn setImage:[UIImage imageNamed:@"record-grid-on"] forState:UIControlStateNormal];
        self.gridContainerView.hidden = NO;
    } else {
        [_gridBtn setImage:[UIImage imageNamed:@"record-grid"] forState:UIControlStateNormal];
        self.gridContainerView.hidden = YES;
    }
}
-(IBAction)descriptionClick:(id)sender {

    CGPoint currentPositionDelay = self.delayScrollView.contentOffset;

    if(currentPositionDelay.y >= self.delayView.frame.origin.y) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPositionDelay.y - self.delayView.frame.origin.y) animated:YES];
    }

    CGPoint currentPosition = self.descriptionScrollView.contentOffset;

    if(currentPosition.y >= self.descriptionView.frame.origin.y) {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPosition.y - self.descriptionView.frame.origin.y) animated:YES];
    } else {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes-on"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, self.descriptionView.frame.origin.y) animated:YES];
    }

}

-(IBAction)delayItemClick:(id)sender {
    UIButton *instanceButton = (UIButton *)sender;
    int tag = instanceButton.tag;
    _secondDelay = tag;
    if(_secondDelay > 0) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
        _secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
    } else {
        _secondLabel.text = @"";
    }

    //Disable all selected buttons
    [_fifteenSecondBtn setSelected:NO];
    [_tenSecondBtn setSelected:NO];
    [_fiveSecondBtn setSelected:NO];
    [_noDelayBtn setSelected:NO];

    [instanceButton setSelected:YES];

}

- (void) handleRetakeButtonTapped:(id)sender {
    SCRecordSession *recordSession = _recorder.recordSession;

    if (recordSession != nil) {
        _recorder.recordSession = nil;

        // If the recordSession was saved, we don't want to completely destroy it
        if ([[SCRecordSessionManager sharedInstance] isSaved:recordSession]) {
            [recordSession endRecordSegment:nil];
        } else {
            [recordSession cancelSession:nil];
        }
    }

    [self prepareCamera];
}

- (void) handleReverseCameraTapped:(id)sender {
    [_recorder switchCaptureDevices];
}

-(IBAction)recordPauseClick:(id)sender {
    if([_recorder isRecording]) {
        [[[UIAlertView alloc] initWithTitle:@"Failed to save" message:@"VIDEO IS RECORDING. ATTEMPTING TO SAVE" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
        [_recordBtn setImage:[UIImage imageNamed:@"record-btn"] forState:UIControlStateNormal];


        SCRecordSession *recordSession = _recorder.recordSession;
        [recordSession endRecordSegment:nil];

        if (recordSession != nil) {
            [self finishSession:recordSession];
        }

    } else {
        [_recorder record];
        [_recordBtn setImage:[UIImage imageNamed:@"record-btn-stop"] forState:UIControlStateNormal];
    }



}

/*
- (void)handleTouchDetected:(SCTouchDetector*)touchDetector {
    if (touchDetector.state == UIGestureRecognizerStateBegan) {
        [_recorder record];
    } else if (touchDetector.state == UIGestureRecognizerStateEnded) {
        [_recorder pause];
    }
}
*/

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end

This is where the error gets thrown.
_reader = [AVAssetReader assetReaderWithAsset:self.inputAsset error:&error];

- (void)exportAsynchronouslyWithCompletionHandler:(void (^)())completionHandler {
    NSError *error = nil;

    [[NSFileManager defaultManager] removeItemAtURL:self.outputUrl error:nil];

    _writer = [AVAssetWriter assetWriterWithURL:self.outputUrl fileType:self.outputFileType error:&error];

    EnsureSuccess(error, completionHandler);

    _reader = [AVAssetReader assetReaderWithAsset:self.inputAsset error:&error];
    EnsureSuccess(error, completionHandler);

Brand new implementation

Hey guys,

I just pushed the brand new implementation. I renamed the project to SCRecorder. A lot of things changed (actually, pretty much everything changed). You will have to learn the library from scratch again, sorry about that :(.

Now let's talk about the good things!

  • I spent a lot of times trying to make this new implementation totally stable, you shouldn't experience crash. We all know though it can still happen on any piece of software, so if you experience something that I didn't expect, feel free to let everyone know so we can try to find a solution!
  • As Instagram and Vine does, you can now delete while recording parts from your videos. A part is called "recordSegment", see the documentation for more explanation.
  • You can start a record session in 2014, and finish it 4 years later.
  • You can play a record session, edit it and resume it.
  • The new implementation might be a little more complicated to use, even though I tried to make it the most straightforward as possible despite the fact that it adds a lot of news possibilities for you guys.

If you have any questions, feel free to ask in this topic!

Cheers,

Simon

Retake button in example app doesn't reset captured video/audio

I'm trying to follow the example app and it looks like the retake button isn't implemented. Looking at the code I see that it only resets the timer label and ensures that a session exists. Can you please provide some sample code to demonstrate the best way to implement a retake on the current session?

To reproduce:
Tap to record some video (segment A)
Tap retake button
Tap to record more video (segment B)
Tap stop

Expected:
The final output consists of only video segment B

Observed:
The final output is both segment A and segment B

removeSegmentAtIndex: deleteFile: is error

I use cocoapod to added : pod 'SCRecorder', '~> 2.0.14'
and i use the methed:
removeSegmentAtIndex: deleteFile:
to remove a segment,

after i removed a segment, and i updateTimeRecordedLabel, and refresh the correct time use under code:
if (_recorder.recordSession != nil) {
currentTime = _recorder.recordSession.currentRecordDuration;
}

self.timeRecordedLabel.text = [NSString stringWithFormat:@"%.2f S", CMTimeGetSeconds(currentTime)];

but the time error, the time was not changed, when i removeSegmentAtIndex twice! the time was changed!!

How to change video dimensions and quality

Great Library,
But is there a way to change the 1080x1920 to something smaller ?
like Vine.. which are squared videos no very big, and great for Uploading to a server.
because of the file size ...

Also.. I have a question regarding the TEMP directory. When is the /tmp directory flushed ? I see all videos stored there since the very beginning.

Any way to filter images with SCFilter?

Hey guys.

I have everything working video wise. But I was wondering if, or maybe how, one can implement the snapchat like filtering of SCFilterGroup with an image, rather than a video.

Any help would be great.

Thanks

Error: Remove any record segment that you don't want

Hi,

I follow your guide to remove the record segment like this:

[recordSession removeSegmentAtIndex:1 deleteFile:YES];
or
[recordSession removeSegmentAtIndex:1 deleteFile:NO];

I have 2 cases with the error:

  1. record 2 segments (open camera, tap and hold 2 times), then remove both 2 segments, continue to record, then convert video --> can't convert
  2. record 3 segments (open camera, tap and hold 3 times), then remove the third segment, then record addition 1 segment, then convert video --> video result have 2 segment (the first and the second segment, not have the four segment)

I'm looking forward your help!

Thanks and best regard!

[Suggestion] Tap to pause current video frame

Hi There,
The library is amazing however it would be even better if you could implement a tap to view current frame. eg while viewing a vine, if you tap on the vine it pauses itself and resumes when you tap it again.

Hi,

the App is crashed when i tapped stop button how can i resolve it.

the crash log is.....
Unknown class SCVideoPlayerView in Interface Builder file.
2013-12-20 19:01:47.469 SCAudioVideoRecorderExample[6685:907] -[UIView player]: unrecognized selector sent to instance 0x208944f0
2013-12-20 19:01:47.472 SCAudioVideoRecorderExample[6685:907] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[UIView player]: unrecognized selector sent to instance 0x208944f0'
*** First throw call stack:
(0x342ef2a3 0x3bfd397f 0x342f2e07 0x342f1531 0x34248f68 0x112d35 0x36116595 0x3616b14b 0x3616b091 0x3616af75 0x3616ae99 0x3616a5d9 0x3616a4c1 0x36158b93 0x36158833 0x10d791 0x10ce7d 0x11d369 0x3c3eb11f 0x3c3ea4b7 0x3c3ef1bd 0x342c2f3b 0x34235ebd 0x34235d49 0x37dfd2eb 0x3614b301 0x10ae5d 0x3c40ab20)
libc++abi.dylib: terminate called throwing an exception

How can I resolve it.
Thanks In Advance.

Audio not initialized, video does not record

We are using SCRecorder and we can reproduce an issue on some devices. We see that when running a session, this line:

https://github.com/rFlex/SCRecorder/blob/master/Library/Sources/SCRecorder.m#L329

Will be in a state where _hasAudio = YES but recordSession.audioInitialized = NO

In this state video will not record. Perhaps this is our implementation but in review it seems quite standard. We use lazy instantiation in our implementation which looks like:

- (SCRecordSession *)session
{
    if(!_session) {
        _session = [SCRecordSession recordSession];
        _session.videoSize = CGSizeMake(320, 320);
        _session.suggestedMaxRecordDuration = CMTimeMakeWithSeconds(kMaxRecordingSeconds, 1);
    }
    return _session;
}


- (SCRecorder *)camera
{
    if(!_camera) {
        _camera = [SCRecorder recorder];
        _camera.recordSession = self.session;
        _camera.sessionPreset = AVCaptureSessionPresetInputPriority;
        _camera.videoOrientation = AVCaptureVideoOrientationPortrait;
        _camera.videoEnabled = YES;
        _camera.audioEnabled = YES;
        _camera.device = AVCaptureDevicePositionFront;
        _camera.flashMode = SCFlashModeOff;
        _camera.previewView = self.previewView;
        _camera.delegate = self;
    }
    return _camera;
}

Any ideas why we get this intermittent issue with recording?

Accessing the AVCaptureSession on separate queue

There doesn't seem to be any pattern with how you access the capture session on your _dispatchQueue.

For example, on openSession:, you do this on the current queue:

AVCaptureSession *session = [[AVCaptureSession alloc] init];
 _beginSessionConfigurationCount = 0;
 _captureSession = session;

But then later on you do:

if (!_captureSession.isRunning) {
     dispatch_async(_dispatchQueue, ^{
          [_captureSession startRunning];

Why are you accessing the capture session from different queues? At first I ignored it thinking it was harmless, but I noticed that many times when I tried using the recorder, the entire iOS media server would crash, and this notification would be posted:

AVAudioSessionMediaServicesWereResetNotification

I forked your repo, and moved all access of the capture session to its respective queue, and I no longer see this issue.

On a separate note, you don't seem to be handling any of the error notifications, such as:

AVCaptureSessionRuntimeErrorNotification
AVCaptureSessionWasInterruptedNotification
AVAudioSessionMediaServicesWereResetNotification
AVAudioSessionMediaServicesWereLostNotification

Getting slow framerate with iPhone 4

Im getting a slow framerate while visualizing the video feed, As well as the generated video while been played. This is not happening in iPhone 5 or Iphone 4S.

Which property should I configure to help iphone 4 processor get to and optimal video??

Thank you very much.

Issue when saving videos

Hey, thanks for the great library. We are using it in something really cool, and i can't wait to show you!

So, to the problem at hand.
When i'm using the exact same code as the example on iOS, i often encounter a writing error on the AVAssetWriter in SCAudioVideoRecorder.m

*** Terminating app due to uncaught exception 
'NSInternalInconsistencyException', reason: 
'*** -[AVAssetWriter finishWritingWithCompletionHandler:] 
Cannot call method when status is 1'

The error occurs on line 266:

- (void) finishWriter:(NSURL*)fileUrl {
#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE
    [self.assetWriter finishWritingWithCompletionHandler:^ { // <-- Here
        dispatch_async(self.dispatch_queue, ^{
            [self stopInternal];
        });
    }];

Hope that you can help with this!

Preview View not Auto-Rotating

img_4827

Attached is an image of what's happening.

When I rotate the phone, the rest of the views in the XIB rotate and scale properly, but the preview layer basically stays the same. When recording, the video does come out properly, it's just the preview layer.

#import "RecordChallengeViewController.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import "SCAudioTools.h"
#import "SCRecorderFocusView.h"
#import "SCRecorder.h"
#import "SCRecordSessionManager.h"
#import "SCTouchDetector.h"
#import "SCAssetExportSession.h"

#import "GridView.h"
#import "SBJson.h"

#define kVideoPreset AVCaptureSessionPresetHigh

@interface RecordChallengeViewController () {
    SCRecorder *_recorder;
    UIImage *_photo;
    SCRecordSession *_recordSession;
}

@property (strong, nonatomic) SCRecorderFocusView *focusView;

@end

@implementation RecordChallengeViewController

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        // Custom initialization
    }
    return self;
}

- (void)viewDidLoad
{
    [super viewDidLoad];


    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(didRotate:) name:UIDeviceOrientationDidChangeNotification object:nil];


    _secondDelay = 0;

    aws = [[SportsyAWS alloc] init];
    model = [SportsyModel sharedModel];
    analytics = [SportsyAnalytics sharedModel];

    _recorder = [SCRecorder recorder];
    _recorder.sessionPreset = kVideoPreset;
    _recorder.audioEnabled = YES;
    _recorder.delegate = self;
    _recorder.autoSetVideoOrientation = YES;


    UIView *previewView = self.previewView;
    previewView.backgroundColor = [UIColor redColor];
    _recorder.previewView = previewView;

    UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(closeOpenSlides:)];
    tapRecognizer.numberOfTapsRequired = 1;
    [self.view addGestureRecognizer:tapRecognizer];

    [self.retakeButton addTarget:self action:@selector(handleRetakeButtonTapped:) forControlEvents:UIControlEventTouchUpInside];
    [self.reverseCamera addTarget:self action:@selector(handleReverseCameraTapped:) forControlEvents:UIControlEventTouchUpInside];

    self.focusView = [[SCRecorderFocusView alloc] initWithFrame:previewView.bounds];
    self.focusView.recorder = _recorder;
    [previewView addSubview:self.focusView];

    self.focusView.outsideFocusTargetImage = [UIImage imageNamed:@"record-focus"];
    self.focusView.insideFocusTargetImage = [UIImage imageNamed:@"record-focus"];

    [_recorder openSession:^(NSError *sessionError, NSError *audioError, NSError *videoError, NSError *photoError) {
        NSLog(@"==== Opened session ====");
        NSLog(@"Session error: %@", sessionError.description);
        NSLog(@"Audio error : %@", audioError.description);
        NSLog(@"Video error: %@", videoError.description);
        NSLog(@"Photo error: %@", photoError.description);
        NSLog(@"=======================");
        [self prepareCamera];
    }];

    GridView *gView = [[GridView alloc] initWithFrame:self.view.frame];
    gView.backgroundColor = [UIColor clearColor];
    [self.gridContainerView addSubview:gView];

    // Do any additional setup after loading the view from its nib.
}

- (void)viewDidAppear:(BOOL)animated {
    [super viewDidAppear:animated];

    [_recorder startRunningSession];

}
- (void)viewWillDisappear:(BOOL)animated {
    [super viewWillDisappear:animated];

    [_recorder endRunningSession];
}

- (void) prepareCamera {
    if (_recorder.recordSession == nil) {

        SCRecordSession *session = [SCRecordSession recordSession];
        //session.suggestedMaxRecordDuration = CMTimeMakeWithSeconds(5, 10000);

        _recorder.recordSession = session;
    }
}

- (void)showError:(NSError*)error {
    [[[UIAlertView alloc] initWithTitle:@"Something went wrong" message:error.localizedDescription delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil] show];
}



- (void)recorder:(SCRecorder *)recorder didCompleteRecordSession:(SCRecordSession *)recordSession {

    [self finishSession:recordSession];

}

- (void)recorder:(SCRecorder *)recorder didInitializeAudioInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
    if (error == nil) {
        NSLog(@"Initialized audio in record session");
    } else {
        NSLog(@"Failed to initialize audio in record session: %@", error.localizedDescription);
    }
}

- (void)recorder:(SCRecorder *)recorder didInitializeVideoInRecordSession:(SCRecordSession *)recordSession error:(NSError *)error {
    if (error == nil) {
        NSLog(@"Initialized video in record session");
    } else {
        NSLog(@"Failed to initialize video in record session: %@", error.localizedDescription);
    }
}

- (void)recorder:(SCRecorder *)recorder didBeginRecordSegment:(SCRecordSession *)recordSession error:(NSError *)error {
    NSLog(@"Began record segment: %@", error);
}

- (void)recorder:(SCRecorder *)recorder didEndRecordSegment:(SCRecordSession *)recordSession segmentIndex:(NSInteger)segmentIndex error:(NSError *)error {
    DLog(@"Ended record segment: %@", error);
}

#pragma mark - Focus
- (void)recorderDidStartFocus:(SCRecorder *)recorder {
    [self.focusView showFocusAnimation];
}

- (void)recorderDidEndFocus:(SCRecorder *)recorder {
    [self.focusView hideFocusAnimation];
}

- (void)recorderWillStartFocus:(SCRecorder *)recorder {
    [self.focusView showFocusAnimation];
}

#pragma mark - Recorder Actions
- (void) handleStopButtonTapped:(id)sender {
    SCRecordSession *recordSession = _recorder.recordSession;

    if (recordSession != nil) {
        [self finishSession:recordSession];
    }
}

- (void)finishSession:(SCRecordSession *)recordSession {
    [recordSession endRecordSegment:^(NSInteger segmentIndex, NSError *error) {
        [[SCRecordSessionManager sharedInstance] saveRecordSession:recordSession];

        _recordSession = recordSession;
        [self prepareCamera];
    }];
}


-(IBAction)delayClick:(id)sender {

    CGPoint currentPositionDescription = self.descriptionScrollView.contentOffset;

    if(currentPositionDescription.y >= self.descriptionView.frame.origin.y) {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPositionDescription.y - self.descriptionView.frame.origin.y) animated:YES];
    }

    CGPoint currentPosition = self.delayScrollView.contentOffset;

    if(currentPosition.y >= self.delayView.frame.origin.y) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPosition.y - self.delayView.frame.origin.y) animated:YES];
    } else {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, self.delayView.frame.origin.y) animated:YES];
    }

    if(_secondDelay > 0) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
        _secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
    } else {
        _secondLabel.text = @"";
    }

}
-(IBAction)micClick:(id)sender {
    if(_recorder.audioEnabled) {
        [_micBtn setImage:[UIImage imageNamed:@"record-mic"] forState:UIControlStateNormal];
        _recorder.audioEnabled = NO;
    } else {
        [_micBtn setImage:[UIImage imageNamed:@"record-mic-on"] forState:UIControlStateNormal];
        _recorder.audioEnabled = YES;
    }
    [self prepareCamera];
}
-(IBAction)gridClick:(id)sender {
    if(self.gridContainerView.hidden) {
        [_gridBtn setImage:[UIImage imageNamed:@"record-grid-on"] forState:UIControlStateNormal];
        self.gridContainerView.hidden = NO;
    } else {
        [_gridBtn setImage:[UIImage imageNamed:@"record-grid"] forState:UIControlStateNormal];
        self.gridContainerView.hidden = YES;
    }
}
-(IBAction)descriptionClick:(id)sender {

    CGPoint currentPositionDelay = self.delayScrollView.contentOffset;

    if(currentPositionDelay.y >= self.delayView.frame.origin.y) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPositionDelay.y - self.delayView.frame.origin.y) animated:YES];
    }

    CGPoint currentPosition = self.descriptionScrollView.contentOffset;

    if(currentPosition.y >= self.descriptionView.frame.origin.y) {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPosition.y - self.descriptionView.frame.origin.y) animated:YES];
    } else {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes-on"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, self.descriptionView.frame.origin.y) animated:YES];
    }

}

-(IBAction)delayItemClick:(id)sender {
    UIButton *instanceButton = (UIButton *)sender;
    int tag = instanceButton.tag;
    _secondDelay = tag;
    if(_secondDelay > 0) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay-on"] forState:UIControlStateNormal];
        _secondLabel.text = [NSString stringWithFormat:@"%i secs", _secondDelay];
    } else {
        _secondLabel.text = @"";
    }

    //Disable all selected buttons
    [_fifteenSecondBtn setSelected:NO];
    [_tenSecondBtn setSelected:NO];
    [_fiveSecondBtn setSelected:NO];
    [_noDelayBtn setSelected:NO];

    [instanceButton setSelected:YES];

}

- (void) handleRetakeButtonTapped:(id)sender {
    SCRecordSession *recordSession = _recorder.recordSession;

    if (recordSession != nil) {
        _recorder.recordSession = nil;

        // If the recordSession was saved, we don't want to completely destroy it
        if ([[SCRecordSessionManager sharedInstance] isSaved:recordSession]) {
            [recordSession endRecordSegment:nil];
        } else {
            [recordSession cancelSession:nil];
        }
    }

    [self prepareCamera];
}

- (void) handleReverseCameraTapped:(id)sender {
    [_recorder switchCaptureDevices];
}

-(IBAction)recordPauseClick:(id)sender {
    if([_recorder isRecording]) {
        [_recordBtn setImage:[UIImage imageNamed:@"record-btn"] forState:UIControlStateNormal];
        DLog(@"TRYING TO END SESSION");
        SCRecordSession *recordSession = _recorder.recordSession;
        _recorder.recordSession = nil;

        [recordSession endSession:^(NSError *error) {
            DLog(@"END SESSION CALL");
            if (error == nil) {
                NSURL *fileUrl = recordSession.outputUrl;
                NSString *videoFileName = [NSString stringWithFormat:@"%@-%@.mp4", [model uid], [model getMongoIdFromDictionary:_challengeDict]];
                [aws uploadWithBackgroundThread:fileUrl withFileName:videoFileName];
                [self dismissViewControllerAnimated:YES completion:nil];

            } else {
                DLog(@"%@", error);
            }
        }];

    } else {
        if(_secondDelay > 0) {
            _delayTimerLabel.text = [NSString stringWithFormat:@"%i", _secondDelay];
            _delayTimerView.hidden = NO;
            _timer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(timerFired) userInfo:nil repeats:YES];
        } else {
            [_recorder record];
        }

        [_recordBtn setImage:[UIImage imageNamed:@"record-btn-stop"] forState:UIControlStateNormal];
    }



}

-(void)timerFired {
    if(_secondDelay > 0){
        _secondDelay-=1;
        if(_secondDelay>-1) {
            _delayTimerLabel.text = [NSString stringWithFormat:@"%i", _secondDelay];
        }
    } else {
        _delayTimerView.hidden = YES;
        [_recorder record];
        [_timer invalidate];
    }
}

-(void)closeOpenSlides:(UITapGestureRecognizer *)recognizer {
    CGPoint currentPositionDelay = self.delayScrollView.contentOffset;

    if(currentPositionDelay.y >= self.delayView.frame.origin.y) {
        [_delayBtn setImage:[UIImage imageNamed:@"record-delay"] forState:UIControlStateNormal];
        [self.delayScrollView setContentOffset:CGPointMake(self.delayView.frame.origin.x, currentPositionDelay.y - self.delayView.frame.origin.y) animated:YES];
    }

    CGPoint currentPositionDescription = self.descriptionScrollView.contentOffset;

    if(currentPositionDescription.y >= self.descriptionView.frame.origin.y) {
        [_descriptionBtn setImage:[UIImage imageNamed:@"record-notes"] forState:UIControlStateNormal];
        [self.descriptionScrollView setContentOffset:CGPointMake(self.descriptionView.frame.origin.x, currentPositionDescription.y - self.descriptionView.frame.origin.y) animated:YES];
    }

}


/*
- (void)handleTouchDetected:(SCTouchDetector*)touchDetector {
    if (touchDetector.state == UIGestureRecognizerStateBegan) {
        [_recorder record];
    } else if (touchDetector.state == UIGestureRecognizerStateEnded) {
        [_recorder pause];
    }
}
*/

- (void) didRotate:(NSNotification *)notification
{
    UIDeviceOrientation orientation = [[UIDevice currentDevice] orientation];

    if (orientation == UIDeviceOrientationLandscapeLeft || orientation == UIDeviceOrientationLandscapeRight)
    {
        _recorder.previewView.frame = CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.height, [[UIScreen mainScreen] bounds].size.width);
    } else {
        _recorder.previewView.frame = CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height);
    }
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration {
    //[self prepareCamera];
}

- (BOOL)automaticallyForwardAppearanceAndRotationMethodsToChildViewControllers {
    return YES;
}

- (BOOL)shouldAutomaticallyForwardRotationMethods {
    return YES;
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end

Recording square video

I'm having trouble configuring the recorder to capture and export video cropped to a square aspect ratio. Without expertise in AVFoundation I am having a hard time understanding whether this is currently possible to do. Is this currently a supported feature for iOS?

I can see that Issue #1 relates to this question but it was closed 7 months ago and the codebase has changed since that resolution. Specifically, it seems that the functionality associated with useInputFormatTypeAsOutputType has been rewritten.

I have tried to configure the recorder using the following methods without success:

  • camera.videoEncoder.outputVideoSize
  • camera.sessionPreset
  • [camera setActiveFormatThatSupportsFrameRate:width:andHeight:error:]

Any information on this matter would be greatly appreciated.

Can't resume capture

Hi fFlex,

I want to resume capture video, I followed your guide:
// Get a dictionary representation of the record session
// And save it somewhere, so you can use it later!
NSDictionary *dictionaryRepresentation = [recordSession dictionaryRepresentation];
[[NSUserDefaults standardUserDefaults] setObject:dictionaryRepresentation forKey:@"RecordSession"];

// Restore a record session from a saved dictionary representation
NSDictionary *dictionaryRepresentation = [[NSUserDefaults standardUserDefaults] objectForKey:@"RecordSession"];
SCRecordSession *recordSession = [SCRecordSession recordSession:dictionaryRepresentation];

but I can't resume capture video.

do you have any different way?

Thanks!

Big bug

@rFlex
When I finished recording audio, back to the home page, again into the video recording, crash now,reason was
SCAudioVideoRecorderExample[4306:60b] *** -[SCCamera release]: message sent to deallocated instance 0x16de65a0

Orientation setting isn't being respected

When trying to set the video orientation like so:

self.camera.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;

or

[self.camera setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];

No matter what, it always records as a portrait.

See code below based upon your example:

- (void)viewDidAppear:(BOOL)animated {
    [super viewDidAppear:animated];

    [self.camera openSession:^(NSError * audioError, NSError * videoError) {
        [self prepareCamera];
        double delayInSeconds = 3.0;
        dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
        dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
            [self.camera startRunningSession];
        });

    }];
}
- (void)viewDidLoad
{
    [super viewDidLoad];

    counter = 20;
    isRecording = NO;

    self.navigationController.navigationBar.hidden = YES;

    self.camera = [[SCCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh];
    self.camera.delegate = self;
    self.camera.enableSound = YES;
    self.camera.previewVideoGravity = SCVideoGravityResizeAspectFill;

    UIView *previewView = cameraContainer;

    self.camera.previewView = previewView;
    self.camera.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
    self.camera.recordingDurationLimit = CMTimeMakeWithSeconds(counter, 1);


    self.focusView = [[SCCameraFocusView alloc] initWithFrame:previewView.bounds];
    self.focusView.camera = self.camera;

    [previewView addSubview:self.focusView];

    self.focusView.outsideFocusTargetImage = [UIImage imageNamed:@"camera-icon"];
    self.focusView.insideFocusTargetImage = [UIImage imageNamed:@"camera-icon"];

}
- (void) prepareCamera {
    if (![self.camera isPrepared]) {
        NSError * error;
        [self.camera prepareRecordingOnTempDir:&error];


        if (error != nil) {
            DLog(@"%@", error);
        } else {
            [self.camera record];

            NSTimer *countdownTimer = [NSTimer scheduledTimerWithTimeInterval:1
                                                                       target:self
                                                                     selector:@selector(advanceTimer:)
                                                                     userInfo:nil
                                                                      repeats:YES];
            isRecording = YES;

        }
    }
}

Bug

Hi! rFlex
Why when I set the maximum recording time, has been holding down the record button, but there is an error? If it is intermittent recording wouldn't appear error. (meaning: holding down up down up)

Add loading view feature

Hi. First of all, Thanks for @rFlex for providing us this awesome framework! I love to use this framework and modify it.
I was working on my project and came up with an idea to improve this project. I think we need more sophisticated loading view. I use horizontal collection view for playing multiple videos with SCPlayer and SCVideoPlayerView in my project. The videos are loaded from server and cached before they played. But while the videos are downloaded, the loading view is not so responsive and beautiful.

I thought it would be helpful to community wide if we can add video downloading feedback feature with first frame of video screen as a loading view and the alpha value of the view like Mindie.

Thanks!

Should we add a sessionPreset property to manage video/photo quality?

NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;

Can't save video to cameraroll

Hi rFlex,

I use SCRecorder, it works well, but I want to save video file after recording to cameraroll, but I can't.

Please help me!

Thanks!

AVAudioSessionMediaServicesWereResetNotification

Hey there!

There is a problem when running the camera for the first time. I wasn't able to get any error messages from any of the delegate methods. So I tried listening to as many notification as possible to find out what's going on. For some reason, AVAudioSessionMediaServicesWereResetNotification posts up a notification. I worked around putting up some code to close and reopen the session in order to get it working. However, I still do not know what causes this.

Observer-issue with SCPlayer

 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'CoreStreamBase<>(0x15e29718 e:ORWEDM p:orwedm s:O): An -observeValueForKeyPath:ofObject:change:context: message was received but not handled.
    Key path: loadedTimeRanges
    Observed object: <AVPlayerItem: 0x15f30490, asset = <AVURLAsset: 0x15efa980, URL = file:///private/var/mobile/Applications/A3DDAC5A-C04F-4AC0-AA08-E0A931E7EB8E/tmp/4ef635f1-0240-4414-bd13-f60e68e96349-video.mp4>>
    Change: {
        kind = 1;
        new =     (
            "CMTimeRange: {{0/1 = 0.000}, {222234/44100 = 5.039}}"
        );
    }
    Context: 0x0'

Hey, we got a crash on this. Is this something you are aware of? We could certainly take a look at it our self, if you don't have any quick ideas about what this is. It seems to called after the item has removed the observer, but I'm not that familiar with KVO.

Where is the SCFilter.h?

Hi, thank you for your project. It helps me a lot. I want to add a filter on the video. But I can't find the SCFilter.h. Where is it ?

Problem with merge

When merging record segments or even just getting the assetRepresentation, some segments seem to be missing. The recordSegments.count seem correct, and end record segment log seems correct, however the video/asset output seems to be incorrect. Say the count is 4 (segments) the end video will show only 3 of the segments leaving out an appended video segment. I notice this happening when I delete and append segments. Any ideas? or sample project showing correct way to implement delete and add as I may be doing something wrong.

I'm unable to pinpoint the exact location of the problem or to recreate the issue all the time. It seems to be almost random.

Recording in landscape

It should support the recording in either landscape or portrait, either by setting in code, or automatically set with the accelerometer info.

If that's difficult to do, then I would appreciate some sample code on how to rotate the video.

Can I use this to only capture Audio?

Hi,

I would like to use this library but the need in my app is only to capture the audio, not the video. Can I still use this library?

Thanks,
Varun

Video, audio not getting recorded during interruption like Skype call.

Receive call or call from the device through Skype. Now open the application (SCRecorderExamples) and begin recording. Seems like, first segment recorded successfully but it isn't. Can not record again. Navigate to preview, no video will be shown. Delegate methods, recorder:didAppendVideoSampleBuffer: & recorder:didAppendAudioSampleBuffer: are not getting called.

If Skype calls arrives while recording, we can continue recording but there will not be any sound for the video while playing on preview view.

While putting breakpoints here and there I got these two errors.

ERROR: [0x103534000] AVAudioSessionPortImpl.mm:50: ValidateRequiredFields: Unknown selected data source for Port iPhone Microphone (type: MicrophoneBuiltIn)

End record segment -1, error : Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x178271240 {NSUnderlyingError=0x17824b1f0 "The operation couldn’t be completed. (OSStatus error 560226676.)", NSLocalizedFailureReason=The media data could not be decoded. It may be damaged., NSLocalizedDescription=Cannot Decode}

Problems is in iOS7. When you set audioEnabled property of SCRecorder to NO, works perfectly but without sound.

Please look into this issue.

fileExistsAtPath returning no

Hello,

  1. THANKS for all the work :)

  2.    [[NSFileManager defaultManager] fileExistsAtPath:[_recordSession.outputUrl path]]) is returning NO in "finishSession".
    

Is that on purpose?

Thanks in advance.

Sometime camera can't open

Hey,

Thank for your source code!

I have the issue: sometime camera can't open.

I'm looking forward your reply!

Thanks and best regard!

Source files missing in cocoapods

Hello :)

First of all, thanks for nice library. I'm using this sources in my project,
and you save my time!

I'm using sources by direct importing your sources which I downloaded
zip file from github in my project.

I want use your sources as cocoapods, but when I update my Podfile and saw
the source files in the pod project, several sources are missing.

There are no sources related filters. I think you've missed your pods.
Please check this issue. Thanks.

Focus mode read only

The auto focus goes crazy sometimes, especially in the beginning of the video and there's no apparent way of changing it. The "focusMode" property on an SCRecorder instance is read only, so I can't change it there either.

SCAssetExportSession _reader startReading Crash

Hi rFlex,
You were able to fix my last problem really quickly and I'm hoping you can do it again. So I am recording a video using your library and when it is done I set the asset to your SCPlayer and then allow the user to filter on top of it. However when I call upon SCAssetExportSession to get the newly filtered video I have run into a crash. If I have audio enabled on the video that is being recorded when it goes to be exported it causes an exception breakpoint in the method exportAsynchronouslyWithCompletionHandler:(void (^)())completionHandler at the line if(! [_reader startReading]) with a bad access and I can jump over the exception break point and it will export but if I run it on production it crashes the app. However, if I either run it with _recorder.audioEnabled = NO or just change
_audioOutput = [self addReader:[audioTracks objectAtIndex:0] withSettings:@{ AVFormatIDKey : [NSNumber numberWithUnsignedInt:kAudioFormatType] }]; to _audioOutput = nil in the SCAssetExportSession it works because the audio is not being processed. Do you have any ideas what may be causing this issue.

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.