Giter Club home page Giter Club logo

fastttcamera's People

Contributors

barrettj avatar davelyon avatar jhersh avatar lauraskelton avatar mdelmaestro avatar ssathy2 avatar tony-yan-yu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fastttcamera's Issues

Return both the original and the filtered image when capturing

Hi there! First of all, amazing job with this camera. Really straightforward and fastttt! I wanted to see if there was a way to call takePhoto on the FastttFilterCamera and have it return both the original, unfiltered version of the image, as well as the filtered image. My best guess would be to capture the image without a filter, and then process the image with the self.fastttfilter.filter attribute to generate a second image. Not sure how much that would slow down the image capture process though.

I was going to fork and implement this on my own, but I figured I'd ask to see if there was a simpler way to do this before I embarked on that journey.

Thanks

EDIT: Just to clarify, my goal is to be able to show the user a live preview of the filters before capturing the photo, but to also allow the user to change the filter after the photo was captured without having to take a new photo.

Filter image example does not work

The sample code does not give a preview as expected:

- (void)viewDidLoad {
    [super viewDidLoad];

    _filterCamera = [FastttFilterCamera cameraWithFilterImage:[UIImage imageNamed:@"MonochomeHighContrast"]];
    self.filterCamera.delegate = self;
    [self fastttAddChildViewController:self.filterCamera];
    self.filterCamera.view.frame = self.view.frame;
}

Instead, setting the filterImage once more works:

- (void)viewDidLoad {
    [super viewDidLoad];

    _filterCamera = [FastttFilterCamera cameraWithFilterImage:[UIImage imageNamed:@"MonochomeHighContrast"]];
    self.filterCamera.delegate = self;
    [self fastttAddChildViewController:self.filterCamera];
    self.filterCamera.view.frame = self.view.frame;
    self.filterCamera.filterImage = [UIImage imageNamed:@"MonochomeHighContrast"];
}

Why?

Tested on iPhone 6 Plus, installed via Pod.

Can't Lock Orientation

I love this library, thanks for all the hard work!

I'm having a problem though. I'm unable to lock the camera in Lansdcape. My entire project supports LandscapeLeft/Right only. On my fastttCamera instance, I've set interfaceRotatesWithOrientation = NO.

If I start the camera holding the device in Landscape mode, everything works great. But if I hold the device in portrait mode, and start the camera, the viewport is rotated 90 degrees. I can now rotate the phone in any direction, and the viewport will stay offset 90 degrees.

Here's a dandy image taken, after starting in Landscape mode & staying in landscape mode.

photo 1

Here's a not so dandy one, taken after starting holding the phone in portrait, and then rotating back to landscape.

photo 2

Note that these are screenshots, not images saved using the delegate callbacks.

Here's a gist of my CameraViewController.m >> https://gist.github.com/jonstoked/43cb9674183f984c1006.

Any ideas?

Thanks,
Jon

Cocoapods don't work

pod install
Analyzing dependencies
Fetching podspec for FastttCamera from ../FastttCamera.podspec
[!] Unable to find a specification for Masonry (= 0.6.1)

Orientating and or scaling image in landscape in portrait only app

So as the title suggests, my App is portrait only, however there are instances where i would like to tell the library it is about to take landscape pictures, for example for business cards. In this case i make the camera fullscreen and encourage them to take a landscape picture. Then i would like the picture that is returned to be rotated to be portrait. Reading some of the variables names it looks as if this is possible but i cant seem to get it to work

Any help is appreciated.

add a framework header file

I use in it in swift, I have to import all the files one by one into bridge header
why not add a header such as FasttCameraKit.h which imports all public header files

CGBitmapContextCreate return NULL

CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef) & kCGBitmapAlphaInfoMask;

//Since iOS8 it's not allowed anymore to create contexts with unmultiplied Alpha info
if (bitmapInfo == kCGImageAlphaLast) {
    bitmapInfo = kCGImageAlphaPremultipliedLast;
}
if (bitmapInfo == kCGImageAlphaFirst) {
    bitmapInfo = kCGImageAlphaPremultipliedFirst;
}

Pinch To Zoom

Hi,

Is there any way to add pinch to zoom support to FastttCamera?

Love this btw :)

Main Thread Checker: UI API called on a background thread

It seems like FastttCamera is using background thread to access UI API. Enabled Main Thread Checker using Xcode 9 Apple Doc.

Happens in FastttCamera.m line number 610

Here's the backtrace after pausing execution.

=================================================================
Main Thread Checker: UI API called on a background thread: -[UIView bounds]
PID: 2754, TID: 1011164, Thread name: (none), Queue name: com.apple.root.default-qos, QoS: 21
Backtrace:
4   FastttCamera                        0x00000001026d7d98 __107-[FastttCamera _processImage:withCropRect:maxDimension:fromCamera:needsPreviewRotation:previewOrientation:]_block_invoke + 708
5   libdispatch.dylib                   0x0000000109cc12cc _dispatch_call_block_and_release + 24
6   libdispatch.dylib                   0x0000000109cc128c _dispatch_client_callout + 16
7   libdispatch.dylib                   0x0000000109ccd3dc _dispatch_queue_override_invoke + 984
8   libdispatch.dylib                   0x0000000109cd29d0 _dispatch_root_queue_drain + 624
9   libdispatch.dylib                   0x0000000109cd26f4 _dispatch_worker_thread3 + 136
10  libsystem_pthread.dylib             0x0000000185beb06c _pthread_wqthread + 1268
11  libsystem_pthread.dylib             0x0000000185beab6c start_wqthread + 4
2017-12-05 20:12:47.092293+0530 App-Dev[2754:1011164] [reports] Main Thread Checker: UI API called on a background thread: -[UIView bounds]
PID: 2754, TID: 1011164, Thread name: (none), Queue name: com.apple.root.default-qos, QoS: 21
Backtrace:
4   FastttCamera                        0x00000001026d7d98 __107-[FastttCamera _processImage:withCropRect:maxDimension:fromCamera:needsPreviewRotation:previewOrientation:]_block_invoke + 708
5   libdispatch.dylib                   0x0000000109cc12cc _dispatch_call_block_and_release + 24
6   libdispatch.dylib                   0x0000000109cc128c _dispatch_client_callout + 16
7   libdispatch.dylib                   0x0000000109ccd3dc _dispatch_queue_override_invoke + 984
8   libdispatch.dylib                   0x0000000109cd29d0 _dispatch_root_queue_drain + 624
9   libdispatch.dylib                   0x0000000109cd26f4 _dispatch_worker_thread3 + 136
10  libsystem_pthread.dylib             0x0000000185beb06c _pthread_wqthread + 1268
11  libsystem_pthread.dylib             0x0000000185beab6c start_wqthread + 4

Pinch to zoom doesn't appear to be working :/

I just built this awesome library into my project and everything seems to be working great except that zooming does not seem to be working. When I use a pinch gesture nothing happens at all. Tap to focus works and everything but literally nothing occurs when i pinch. I have not messed with any of the setup but I checked to ensure that zooming was enabled and it is. Any thoughts on why this feature isn't working for me? iPhone 5S running iOS 9

Swift not recognizing enum FastttCameraFlashMode

Hi - I'm trying to check the cameraFlashMode while using Swift 2.0. For some reason, I get the error
Enum case 'On' is not a member of type 'FastttCameraFlashMode?' with the following:

@IBAction func switchFlash(sender: AnyObject) {
        var flashMode: FastttCameraFlashMode
        var flashTitle: String
        switch self.camera?.cameraFlashMode {
        case FastttCameraFlashMode.On:
            flashMode = FastttCameraFlashMode.Off
            flashTitle = "Flash Off"
        default:
            flashMode = FastttCameraFlashMode.On
            flashTitle = "Flash On"
        }

        if let camera = camera {
            if camera.isFlashAvailableForCurrentDevice() {
                camera.cameraFlashMode = flashMode
                flashButton.setTitle(flashTitle, forState: UIControlState.Normal)
            }
        }
    }

The odd thing is that I can set the flashMode perfectly fine, but the switch is not working. The same happens for Off and Auto. I'm not sure if this is an iOS 9 issue or a Fasttt issue. If it's an Apple issue, then I can file a Radar.

Thanks,
Ahan

Support for non-default capture session presets

Hello

_session.sessionPreset = AVCaptureSessionPresetPhoto;

AVCaptureSessionPresetPhoto
AVCaptureSessionPresetHigh
AVCaptureSessionPresetMedium
AVCaptureSessionPresetLow
AVCaptureSessionPreset352x288
AVCaptureSessionPreset640x480
...
AVCaptureSessionPresetInputPriority

Opportunity to change this parameter is very necessary. Take out it in a separate method, please.

lag on preview

I noticed a lag on the preview, It's sow, not fluid and a little bit laggy. On my iPhone 6. On a homemade framework like this one, i didn't have bug like that.

Captured image has wrong size / aspect ratio

Hi! Great library!
I have a problem with the output image. I am trying to get a square photo but the output is slightly off.
These are the logs from the various delegate methods:

fastCamera.view.frame: (0.0, 80.0, 320.0, 320.0)
RAW JPEG Data size: Optional((2448.0, 3264.0))
capturedImage size: (2448.0, 2452.0)
Normalized full capturedImage size: (2448.0, 2452.0)
Normalized scaled capturedImage size: (320.0, 321.0)

Any idea why the image is slightly wider? Thanks!
I am testing on an iPhone 5S running iOS 9.1

Race condition on older phones

Sometimes _checkDeviceAuthorizationWithCompletion takes too long before invoking the completion handler, resulting in _session being nil at the point startRunning is called.

Camera not fully initializing in Swift project

Strange one here, hoping you can shed some light...

This block of code in FastttCamera.m doesn't execute in my Swift project:

if (self.isViewLoaded && self.view.window) {
    [self startRunning];
    [self _insertPreviewLayer];
    [self _setPreviewVideoOrientation];
    [self _resetZoom];
}

I've verified that self.isViewLoaded is true. However, when I po self.view.window, the response is property 'window' not found on object of type 'UIView *'

OK, fine, that would explain why I the block doesn't run. If I remove the self.view.window check everything seems to function normally again. However within the example app provided, self.view.window is also not found on view when I po it in lldb (same error as in my app), BUT the block still executes.

To test this, set a breakpoint here, run the example app, and then po self.view.window.

The example app does work, and my app mostly works without this although zooming is broken because FastttZoom.maxScale is not initialized. Any ideas what would cause this or how to fix?

filter image makes high memory

When I use FastttCamera's filter image, I found that It makes my app's memory very high,

and this increased memory can't be released.
I tested the demo , It causes the same question. please help me to solve the problem ,tks so much

Support for asynchronous processing of images

Working on an app that is requested to have rapid-fire photo taking abilities. This is not capable with the current pod given that the image needs to be processed before isCapturingImage is set to NO.

Why not have that function asynchronous and just have a queue to handle the photos that come in from the camera? I am probably going to try and implement this myself with your library, but I'm just wondering why you all decided against it. Or maybe it is possible and I'm just missing something.

Add support for Torch instead of Flash?

This is actually not a huge change and can of course just be bolted on by a user but I figure it might be worth it to include an option to turn on the Torch before taking a picture rather than just flashing it (I do it this way in my apps because I find it lets me compose my shot better on the first try, since the picture has time to refocus nicely and you can adjust your angle to minimize glare). Any thoughts?

First startup of the camera , it is not possible to zoom

when I was returning from the background you can use it .

Do you have a problem with the following ?
_setupCaptureSession method of FastttCamera.m
The following has not been read at the time of the first start-up .

            if (self.isViewLoaded && self.view.window) {
                [self startRunning];
                [self _insertPreviewLayer];
                [self _setPreviewVideoOrientation];
                [self _resetZoom];
            }

In Xcode7 Console
(lldb) po [(UIView *)self.view window]
nil

photo and it crashed on iPhone6 plus.

the delegate method return a capturedImage,and it's size 2448 * 2448 on iPhone6 plus device,cause memory waring then crash.Colud you help me or something advice?

  • (void)cameraController:(id)cameraController didFinishCapturingImage:(FastttCapturedImage *)capturedImage

Square preview image

How do I automatically crop the camera to a square UIView as it says in the README?

Preferably in swift.

'Expected a Type' error since Xcode 7

After opening project on Xcode 7 and updating pods - I received an 'Expected a Type' error on this line:

- (CGRect)fastttCropRectFromPreviewLayer:(AVCaptureVideoPreviewLayer *)previewLayer;

I thought Cocoapods might be to blame so I've reverted pod to the version I was using before (0.2.9) to no effect. I've removed all pods from the project, cleaned and then re-added - but no joy. I'm at a bit of a loss!

Image below is what I see - and I can provide more detail if required. Any thoughts would be much appreciated. Thanks.

image

`AVCaptureSession` `startRunning` and `stopRunning` performed on Main queue

The Apple docs state:

(startRunning)The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam for iOS for an implementation example.

(stopRunning)This method is synchronous and blocks until the receiver has completely stopped running.

I think this project is great and can be improved even more by moving some of the session setup into a dedicated NSOperationQueue for better performance. Keep it up! :)

Video feed & photo taken are wrong orientation when iPad is faceUp

Laura thanks for your awesome work! This is a great & useful piece of code you have written.

Fixed by pull request #63

iPads with deviceOrientation of faceUp are often in landscape orientation. The code currently assumes portrait orientation. This pull request solves the issue by using statusBarOrientation for faceUp and faceDown deviceOrientation (only).

Crash on [AVCaptureSession addInput:]

There are no checks for errors when AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil] so if an error occurs, deviceInput will be nil and the following call to [_session addInput:deviceInput] will crash.

We cannot reproduce it on development, but we see several crashes on our production app on crashlytics:

Fatal Exception: NSInvalidArgumentException
0  CoreFoundation                 0x184aa51b8 __exceptionPreprocess
1  libobjc.A.dylib                0x1834dc55c objc_exception_throw
2  AVFoundation                   0x18c287c70 -[AVCaptureSession addInput:]
3  FastttCamera                   0x100f42170 __36-[FastttCamera _setupCaptureSession]_block_invoke.150 (FastttCamera.m:436)
4  libdispatch.dylib              0x18392e1fc _dispatch_call_block_and_release
5  libdispatch.dylib              0x18392e1bc _dispatch_client_callout
6  libdispatch.dylib              0x183932d68 _dispatch_main_queue_callback_4CF
7  CoreFoundation                 0x184a52810 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
8  CoreFoundation                 0x184a503fc __CFRunLoopRun
9  CoreFoundation                 0x18497e2b8 CFRunLoopRunSpecific
10 GraphicsServices               0x186432198 GSEventRunModal
11 UIKit                          0x18a9c57fc -[UIApplication _run]
12 UIKit                          0x18a9c0534 UIApplicationMain

Is there any reason why there are no such checks?

Thanks in advance

"use of @import when modules are disabled error"

I've just updated pods for my project and now i'm having "use of @import when modules are disabled error" because of FastttCamera.h:

@import UIKit;

BUT, modules are switched ON for my project and pods:

bildschirmfoto 2015-03-29 um 11 30 44

bildschirmfoto 2015-03-29 um 11 32 18

Any idea how to fix this?

Swift implementation

Hi!

I'm trying to implement the camera in Swift. So far I've been able to instantiate it, take a picture, and manipulate the resulting pic.

Where I'm running into trouble is with the functions in AVCaptureDevice+FastttCamera.m. I can't seem to get the syntax right to call these functions in Swift.

var myCam = FastttCamera()
if FastttCamera.isFlashAvailableForCameraDevice(myCam.cameraDevice){
    myCam.setCameraFlashMode(FastttCameraFlashMode.Off)
}

'FastttCamera' does not have a member named 'setCameraFlashMode'

I know I'm probably approaching this incorrectly but figured I'd ask for help since I'm stuck. If someone could take a quick look I'd really appreciate it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.