ifttt / fastttcamera Goto Github PK
View Code? Open in Web Editor NEWFasttt and easy camera framework for iOS with customizable filters
Home Page: http://ifttt.github.io
License: MIT License
Fasttt and easy camera framework for iOS with customizable filters
Home Page: http://ifttt.github.io
License: MIT License
Hi there! First of all, amazing job with this camera. Really straightforward and fastttt! I wanted to see if there was a way to call takePhoto on the FastttFilterCamera and have it return both the original, unfiltered version of the image, as well as the filtered image. My best guess would be to capture the image without a filter, and then process the image with the self.fastttfilter.filter attribute to generate a second image. Not sure how much that would slow down the image capture process though.
I was going to fork and implement this on my own, but I figured I'd ask to see if there was a simpler way to do this before I embarked on that journey.
Thanks
EDIT: Just to clarify, my goal is to be able to show the user a live preview of the filters before capturing the photo, but to also allow the user to change the filter after the photo was captured without having to take a new photo.
After I updated to Xcode7.0 from 6.4,
I've got a build warning like below.
Pods/FastttCamera/FastttCamera/FastttCamera.m:164:1: Conflicting return type in implementation of 'supportedInterfaceOrientations': 'UIInterfaceOrientationMask' (aka 'enum UIInterfaceOrientationMask') vs 'NSUInteger' (aka 'unsigned long')
i think it is the same problem with this link:
https://forums.developer.apple.com/thread/6165
The sample code does not give a preview as expected:
- (void)viewDidLoad {
[super viewDidLoad];
_filterCamera = [FastttFilterCamera cameraWithFilterImage:[UIImage imageNamed:@"MonochomeHighContrast"]];
self.filterCamera.delegate = self;
[self fastttAddChildViewController:self.filterCamera];
self.filterCamera.view.frame = self.view.frame;
}
Instead, setting the filterImage
once more works:
- (void)viewDidLoad {
[super viewDidLoad];
_filterCamera = [FastttFilterCamera cameraWithFilterImage:[UIImage imageNamed:@"MonochomeHighContrast"]];
self.filterCamera.delegate = self;
[self fastttAddChildViewController:self.filterCamera];
self.filterCamera.view.frame = self.view.frame;
self.filterCamera.filterImage = [UIImage imageNamed:@"MonochomeHighContrast"];
}
Why?
Tested on iPhone 6 Plus, installed via Pod.
I love this library, thanks for all the hard work!
I'm having a problem though. I'm unable to lock the camera in Lansdcape. My entire project supports LandscapeLeft/Right only. On my fastttCamera instance, I've set interfaceRotatesWithOrientation = NO.
If I start the camera holding the device in Landscape mode, everything works great. But if I hold the device in portrait mode, and start the camera, the viewport is rotated 90 degrees. I can now rotate the phone in any direction, and the viewport will stay offset 90 degrees.
Here's a dandy image taken, after starting in Landscape mode & staying in landscape mode.
Here's a not so dandy one, taken after starting holding the phone in portrait, and then rotating back to landscape.
Note that these are screenshots, not images saved using the delegate callbacks.
Here's a gist of my CameraViewController.m >> https://gist.github.com/jonstoked/43cb9674183f984c1006.
Any ideas?
Thanks,
Jon
pod install
Analyzing dependencies
Fetching podspec for FastttCamera
from ../FastttCamera.podspec
[!] Unable to find a specification for Masonry (= 0.6.1)
So as the title suggests, my App is portrait only, however there are instances where i would like to tell the library it is about to take landscape pictures, for example for business cards. In this case i make the camera fullscreen and encourage them to take a landscape picture. Then i would like the picture that is returned to be rotated to be portrait. Reading some of the variables names it looks as if this is possible but i cant seem to get it to work
Any help is appreciated.
I use in it in swift, I have to import all the files one by one into bridge header
why not add a header such as FasttCameraKit.h which imports all public header files
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef) & kCGBitmapAlphaInfoMask;
//Since iOS8 it's not allowed anymore to create contexts with unmultiplied Alpha info
if (bitmapInfo == kCGImageAlphaLast) {
bitmapInfo = kCGImageAlphaPremultipliedLast;
}
if (bitmapInfo == kCGImageAlphaFirst) {
bitmapInfo = kCGImageAlphaPremultipliedFirst;
}
Hi,
Is there any way to add pinch to zoom support to FastttCamera?
Love this btw :)
Has anyone else tried doing this?
It seems like FastttCamera is using background thread to access UI API. Enabled Main Thread Checker using Xcode 9 Apple Doc.
Happens in FastttCamera.m
line number 610
Here's the backtrace after pausing execution.
=================================================================
Main Thread Checker: UI API called on a background thread: -[UIView bounds]
PID: 2754, TID: 1011164, Thread name: (none), Queue name: com.apple.root.default-qos, QoS: 21
Backtrace:
4 FastttCamera 0x00000001026d7d98 __107-[FastttCamera _processImage:withCropRect:maxDimension:fromCamera:needsPreviewRotation:previewOrientation:]_block_invoke + 708
5 libdispatch.dylib 0x0000000109cc12cc _dispatch_call_block_and_release + 24
6 libdispatch.dylib 0x0000000109cc128c _dispatch_client_callout + 16
7 libdispatch.dylib 0x0000000109ccd3dc _dispatch_queue_override_invoke + 984
8 libdispatch.dylib 0x0000000109cd29d0 _dispatch_root_queue_drain + 624
9 libdispatch.dylib 0x0000000109cd26f4 _dispatch_worker_thread3 + 136
10 libsystem_pthread.dylib 0x0000000185beb06c _pthread_wqthread + 1268
11 libsystem_pthread.dylib 0x0000000185beab6c start_wqthread + 4
2017-12-05 20:12:47.092293+0530 App-Dev[2754:1011164] [reports] Main Thread Checker: UI API called on a background thread: -[UIView bounds]
PID: 2754, TID: 1011164, Thread name: (none), Queue name: com.apple.root.default-qos, QoS: 21
Backtrace:
4 FastttCamera 0x00000001026d7d98 __107-[FastttCamera _processImage:withCropRect:maxDimension:fromCamera:needsPreviewRotation:previewOrientation:]_block_invoke + 708
5 libdispatch.dylib 0x0000000109cc12cc _dispatch_call_block_and_release + 24
6 libdispatch.dylib 0x0000000109cc128c _dispatch_client_callout + 16
7 libdispatch.dylib 0x0000000109ccd3dc _dispatch_queue_override_invoke + 984
8 libdispatch.dylib 0x0000000109cd29d0 _dispatch_root_queue_drain + 624
9 libdispatch.dylib 0x0000000109cd26f4 _dispatch_worker_thread3 + 136
10 libsystem_pthread.dylib 0x0000000185beb06c _pthread_wqthread + 1268
11 libsystem_pthread.dylib 0x0000000185beab6c start_wqthread + 4
I just built this awesome library into my project and everything seems to be working great except that zooming does not seem to be working. When I use a pinch gesture nothing happens at all. Tap to focus works and everything but literally nothing occurs when i pinch. I have not messed with any of the setup but I checked to ensure that zooming was enabled and it is. Any thoughts on why this feature isn't working for me? iPhone 5S running iOS 9
Hi - I'm trying to check the cameraFlashMode while using Swift 2.0. For some reason, I get the error
Enum case 'On' is not a member of type 'FastttCameraFlashMode?'
with the following:
@IBAction func switchFlash(sender: AnyObject) {
var flashMode: FastttCameraFlashMode
var flashTitle: String
switch self.camera?.cameraFlashMode {
case FastttCameraFlashMode.On:
flashMode = FastttCameraFlashMode.Off
flashTitle = "Flash Off"
default:
flashMode = FastttCameraFlashMode.On
flashTitle = "Flash On"
}
if let camera = camera {
if camera.isFlashAvailableForCurrentDevice() {
camera.cameraFlashMode = flashMode
flashButton.setTitle(flashTitle, forState: UIControlState.Normal)
}
}
}
The odd thing is that I can set the flashMode
perfectly fine, but the switch is not working. The same happens for Off and Auto. I'm not sure if this is an iOS 9 issue or a Fasttt issue. If it's an Apple issue, then I can file a Radar.
Thanks,
Ahan
it'll be great if it supported burst mode
Hello
_session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureSessionPresetPhoto
AVCaptureSessionPresetHigh
AVCaptureSessionPresetMedium
AVCaptureSessionPresetLow
AVCaptureSessionPreset352x288
AVCaptureSessionPreset640x480
...
AVCaptureSessionPresetInputPriority
Opportunity to change this parameter is very necessary. Take out it in a separate method, please.
I noticed a lag on the preview, It's sow, not fluid and a little bit laggy. On my iPhone 6. On a homemade framework like this one, i didn't have bug like that.
Hi! Great library!
I have a problem with the output image. I am trying to get a square photo but the output is slightly off.
These are the logs from the various delegate methods:
fastCamera.view.frame: (0.0, 80.0, 320.0, 320.0)
RAW JPEG Data size: Optional((2448.0, 3264.0))
capturedImage size: (2448.0, 2452.0)
Normalized full capturedImage size: (2448.0, 2452.0)
Normalized scaled capturedImage size: (320.0, 321.0)
Any idea why the image is slightly wider? Thanks!
I am testing on an iPhone 5S running iOS 9.1
Live image in portrait when camera is in landscape
The shutter button is also enlarged and slightly below the screen when in landscape.
iOS 9.3.1
iPhone 6+
Sometimes _checkDeviceAuthorizationWithCompletion
takes too long before invoking the completion handler, resulting in _session
being nil
at the point startRunning
is called.
Check this fork: https://github.com/Snupps/FastttCamera
Strange one here, hoping you can shed some light...
This block of code in FastttCamera.m
doesn't execute in my Swift project:
if (self.isViewLoaded && self.view.window) {
[self startRunning];
[self _insertPreviewLayer];
[self _setPreviewVideoOrientation];
[self _resetZoom];
}
I've verified that self.isViewLoaded
is true. However, when I po self.view.window
, the response is property 'window' not found on object of type 'UIView *'
OK, fine, that would explain why I the block doesn't run. If I remove the self.view.window
check everything seems to function normally again. However within the example app provided, self.view.window
is also not found on view
when I po
it in lldb (same error as in my app), BUT the block still executes.
To test this, set a breakpoint here, run the example app, and then po self.view.window
.
The example app does work, and my app mostly works without this although zooming is broken because FastttZoom.maxScale
is not initialized. Any ideas what would cause this or how to fix?
When I use FastttCamera's filter image, I found that It makes my app's memory very high,
and this increased memory can't be released.
I tested the demo , It causes the same question. please help me to solve the problem ,tks so much
Working on an app that is requested to have rapid-fire photo taking abilities. This is not capable with the current pod given that the image needs to be processed before isCapturingImage is set to NO.
Why not have that function asynchronous and just have a queue to handle the photos that come in from the camera? I am probably going to try and implement this myself with your library, but I'm just wondering why you all decided against it. Or maybe it is possible and I'm just missing something.
不会释放内存啊!
memery not free!!!
This is actually not a huge change and can of course just be bolted on by a user but I figure it might be worth it to include an option to turn on the Torch before taking a picture rather than just flashing it (I do it this way in my apps because I find it lets me compose my shot better on the first try, since the picture has time to refocus nicely and you can adjust your angle to minimize glare). Any thoughts?
when I was returning from the background you can use it .
Do you have a problem with the following ?
_setupCaptureSession method of FastttCamera.m
The following has not been read at the time of the first start-up .
if (self.isViewLoaded && self.view.window) {
[self startRunning];
[self _insertPreviewLayer];
[self _setPreviewVideoOrientation];
[self _resetZoom];
}
In Xcode7 Console
(lldb) po [(UIView *)self.view window]
nil
the delegate method return a capturedImage,and it's size 2448 * 2448 on iPhone6 plus device,cause memory waring then crash.Colud you help me or something advice?
How do I automatically crop the camera to a square UIView as it says in the README?
Preferably in swift.
After opening project on Xcode 7 and updating pods - I received an 'Expected a Type' error on this line:
- (CGRect)fastttCropRectFromPreviewLayer:(AVCaptureVideoPreviewLayer *)previewLayer;
I thought Cocoapods might be to blame so I've reverted pod to the version I was using before (0.2.9) to no effect. I've removed all pods from the project, cleaned and then re-added - but no joy. I'm at a bit of a loss!
Image below is what I see - and I can provide more detail if required. Any thoughts would be much appreciated. Thanks.
Both the front camera and the back camera seem to be more zoomed in than the standard camera, how do I fully zoom out?
The Apple docs state:
(startRunning)The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam for iOS for an implementation example.
(stopRunning)This method is synchronous and blocks until the receiver has completely stopped running.
I think this project is great and can be improved even more by moving some of the session setup into a dedicated NSOperationQueue
for better performance. Keep it up! :)
Laura thanks for your awesome work! This is a great & useful piece of code you have written.
Fixed by pull request #63
iPads with deviceOrientation of faceUp are often in landscape orientation. The code currently assumes portrait orientation. This pull request solves the issue by using statusBarOrientation for faceUp and faceDown deviceOrientation (only).
There are no checks for errors when AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil]
so if an error occurs, deviceInput will be nil and the following call to [_session addInput:deviceInput]
will crash.
We cannot reproduce it on development, but we see several crashes on our production app on crashlytics:
Fatal Exception: NSInvalidArgumentException
0 CoreFoundation 0x184aa51b8 __exceptionPreprocess
1 libobjc.A.dylib 0x1834dc55c objc_exception_throw
2 AVFoundation 0x18c287c70 -[AVCaptureSession addInput:]
3 FastttCamera 0x100f42170 __36-[FastttCamera _setupCaptureSession]_block_invoke.150 (FastttCamera.m:436)
4 libdispatch.dylib 0x18392e1fc _dispatch_call_block_and_release
5 libdispatch.dylib 0x18392e1bc _dispatch_client_callout
6 libdispatch.dylib 0x183932d68 _dispatch_main_queue_callback_4CF
7 CoreFoundation 0x184a52810 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
8 CoreFoundation 0x184a503fc __CFRunLoopRun
9 CoreFoundation 0x18497e2b8 CFRunLoopRunSpecific
10 GraphicsServices 0x186432198 GSEventRunModal
11 UIKit 0x18a9c57fc -[UIApplication _run]
12 UIKit 0x18a9c0534 UIApplicationMain
Is there any reason why there are no such checks?
Thanks in advance
I've just updated pods for my project and now i'm having "use of @import when modules are disabled error" because of FastttCamera.h
:
@import UIKit;
BUT, modules are switched ON for my project and pods:
Any idea how to fix this?
Hi!
I'm trying to implement the camera in Swift. So far I've been able to instantiate it, take a picture, and manipulate the resulting pic.
Where I'm running into trouble is with the functions in AVCaptureDevice+FastttCamera.m. I can't seem to get the syntax right to call these functions in Swift.
var myCam = FastttCamera()
if FastttCamera.isFlashAvailableForCameraDevice(myCam.cameraDevice){
myCam.setCameraFlashMode(FastttCameraFlashMode.Off)
}
'FastttCamera' does not have a member named 'setCameraFlashMode'
I know I'm probably approaching this incorrectly but figured I'd ask for help since I'm stuck. If someone could take a quick look I'd really appreciate it.
Hi,
How to reduce the time delay between capturing the series of pictures.I want to reduce the time in between the picture to picture.
Thank you
Very good library, Is it possible to integrate video capturing?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.