Giter Club home page Giter Club logo

quickdraw-airgesture-tensorflow's Introduction

QuickDraw - AirGesture

GitHub stars GitHub forks GitHub license

Introduction

Here is my python source code for QuickDraw - an online game developed by google, combined with AirGesture - a simple gesture recognition application. By using my code, you could:

  • Run an app which you could draw in front of a camera with your hand (If you use laptop, your webcam will be used by default)
  • Run an app which you could draw on a canvas

Camera app

In order to use this application, you only need to use your hand to draw in front of a camera/webcam. The middle point of your hand will be detected and highlighted by a red dot. When you are ready for drawing, you need to press space button to start drawing. When you want to stop drawing, press space button again. Below is the demo by running the sript camera_app.py:


Camera app demo

Drawing app

The script and demo will be released soon

Categories:

The table below shows 18 categories my model used:

apple book bowtie candle
cloud cup door envelope
eyeglasses hammer hat ice cream
leaf scissors star t-shirt
pants tree

Trained models

You could find my trained model at data/trained_models/

Docker

For being convenient, I provide Dockerfile which could be used for running training phase as well as launching application

Assume that docker image's name is qd_ag. You already clone this repository and cd into it.

Build:

sudo docker build --network=host -t qd_ag .

Run:

If you want to launch the application, first you need to run xhost + to turn off access control (if you only want to run the training, you could skip this step). Then you run:

sudo docker run --gpus all -it --rm --volume="path/to/your/data:/workspace/code/data -e DISPLAY=$DISPLAY --env="QT_X11_NO_MITSHM=1" -v /tmp/.X11-unix:/tmp/.X11-unix --device=/dev/video0:/dev/video0 qd_ag

Inside docker container, you could run train.py or camera_app.py scripts for training or launching app respectively. By default, the camera_app.py script will automatically generate a video capturing what you have done during the session, at data/output.mp4

Experiments:

For each class, I split the data to training and test sets with ratio of 8:2. The training/test loss/accuracy curves for the experiment are shown below:

quickdraw-airgesture-tensorflow's People

Contributors

uvipen avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.