Giter Club home page Giter Club logo

real-time-human-detection-tracking-system's Introduction

Smart tracking Camera using Yolo.

Data scientist | Anass MAJJI


๐Ÿง Description

  • In this project, we built a Smart Human Tracking Camera using Arduino and Yolov5 model.

๐Ÿš€ Repository Structure

The repository contains the following files & directories:

  • .github/workflows : it contains the .yml file which details the instructions of our automated tests and deployment process.

  • images : this folder contains all images used on the README file.

  • src : in this folder we have :

    • app : code of the Fastapi webapp.
    • test : differents Unit tests.
    • yolov5 : DL model used for human detection.
  • requirements.txt: all the packages used in this project.

๐Ÿ“ˆ Demontration

In this section, we are going to demonstrate a walkthrough on building and deployment of a Real-time Human Detection and tracking system using Yolov5 model and Arduino UNO cards. We can split this project into two parts :

1. Software section :

1.1 Fastapi webapp :

Before deploying the model on the Arduino board, we built a Fastapi webapp using HTML, CSS and JS. For the Client / Server connection we used the WebSocket protocol to send the real time Yolov5's output as a streaming video. Bellow is the the home page of the webapp. As we can see, there are two main options :

First option :

It consists in detecting humans from images. The user can upload the image ("Click to upload" buttom) and then click on "Analyze" to get the output of the Yolo model. Once the output image is generated, the user can download it by clicking on "Download".

Bellow, an example of the input and generated image using Yolov5 model.

Second option :

With the second option, we use the Yolov5 model to detect and track humans using camera. The video streaming will start after clicking on "start" button. Here, we have two choices, we can either use a webcom or an external USB camera.

The video streaming is stopped afer clicking on "Stop" button or on "Exit WebCom" button to shut down the WebSocket connection.

1.2 Deployment using CI/CD :

  • CI/CD : Finally, to deploy the project we use CI/CD which is one of the best practices in software development as it reduces the repetitive process of unit tests and the deployment of the software. For that, in src/test_app.py script, we test each function of the Fastapi webapp. All of these processes can be achieved using a new feature on github called github Actions. To create the CI/CD workflow, we need to create a .yml file on .github/workflows in which we have the instructions of our automated tests and deployment process.

2. Hardware section :

2.1 Build a human tracking camera from scratch :

We deploy the Yolov5 model using Arduino UNO card. For that, we need :

  • 2 Servo motors : used for vertical and horizontal rotation with a 120 rotation degree.

  • Arduino UNO card : is a microcontroller board mainly used to interact and controll eletronic objects with an Arduino code.

  • 1080p USB Camera : with a 30 FPS (frame per second)

  • Connecting cables.

  • Camera shell

  • To controll 2 servo-motors using arduino Card, we need first to download and install an Arduino IDE and then upload the Arduino code on the Arduino UNO Card (you can find my code on src/arduino/arduino_2_motors.ino)

Once done, we set up the configuration (shown bellow) to connect all thoses objects mentionned above with the laptop.

2.2 Vertical & horizontal rotations :

In order to track a human using our camera, we need to define some rules based on the position of the person so that it can turn vertically and horizontally. Let's assume that the person is on the left of the camera and the box around him (in red bellow) has coordinates ((X1,Y1) (X2, Y2)). We call it "main_box" :

And then, we compute a centred version of the box, we call it "centred_box" with coordinates ((X1_center, Y1_center) (X2_center, Y2_center)) as shown in the figure bellow :

So for each vertical and horizontal rotation, we have 3 cases depending on the position of the "main_box" relative to its "centred_box" :

As mentioned in the formulas, we defined a 10% tolerance along the X and Y axis to better stabilize the camera.

๐Ÿ“ˆ Performance & results


๐Ÿ“ช Contact

For any information, feedback or questions, please contact me

real-time-human-detection-tracking-system's People

Contributors

amajji avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.