Giter Club home page Giter Club logo

jetson-trashformers's Introduction

What is this project about?

Our project uses neural networks to train the Robotis BioloidGP to detect trash and throw it away in trash cans, effectively keeping the office environment clean. The current stage of the model allows for detection of white cups (as trash) in various backgrounds, as well as trashcan symbols. The purpose of this project is to provide a use-case example for developers who may wish to use the Jetson™ platform.

The code in this repository has only been tested on the NVIDIA Jetson TX2.

How can I run this project?

git clone https://github.com/NVIDIA-Jetson/jetson-trashformers.git
cd jetson-trashformers
make
sh runDetect.sh

The first three commands clone and compile the program. The last command runs a script which starts the program. This program can only be run on the Jetson TX2.

When runDetect.sh is run, the robot's webcam is activated and begins searching for a cup. If no cup is found, it will turn and scan around until it finds a cup. Once it has done so, it will walk towards the cup, find its orientation, and attempt to pick it up once within range. The robot then scans for a trashcan symbol and will drop the cup in the trashcan (altered to account for the robot's size).

Watch a demo of the runDetect.sh command

What is TrashNet?

TrashNet is the neural network that we have created in order to detect cups and trashcans. First, a network named CupNet was created with images of only cups and false positives in order to teach the robot about cups. Once a well-defined model was achieved, we added images of trashcans to teach the robot to detect those, while building off its previous knowledge about cups. This neural network has been created and trained on NVIDIA DIGITS using the Caffe framework. We used the help of Dustin Franklin's Jetson Inference tutorial to learn more about using DIGITS and creating our own neural network. To learn more about single and multi-class detection and creating networks with custom data, visit our wiki.

This graph shows the multi-class model's statistics during the training period.

The model learns to draw bounding boxes around cups through training.

Hardware Expectations?

  • Robotis BioloidGP
  • Connect Tech Orbitty Carrier Board
  • Jetson™ TX2
  • Logitech C270 Webcam
  • USB2Dynamixel
  • Zig2Serial

Find a Hardware setup guide here.

Find a Linux 4 Tegra® with Connect Tech Carrier Board setup guide here.

Libraries Used

See 'lib' folder for the specific files.

  • libdetectnet-camera.so
    • A shared object library with an edited and compiled version of detectnet-camera.cpp from Dustin's github.
  • libdxl_sbc_cpp.so
  • libjetson-inference.so
  • libzgb.a
    • A shared object library to control robot commands via ZigBee. (This library is downloaded as a dependancy from the ROBOTIS website)

Authors

(Left to Right from picture above)

  • Ishan Mitra
  • Shruthi Jaganathan
  • Mark Theis
  • Michael Chacko

Licenses Used

  • ROBOTIS:
SDK OBTAINED FROM https://github.com/ROBOTIS-GIT/DynamixelSDK on June 29 2017

Copyright (c) 2016, ROBOTIS CO., LTD. All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  • Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

  • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

  • Neither the name of ROBOTIS nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

  • ZigBee
Source code available at http://support.robotis.com/en/software/zigbee_sdk/zig2serial/linux.htm

No license was found as of June 29 2017.

jetson-trashformers's People

Contributors

ishan22 avatar metheis avatar michaelchacko avatar shruthij28 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jetson-trashformers's Issues

About the battery option for tx2

Hello, all

OS: Ubuntu 16.04
platform: Nvidia Jetson TX2 with Jetpack3.2.1

I am trying to power the NVIDIA TX2 by battery. And the available voltage range is 5.5-19.6V provided by the offical document.

I used lipo3s(12v), sla battery(12v), car battery(12v), all of these batteries can only make the LED cr5 on, but can not turn it through when I push the power button(PS. LED cr6 was faintly flashing).

However I can power the tx2 and trun it on by using a dc-power supply(voltage setting: 12 V).
I can't understand why!!

Dose anyone know the reason and have a soultion?

Thank you
Weixin

Questions About the Wiki Page

Hey, everyone. I have a few questions about the Wiki page that tells us how to flash JetPack and Orbitty BSP.

  1. Why should we finish up flashing the JetPack before proceeding with flash Orbitty BSP? The Wiki page tells us to install JetPack on the host machine, and flash the Jetson, remotely I assume, afterward.

  2. How do we flash the Orbitty BSP? The Wiki page is not totally clear about how connections we should have. Neither is the guide on Connect Tech. Should the power be plugged in, for instance?

If anyone can answer those two questions for me, it'd be great. Thanks!

Wenchao

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.