Giter Club home page Giter Club logo

hopkira / k9-chess-angular Goto Github PK

View Code? Open in Web Editor NEW
14.0 12.0 10.0 52.88 MB

Angular JS K9 robot controller, designed primarily for use on a iPad or Android touch device as a virtual joystick. Also visualizes the robot's surroundings using camera and sensor data.

Home Page: http://k9-build.blogspot.co.uk/

License: The Unlicense

HTML 11.40% JavaScript 28.63% Python 59.69% CSS 0.01% Shell 0.27%
robotics robot k9 doctorwho doctor-who espruino cognitive-robotics ibm ibm-watson ibm-watson-speech

k9-chess-angular's Introduction

K9 Controller

Code and configuration files for a chess playing enhancement to a remote presence robot!

The root directory contains some frequently requested design documents and plans as PDFs, plus the index.html that defines the HTML that is the root page of the end user app.

Directory Structure

chess

Chess profiler by the amazing Maris Van Sprang!

conversation

JSON configuration files for IBM Watson Assistant workspaces

css

CSS files for end user interface largely generated and maintained by Ionic

espruino

Embedded JavaScript routines that run on Espruino picos to offload the overhead of working with low level sensors from the Pi. This includes a neural net implementation for combining five ultrasonic sensor readings into a position vector.

File Description
moving_avg_net.js Generated neural net that runs on the Espruino; generates a moving average of readings
back_panel.js Controller for the back planel IR sensors, switches and touch sensors
sensors_LIDAR_ears.js Controller for the LIDAR ears that are used to see in front of the robot. The program combines the function to move the ears, alongside the ability to read the servo potentiometer to work out which direction they are facing and to read the distance readings from the LIDAR itself
ears only_test.js Simple controller for the ear servo motors only (no LIDAR)

img

Visuals for end user interface, includes default camera image and SVG for Sensors tab. Some reference images of the hero prop are also included.

js

The AngularJS JavaScript that is the bulk of the end user application function

File Description
app.js Basic structure of the application modules plus some low level functions
controllers.js Each tab has its own controller in this file that respondes to user events and manipulates the model. This separation of event handling and model manipulation makes maintenance and problem diagnosis easier.
directives.js There are custom directives for the locked/unlocked icon on each tab (that shows whether communications between browser and dog are working in both directions) and the joystick on the Motors tab. These directives make the HTML much easier to understand and maintain.
services.js The shared services maintain a model of the state of the dog in the front end app; they also support the creation of sockets between the app and dog and the standardisation of messages flowing over that connection. The translation between sensor readings and the SVG world are also calculated here for display on the sensors page.

models

This directory contains the high level descriptions, models and schematics for K9. It also provides 3D models in SketchUp and TinkerCAD/3d printing formats to enable the recreation of components.

node-RED

This directory contains the flows to control K9. It provides the means to flow information between the various elements of the dog and co-ordinates movement and speech. It also contains the definition of the dashboard to show on K9's screen.

python

This directory contains the python programs that use the Adafruit PWM Servo Driver and RoboClaw PID MotorController to make K9 move. A harness is included to generate sensor data to simulare collisions. There are also some simple scripts to interface to Watson Conversation and STT (and to K9's espeak TTS)

Program Description
K9PythonController.py RoboClaw based Motor Controller
ear_controller.py Controls K9's ears to collect forward facing LIDAR information
logo.py Translates simple Logo paths into a movement plan for the RoboClaw
memory.py Provides access to K9's short term memory which stores state and sensor readings
status.py Sends K9's current state to node-RED (and browser) as JSON string every 200ms
K9_roboclaw_init.py Stores PID and motor settings in Roboclaw NVRAM
node_RED_harness_ultrasonic.py Creates simulated LIDAR, IR and ultrasonic sensor readings
ttsrobot.py Uses Snowboy, Watson STT, Conversation and TTS - a bit Alexa like :+)

script

Simple deployment scripts to move code into right place on the Pi

snap

Standard JavaScript library used to integrate SVG and AngularJS on the Sensors page

templates

This directory contains the HTML for each of the tabs of the user interface (including the definition of the tabs themselves!). Keeping the html for each tab separately simplifies testing and maintenance.

tessel

Tessel is not currently used on the K9 robot

k9-chess-angular's People

Contributors

amitmangalvedkar avatar hopkira avatar marisvs avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

k9-chess-angular's Issues

Twisted error Reactor not Restartable

So Everything works :) 🥇 Also Happy Easter :)
So I run the program whether it be the loop or the program itself, I say K9 and snowboy detects it, then I say Hello, He says Hello I am K9, then snowboy waits to detect again, I say K9 and snowboy detects it, then it says lights on I say What time is it, but the transcription doesnt show up, it says lights off and then this error appears. I know the error is about the Speech to Text service because I copied the code below and pasted it after tts is called and it gives the same error The error is happening after it detects k9 and second time and begins to transcribe or in the case where i pasted it happens immediately upon running the code below Error is further down
with open('my_voice.wav') as f:
speech_to_text.recognize_with_websocket(audio=f,content_type='audio/l16; rate=44100', recognize_callback=mycallback)
while not finished:
time.sleep(0.1)

Traceback (most recent call last):
File "/home/pi/K9/k9-chess-angular-master/python/working.py", line 10, in
import os, sys, subprocess, threading, time, json, re, signal, snowboydecoder, working
File "/home/pi/K9/k9-chess-angular-master/python/working.py", line 139, in
speech_to_text.recognize_with_websocket(audio=f,content_type='audio/l16; rate=44100', recognize_callback=mycallback)
File "/usr/local/lib/python2.7/dist-packages/watson_developer_cloud/speech_to_text_v1.py", line 279, in recognize_with_websocket
RecognizeListener(audio, options, recognize_callback, url, headers)
File "/usr/local/lib/python2.7/dist-packages/watson_developer_cloud/websocket/speech_to_text_websocket_listener.py", line 47, in init
reactor.run() # pylint: disable=E1101
File "/home/pi/.local/lib/python2.7/site-packages/twisted/internet/base.py", line 1242, in run
self.startRunning(installSignalHandlers=installSignalHandlers)
File "/home/pi/.local/lib/python2.7/site-packages/twisted/internet/base.py", line 1222, in startRunning
ReactorBase.startRunning(self)
File "/home/pi/.local/lib/python2.7/site-packages/twisted/internet/base.py", line 730, in startRunning
raise error.ReactorNotRestartable()
ReactorNotRestartable

Create the first iteration of the Opponent Profiler

The Opponent Profiler should aspire to consist of:

  • A data model for all the things we might want to know about a opponent and data store for active players in that model. These should be both chess-specific facts and profiled data plus as appropriate wider human pieces of information - as much as we may consider potentially useful in gaining an edge over them.
  • Interfaces which allow any elements of the player profile to be queried and pulled by other components of the system

To achieve this:

  • It is envisaged OP will have access to a main player database (need to define source of this - it might be problematic to nick Chessbase's!) for the retrieval of player static data (age, nationality, title, current rating perhaps ...)
    (NOT intended for first iteration, but these are the later goals)
  • Chronological data on the player - rating history, events played, achievements (titles, prizes, etc.)
  • Profiled data - mined from game data and elsewhere - worst opponents, best opponents; stats by opening, usage of time, inference of style e.g. - aggressive, positional, strong in endgames, etc.
  • Profiled data can be added to when K-9 plays the opponent because he gets direct, detailed information.

Create a collision detection algorithm for the robot

Minimum definition of done:

  • Works with sensor data provided as per test harness
  • 99.9% accuracy based on simulated data set
  • Robot modeled as 1125mm x 445mm rectangle
  • Collision processing on Raspberry Pi
  • Collision detection based on 27 readings at low speeds and 13 readings at high speed

Desirable definition of done:

  • Robot modeled as 8 sided convex polygon or better
  • Collision processing offloaded to Espruino
  • Collision detection based on >27 readings at low speed

Create a safe-driving robot aware of it's surroundings

Enhance the existing robot with the ability to sense its environment and make automated changes to its direction and speed based on input from sensors. This should include the ability to stop and make minor corrections to avoid obstacles. It should also include the ability to stop the robot from entering into dangerous maneuvers. This includes spinning or turning at high speed, spinning when not enough room or reversing at high speed.

Create Event Control web app

This is envisaged as an "administrative" web UI for the setup and control of an chess event (match, or in future other, e.g. Training). Whilst other modes of interaction with K-9's chess functions will later be made available (e.g. voice), the Control App is intended to be first and foremost means of controlling match setup and managing matches as they progress.

Errors Return :(

I have no idea what happened I've been keeping up with your commits adding them In when you commit them but this error has returned now
Traceback (most recent call last):
File "/home/pi/K9/k9-chess-angular-master/python/lastone.py", line 126, in
answer = results.group(1)
AttributeError: 'NoneType' object has no attribute 'group'

Create a follow-me robot that responds automatically to a ultrasonic transducer

Replace the manual guidance of the robot with the ability to follow a hand-held ultrasonic transducer (transmitter). This will require multiple ultrasonic receivers around the body of the robot to enable the direction of the sound to be quickly deduced.
The resulting activity should be constrained by the same safe driving algorithms as the manual control.

Create motherboard mk2

The motherboard holds all of K9's key processing and interface components on a single removable board so that he can easily be separated from the shell and the motor board

Create 'Follow Me' Page

Create a follow me page on the Angular 2 framework.

Page should include:
K9 eye camera
visualization of proximity sensor feeds
visualization of ultrasonic follow me signals
basic minimum controls (heel/stay, emergency stop, get out of the way)

Getting more done in GitHub with ZenHub

Hola! @hopkira has created a ZenHub account for the hopkira organization. ZenHub is the only project management tool integrated natively in GitHub – created specifically for fast-moving, software-driven teams.


How do I use ZenHub?

To get set up with ZenHub, all you have to do is download the browser extension and log in with your GitHub account. Once you do, you’ll get access to ZenHub’s complete feature-set immediately.

What can ZenHub do?

ZenHub adds a series of enhancements directly inside the GitHub UI:

  • Real-time, customizable task boards for GitHub issues;
  • Multi-Repository burndown charts, estimates, and velocity tracking based on GitHub Milestones;
  • Personal to-do lists and task prioritization;
  • Time-saving shortcuts – like a quick repo switcher, a “Move issue” button, and much more.

Add ZenHub to GitHub

Still curious? See more ZenHub features or read user reviews. This issue was written by your friendly ZenHub bot, posted by request from @hopkira.

ZenHub Board

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.