Giter Club home page Giter Club logo

spark-protocol's Introduction

spark-protocol

Node.JS module for hosting direct encrypted CoAP socket connections! Checkout the local spark-server

                          __      __        __              __
   _________  ____ ______/ /__   / /___  __/ /_  ___  _____/ /
  / ___/ __ \/ __ `/ ___/ //_/  / __/ / / / __ \/ _ \/ ___/ / 
 (__  ) /_/ / /_/ / /  / , |   / /_/ /_/ / /_/ /  __(__  )_/  
/____/ .___/\__,_/_/  /_/|_|   \__/\__,_/_.___/\___/____(_)   
    /_/                                                       

What do I need to know?

This module knows how to talk encrypted CoAP. It's really good at talking with Spark Cores, and any other hardware that uses this protocol. You'll need a server key to use and load onto your devices. You'll also need to grab any public keys for your connected devices and store them somewhere this module can find them. The public server key stored on the device can also store an IP address or DNS name for your server, so make sure you load that onto your server key when copying it to your device. The server will also generate a default key if you don't have one when it starts up.

What code modules should I start with?

There's lots of fun stuff here, but in particular you should know about the "SparkCore" ( https://github.com/spark/spark-protocol/blob/master/js/clients/SparkCore.js ) , and "DeviceServer" ( https://github.com/spark/spark-protocol/blob/master/js/server/DeviceServer.js ) modules. The "DeviceServer" module runs a server that creates "SparkCore" objects, which represent your connected devices.

How do I start a server in code?

var DeviceServer = require("spark-protocol").DeviceServer;
var server = new DeviceServer({
    coreKeysDir: "/path/to/your/public_device_keys"
});
global.server = server;
server.start();

How do I get my key / ip address on my core?

1.) Figure out your IP address, for now lets say it's 192.168.1.10

2.) Make sure you have the Spark-CLI (https://github.com/spark/spark-cli) installed

3.) Connect your Spark Core to your computer in listening mode (http://docs.spark.io/connect/#appendix-dfu-mode-device-firmware-upgrade)

4.) Load the server key and include your IP address / dns address:

spark keys server server_public_key.der your_ip_address
spark keys server server_public_key.der 192.168.1.10

5.) That's it!

Where's the API / webserver stuff, this is just a TCP server?

Oh, you want the Spark-Server module here: https://github.com/spark/spark-server :)

spark-protocol's People

Contributors

brycekahle avatar dmiddlecamp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spark-protocol's Issues

Comments about size and initial settings for counter incorrect

In trying to understand the protocol I noticed the following comments around https://github.com/spark/spark-protocol/blob/master/js/lib/Handshake.js#L62

 * Core creates a protobufs Hello with counter set to the uint32 represented by the most significant 4 bytes of the IV, encrypts the protobufs Hello with AES, and sends the ciphertext to Server.
 * Server reads protobufs Hello from socket, taking note of counter.  Each subsequent message received from Core must have the counter incremented by 1. After the max uint32, the next message should set the counter to zero.
 * Server creates protobufs Hello with counter set to a random uint32, encrypts the protobufs Hello with AES, and sends the ciphertext to Core.
 * Core reads protobufs Hello from socket, taking note of counter.  Each subsequent message received from Server must have the counter incremented by 1. After the max uint32, the next message should set the counter to zero.

However, as far as I can tell, the counter is a uint16 and the firmware grabs the the top 2 significant bytes of the SALT, not the top 4 from the IV. It would be great if these comments could be updated since it could throw someone else otherwise. (If I'm wrong please let me know of course.)

What led me to believe this:
https://github.com/spark/firmware/blob/release/0.4.3/communication/src/spark_protocol.cpp#L1636
https://github.com/spark/spark-protocol/blob/master/js/settings.js#L39
https://github.com/spark/spark-protocol/blob/master/js/clients/SparkCore.js#L373

node-coap vs h5.coap

Hi there, I'm the author of node-coap, a CoAP implementation for node that resembles the stock HTTP module API. h5.coap is not published on NPM, and generally it's not really maintained. Moreover, node-coap has a server counterpart, which h5.coap lacked (morkai/h5.coap#2, it was easier for me to implement my own CoAP library than to contribute there). I will be really interested in seeing why you preferred h5.coap to node-coap :).

Can't connect a Core

Running the example code, my core will not stay connected. Core is using the latest firmware from the master branch. All keys are setup. Output below:

server started { host: 'localhost', port: 5683 }
true
Connection from: ::ffff:192.168.1.23, connId: 1
CryptoStream transform error TypeError: Cannot read property 'length' of null
CryptoStream transform error TypeError: Cannot read property 'length' of null
on ready { coreID: 'id#',
  ip: '::ffff:192.168.1.23',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: '_0' }
Core online!
CryptoStream transform error Error: error:06065064:digital envelope routines:EVP_DecryptFinal_ex:bad decrypt

and it just continues this loop:

onSocketData called, but no data sent.
1: Core disconnected: socket close false { coreID: '55ff6c066678505537381667',
  cache_key: '_0',
  duration: 25.035 }
Session ended for _0
Connection from: ::ffff:192.168.1.23, connId: 2
CryptoStream transform error TypeError: Cannot read property 'length' of null
CryptoStream transform error TypeError: Cannot read property 'length' of null
on ready { coreID: 'id#',
  ip: '::ffff:192.168.1.23',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: '_1' }
Core online!
CryptoStream transform error Error: error:06065064:digital envelope routines:EVP_DecryptFinal_ex:bad decrypt

Any help would be very appreciated.

Issue with photon (0.5) after publish

I have this issue immediately after a publish?

Connection from: ::ffff:192.168.2.13, connId: 2
on ready { coreID: '',
ip: '::ffff:192.168.2.13',
product_id: 6,
firmware_version: 65535,
cache_key: '_1' }
Core online!
onSocketData called, but no data sent.
routeMessage got a NULL coap message { coreID: '' }
1: Core disconnected: socket close false { coreID: '',
cache_key: '_1',
duration: 0.722 }
Session ended for _1
Connection from: ::ffff:192.168.2.13, connId: 3
on ready { coreID: '',
ip: '::ffff:192.168.2.13',
product_id: 6,
firmware_version: 65535,
cache_key: '_2' }
Core online!
onSocketData called, but no data sent.
routeMessage got a NULL coap message { coreID: '' }

Error: Handshake failed: read_coreid timed out { ip: '10.0.1.115', cache_key: '_2', coreID: null }

I've been trying to connect my spark to the raw protocol server. Here are the steps I've been following.

  1. Use the spark-cli command: spark keys new to create a new set of keys.
  2. Place spark into DFU mode: Hold both buttons and release RST till flashing yellow
  3. Use the spark-cli command: spark keys server core.der $MY_IP

Result of Key upload

checking file  data/core10_0_1_3.der
spawning dfu-util -d 1d50:607f -a 1 -i 0 -s 0x00001000 -D data/core10_0_1_3.der
dfu-util 0.7

Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
Copyright 2010-2012 Tormod Volden and Stefan Schmidt
This program is Free Software and has ABSOLUTELY NO WARRANTY
Please report bugs to [email protected]

Filter on vendor = 0x1d50 product = 0x607f
Opening DFU capable USB device... ID 1d50:607f
Run-time device DFU version 011a
Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name="@SPI Flash : SST25x/0x00000000/512*04Kg"
Claiming USB DFU Interface...
Setting Alternate Setting #1 ...
Determining device status: state = dfuERROR, status = 10
dfuERROR, clearing status
Determining device status: state = dfuIDLE, status = 0
dfuIDLE, continuing
DFU mode device DFU version 011a
Device returned transfer size 1024
No valid DFU suffix signature
Warning: File has no DFU suffix
DfuSe interface name: "SPI Flash : SST25x"
Downloading to address = 0x00001000, size = 1024
.
File downloaded successfully
Okay!  New keys in place, your core will not restart.
  1. Next I use the sample code contained in the repo as the base of my server. (All keys live in main directory.)
var DeviceServer = require('spark-protocol').DeviceServer;
var server = new DeviceServer({
  coreKeysDir: '.'
});
global.server = server;
server.start();
  1. I run the script with node index.js

Here is the resulting output.

static class init!
found default_key
found default_key
Loading server key from default_key.pem
set server key
server public key is:  -----BEGIN PUBLIC KEY-----
OMITTED
-----END PUBLIC KEY-----

server started { host: 'localhost', port: 5683 }
Connection from: 10.0.1.115, connId: 1
Handshaking
Handshake decryption error:  [Error: error:0407106B:rsa routines:RSA_padding_check_PKCS1_type_2:block type is not 02]
1: Core disconnected: decryption failed { coreID: 'unknown', cache_key: '_0' }
Session ended for _0
Handshake failed:  decryption failed { ip: '10.0.1.115', cache_key: '_0', coreID: null }
Connection from: 10.0.1.115, connId: 2
Handshaking
Handshake decryption error:  [Error: error:0407106B:rsa routines:RSA_padding_check_PKCS1_type_2:block type is not 02]
1: Core disconnected: decryption failed { coreID: 'unknown', cache_key: '_1' }
Session ended for _1
Handshake failed:  decryption failed { ip: '10.0.1.115', cache_key: '_1', coreID: null }
Connection from: 10.0.1.115, connId: 3
Handshaking
1: Core disconnected: read_coreid timed out { coreID: 'unknown', cache_key: '_2' }
Session ended for _2
Handshake failed:  read_coreid timed out { ip: '10.0.1.115', cache_key: '_2', coreID: null }

2 questions.

  1. How can I troubleshoot the handshake failing?
  2. How can I get access to instances of core devices created by the server?

Thanks for your help!

-Matt

Licensing Issues

Hello!

I'm very eager to try out spark-protocol in a project that I'm working on, but I've found the following problem. In your LICENSE, you state the license for the project is LGPLv3. However, the header on individual files says the license is GPLv3. Here is the header in question.

/*
*   Copyright (C) 2013-2014 Spark Labs, Inc. All rights reserved. -  https://www.spark.io/
*
*   This file is part of the Spark-protocol module
*
*   This program is free software: you can redistribute it and/or modify
*   it under the terms of the GNU General Public License version 3
*   as published by the Free Software Foundation.
*
*   Spark-protocol is distributed in the hope that it will be useful,
*   but WITHOUT ANY WARRANTY; without even the implied warranty of
*   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
*   GNU General Public License for more details.
*
*   You should have received a copy of the GNU General Public License
*   along with Spark-protocol.  If not, see <http://www.gnu.org/licenses/>.
*
*   You can download the source here: https://github.com/spark/spark-protocol
*/

Is this project dual licensed, or is it licensed under LGPLv3 or GPLv3? Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.