Giter Club home page Giter Club logo

nengo-examples's Introduction

Latest PyPI version Python versions

Nengo: Large-scale brain modelling in Python

An illustration of the three principles of the NEF

Nengo is a Python library for building and simulating large-scale neural models. Nengo can create sophisticated spiking and non-spiking neural simulations with sensible defaults in a few lines of code. Yet, Nengo is highly extensible and flexible. You can define your own neuron types and learning rules, get input directly from hardware, build and run deep neural networks, drive robots, and even simulate your model on a completely different neural simulator or neuromorphic hardware.

Installation

Nengo depends on NumPy, and we recommend that you install NumPy before installing Nengo. If you're not sure how to do this, we recommend using Anaconda.

To install Nengo:

pip install nengo

If you have difficulty installing Nengo or NumPy, please read the more detailed Nengo installation instructions first.

If you'd like to install Nengo from source, please read the developer installation instructions.

Nengo is tested to work on Python 3.6 and above. Python 2.7 and Python 3.4 were supported up to and including Nengo 2.8.0. Python 3.5 was supported up to and including Nengo 3.1.

Examples

Here are six of many examples showing how Nengo enables the creation and simulation of large-scale neural models in few lines of code.

  1. 100 LIF neurons representing a sine wave
  2. Computing the square across a neural connection
  3. Controlled oscillatory dynamics with a recurrent connection
  4. Learning a communication channel with the PES rule
  5. Simple question answering with the Semantic Pointer Architecture
  6. A summary of the principles underlying all of these examples

Documentation

Usage and API documentation can be found at https://www.nengo.ai/nengo/.

To build the documentation yourself, run the following command:

python setup.py build_sphinx

This requires Pandoc to be installed, as well as some additional Python packages. For more details, see the Developer Guide.

Development

Information for current or prospective developers can be found at https://www.nengo.ai/contributing/.

Getting Help

Questions relating to Nengo, whether it's use or it's development, should be asked on the Nengo forum at https://forum.nengo.ai.

nengo-examples's People

Contributors

clvcooke avatar drasmuss avatar s72sue avatar seanny123 avatar tbekolay avatar xchoo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nengo-examples's Issues

finish() got an unexpected keyword argument 'dirty'

Hi,
i think this is a version issue with progressbar2 being used. I am trying to run the speech sample where you train the model.
This is the error code:

TypeError                                 Traceback (most recent call last)
<ipython-input-10-16285c31d209> in <module>
      1 # create a Nengo DL simulator and set the minibatch size
----> 2 with nengo_dl.Simulator(net, minibatch_size=100) as sim:
      3 
      4     # define an optimizer
      5     optimizer = tf.train.RMSPropOptimizer(0.001)

~/anaconda3/envs/speech-examples/lib/python3.6/site-packages/nengo_dl/simulator.py in __init__(self, network, dt, seed, model, dtype, device, unroll_simulation, minibatch_size, tensorboard, progress_bar)
    153                 self.minibatch_size,
    154                 device,
--> 155                 progress,
    156             )
    157 

~/anaconda3/envs/speech-examples/lib/python3.6/site-packages/nengo_dl/tensor_graph.py in __init__(self, model, dt, unroll_simulation, dtype, minibatch_size, device, progress)
    140                 old_operators = operators
    141                 for simp in simplifications:
--> 142                     operators = simp(operators)
    143 
    144         # group mergeable operators

~/anaconda3/envs/speech-examples/lib/python3.6/site-packages/progressbar/bar.py in __exit__(self, exc_type, exc_value, traceback)

TypeError: finish() got an unexpected keyword argument 'dirty'

There is smth. wrong with the versions being used in your instructions. Also, you should consider mentioning that you need the nengo 2.8.0 version.

Thanks for your help,
Julian

The importance of subdimensions in SPA memories

As @xchoo explained it, integrators work by finding fixed points in the representation space. Neural noise means it's possible to escape these fixed points. This escape/drift is easier in higher dimensions. Consequently, if you want a non-drifty memory in high dimensions, you should use one ensemble per dimension:

    mem = spa.State(vocab, subdimensions=1,
        represent_identity=False, feedback=1, label="mem")

Example of how to detect if a SPA memory is empty

import nengo_spa as spa
import nengo
import numpy as np

D = 32

sym_keys = {'ONE', 'TWO', 'THREE', 'FOUR'}
vocab = spa.Vocabulary(D)
vocab.populate(";".join(sym_keys))

with spa.Network() as model:

    in_state = spa.State(vocab, represent_identity=False, label="input")
    mem = spa.State(vocab, subdimensions=1,
                    represent_identity=False, feedback=1, label="mem")

    in_state >> mem

    with nengo.presets.ThresholdingEnsembles(0.09):
        ens = nengo.Ensemble(300, 1, radius=1)

    def sum_dims(x):
        return np.sum(np.abs(x)) / (D/2)

    for mem_ens in mem.all_ensembles:
        nengo.Connection(mem_ens, ens, function=sum_dims)

Make this repo pip-installable

This requires organizing the files under a good root with an __init__.py, and writing a setup.py.

A few things that need to be hashed out before that can be done:

  • How do we deal with extra dependencies some examples might have?
  • How do we deal with different filetypes? .ipynb I'm looking at you.

Some thought to kick off the discussion...

For the first point, we should definitely have nengo and nengo_gui as requirements in setup.py, but probably only those two. For examples that use other packages (nengo_spinnaker, etc), we should have some helper functions in this repo to check if certain things are installed and give helpful messages if they're not.

For the second point, I think for the time being we just make sure to include them in MANIFEST.in so that they show up when the repo is pip installed. In the future, it would be nice to include some helper functions or something that can run .ipynb files or that can load up the GUI with the given .py file etc.

Which versions to use?

Hi,
I just started out with Nengo and when installing all the packages it is not clear what versions should be used.
This is how you can reproduce my issue:
Execute pip install for nengo and nengo_dl and use some environment where you have matplotlib and numpy.
For me, this installed nengo 3.0, nengo-dl 3.2 and some tf version >=2.0.

Then, try executing this code:

(This is a notebook, but you can also treat it as a normal .py file I guess)

#!/usr/bin/env python
# coding: utf-8

# In[10]:


# - Imports
import warnings
warnings.filterwarnings('ignore')
import nengo
import nengo_dl
import numpy as np
import matplotlib.pyplot as plt


# In[11]:


def weight_init(shape):
    '''Convenience function for randomly initializing weights'''
    weights = np.random.uniform(-0.05, 0.05, size=shape)
    return weights

def generate_xor_sample(total_duration, dt, amplitude=1, use_smooth=True, plot=False):
    """
    Generates a temporal XOR signal
    """
    input_duration = 2/3*total_duration
    # Create a time base
    t = np.linspace(0,total_duration, int(total_duration/dt)+1)
    
    first_duration = np.random.uniform(low=input_duration/10, high=input_duration/4 )
    second_duration = np.random.uniform(low=input_duration/10, high=input_duration/4 )

    end_first = np.random.uniform(low=first_duration, high=2/3*input_duration-second_duration)
    start_first = end_first - first_duration

    start_second = np.random.uniform(low=end_first + 0.1, high=2/3*input_duration-second_duration) # At least 200 ms break
    end_second = start_second+second_duration

    data = np.zeros(int(total_duration/dt)+1)

    i1 = np.random.rand() > 0.5
    i2 = np.random.rand() > 0.5
    response = (((not i1) and i2) or (i1 and (not i2)))
    if(i1):
        a1 = 1
    else:
        a1 = -1
    if(i2):
        a2 = 1
    else:
        a2 = -1

    input_label = 0
    if(a1==1 and a2==1):
        input_label = 0
    elif(a1==1 and a2==-1):
        input_label = 1
    elif(a1==-1 and a2==1):
        input_label = 2
    else:
        input_label = 3

    data[(start_first <= t) & (t < end_first)] = a1
    data[(start_second <= t) & (t < end_second)] = a2

    if(use_smooth):
        sigma = 10
        w = (1/(sigma*np.sqrt(2*np.pi)))* np.exp(-((np.linspace(1,1000,int(1/dt))-500)**2)/(2*sigma**2))
        w = w / np.sum(w)
        data = amplitude*np.convolve(data, w, "same")
    else:
        data *= amplitude

    target = np.zeros(int(total_duration/dt)+1)
    if(response):
        ar = 1.0
    else:
        ar = -1.0
    
    target[int(1/dt*(end_second+0.05)):int(1/dt*(end_second))+int(1/dt*0.3)] = ar
    sigma = 20
    w = (1/(sigma*np.sqrt(2*np.pi)))* np.exp(-((np.linspace(1,1000,int(1/dt))-500)**2)/(2*sigma**2))
    w = w / np.sum(w)
    target = np.convolve(target, w, "same")
    target /= np.max(np.abs(target))

    if(plot):
        eps = 0.05
        fig = plt.figure(figsize=(10,4))
        plt.subplot(121)
        plt.plot(t, data)
        plt.ylim([-amplitude-eps, amplitude+eps])
        plt.subplot(122)
        plt.plot(t, target)
        plt.show()

    return (data[:int(total_duration/dt)], target[:int(total_duration/dt)], input_label)


# In[12]:


# - Check the input/target
_,_,_ = generate_xor_sample(1.0,dt=0.001,plot=True)


# In[13]:


inp_dim = 1
out_dim = 1

n_neurons = 384
max_rate = 250
amplitude = 1 / max_rate

mem_tau = 0.05 # s
syn_tau = 0.07 # s

duration = 1.0 # s

# - Tau_rc is the membrane TC, tau_ref is the refractory period
lifs = nengo.LIF(tau_rc=mem_tau, tau_ref=0.00, amplitude=amplitude)


# In[14]:


# - Network connectivity, only one recurrently connected ensemble
with nengo.Network() as net:
    net.config[nengo.Connection].synapse = nengo.synapses.Lowpass(tau=syn_tau)
    net.config[nengo.Ensemble].max_rates = nengo.dists.Choice([max_rate])
    net.config[nengo.Ensemble].intercepts = nengo.dists.Choice([0])

    inp = nengo.Node(np.zeros(inp_dim))
    ens = nengo.Ensemble(n_neurons=n_neurons, dimensions=1, neuron_type=lifs)
    out = nengo.Node(size_in=out_dim)

    conn_a = nengo.Connection(
        inp, ens.neurons, transform=weight_init(shape=(n_neurons, inp_dim)))

    conn_b = nengo.Connection(
        ens.neurons, out, transform=weight_init(shape=(out_dim, n_neurons)))
    
    probe_out = nengo.Probe(out, synapse=0.01)
    probe_spikes = nengo.Probe(ens.neurons)


# In[15]:


# - Pass a sample data point through the random network and plot the outputs
with nengo_dl.Simulator(net) as sim:
    # Run it for 1 second
    sample_xor_input, _, _ = generate_xor_sample(1.0, dt=0.001)
    sim.run(len(sample_xor_input), data={inp: sample_xor_input}, progress_bar=False)

Then, when trying to run a simulation using the un-trained network, this error occurs ( I am not including the trace):

~/anaconda3/envs/nengo/lib/python3.6/site-packages/nengo_dl/simulator.py in _generate_inputs(self, data, n_steps)
   1776             data_batch = data_steps = None
   1777         else:
-> 1778             data_batch, data_steps = next(iter(data.values())).shape[:2]
   1779 
   1780         batch_size = self.minibatch_size if data_batch is None else data_batch

ValueError: not enough values to unpack (expected 2, got 1)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.