yabata / pyrenn Goto Github PK
View Code? Open in Web Editor NEWA Recurrent Neural Network Toolbox for Python and Matlab
License: GNU General Public License v3.0
A Recurrent Neural Network Toolbox for Python and Matlab
License: GNU General Public License v3.0
Was reading through the documentation and there were a few things I believe to be mistakes.
In this image the middle weight going from layer 3 to layer 2 should be LW 2,3 [d], rather than LW 1,2 [d].
https://raw.githubusercontent.com/yabata/pyrenn/master/doc/img/recurrent_nn.png
Here layer 1 is missing its bias values
https://github.com/yabata/pyrenn/blob/master/doc/img/MLP2221_detailed.png
And while the example in the doc says LW 3,2 has 4 weights, there are only 2 in the corresponding example image (2 neurons in the M-1th hidden layer connect to a single neuron in the output layer), is this to say that it is still stored in a 2x2 matrix but two of those weight variables are actually zero's?
Is there a way to terminate the training of a network if the error rate has converged to a certain significant figure?
Contrary to what the documentation says, the function saveNN and loadNN both import pandas as pd, although only loadNN seems to use it
w = pd.read_csv(...)
Could you please replace the pandas dependency and only use the csv reader/writer?
It would be nice to include something like this
$ diff pyrenn.py pyrenn-original.py
3c3
< def CreateNN(nn,dIn=[0],dIntern=[],dOut=[], activation='tanh'):
---
> def CreateNN(nn,dIn=[0],dIntern=[],dOut=[]):
43,48d42
< if activation == "tanh":
< net['activation'] = tanh
< elif activation == "softplus":
< net['activation'] = softplus
< elif activation == "relu":
< net['activation'] = relu
51,68d44
< def tanh(n, a=None, deriv=False):
< if deriv:
< return 1 - a**2
< else:
< return np.tanh(n)
<
< def softplus(n, a=None, deriv=False):
< if deriv:
< return 1 / (1 + np.exp(-n))
< else:
< return np.log(1 + np.exp(n))
<
< def relu(n, a=None, deriv=False):
< if deriv:
< return np.ones(n.shape) * (n > 0)
< else:
< return n * (n > 0)
<
288d263
< activation = net['activation']
321c296
< a[q,m] = activation(n[q,m])
---
> a[q,m] = np.tanh(n[q,m])
388d362
< activation = net['activation']
442c416
< + np.dot(np.dot(S[q,u,l],LW[l,m,0]),np.diag(activation(n[q,m],a[q,m],deriv=True)))
---
> + np.dot(np.dot(S[q,u,l],LW[l,m,0]),np.diag(1-(np.tanh(n[q,m]))**2))
454c428
< S[q,m,m] = np.diag(activation(n[q,m],a[q,m],deriv=True)) #Sensitivity Matrix S[m,m]
---
> S[q,m,m] = np.diag(1-(np.tanh(n[q,m]))**2) #Sensitivity Matrix S[m,m]
612c586
< + np.dot(np.dot(S[q,u,l],LW[l,m,0]),np.diag(activation(n[q,m],a[q,m],deriv=True)))
---
> + np.dot(np.dot(S[q,u,l],LW[l,m,0]),np.diag(1-(np.tanh(n[q,m]))**2))
622c596
< S[q,m,m] = np.diag(activation(n[q,m],a[q,m],deriv=True)) #Sensitivity Matrix S[m,m]
---
> S[q,m,m] = np.diag(1-(np.tanh(n[q,m]))**2) #Sensitivity Matrix S[m,m]
It is not possible to save an untrained network because of the missing Pnorm and Ynorm parameters.
This can be changed by defining them during creation of the NN and setting them to e.g. None ore Zero. Then changing "prepare_data" is necessary as well
Hi,
Thank you for your great work!!
I would like to understand more about how the RTRL Jacobian matrix was calculated. Is that also from the paper "A Learning Algorithm for Continually Running Fully Recurrent Neural Networks"? Could you elaborate a bit on the procedure or point me to some references that you used for calculating the Jacobian matrix? Thank you.
Hi!
First of all, great work! its amazing and the documentation is great!
I'm trying to load a NN, previously saved using: saveNN(netBFGS,filename)
...
But when trying to load it and perform more experiments, I do: net = loadNN('Nets/netLM');
, and I'm getting this errors:
Error using xlsread (line 128)
XLSREAD unable to open file 'Nets/netLM'.
File 'Nets/netLM' not found.
Error in loadNN (line 10)
[~,~,rawData] = xlsread(filename);
After checking if my file exists (which it does...) I tried to imported like on loadNN.m using: [~,~,Data] = xlsread('Nets/netLM')
and I'm getting:
Error using xlsread (line 247)
Unable to read XLS file /Nets/netLM. File is not in recognized format.
Is there might be a problem with saveNN?? maybe being saved not in the proper format? or maybe on MATLAB_R2015a the xlsread function is changed and not compatible with the CSV format?
im using the same code in the example:
IMPORT DATASET -----------------------------------
X = pd.read_feather('Input.file') ;
Y = pd.read_feather('Target.file') ;
TRASFORMATION DATASET IN ARRAY -------------------
X = X.values.T
Y = Y.values.T
print(X.shape,Y.shape)
NETWORK ------------------------------
net = prn.CreateNN([3,6,8])
net = prn.train_LM(X,Y,net,verbose=True,k_max=1000,E_stop=1e-5)
ADD : why with command :
W0 = net['w0']; print(W0.shape) ;
W1 = net['w']; print(W1.shape) ;
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.