gitabcworld / skiprnn_pytorch Goto Github PK
View Code? Open in Web Editor NEWA pytorch implementation of the paper: "Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks"
License: MIT License
A pytorch implementation of the paper: "Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks"
License: MIT License
Could you please give me the version of torch you used to implement this technique at that time. I have some troubles with version conflict. Many thanks!
Thanks for your excellent work! Can you provide us with the pytorch version. Because there are some changes being made and I cannot run your code using 0.3.X version. Since the class of RNN cell is changed.
when i run your code, i get this error:
Traceback (most recent call last):
File "01_adding_task.py", line 19, in
from util.graph_definition import *
File "/mnt/d/skiprnn_pytorch/util/graph_definition.py", line 14, in
from rnn_cells.custom_cells import CBasicLSTMCell, CBasicGRUCell,
File "/mnt/d/skiprnn_pytorch/rnn_cells/custom_cells.py", line 7, in
from basic_rnn_cells import BasicLSTMCell, BasicGRUCell
ModuleNotFoundError: No module named 'basic_rnn_cells'
How to customize and implement bidirectional RNN
Hi!
I am running a replication study between the tensor flow and pytorch implementation and for the addition task, no convergence is seen i.e. the number of validation samples don't go down at all or do that really slowly (remain always around 99% to 97% running with cuda overnight), compared to this the tensor flow code shows skipping sample behavior just after the first 300 iterations.
I could not spot any immediate difference between the two implementations (I am more used to pytorch).
Thanks
Abhishek
When I try to run any experiment from the downloaded code, I directly get the error:
AttributeError: 'CSkipLSTMCell' object has no attribute 'check_forward_input'
I looked into the custom_cells.py and other files (with the base class) can't see find where it is initialized in.
Hi,
Thanks for this good work!
I am using Skip-LSTM on my experiment now. And it seems to work well.
However, I am wondering how can this code deal with variable-length sequence inputs?
When using RNN/LSTM in pytorch, we can use torch.nn.utils.rnn.pack_padded_sequence
and torch.nn.utils.rnn.pad_packed_sequence
to make the model not to consider the padding vectors as input.
Are there any alternative ways to do the same thing? Thanks a lot!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.