Comments (10)
i used conda to install torchtext, and it seems that it recommends torchtext==0.6.0, then you actually don't need to change anything but the two lines in util/data_loader(line 6, 7) and it works well!
Hello, my friend, can you give me a copy of your requirements.txt? I would be very grateful.
python3.8
torch==1.7
torchvison==0.8.1
torchtext==0.8.0
replace the
from torchtext.legacy.data import Field, BucketIterator
from torchtext.legacy.datasets import Multi30k
by
from torchtext.data import Field, BucketIterator
from torchtext.datasets import Multi30k
if the data can't dowload, you can download them from
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz'
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz'
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz'
and tar them in the .data/multi30k folder
from transformer.
i used conda to install torchtext, and it seems that it recommends torchtext==0.6.0,
then you actually don't need to change anything but the two lines in util/data_loader(line 6, 7)
and it works well!
from transformer.
i used conda to install torchtext, and it seems that it recommends torchtext==0.6.0, then you actually don't need to change anything but the two lines in util/data_loader(line 6, 7) and it works well!
Hei, bro, could you pls tell me how to download the datasets, thanks in advance!
from transformer.
I‘m new to transformer rencently,too. My torchtext==0.15, the fuction Field and Bucketerator were removed . What are the alternatives to torchtext.field and Buckerator?
from transformer.
i used conda to install torchtext, and it seems that it recommends torchtext==0.6.0, then you actually don't need to change anything but the two lines in util/data_loader(line 6, 7) and it works well!
Hello, my friend, can you give me a copy of your requirements.txt? I would be very grateful.
from transformer.
i used conda to install torchtext, and it seems that it recommends torchtext==0.6.0, then you actually don't need to change anything but the two lines in util/data_loader(line 6, 7) and it works well!
Hello, my friend, can you give me a copy of your requirements.txt? I would be very grateful.
python3.8 torch==1.7 torchvison==0.8.1 torchtext==0.8.0
replace the
from torchtext.legacy.data import Field, BucketIterator from torchtext.legacy.datasets import Multi30k
by
from torchtext.data import Field, BucketIterator from torchtext.datasets import Multi30k
if the data can't dowload, you can download them from
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz'
and tar them in the .data/multi30k folder
okay, thank u, my friends!
from transformer.
我使用 conda 安装 torchtext,似乎它推荐 torchtext==0.6.0,那么您实际上不需要更改任何东西,只需更改 util/data_loader(第 6、7 行)中的两行,效果很好!
你好,我的朋友,你能给我一份你的要求.txt吗?我将不胜感激。
python3.8 火炬==1.7 火炬视子==0.8.1 火炬文本==0.8.0
将
from torchtext.legacy.data import Field, BucketIterator from torchtext.legacy.datasets import Multi30k
由
from torchtext.data import Field, BucketIterator from torchtext.datasets import Multi30k
如果数据无法下载,您可以从
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz'
并将它们涂在 .data/multi30k 文件夹中
您好 我按照您后面的方法 将三个文件下载并放到文件夹.data/multi30k 中,但是依然提示缺少train.en文件 请问还有其他的办法下载数据或者有下载脚本吗
from transformer.
我使用 conda 安装 torchtext,似乎它推荐 torchtext==0.6.0,那么您实际上不需要更改任何东西,只需更改 util/data_loader(第 6、7 行)中的两行,效果很好!
你好,我的朋友,你能给我一份你的要求.txt吗?我将不胜感激。
python3.8 火炬==1.7 火炬视子==0.8.1 火炬文本==0.8.0
将from torchtext.legacy.data import Field, BucketIterator from torchtext.legacy.datasets import Multi30k
由
from torchtext.data import Field, BucketIterator from torchtext.datasets import Multi30k
如果数据无法下载,您可以从
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz'
并将它们涂在 .data/multi30k 文件夹中
您好 我按照您后面的方法 将三个文件下载并放到文件夹.data/multi30k 中,但是依然提示缺少train.en文件 请问还有其他的办法下载数据或者有下载脚本吗
没解压吧,解压一下
from transformer.
我使用 conda 安装 torchtext,似乎它推荐 torchtext==0.6.0,那么您实际上不需要更改任何东西,只需更改 util/data_loader(第 6、7 行)中的两行,效果很好!
你好,我的朋友,你能给我一份你的要求.txt吗?我将不胜感激。
python3.8 火炬==1.7 火炬视子==0.8.1 火炬文本==0.8.0
将from torchtext.legacy.data import Field, BucketIterator from torchtext.legacy.datasets import Multi30k
由
from torchtext.data import Field, BucketIterator from torchtext.datasets import Multi30k
如果数据无法下载,您可以从
https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz' https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz'
并将它们涂在 .data/multi30k 文件夹中
您好 我按照您后面的方法 将三个文件下载并放到文件夹.data/multi30k 中,但是依然提示缺少train.en文件 请问还有其他的办法下载数据或者有下载脚本吗
没解压吧,解压一下
解决了我的问题,非常感谢!!!
from transformer.
I‘m new to transformer rencently,too. My torchtext==0.15, the fuction Field and Bucketerator were removed . What are the alternatives to torchtext.field and Buckerator?我最近也是变压器新手。我的torchtext==0.15,功能Field和Bucketerator被删除了。 torchtext.field 和 Buckerator 的替代品有哪些?
I have this problem,too.Do you solve this problem?
from transformer.
Related Issues (19)
- Masked attention HOT 3
- Potential bug in the pad mask HOT 5
- how to get dataset HOT 4
- The experimental results have a large gap with the one in README HOT 2
- Question about implementation in the multi-head attention part HOT 3
- how to resolve the issue ”No module named 'torch._C'“
- About multi-head attention in attention is all you need, thanks. HOT 2
- src_pad_idx and trg_pad_idx
- About MultiHeadAttention's split method HOT 3
- shaollow copy HOT 1
- batch.trg[j] out of index. HOT 1
- Weight matrix sharing confusion
- About initial learing rate HOT 1
- Questions regarding the implementation HOT 9
- [Bug] LayerNorm should not contain learnable parameters HOT 4
- LayerNorm implement HOT 4
- [Bug] Dropout should comes before residual connection and layer norm HOT 1
- Reporting a bug during test time HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from transformer.