Comments (14)
Download all the files from the hub. I think the following code should then work:
from easynmt import EasyNMT, models
model = EasyNMT(translator=models.AutoModel("path/to/model"))
from easynmt.
@nreimers this works a treat but I hit another issue with fasttext being called by the script:-
print("=> detected language:", MODEL.language_detection(sentence), "\n")
File "C:\Program Files\Python38\lib\site-packages\easynmt\EasyNMT.py", line 413, in language_detection
raise Exception("No method for automatic language detection was found. Please install at least one of the following: fasttext (pip install fasttext), langid (pip install langid), or langdetect (pip install langdetect)")
Exception: No method for automatic language detection was found. Please install at least one of the following: fasttext (pip install fasttext), langid (pip install langid), or langdetect (pip install langdetect)
I actually have separate code in my script using fasttext successfully and it is installed. I think the issue lies in the fact that as this is an offline system, the lid.176.ftz_ file isn't in the correct path neither for where the easynmt.py script is looking for it.
Looking at the easyNMT.py, the relevant segment appears to be:-
if self._fasttext_lang_id is None:
import fasttext
fasttext.FastText.eprint = lambda x: None #Silence useless warning: https://github.com/facebookresearch/fastText/issues/1067
model_path = os.path.join(self._cache_folder, 'lid.176.ftz')
So it would appear that _cache_folder is where the file should be stored. I can't work out where this _cache_folder is though? I did find the below but couldn't work out where this related to:-
if cache_folder is None:
if 'EASYNMT_CACHE' in os.environ:
cache_folder = os.environ['EASYNMT_CACHE']
else:
cache_folder = os.path.join(torch.hub._get_torch_home(), 'easynmt_v2')
self._cache_folder = cache_folder
Any ideas @nreimers where I can store the lid.176.ftz on a Windows system manually?
from easynmt.
You can set the environment variable EASYNMT_CACHE to point to any folder you like.
Otherwise, when you call translate and pass the source_lang, then no automatic language detection happens. So you could wrap your translate function, that you do the language detection (if needed) and pass it to model.translate
from easynmt.
Further to the above, I managed to address this by using the cache_folder parameter:-
MODEL = EasyNMT(translator=models.AutoModel("models//facebook//m2m100_418M"), cache_folder='cached')
from easynmt.
Is there the way to download all models one time from hub by any CLI? to specific path= ?
from easynmt.
@Tortoise17 You can use standard git checkout to get the model files:
git clone https://huggingface.co/facebook/m2m100_1.2B
will download all files from https://huggingface.co/facebook/m2m100_1.2B
from easynmt.
Thank you. I hope that it has no limitations in terms of usage of text (except the size of the text chunk per input cli).
from easynmt.
Sadly I don't understand what you mean
from easynmt.
Is there any limit in the use of the model for translation when the transformer is used with the use of internet?? Limit means the characters per month or per day or CLI use per day limit?
from easynmt.
No, there is no such limit
from easynmt.
@Tortoise17 You can use standard git checkout to get the model files:
git clone https://huggingface.co/facebook/m2m100_1.2B
will download all files from https://huggingface.co/facebook/m2m100_1.2B
Thank you. and how to get all Opus-mt
models download by CLI?
from easynmt.
You have to download each individually
from easynmt.
I am trying to keep the model downloaded in the local disk as I have no space in native cache.
I am trying like this, can you guide how this model I can save locally and than load from locally ?
I cloned and git-lfs installed. than
from easynmt import EasyNMT, models
model = EasyNMT(translator=models.AutoModel("/home/translate/m2m100_418M")), cache_folder='/home/translate/cached')
First it demaded the easynmt.json
in m2m100_418M
folder.
If you can guide once for opus-mt
and m2m100_418M
model. I assume that the way should be same for both.
I am confused where is the mistake.
from easynmt.
@nreimers this works a treat but I hit another issue with fasttext being called by the script:-
print("=> detected language:", MODEL.language_detection(sentence), "\n") File "C:\Program Files\Python38\lib\site-packages\easynmt\EasyNMT.py", line 413, in language_detection raise Exception("No method for automatic language detection was found. Please install at least one of the following: fasttext (pip install fasttext), langid (pip install langid), or langdetect (pip install langdetect)") Exception: No method for automatic language detection was found. Please install at least one of the following: fasttext (pip install fasttext), langid (pip install langid), or langdetect (pip install langdetect)
I actually have separate code in my script using fasttext successfully and it is installed. I think the issue lies in the fact that as this is an offline system, the lid.176.ftz_ file isn't in the correct path neither for where the easynmt.py script is looking for it.
Looking at the easyNMT.py, the relevant segment appears to be:-
if self._fasttext_lang_id is None: import fasttext fasttext.FastText.eprint = lambda x: None #Silence useless warning: https://github.com/facebookresearch/fastText/issues/1067 model_path = os.path.join(self._cache_folder, 'lid.176.ftz')
So it would appear that _cache_folder is where the file should be stored. I can't work out where this _cache_folder is though? I did find the below but couldn't work out where this related to:-
if cache_folder is None: if 'EASYNMT_CACHE' in os.environ: cache_folder = os.environ['EASYNMT_CACHE'] else: cache_folder = os.path.join(torch.hub._get_torch_home(), 'easynmt_v2') self._cache_folder = cache_folder
Any ideas @nreimers where I can store the lid.176.ftz on a Windows system manually?
Hi, I was getting the same issue, you can store the ftz file in the same cache folder, after which there would be two outcomes,
- Your model is working
- You are getting the same error
for me it was completely random so what I did was, model_path = os.path.join(self._cache_folder, 'lid.176.ftz'), I harded the path, which is obviously not the right way but I also had a restricted machine and I couldn't find a work-around with my limited knowledge.
Hope this helps
from easynmt.
Related Issues (20)
- Support for NLLB HOT 5
- Enable manually specifying the desired OPUS model? HOT 3
- Exception when trying to download Response 403 HOT 5
- Workflow for large datasets
- How to run test_translation_speed.py HOT 2
- EasyNMT
- Is there randomness in translation or does every translation lead to the exact same output?
- Finetune/Train on custom dataset
- Unable to translate from English to Thailand
- Is English to Thailand not available as opus-mt pre-trained model?
- Zsh: Illegal Hardware Instruction Issue on M2 MacBook
- Slow running on Colab
- Resource requirements for Docker Container CPU Image
- Gibberish translation coming for some languages HOT 1
- Demo doesn't allow selection of Target language
- Encountered error while trying to install fasttext HOT 2
- What's in v2.0.2
- gpu memory does not get released with `max_loaded_models`
- Suport for fine tuned models
- Support for new OPUS-MT models
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from easynmt.