Giter Club home page Giter Club logo

iceychris / libreasr Goto Github PK

View Code? Open in Web Editor NEW
680.0 25.0 30.0 6.3 MB

:speech_balloon: An On-Premises, Streaming Speech Recognition System

Home Page: https://news.ycombinator.com/item?id=25099847

License: MIT License

Makefile 1.16% Python 76.14% CMake 0.14% C 5.32% HTML 1.12% CSS 0.40% JavaScript 6.07% Nix 0.18% Shell 0.36% Jupyter Notebook 9.10%
asr speech-recognition pytorch fastai rnn-transducer deep-learning esp32-lyrat python

libreasr's Introduction

libreasr's People

Contributors

iceychris avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

libreasr's Issues

Design a Logo

Design a logo for LibreASR and share it here.

To make an open source project cool, it should have a logo ๐Ÿ˜„

RuntimeError: Didn't find engine for operation quantized::linear_prepack NoQEngine on running the docker container

Hello, I tried the Quickstart steps and I can't get through this error. Did I miss something?
Thanks!

# docker run -it -p 8080:8080 iceychris/libreasr:latest
make sde &
make sen &
make b
make[1]: Entering directory '/workspace'
python3 -u api-server.py de
make[1]: Entering directory '/workspace'
python3 -u api-server.py en
make[1]: Entering directory '/workspace'
python3 -u api-bridge.py
[api-bridge] running on :8080
LM: Failed to load.
LM: Failed to load.
Traceback (most recent call last):
  File "api-server.py", line 155, in <module>
Traceback (most recent call last):
  File "api-server.py", line 155, in <module>
        serve(args.lang)serve(args.lang)

  File "api-server.py", line 140, in serve
  File "api-server.py", line 140, in serve
        apg.add_ASRServicer_to_server(ASRServicer(lang), server)apg.add_ASRServicer_to_server(ASRServicer(lang), server)

  File "api-server.py", line 56, in __init__
  File "api-server.py", line 56, in __init__
    conf, lang, m, x_tfm, x_tfm_stream = load_stuff(lang)
    conf, lang, m, x_tfm, x_tfm_stream = load_stuff(lang)  File "/workspace/lib/inference.py", line 20, in load_stuff

  File "/workspace/lib/inference.py", line 20, in load_stuff
        conf, lang, m, tfms = parse_and_apply_config(inference=True, lang=lang)
conf, lang, m, tfms = parse_and_apply_config(inference=True, lang=lang)  File "/workspace/lib/config.py", line 151, in parse_and_apply_config

  File "/workspace/lib/config.py", line 151, in parse_and_apply_config
        load_asr_model(m, lang_name, lang, conf["cuda"]["device"], lm=lm)load_asr_model(m, lang_name, lang, conf["cuda"]["device"], lm=lm)

  File "/workspace/lib/model_utils.py", line 88, in load_asr_model
  File "/workspace/lib/model_utils.py", line 88, in load_asr_model
        model, {torch.nn.LSTM, torch.nn.Linear}, dtype=torch.qint8model, {torch.nn.LSTM, torch.nn.Linear}, dtype=torch.qint8

  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 285, in quantize_dynamic
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 285, in quantize_dynamic
    convert(model, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 365, in convert
    convert(model, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 365, in convert
    convert(mod, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 365, in convert
    convert(mod, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 365, in convert
    convert(mod, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 365, in convert
    convert(mod, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 365, in convert
    convert(mod, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 366, in convert
    convert(mod, mapping, inplace=True)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 366, in convert
    reassign[name] = swap_module(mod, mapping)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 395, in swap_module
    reassign[name] = swap_module(mod, mapping)
  File "/usr/local/lib/python3.7/dist-packages/torch/quantization/quantize.py", line 395, in swap_module
    new_mod = mapping[type(mod)].from_float(mod)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 421, in from_float
    new_mod = mapping[type(mod)].from_float(mod)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 421, in from_float
        return super(LSTM, cls).from_float(mod)return super(LSTM, cls).from_float(mod)

  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 229, in from_float
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 229, in from_float
    mod.bias, mod.batch_first, mod.dropout, mod.bidirectional, dtype)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 335, in __init__
    mod.bias, mod.batch_first, mod.dropout, mod.bidirectional, dtype)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 335, in __init__
    super(LSTM, self).__init__('LSTM', *args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 85, in __init__
    super(LSTM, self).__init__('LSTM', *args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 85, in __init__
    torch.ops.quantized.linear_prepack(w_ih, b_ih)
RuntimeError: Didn't find engine for operation quantized::linear_prepack NoQEngine
    torch.ops.quantized.linear_prepack(w_ih, b_ih)
RuntimeError: Didn't find engine for operation quantized::linear_prepack NoQEngine
make[1]: *** [Makefile:55: sen] Error 1
make[1]: Leaving directory '/workspace'
make[1]: *** [Makefile:57: sde] Error 1
make[1]: Leaving directory '/workspace'

Arch Linux (AUR) package

Hello,

really nice project, I am thoroughly impressed.
However, in my humble opinion the project needs an AUR package in order to reach its full potential and address its target audience - smart nerds.

I will star the project once an AUR package is available.

:wq

Raspberry Pi Support

Make LibreASR work on Raspberry Pis.

There already is a Dockerfile which builds fine on my Pi 4.
Loading and running the PyTorch models also works.

But, loading the youtokentome tokenizer model does not work:

chris@rpi4:~ $ docker run -it -v $(pwd):/workspace libreasr-armv7 /bin/bash
root@abf9b8be2d80:/workspace# python3
Python 3.7.3 (default, Jul 25 2020, 13:03:44)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import youtokentome
>>> youtokentome.BPE("tokenizer.yttm-model")
terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted (core dumped)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.