Giter Club home page Giter Club logo

chinese_converter's Introduction

npm version npm downloads

Known Vulnerabilities codebeat badge Codacy Badge DeepScan grade

CeCC - Colorless echo Chinese converter

在人工智慧讀通文義、繁簡轉換前,應用自然語言處理中文分詞、標注詞性、判斷語境之後再做轉換,應比單純詞彙比對更準確。辭典應可如維基百科般由眾人編輯,且記錄改變原由,加進 test suit。

Concepts

Chinese_converter 採用先中文分詞(附帶詞義、詞性標注),再以 CeL.zh_conversion 繁簡轉換的方法,來輕量化繁簡轉換辭典,同時達到較為精確的繁簡轉換。

單純從前至後替換文字的缺陷

CeL.zh_conversion 採用與 OpenCC 和新同文堂相同的技術,正向最大匹配法 (forward maximum matching algorithm);從前至後順向,於每個字元位置檢查符合辭典檔中詞彙的最長詞彙,一旦符合就置換並跳到下一個字元位置。

這種方法在遇到某些字詞必須與前一個字詞連動時,就可能漏失掉。例如「干」預設會轉成「幹」。(轉換標的通常是用途雜亂,最難找出規則又常出現、例外多的字詞。)因此當辭典檔中有「芒果」卻沒有「芒果乾」時,遇到「芒果干」就可能換成「芒果幹」。 我們可以藉由把這些需要連動的詞彙全部加入辭典檔來完善轉換結果,例如令「芒果干」轉換成「芒果乾」,additional.to_TW.txt 與 CN_to_TW.LTP.PoS.txt 中就有許多例子。但這造成辭典檔複雜龐大,就本例來說,我們畢竟不可能把所有動植物,如蘋果乾、響尾蛇乾全加進去。 而且每次加入新的詞彙也得考慮是否會影響到上下文。例如加入「上千」將造成「算得上千钧一发」因為先符合了「上千」,「钧一发」本身不在辭典檔中,將造成轉換錯誤。而加入了「下游」→「下游」,也得同時加入「下游戏」→「下遊戲」以防止「玩一下游戏」、「停下游戏」被錯誤轉換。

此外這種做法最大的問題是不能依上下文判斷。例如「这颗梨子干你什么事」、「我拿水蜜桃干他朋友什么事」就不容易正確轉換。而有些詞像「排泄」、「排洩」,「自制」、「自製」有兩種可能性,也必須依上下文來判斷。

先中文分詞的好處

先中文分詞來判斷,可依句子與片語的結構來轉換。不但較靈活,也更能應對特殊情況。例如 zh_conversion 指定了許多 这只→這隻、一只→一隻 之類的轉換。在遇到「這隻是小鳥」、「這只是妄想」時常常出錯。若能判斷出「只」是否為量詞,則可減少許多錯誤。

雖然中文分詞有其優勢,可惜現在中文分詞精確度尚待加強。辭典檔規則極為依賴中文分詞系統,加上 zh_conversion 執行速度快許多,因此 Chinese_converter 只對分詞極少出錯,或者要寫成 zh_conversion 形式規則太過繁雜的特殊情況,才採用中文分詞辭典。

歡迎提供句子以做測試,也歡迎提交辭典檔規則。

Process

繁簡轉換流程:

  1. 中文分詞(採用外包程式中文斷詞)
  2. TODO: 自動判斷/手動指定句子、段落的語境(配合維基百科專有名詞轉換)。
  3. 依照相應詞典轉換各詞彙

Features

  1. 經由判斷詞性,可簡單判斷如何轉換。例如動詞用,形容詞用
  2. 自帶條件式生成功能,可快速生成辭典用的候選條件式。
  3. 自附 cache 功能,可大大降低多次轉譯長文的時間(例如在測試期間、修改辭典條件欲重新轉換)。

Installation

Install LTP 4.1.5.post2 first.

經實測,採用哈工大 LTP 4.1.5.post2 Base 模型服務端版本,配合相對應辭典;以 繁→簡→繁 轉換測試檔中的文字,可轉換回原先之內容。

1. Install 中文分詞: LTP

On Windows, install LTP:

  1. 安裝 PyTorch 支援的 Python。(最新的 Python 常不能執行 PyTorch。)
  2. 安裝 LTP 與 LTP server 所依賴的軟體包:
pip install --force-reinstall "ltp==4.1.5.post2"
pip install tornado
pip install fire

2. Install cecc

Install Node.js, and then install cecc:

npm install cecc

Usage

1. Preparing server script

  1. 直接下載 LTP server 原始碼並改 'small''base'。您可直接採用已改過的 server.base.py
  2. 啟動 LTP server,預設為 http://localhost:5000/ 。您可能需要 6 GB 記憶體來啟動 server。第一次執行需要下載超過 500 MiB 的辭典檔,通常耗時10分多鐘。
    # 當您採用上述已改過的 server.base.py,可以此法啟動 LTP server。
    python server.base.py serve

2. Running codes

  1. Try running codes using node.js:
    // load module
    const CeCC = require('cecc');
    // chinese_converter
    const cecc = new CeCC({ LTP_URL : 'http://localhost:5000/' });
    cecc.to_TW('简体中文');
    cecc.to_CN('繁體中文');
  2. Full test. 完整測試。
    # 重新生成 .converted.* 解答檔案。
    npm test regenerate_converted
    # 不測試 wikipedia 頁面。
    npm test nowiki
    # TODO: 重新生成所有詞性查詢 cache。
    npm test ignore_cache

Mechanism 文字替換機制

  1. 若有符合附帶詞性辭典檔的文字,則依之變換。其他未符合的交由 CeL.extension.zh_conversion 處理。
  2. zh_conversion 基本上採用 OpenCC 的辭典,並以 generate_additional_table.js 合併新同文堂和 ConvertZZ 的辭典檔成 additional.to_TW.auto-generated.txt 與 additional.to_CN.auto-generated.txt。依照 CeL.extension.zh_conversion 中 Converter.options 之辭典檔順序,每個序列由長至短轉換。實際文字替換轉換作業在 CeL.data.Convert_Pairs 中的 function convert_using_pair_Map_by_length(text)

辭典修訂流程

Chinese_converter 辭典修訂過程

常規單句式辭典修訂流程

  1. 閱讀轉換過的文字,發現轉換錯誤。
  2. 改成正確的句子,填入測試檔 general.TW.txtgeneral.TW.answer.txt
  3. 啟動 LTP servernpm test 跑測試。
  4. 檢核測試工具自動生成的條件式,將合適的條件式填入辭典檔 CN_to_TW.LTP.PoS.txtTW_to_CN.LTP.PoS.txt。必要時添加新 filter 功能函數於 CN_to_TW.LTP.filters.js
  5. npm test 確認無衝突。
  6. 通過測試後 push 新辭典檔。

邊閱讀文本邊修訂流程

有時另外挑出句子會解析出不同語法,此時必須透過完整轉換文本修訂辭典:通過 work_crawler 選擇繁簡轉換功能,並隨時修訂辭典,應先設定 .cache_directory(work_crawler 會自動設定)。

  1. 閱讀轉換過的文字,發現轉換錯誤。
  2. 改成正確的句子,填入作品相應的測試檔 _test suite/articles/watch_target.作品名稱.(TW|CN).txt (e.g., watch_target.第一序列.TW.txt),會在每次轉換都測試是否有相符之文字。
  3. 持續修改辭典檔至能通過 npm test nowiki 測試。
  4. 重新生成繁簡轉換後文本,檢核測試工具自動生成的條件式,將合適的條件式填入辭典檔 CN_to_TW.LTP.PoS.txtTW_to_CN.LTP.PoS.txt
  5. 全部閱讀檢核完後,將作品相應的測試檔的文句填入測試檔 general.TW.txtgeneral.TW.answer.txt

Defect

  • LTP 轉換速率過慢。
  • 詞典仍過於薄弱、有缺陷,尚待加強。
  • 因為當前語言解析程式仍處於粗糙階段,解析結果不甚穩定;更換 LTP 程式版本就需要大幅改變辭典。

See also

中文分詞

久未更新

詞性標記 词性标注

久未更新

簡繁轉換

未考慮詞性之簡繁轉換辭典:

chinese_converter's People

Contributors

kanasimi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

andyjahea

chinese_converter's Issues

無法安裝和執行 LTP server

系統: Windows 10, Python 3.12.3

已安裝以下套件:

fire                      0.6.0
pip                       24.0
torch                     2.3.0
tornado                   6.4

無法按照 README 提供的指令安裝 LTP;

D:\*****\test\python\ltp>pip install --force-reinstall "ltp==4.1.5.post2"
Collecting ltp==4.1.5.post2
  Using cached ltp-4.1.5.post2-py3-none-any.whl.metadata (4.8 kB)
Collecting torch>=1.2.0 (from ltp==4.1.5.post2)
  Using cached torch-2.3.0-cp312-cp312-win_amd64.whl.metadata (26 kB)
Collecting transformers<=4.7.0,>=4.0.0 (from ltp==4.1.5.post2)
  Using cached transformers-4.7.0-py3-none-any.whl.metadata (48 kB)
Collecting pygtrie<2.5,>=2.3.0 (from ltp==4.1.5.post2)
  Using cached pygtrie-2.4.2-py3-none-any.whl
Collecting packaging>=20.0 (from ltp==4.1.5.post2)
  Using cached packaging-24.0-py3-none-any.whl.metadata (3.2 kB)
Collecting filelock (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached filelock-3.14.0-py3-none-any.whl.metadata (2.8 kB)
Collecting typing-extensions>=4.8.0 (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached typing_extensions-4.12.0-py3-none-any.whl.metadata (3.0 kB)
Collecting sympy (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached sympy-1.12.1-py3-none-any.whl.metadata (12 kB)
Collecting networkx (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached networkx-3.3-py3-none-any.whl.metadata (5.1 kB)
Collecting jinja2 (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached jinja2-3.1.4-py3-none-any.whl.metadata (2.6 kB)
Collecting fsspec (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached fsspec-2024.5.0-py3-none-any.whl.metadata (11 kB)
Collecting mkl<=2021.4.0,>=2021.1.1 (from torch>=1.2.0->ltp==4.1.5.post2)
  Using cached mkl-2021.4.0-py2.py3-none-win_amd64.whl.metadata (1.4 kB)
Collecting huggingface-hub==0.0.8 (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached huggingface_hub-0.0.8-py3-none-any.whl.metadata (8.7 kB)
Collecting numpy>=1.17 (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached numpy-1.26.4-cp312-cp312-win_amd64.whl.metadata (61 kB)
Collecting pyyaml (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached PyYAML-6.0.1-cp312-cp312-win_amd64.whl.metadata (2.1 kB)
Collecting regex!=2019.12.17 (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached regex-2024.5.15-cp312-cp312-win_amd64.whl.metadata (41 kB)
Collecting requests (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached requests-2.32.3-py3-none-any.whl.metadata (4.6 kB)
Collecting sacremoses (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached sacremoses-0.1.1-py3-none-any.whl.metadata (8.3 kB)
Collecting tokenizers<0.11,>=0.10.1 (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached tokenizers-0.10.3.tar.gz (212 kB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting tqdm>=4.27 (from transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached tqdm-4.66.4-py3-none-any.whl.metadata (57 kB)
Collecting intel-openmp==2021.* (from mkl<=2021.4.0,>=2021.1.1->torch>=1.2.0->ltp==4.1.5.post2)
  Using cached intel_openmp-2021.4.0-py2.py3-none-win_amd64.whl.metadata (1.2 kB)
Collecting tbb==2021.* (from mkl<=2021.4.0,>=2021.1.1->torch>=1.2.0->ltp==4.1.5.post2)
  Using cached tbb-2021.12.0-py3-none-win_amd64.whl.metadata (1.1 kB)
Collecting colorama (from tqdm>=4.27->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB)
Collecting MarkupSafe>=2.0 (from jinja2->torch>=1.2.0->ltp==4.1.5.post2)
  Using cached MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl.metadata (3.1 kB)
Collecting charset-normalizer<4,>=2 (from requests->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl.metadata (34 kB)
Collecting idna<4,>=2.5 (from requests->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached idna-3.7-py3-none-any.whl.metadata (9.9 kB)
Collecting urllib3<3,>=1.21.1 (from requests->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached urllib3-2.2.1-py3-none-any.whl.metadata (6.4 kB)
Collecting certifi>=2017.4.17 (from requests->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB)
Collecting click (from sacremoses->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Collecting joblib (from sacremoses->transformers<=4.7.0,>=4.0.0->ltp==4.1.5.post2)
  Using cached joblib-1.4.2-py3-none-any.whl.metadata (5.4 kB)
Collecting mpmath<1.4.0,>=1.1.0 (from sympy->torch>=1.2.0->ltp==4.1.5.post2)
  Using cached mpmath-1.3.0-py3-none-any.whl.metadata (8.6 kB)
Using cached ltp-4.1.5.post2-py3-none-any.whl (94 kB)
Using cached packaging-24.0-py3-none-any.whl (53 kB)
Using cached torch-2.3.0-cp312-cp312-win_amd64.whl (159.7 MB)
Using cached transformers-4.7.0-py3-none-any.whl (2.5 MB)
Using cached huggingface_hub-0.0.8-py3-none-any.whl (34 kB)
Using cached mkl-2021.4.0-py2.py3-none-win_amd64.whl (228.5 MB)
Using cached intel_openmp-2021.4.0-py2.py3-none-win_amd64.whl (3.5 MB)
Using cached tbb-2021.12.0-py3-none-win_amd64.whl (286 kB)
Using cached numpy-1.26.4-cp312-cp312-win_amd64.whl (15.5 MB)
Using cached regex-2024.5.15-cp312-cp312-win_amd64.whl (268 kB)
Using cached tqdm-4.66.4-py3-none-any.whl (78 kB)
Using cached typing_extensions-4.12.0-py3-none-any.whl (37 kB)
Using cached filelock-3.14.0-py3-none-any.whl (12 kB)
Using cached fsspec-2024.5.0-py3-none-any.whl (316 kB)
Using cached jinja2-3.1.4-py3-none-any.whl (133 kB)
Using cached networkx-3.3-py3-none-any.whl (1.7 MB)
Using cached PyYAML-6.0.1-cp312-cp312-win_amd64.whl (138 kB)
Using cached requests-2.32.3-py3-none-any.whl (64 kB)
Using cached sacremoses-0.1.1-py3-none-any.whl (897 kB)
Using cached sympy-1.12.1-py3-none-any.whl (5.7 MB)
Using cached certifi-2024.2.2-py3-none-any.whl (163 kB)
Using cached charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl (100 kB)
Using cached idna-3.7-py3-none-any.whl (66 kB)
Using cached MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl (17 kB)
Using cached mpmath-1.3.0-py3-none-any.whl (536 kB)
Using cached urllib3-2.2.1-py3-none-any.whl (121 kB)
Using cached click-8.1.7-py3-none-any.whl (97 kB)
Using cached colorama-0.4.6-py2.py3-none-any.whl (25 kB)
Using cached joblib-1.4.2-py3-none-any.whl (301 kB)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml): started
  Building wheel for tokenizers (pyproject.toml): finished with status 'error'
  error: subprocess-exited-with-error
  
  Building wheel for tokenizers (pyproject.toml) did not run successfully.
  exit code: 1
  
  [51 lines of output]
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win-amd64-cpython-312
  creating build\lib.win-amd64-cpython-312\tokenizers
  copying py_src\tokenizers\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers
  creating build\lib.win-amd64-cpython-312\tokenizers\models
  copying py_src\tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\models
  creating build\lib.win-amd64-cpython-312\tokenizers\decoders
  copying py_src\tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\decoders
  creating build\lib.win-amd64-cpython-312\tokenizers\normalizers
  copying py_src\tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\normalizers
  creating build\lib.win-amd64-cpython-312\tokenizers\pre_tokenizers
  copying py_src\tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\pre_tokenizers
  creating build\lib.win-amd64-cpython-312\tokenizers\processors
  copying py_src\tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\processors
  creating build\lib.win-amd64-cpython-312\tokenizers\trainers
  copying py_src\tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\trainers
  creating build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  copying py_src\tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\implementations
  creating build\lib.win-amd64-cpython-312\tokenizers\tools
  copying py_src\tokenizers\tools\visualizer.py -> build\lib.win-amd64-cpython-312\tokenizers\tools
  copying py_src\tokenizers\tools\__init__.py -> build\lib.win-amd64-cpython-312\tokenizers\tools
  copying py_src\tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers
  copying py_src\tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers\models
  copying py_src\tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers\decoders
  copying py_src\tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers\normalizers
  copying py_src\tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers\pre_tokenizers
  copying py_src\tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers\processors
  copying py_src\tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-312\tokenizers\trainers
  copying py_src\tokenizers\tools\visualizer-styles.css -> build\lib.win-amd64-cpython-312\tokenizers\tools
  running build_ext
  running build_rust
  error: can't find Rust compiler
  
  If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
  
  To update pip, run:
  
      pip install --upgrade pip
  
  and then retry package installation.
  
  If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
  [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

如果用其他方式安裝以下套件:

ltp                       4.2.13
ltp-core                  0.1.4
ltp-extension             0.1.13

執行 LTP server 時仍會出錯:

D:\*****\test\python\ltp>python server.base.py serve
C:\Users\***\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
Traceback (most recent call last):
  File "D:\*****\test\python\ltp\server.base.py", line 154, in <module>
    Fire(Server)
  File "C:\Users\***\AppData\Local\Programs\Python\Python312\Lib\site-packages\fire\core.py", line 143, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\***\AppData\Local\Programs\Python\Python312\Lib\site-packages\fire\core.py", line 477, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
                                ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\***\AppData\Local\Programs\Python\Python312\Lib\site-packages\fire\core.py", line 693, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^
  File "D:\*****\test\python\ltp\server.base.py", line 52, in __init__
    self.ltp = LTP(path=path, device=device)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\***\AppData\Local\Programs\Python\Python312\Lib\site-packages\ltp\interface.py", line 125, in LTP
    return LTP_neural._from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\***\AppData\Local\Programs\Python\Python312\Lib\site-packages\ltp\nerual.py", line 534, in _from_pretrained
    ltp = cls(**model_kwargs).to(map_location)
          ^^^^^^^^^^^^^^^^^^^
TypeError: LTP.__init__() got an unexpected keyword argument 'path'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.