novocaine / pretraining-bea2019 Goto Github PK
View Code? Open in Web Editor NEWThis project forked from grammatical/pretraining-bea2019
Models, system configurations and outputs of our winning GEC systems in the BEA 2019 shared task described in R. Grundkiewicz, M. Junczys-Dowmunt, K. Heafield: Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data, BEA 2019.
License: MIT License