grammatical / pretraining-bea2019 Goto Github PK
View Code? Open in Web Editor NEWModels, system configurations and outputs of our winning GEC systems in the BEA 2019 shared task described in R. Grundkiewicz, M. Junczys-Dowmunt, K. Heafield: Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data, BEA 2019.
License: MIT License