Giter Club home page Giter Club logo

mt-reading-list's People

Contributors

alphadl avatar anoopkunchukuttan avatar dingyz12 avatar emresatir avatar eurus-holmes avatar glaceon31 avatar gpengzhi avatar hsing-wang avatar juliakreutzer avatar lvapeab avatar minicheshire avatar miradel51 avatar shuo-git avatar thudcsly avatar thudcswd avatar xc-kiwiberry avatar zhanghuimeng avatar zhouyan19 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mt-reading-list's Issues

Improving Attention Modeling with Implicit Distortion and Fertility for Machine Translation

@inproceedings{Feng2016Improving,
author = {Shi Feng and
Shujie Liu and
Nan Yang and
Mu Li and
Ming Zhou and
Kenny Q. Zhu},
title = {Improving Attention Modeling with Implicit Distortion and Fertility for Machine Translation},
booktitle = {{COLING} 2016, 26th International Conference on Computational Linguistics,
Proceedings of the Conference: Technical Papers, December 11-16, 2016,
Osaka, Japan},
pages = {3082--3092},
year = {2016},
}

Any papers relating to transliteration?

Particularly, I'm looking for papers relating to incorporating domain glossaries and improving accuracy/consistency of number translations in neural machine translation

[Document-level translation]Towards making the most of context in neural machine translaiton

Hi there,

Here is another paper about document-level translation (which can also deal with single-sentence translation):

Zaixiang Zheng, Xiang Yue, Shujian Huang, Jiajun Chen, Alexandra Birch. 2020. Towards Making the Most of Context in Neural Machine Translation. In Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence (IJCAI).

ijcai version | arxiv version (code available)

Many thanks!
Zaixiang

Typos of Author Names

Yiming Wang, Fei Tian, Dongjian He, Tao Qin, ChengXiang Zhai, Tie-Yan Liu. 2019. Non-Autoregressive Machine Translation with Auxiliary Regularization. In Proceedings of AAAI 2019.

The first and third authors have wrong names.

----------------------------------->

Yiren Wang, Fei Tian, Di He, Tao Qin, ChengXiang Zhai, Tie-Yan Liu. 2019. Non-Autoregressive Machine Translation with Auxiliary Regularization. In Proceedings of AAAI 2019.

Neural Machine Translation in Linear Time

  • arxiv(cs.cl) 2016
  • PDF link: https://arxiv.org/pdf/1610.10099v1.pdf
  • MT & Language Modeling, this paper introduced 'ByteNet', a character level dilated conv NN based encoder-decoder model, which encouraged a line of research (e.g., Transformer) and achieved two inspiring and insightful results in that time:
* The ByteNet decoder attains state-of-the-art performance on character-level 
language modelling and outperforms the previous best results obtained with 
recurrent neural networks. 

* The ByteNet also achieves performance on raw character-level machine 
translation that approaches that of the best neural translation models that 
run in quadratic time.

"Dual Inference for Machine Learning" and “Dual Supervised Learning”

@inproceedings{xia2017dualsupervised,
title={Dual Supervised Learning.},
author={Xia, Yingce and Qin, Tao and Chen, Wei and Bian, Jiang and Yu, Nenghai and Liu, Tieyan},
journal={international conference on machine learning},
pages={3789--3798},
year={2017}}

@inproceedings{Xia2017DualInference,
author = {Yingce Xia and
Jiang Bian and
Tao Qin and
Nenghai Yu and
Tie{-}Yan Liu},
title = {Dual Inference for Machine Learning},
booktitle = {Proceedings of the Twenty-Sixth International Joint Conference on
Artificial Intelligence, {IJCAI} 2017, Melbourne, Australia, August
19-25, 2017},
pages = {3112--3118},
year = {2017},
crossref = {DBLP:conf/ijcai/2017},
url = {https://doi.org/10.24963/ijcai.2017/434},
doi = {10.24963/ijcai.2017/434},
timestamp = {Wed, 27 Jun 2018 12:24:11 +0200},
biburl = {https://dblp.org/rec/bib/conf/ijcai/XiaBQYL17},
bibsource = {dblp computer science bibliography, https://dblp.org}
}

A paper might be Inappropriately categorized

Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, and Hang Li. 2017. Context Gates for Neural Machine Translation. Transactions of the Association for Computational Linguistics. (Citation: 36)

This paper is essentially about how to balance source-side and target-side context in sentence-level MT. The paper might be inappropriately categorized into "document-level translation".

I personally suggest it could be put into "Coverage Constraints".

Algorithms used in top performance WMT Systems

Thank you for your awesome MT-Reading-List. I suggest adding algorithms used in top performance WMT systems, because some papers are just papers which are not effective when data are abundant. Furthermore, an ensemble Transformer + BPE + Back-translation is a strong baseline in practice. The algorithms employed in WMT competitions will clarify which idea actually works when data are abundant.

Neural machine translation with reconstruction

@inproceedings{tu2017neural,
title={Neural machine translation with reconstruction},
author={Tu, Zhaopeng and Liu, Yang and Shang, Lifeng and Liu, Xiaohua and Li, Hang},
booktitle={Thirty-First AAAI Conference on Artificial Intelligence},
year={2017}
}

Welcome! Here's the intro

  • We update papers about machine translation from top conferences, including ICLR, NeurIPS, ICML, ACL, EMNLP, NAACL, COLING, EACL and so on, as well as top journals including CL and TACL.
  • Currently, we only add officially published papers as well as the archived papers which have triggered heated discussion (BERT, for example!).
  • For those recently archived papers, or insightful but accidentally rejected papers, we tend to follow them by opening issues; we'll close the issues as soon as the corresponding papers are accepted. Discussions about them are also welcomed~
  • Last but not least, feel free to recommend more papers!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.