Common Contracts

3 similar Agreement contracts

Abstract
Agreement • November 8th, 2022
  • Contract Type
  • Filed
    November 8th, 2022

The attentional mechanism has proven to be ef- fective in improving end-to-end neural machine translation. However, due to the intricate struc- tural divergence between natural languages, unidi- rectional attention-based models might only cap- ture partial aspects of attentional regularities. We propose agreement-based joint training for bidirec- tional attention-based end-to-end neural machine translation. Instead of training source-to-target and target-to-source translation models independently, our approach encourages the two complementary models to agree on word alignment matrices on the same training data. Experiments on Chinese- English and English-French translation tasks show that agreement-based joint training significantly improves both alignment and translation quality over independent training.

AutoNDA by SimpleDocs
Institute for Interdisciplinary Information Sciences, Tsinghua University, Beijing, China
Agreement • April 25th, 2016

The attentional mechanism has proven to be ef- fective in improving end-to-end neural machine translation. However, due to the intricate struc- tural divergence between natural languages, unidi- rectional attention-based models might only cap- ture partial aspects of attentional regularities. We propose agreement-based joint training for bidirec- tional attention-based end-to-end neural machine translation. Instead of training source-to-target and target-to-source translation models independently, our approach encourages the two complementary models to agree on word alignment matrices on the same training data. Experiments on Chinese- English and English-French translation tasks show that agreement-based joint training significantly improves both alignment and translation quality over independent training.

Institute for Interdisciplinary Information Sciences, Tsinghua University, Beijing, China
Agreement • April 19th, 2016

The attentional mechanism has proven to be ef- fective in improving end-to-end neural machine translation. However, due to the intricate struc- tural divergence between natural languages, unidi- rectional attention-based models might only cap- ture partial aspects of attentional regularities. We propose agreement-based joint training for bidirec- tional attention-based end-to-end neural machine translation. Instead of training source-to-target and target-to-source translation models independently, our approach encourages the two complementary models to agree on word alignment matrices on the same training data. Experiments on Chinese- English and English-French translation tasks show that agreement-based joint training significantly improves both alignment and translation quality over independent training.

Time is Money Join Law Insider Premium to draft better contracts faster.