Effective Approaches to Attention-based Neural Machine Translation

本文探讨了两种有效的注意力机制:全局注意力始终关注所有源单词,局部注意力则每次仅查看源句子的一部分。实验表明,这两种方法在英语与德语之间的翻译任务中均取得了显著效果。采用局部注意力机制相较于无注意力系统提升了5.0个BLEU点,并且通过结合不同注意力架构的模型,在WMT'15英语到德语翻译任务中创造了新的最佳记录,达到了25.9个BLEU点。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Abstract:

An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches over the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems which already incorporate known techniques such as dropout. Our ensemble model using different attention architectures has established a new state-of-the-art result in the WMT'15 English to German translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over the existing best system backed by NMT and an n-gram reranker.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值