天天看点

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

中文题名:基于联合学习对齐和翻译的神经机器翻译

目录

摘要

背景:神经机器翻译

任务定义

编码器-解码器框架(基线)

编码器(基线)

解码器(基线)

模型效果

存在的问题

学习对齐和翻译

RNNenc vs RNNsearch

RNNsearch的编码器

RNNsearch的解码器

注意力思想

注意力机制

RNNsearch模型的解码器的计算步骤

RNNsearch模型

实例

实验设置和结果

实验设置

评估标准——Bleu

模型效果

实验结果分析

未来工作

  • 摘要

  1. 神经机器翻译的任务定义
  2. 传统神经机器翻译所用的编码器-解码器模型的缺陷
  3. 本文提出一种能够自动搜索原句中与预测目标词相关的神经机器翻译模型
  4. 所提出的模型的效果
  • 背景:神经机器翻译

任务定义

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

该模型采用1到K编码的字向量的源语言句子作为输入:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

并输出由1到K编码的字向量的目标语言句子:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

任务目标:评估函数

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

编码器-解码器框架(基线)

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

模型名称:RNNenc

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

编码器(基线)

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示一个输入句子的序列

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示编码器的隐层状态

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示由句子序列的隐层状态生成的上下文向量

编码器读取输入句子序列x,生成一个上下文向量c

解码器(基线)

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示一个生成句子的序列

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示解码器的隐层状态

解码器是用来在给定上下文向量c和所有之前的预测词

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

时预测下一个词

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

,也可以说,解码器通过将联合概率分解为顺序条件概率来定义一条翻译y上的概率:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

使用RNN,每个条件概率被建模为:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

其中,g是非线性的,可能为多层的,用来输出

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

概率的函数

模型效果

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

在机器翻译领域,使用Seq2Seq模型在英法翻译任务中表现接近技术最先进水平,比传统的词袋模型效果好。

存在的问题

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  1. 必须记住整个句子序列的语义信息
  2. 把无论长度多长的句子都编码成固定维度的向量,这样限制了翻译过程中长句子的表示
  3. 与人类翻译时的习惯不同,人们不会在生成目标语言翻译时关注源语言句子的每一个单词
  • 学习对齐和翻译

提出一种新的神经机器翻译模型:RNNsearch

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

编码器:采用双向循环神经网络,隐藏状态同时对当前单词前面和后面的信息编码

解码器:提出注意力机制,对输入的隐藏状态求权重

RNNenc vs RNNsearch

RNNenc:

  1. 将整个输入语句编码城一个固定长度的向量
  2. 使用单向循环神经网络

RNNsearch:

  1. 将输入的句子编码为变长向量序列
  2. 在解码翻译时,自适应地选择这些向量的子集
  3. 使用双向循环神经网络

RNNsearch的编码器

前向RNN:

输入:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

输出:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

后向RNN:

输入:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

输出:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

连接:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

RNNsearch的解码器

目标端词

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

的条件概率:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

表示i时刻的隐层状态:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

与RNNenc模型的不同点:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

注意力思想

思想:集中关注的上下文

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

注意力机制

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

计算上下文向量

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

权值(注意力分数)

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

对齐模型:

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

,用来对位置j周围的输入和位置i处的输出的匹配程度进行评分。

RNNsearch模型的解码器的计算步骤

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  1. 计算注意力分数(对齐模型)
    论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  2. 计算带有注意力分数的上下文信息
    论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  3. 生成新的隐层状态输出
    论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  4. 计算新的目标语言输出
    论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

RNNsearch模型

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

实例

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  • 实验设置和结果

实验设置

实验模型:RNNsearch和RNNenc

实验任务:从英语(源语言)到法语(目标语言)的翻译

数据集:WMT’14数据集

对比实验:分别取最大长度为30和最大长度为50的句子长度进行实验

评估标准——Bleu

一种文本评估算法,用来评估机器翻译跟专业人工翻译之间的对应关系。

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

模型效果

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

测试集中生成的与句子长度相关的译文的BLEU分数,结果是在整个测试集上得到,其中包括有未知单词的句子。RNNsearch模型在长句子上表现优异。

实验结果分析

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

x轴是源语言的单词,y轴是目标语言的单词,图中显示的是第j个源单词和第i个目标单词的注意力分数

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》

,越接近于0则越白,越接近于1则越黑。

  • 未来工作

论文笔记《Neural Machine Translation by Jointly Learning to Align and Translate》
  1. 使用不同注意力机制计算会导致不同的结果
  2. 使用单向LSTM和使用计算注意力分数具有同样的效果
  3. 提出其他的注意力分数计算方法
NLP

继续阅读