語系:
繁體中文
English
說明(常見問題)
圖資館首頁
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Joint training for neural machine tr...
~
Cheng, Yong.
Joint training for neural machine translation
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Joint training for neural machine translationby Yong Cheng.
作者:
Cheng, Yong.
出版者:
Singapore :Springer Singapore :2019.
面頁冊數:
xiii, 78 p. :ill., digital ;24 cm.
Contained By:
Springer Nature eBook
標題:
Machine translating.
電子資源:
https://doi.org/10.1007/978-981-32-9748-7
ISBN:
9789813297487$q(electronic bk.)
Joint training for neural machine translation
Cheng, Yong.
Joint training for neural machine translation
[electronic resource] /by Yong Cheng. - Singapore :Springer Singapore :2019. - xiii, 78 p. :ill., digital ;24 cm. - Springer theses,2190-5053. - Springer theses..
1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
ISBN: 9789813297487$q(electronic bk.)
Standard No.: 10.1007/978-981-32-9748-7doiSubjects--Topical Terms:
180536
Machine translating.
LC Class. No.: P308 / .C44 2019
Dewey Class. No.: 418.020285
Joint training for neural machine translation
LDR
:02271nmm a2200337 a 4500
001
586735
003
DE-He213
005
20200704201045.0
006
m d
007
cr nn 008maaau
008
210326s2019 si s 0 eng d
020
$a
9789813297487$q(electronic bk.)
020
$a
9789813297470$q(paper)
024
7
$a
10.1007/978-981-32-9748-7
$2
doi
035
$a
978-981-32-9748-7
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
P308
$b
.C44 2019
072
7
$a
UYQL
$2
bicssc
072
7
$a
COM073000
$2
bisacsh
072
7
$a
UYQL
$2
thema
082
0 4
$a
418.020285
$2
23
090
$a
P308
$b
.C518 2019
100
1
$a
Cheng, Yong.
$3
878284
245
1 0
$a
Joint training for neural machine translation
$h
[electronic resource] /
$c
by Yong Cheng.
260
$a
Singapore :
$b
Springer Singapore :
$b
Imprint: Springer,
$c
2019.
300
$a
xiii, 78 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Springer theses,
$x
2190-5053
505
0
$a
1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.
520
$a
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
650
0
$a
Machine translating.
$3
180536
650
0
$a
Neural networks (Computer science)
$3
181982
650
1 4
$a
Natural Language Processing (NLP)
$3
826373
650
2 4
$a
Logic in AI.
$3
836108
710
2
$a
SpringerLink (Online service)
$3
273601
773
0
$t
Springer Nature eBook
830
0
$a
Springer theses.
$3
557607
856
4 0
$u
https://doi.org/10.1007/978-981-32-9748-7
950
$a
Computer Science (SpringerNature-11645)
筆 0 讀者評論
全部
電子館藏
館藏
1 筆 • 頁數 1 •
1
條碼號
館藏地
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
000000190520
電子館藏
1圖書
電子書
EB P308 .C518 2019 2019
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
多媒體檔案
https://doi.org/10.1007/978-981-32-9748-7
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入