Language:
English
繁體中文
Help
圖資館首頁
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Joint training for neural machine tr...
~
Cheng, Yong.
Joint training for neural machine translation
Record Type:
Electronic resources : Monograph/item
Title/Author:
Joint training for neural machine translationby Yong Cheng.
Author:
Cheng, Yong.
Published:
Singapore :Springer Singapore :2019.
Description:
xiii, 78 p. :ill., digital ;24 cm.
Contained By:
Springer Nature eBook
Subject:
Machine translating.
Online resource:
https://doi.org/10.1007/978-981-32-9748-7
ISBN:
9789813297487$q(electronic bk.)
Joint training for neural machine translation
Cheng, Yong.
Joint training for neural machine translation
[electronic resource] /by Yong Cheng. - Singapore :Springer Singapore :2019. - xiii, 78 p. :ill., digital ;24 cm. - Springer theses,2190-5053. - Springer theses..
1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
ISBN: 9789813297487$q(electronic bk.)
Standard No.: 10.1007/978-981-32-9748-7doiSubjects--Topical Terms:
180536
Machine translating.
LC Class. No.: P308 / .C44 2019
Dewey Class. No.: 418.020285
Joint training for neural machine translation
LDR
:02271nmm a2200337 a 4500
001
586735
003
DE-He213
005
20200704201045.0
006
m d
007
cr nn 008maaau
008
210326s2019 si s 0 eng d
020
$a
9789813297487$q(electronic bk.)
020
$a
9789813297470$q(paper)
024
7
$a
10.1007/978-981-32-9748-7
$2
doi
035
$a
978-981-32-9748-7
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
P308
$b
.C44 2019
072
7
$a
UYQL
$2
bicssc
072
7
$a
COM073000
$2
bisacsh
072
7
$a
UYQL
$2
thema
082
0 4
$a
418.020285
$2
23
090
$a
P308
$b
.C518 2019
100
1
$a
Cheng, Yong.
$3
878284
245
1 0
$a
Joint training for neural machine translation
$h
[electronic resource] /
$c
by Yong Cheng.
260
$a
Singapore :
$b
Springer Singapore :
$b
Imprint: Springer,
$c
2019.
300
$a
xiii, 78 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Springer theses,
$x
2190-5053
505
0
$a
1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.
520
$a
This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.
650
0
$a
Machine translating.
$3
180536
650
0
$a
Neural networks (Computer science)
$3
181982
650
1 4
$a
Natural Language Processing (NLP)
$3
826373
650
2 4
$a
Logic in AI.
$3
836108
710
2
$a
SpringerLink (Online service)
$3
273601
773
0
$t
Springer Nature eBook
830
0
$a
Springer theses.
$3
557607
856
4 0
$u
https://doi.org/10.1007/978-981-32-9748-7
950
$a
Computer Science (SpringerNature-11645)
based on 0 review(s)
ALL
電子館藏
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
000000190520
電子館藏
1圖書
電子書
EB P308 .C518 2019 2019
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Multimedia file
https://doi.org/10.1007/978-981-32-9748-7
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login