Language:
English
繁體中文
Help
圖資館首頁
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Hands-on question answering systems ...
~
Agrawal, Amit.
Hands-on question answering systems with BERTapplications in neural networks and natural language processing /
Record Type:
Electronic resources : Monograph/item
Title/Author:
Hands-on question answering systems with BERTby Navin Sabharwal, Amit Agrawal.
Reminder of title:
applications in neural networks and natural language processing /
Author:
Sabharwal, Navin.
other author:
Agrawal, Amit.
Published:
Berkeley, CA :Apress :2021.
Description:
xv, 184 p. :ill., digital ;24 cm.
Contained By:
Springer Nature eBook
Subject:
Neural networks (Computer science)
Online resource:
https://doi.org/10.1007/978-1-4842-6664-9
ISBN:
9781484266649$q(electronic bk.)
Hands-on question answering systems with BERTapplications in neural networks and natural language processing /
Sabharwal, Navin.
Hands-on question answering systems with BERT
applications in neural networks and natural language processing /[electronic resource] :by Navin Sabharwal, Amit Agrawal. - Berkeley, CA :Apress :2021. - xv, 184 p. :ill., digital ;24 cm.
Chapter 1: Introduction to Natural Language Processing -- Chapter 2: Introduction to Word Embeddings -- Chapter 3: BERT Algorithms Explained -- Chapter 4: BERT Model Applications - Question Answering System -- Chapter 5: BERT Model Applications - Other tasks -- Chapter 6: Future of BERT models.
Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT. You will: Examine the fundamentals of word embeddings Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data.
ISBN: 9781484266649$q(electronic bk.)
Standard No.: 10.1007/978-1-4842-6664-9doiSubjects--Topical Terms:
181982
Neural networks (Computer science)
LC Class. No.: QA76.87 / .S234 2021
Dewey Class. No.: 006.32
Hands-on question answering systems with BERTapplications in neural networks and natural language processing /
LDR
:02747nmm a2200325 a 4500
001
597369
003
DE-He213
005
20210630164829.0
006
m d
007
cr nn 008maaau
008
211019s2021 cau s 0 eng d
020
$a
9781484266649$q(electronic bk.)
020
$a
9781484266632$q(paper)
024
7
$a
10.1007/978-1-4842-6664-9
$2
doi
035
$a
978-1-4842-6664-9
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA76.87
$b
.S234 2021
072
7
$a
UYQM
$2
bicssc
072
7
$a
COM004000
$2
bisacsh
072
7
$a
UYQM
$2
thema
082
0 4
$a
006.32
$2
23
090
$a
QA76.87
$b
.S115 2021
100
1
$a
Sabharwal, Navin.
$3
609575
245
1 0
$a
Hands-on question answering systems with BERT
$h
[electronic resource] :
$b
applications in neural networks and natural language processing /
$c
by Navin Sabharwal, Amit Agrawal.
260
$a
Berkeley, CA :
$b
Apress :
$b
Imprint: Apress,
$c
2021.
300
$a
xv, 184 p. :
$b
ill., digital ;
$c
24 cm.
505
0
$a
Chapter 1: Introduction to Natural Language Processing -- Chapter 2: Introduction to Word Embeddings -- Chapter 3: BERT Algorithms Explained -- Chapter 4: BERT Model Applications - Question Answering System -- Chapter 5: BERT Model Applications - Other tasks -- Chapter 6: Future of BERT models.
520
$a
Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT. You will: Examine the fundamentals of word embeddings Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data.
650
0
$a
Neural networks (Computer science)
$3
181982
650
0
$a
Machine learning.
$3
188639
650
0
$a
Natural language processing (Computer science)
$3
200539
650
1 4
$a
Machine Learning.
$3
833608
650
2 4
$a
Professional Computing.
$3
763344
700
1
$a
Agrawal, Amit.
$3
860267
710
2
$a
SpringerLink (Online service)
$3
273601
773
0
$t
Springer Nature eBook
856
4 0
$u
https://doi.org/10.1007/978-1-4842-6664-9
950
$a
Professional and Applied Computing (SpringerNature-12059)
based on 0 review(s)
ALL
電子館藏
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
000000196099
電子館藏
1圖書
電子書
EB QA76.87 .S115 2021 2021
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
多媒體檔案
https://doi.org/10.1007/978-1-4842-6664-9
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入