1.

Record Nr.

UNINA9910483388203321

Autore

Sabharwal Navin

Titolo

Hands-on Question Answering Systems with BERT : Applications in Neural Networks and Natural Language Processing / / by Navin Sabharwal, Amit Agrawal

Pubbl/distr/stampa

Berkeley, CA : , : Apress : , : Imprint : Apress, , 2021

ISBN

1-5231-5076-9

1-4842-6664-1

Edizione

[1st ed. 2021.]

Descrizione fisica

1 online resource (XV, 184 p. 80 illus.)

Disciplina

006.32

Soggetti

Machine learning

Cloud Computing

Programming languages (Electronic computers)

Machine Learning

Programming Language

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Includes index.

Nota di contenuto

Chapter 1: Introduction to Natural Language Processing -- Chapter 2: Introduction to Word Embeddings -- Chapter 3: BERT Algorithms Explained -- Chapter 4: BERT Model Applications - Question Answering System -- Chapter 5: BERT Model Applications - Other tasks -- Chapter 6: Future of BERT models.

Sommario/riassunto

Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT. After this solid foundation, you’ll be ready to take a deep dive into BERT



algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT. You will: Examine the fundamentals of word embeddings Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data.

2.

Record Nr.

UNINA9910133012503321

Titolo

Acta informatica

Pubbl/distr/stampa

Heidelberg, : Springer-Verlag Heidelberg

ISSN

1432-0525

Disciplina

001.6405

Soggetti

Electronic data processing

Information storage and retrieval systems

Machine theory

Informatics

Informatique

Systèmes d'information

Théorie des automates

Periodical

Periodicals.

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Periodico

Note generali

Refereed/Peer-reviewed