04919nam 22007095 450 991048361490332120200705022623.0981-13-0062-310.1007/978-981-13-0062-2(CKB)4100000006098266(DE-He213)978-981-13-0062-2(MiAaPQ)EBC5921923(PPN)229914977(EXLCZ)99410000000609826620180829d2019 u| 0engurnn|008mamaatxtrdacontentcrdamediacrrdacarrierNeural Representations of Natural Language /by Lyndon White, Roberto Togneri, Wei Liu, Mohammed Bennamoun1st ed. 2019.Singapore :Springer Singapore :Imprint: Springer,2019.1 online resource (XIV, 122 p. 36 illus., 31 illus. in color.) Studies in Computational Intelligence,1860-949X ;783981-13-0061-5 Includes bibliographical references and index.Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.Studies in Computational Intelligence,1860-949X ;783Computational intelligenceSignal processingImage processingSpeech processing systemsPattern perceptionComputational linguisticsComputational Intelligencehttps://scigraph.springernature.com/ontologies/product-market-codes/T11014Signal, Image and Speech Processinghttps://scigraph.springernature.com/ontologies/product-market-codes/T24051Pattern Recognitionhttps://scigraph.springernature.com/ontologies/product-market-codes/I2203XComputational Linguisticshttps://scigraph.springernature.com/ontologies/product-market-codes/N22000Computational intelligence.Signal processing.Image processing.Speech processing systems.Pattern perception.Computational linguistics.Computational Intelligence.Signal, Image and Speech Processing.Pattern Recognition.Computational Linguistics.006.3White Lyndonauthttp://id.loc.gov/vocabulary/relators/aut1224820Togneri Robertoauthttp://id.loc.gov/vocabulary/relators/autLiu Weiauthttp://id.loc.gov/vocabulary/relators/autBennamoun Mohammedauthttp://id.loc.gov/vocabulary/relators/autMiAaPQMiAaPQMiAaPQBOOK9910483614903321Neural Representations of Natural Language2843939UNINA