LEADER 04349nam 22006375 450 001 996546836903316 005 20230823043640.0 010 $a981-9916-00-3 024 7 $a10.1007/978-981-99-1600-9 035 $a(CKB)5700000000428131 035 $a(MiAaPQ)EBC30718654 035 $a(Au-PeEL)EBL30718654 035 $a(DE-He213)978-981-99-1600-9 035 $a(PPN)272271926 035 $a(OCoLC)1395909338 035 $a(EXLCZ)995700000000428131 100 $a20230823d2023 u| 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aRepresentation Learning for Natural Language Processing$b[electronic resource] /$fedited by Zhiyuan Liu, Yankai Lin, Maosong Sun 205 $a2nd ed. 2023. 210 1$aSingapore :$cSpringer Nature Singapore :$cImprint: Springer,$d2023. 215 $a1 online resource (535 pages) 311 $a981-9915-99-6 327 $aChapter 1. Representation Learning and NLP -- Chapter 2. Word Representation -- Chapter 3. Compositional Semantics -- Chapter 4. Sentence Representation -- Chapter 5. Document Representation -- Chapter 6. Sememe Knowledge Representation -- Chapter 7. World Knowledge Representation -- Chapter 8. Network Representation -- Chapter 9. Cross-Modal Representation -- Chapter 10. Resources -- Chapter 11. Outlook. 330 $aThis book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learning techniques for multiple language entries, including words, sentences and documents, as well as pre-training techniques. Part II then introduces the related representation techniques to NLP, including graphs, cross-modal entries, and robustness. Part III then introduces the representation techniques for the knowledge that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, legal domain knowledge and biomedical domain knowledge. Lastly, Part IV discusses the remaining challenges and future research directions. The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, social network analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing. As compared to the first edition, the second edition (1) provides a more detailed introduction to representation learning in Chapter 1; (2) adds four new chapters to introduce pre-trained language models, robust representation learning, legal knowledge representation learning and biomedical knowledge representation learning; (3) updates recent advances in representation learning in all chapters; and (4) corrects some errors in the first edition. The new contents will be approximately 50%+ compared to the first edition. This is an open access book. 606 $aNatural language processing (Computer science) 606 $aComputational linguistics 606 $aArtificial intelligence 606 $aData mining 606 $aNatural Language Processing (NLP) 606 $aComputational Linguistics 606 $aArtificial Intelligence 606 $aData Mining and Knowledge Discovery 615 0$aNatural language processing (Computer science). 615 0$aComputational linguistics. 615 0$aArtificial intelligence. 615 0$aData mining. 615 14$aNatural Language Processing (NLP). 615 24$aComputational Linguistics. 615 24$aArtificial Intelligence. 615 24$aData Mining and Knowledge Discovery. 676 $a006.35 700 $aLiu$b Zhiyuan$0851460 701 $aLin$b Yankai$01423719 701 $aSun$b Maosong$01423720 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996546836903316 996 $aRepresentation Learning for Natural Language Processing$93552088 997 $aUNISA