LEADER 03697nam 2200505 450 001 996547955203316 005 20230528085709.0 010 $a981-19-8934-6 024 7 $a10.1007/978-981-19-8934-6 035 $a(MiAaPQ)EBC7211139 035 $a(Au-PeEL)EBL7211139 035 $a(CKB)26240746500041 035 $a(DE-He213)978-981-19-8934-6 035 $a(PPN)269094741 035 $a(EXLCZ)9926240746500041 100 $a20230528d2023 uy 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aDynamic network representation based on latent factorization of tensors /$fHao Wu, Xuke Wu, Xin Luo 205 $a1st ed. 2023. 210 1$aSingapore :$cSpringer,$d[2023] 210 4$dİ2023 215 $a1 online resource (89 pages) 225 1 $aSpringerBriefs in computer science 311 08$aPrint version: Wu, Hao Dynamic Network Representation Based on Latent Factorization of Tensors Singapore : Springer,c2023 9789811989339 320 $aIncludes bibliographical references and index. 327 $aChapter 1 IntroductionChapter -- 2 Multiple Biases-Incorporated Latent Factorization of tensors -- Chapter 3 PID-Incorporated Latent Factorization of Tensors -- Chapter 4 Diverse Biases Nonnegative Latent Factorization of Tensors -- Chapter 5 ADMM-Based Nonnegative Latent Factorization of Tensors -- Chapter 6 Perspectives and Conclusion. . 330 $aA dynamic network is frequently encountered in various real industrial applications, such as the Internet of Things. It is composed of numerous nodes and large-scale dynamic real-time interactions among them, where each node indicates a specified entity, each directed link indicates a real-time interaction, and the strength of an interaction can be quantified as the weight of a link. As the involved nodes increase drastically, it becomes impossible to observe their full interactions at each time slot, making a resultant dynamic network High Dimensional and Incomplete (HDI). An HDI dynamic network with directed and weighted links, despite its HDI nature, contains rich knowledge regarding involved nodes? various behavior patterns. Therefore, it is essential to study how to build efficient and effective representation learning models for acquiring useful knowledge. In this book, we first model a dynamic network into an HDI tensor and present the basic latent factorization of tensors (LFT) model. Then, we propose four representative LFT-based network representation methods. The first method integrates the short-time bias, long-time bias and preprocessing bias to precisely represent the volatility of network data. The second method utilizes a proportion-al-integral-derivative controller to construct an adjusted instance error to achieve a higher convergence rate. The third method considers the non-negativity of fluctuating network data by constraining latent features to be non-negative and incorporating the extended linear bias. The fourth method adopts an alternating direction method of multipliers framework to build a learning model for implementing representation to dynamic networks with high preciseness and efficiency. 410 0$aSpringerBriefs in computer science. 606 $aCalculus of tensors 615 0$aCalculus of tensors. 676 $a515.63 700 $aWu$b Hao$01062638 702 $aWu$b Xuke 702 $aLuo$b Xin 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996547955203316 996 $aDynamic Network Representation Based on Latent Factorization of Tensors$93071628 997 $aUNISA