LEADER 03991nam 22007575 450 001 9910299960303321 005 20200702232627.0 010 $a3-319-70609-8 024 7 $a10.1007/978-3-319-70609-2 035 $a(CKB)4340000000223546 035 $a(DE-He213)978-3-319-70609-2 035 $a(MiAaPQ)EBC6298184 035 $a(MiAaPQ)EBC5590649 035 $a(Au-PeEL)EBL5590649 035 $a(OCoLC)1066192745 035 $a(PPN)221251995 035 $a(EXLCZ)994340000000223546 100 $a20171104d2018 u| 0 101 0 $aeng 135 $aurnn|008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aFrom Content-based Music Emotion Recognition to Emotion Maps of Musical Pieces /$fby Jacek Grekow 205 $a1st ed. 2018. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2018. 215 $a1 online resource (XIV, 138 p. 71 illus., 22 illus. in color.) 225 1 $aStudies in Computational Intelligence,$x1860-949X ;$v747 311 $a3-319-70608-X 320 $aIncludes bibliographical references and index. 327 $aIntroduction -- Representations of Emotions -- Human Annotation -- MIDI Features -- Hierarchical Emotion Detection in MIDI Files. 330 $aThe problems it addresses include emotion representation, annotation of music excerpts, feature extraction, and machine learning. The book chiefly focuses on content-based analysis of music files, a system that automatically analyzes the structures of a music file and annotates the file with the perceived emotions. Further, it explores emotion detection in MIDI and audio files. In the experiments presented here, the categorical and dimensional approaches were used, and the knowledge and expertise of music experts with a university music education were used for music file annotation. The automatic emotion detection systems constructed and described in the book make it possible to index and subsequently search through music databases according to emotion. In turn, the emotion maps of musical compositions provide valuable new insights into the distribution of emotions in music and can be used to compare that distribution in different compositions, or to conduct emotional comparisons of different interpretations of the same composition. 410 0$aStudies in Computational Intelligence,$x1860-949X ;$v747 606 $aComputational intelligence 606 $aMusic 606 $aAcoustical engineering 606 $aEmotions 606 $aPattern recognition 606 $aAcoustics 606 $aComputational Intelligence$3https://scigraph.springernature.com/ontologies/product-market-codes/T11014 606 $aMusic$3https://scigraph.springernature.com/ontologies/product-market-codes/417000 606 $aEngineering Acoustics$3https://scigraph.springernature.com/ontologies/product-market-codes/T16000 606 $aEmotion$3https://scigraph.springernature.com/ontologies/product-market-codes/Y20140 606 $aPattern Recognition$3https://scigraph.springernature.com/ontologies/product-market-codes/I2203X 606 $aAcoustics$3https://scigraph.springernature.com/ontologies/product-market-codes/P21069 615 0$aComputational intelligence. 615 0$aMusic. 615 0$aAcoustical engineering. 615 0$aEmotions. 615 0$aPattern recognition. 615 0$aAcoustics. 615 14$aComputational Intelligence. 615 24$aMusic. 615 24$aEngineering Acoustics. 615 24$aEmotion. 615 24$aPattern Recognition. 615 24$aAcoustics. 676 $a780.285 700 $aGrekow$b Jacek$4aut$4http://id.loc.gov/vocabulary/relators/aut$01063953 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910299960303321 996 $aFrom Content-based Music Emotion Recognition to Emotion Maps of Musical Pieces$92535461 997 $aUNINA