LEADER 04466nam 22006495 450 001 996465974103316 005 20200629221037.0 010 $a3-319-66435-2 024 7 $a10.1007/978-3-319-66435-4 035 $a(CKB)4100000000587139 035 $a(DE-He213)978-3-319-66435-4 035 $a(MiAaPQ)EBC5590940 035 $a(PPN)204533473 035 $a(EXLCZ)994100000000587139 100 $a20170927d2017 u| 0 101 0 $aeng 135 $aurnn|008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aEvaluation in the Crowd. Crowdsourcing and Human-Centered Experiments$b[electronic resource] $eDagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 ? 27, 2015, Revised Contributions /$fedited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld 205 $a1st ed. 2017. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2017. 215 $a1 online resource (VII, 191 p. 15 illus.) 225 1 $aInformation Systems and Applications, incl. Internet/Web, and HCI ;$v10264 311 $a3-319-66434-4 320 $aIncludes bibliographical references and index. 327 $aCrowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments. 330 $aAs the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community. 410 0$aInformation Systems and Applications, incl. Internet/Web, and HCI ;$v10264 606 $aUser interfaces (Computer systems) 606 $aComputer communication systems 606 $aApplication software 606 $aEconomic theory 606 $aUser Interfaces and Human Computer Interaction$3https://scigraph.springernature.com/ontologies/product-market-codes/I18067 606 $aComputer Communication Networks$3https://scigraph.springernature.com/ontologies/product-market-codes/I13022 606 $aInformation Systems Applications (incl. Internet)$3https://scigraph.springernature.com/ontologies/product-market-codes/I18040 606 $aEconomic Theory/Quantitative Economics/Mathematical Methods$3https://scigraph.springernature.com/ontologies/product-market-codes/W29000 615 0$aUser interfaces (Computer systems). 615 0$aComputer communication systems. 615 0$aApplication software. 615 0$aEconomic theory. 615 14$aUser Interfaces and Human Computer Interaction. 615 24$aComputer Communication Networks. 615 24$aInformation Systems Applications (incl. Internet). 615 24$aEconomic Theory/Quantitative Economics/Mathematical Methods. 676 $a005.437 702 $aArchambault$b Daniel$4edt$4http://id.loc.gov/vocabulary/relators/edt 702 $aPurchase$b Helen$4edt$4http://id.loc.gov/vocabulary/relators/edt 702 $aHoßfeld$b Tobias$4edt$4http://id.loc.gov/vocabulary/relators/edt 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996465974103316 996 $aEvaluation in the Crowd. Crowdsourcing and Human-Centered Experiments$92811279 997 $aUNISA