Vai al contenuto principale della pagina

Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions / / edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Titolo: Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions / / edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld Visualizza cluster
Pubblicazione: Cham : , : Springer International Publishing : , : Imprint : Springer, , 2017
Edizione: 1st ed. 2017.
Descrizione fisica: 1 online resource (VII, 191 p. 15 illus.)
Disciplina: 005.437
Soggetto topico: User interfaces (Computer systems)
Computer communication systems
Application software
Economic theory
User Interfaces and Human Computer Interaction
Computer Communication Networks
Information Systems Applications (incl. Internet)
Economic Theory/Quantitative Economics/Mathematical Methods
Persona (resp. second.): ArchambaultDaniel
PurchaseHelen
HoßfeldTobias
Nota di bibliografia: Includes bibliographical references and index.
Nota di contenuto: Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
Sommario/riassunto: As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.
Titolo autorizzato: Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments  Visualizza cluster
ISBN: 3-319-66435-2
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9910483571703321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Serie: Information Systems and Applications, incl. Internet/Web, and HCI ; ; 10264