top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Automatic item generation [[electronic resource] ] : theory and practice / / edited by Mark J. Gierl and Thomas M. Haladyna
Automatic item generation [[electronic resource] ] : theory and practice / / edited by Mark J. Gierl and Thomas M. Haladyna
Pubbl/distr/stampa New York, : Routledge, 2012
Descrizione fisica 1 online resource (257 p.)
Disciplina 371.26
371.260285
Altri autori (Persone) GierlMark J
HaladynaThomas M
Soggetto topico Educational psychology
Educational tests and measurements
Soggetto genere / forma Electronic books.
ISBN 1-283-59015-8
9786613902603
0-203-80391-4
1-136-63689-7
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover; Title Page; Copyright Page; Contents; List of Figures and Tables; Acknowledgments; PART I Initial Considerations for Automatic Item Generation; 1. Automatic Item Generation: An Introduction; 2. Automatic Item Generation: A Historical Perspective; 3. Using Weak and Strong Theory to Create Item Models for Automatic Item Generation: Some Practical Guidelines with Examples; 4. Item Generation: Implications for a Validity Argument; PART II Connecting Theory and Practice in Automatic Item Generation; 5. An Introduction to Assessment Engineering for Automatic Item Generation
6. Generating Items Under the Assessment Engineering Framework7. Using Evidence-Centered Design Task Models in Automatic Item Generation; PART III Psychological Foundations for Automatic Item Generation; 8. Learning Sciences, Cognitive Models, and Automatic Item Generation; 9. Using Cognitive Psychology to Generate Items and Predict Item Characteristics; 10. Difficulty Modeling and Automatic Generation of Quantitative Items: Recent Advances and Possible Next Steps; PART IV Technical Developments in Automatic Item Generation; 11. Statistical Modeling of Automatically Generated Items
12. Automatic Item Generation for Computerized Adaptive Testing13. IGOR: A Web-Based Automatic Item Generation Tool; 14. Obstacles for Automatic Item Generation; Author Index; Subject Index
Record Nr. UNINA-9910465248903321
New York, : Routledge, 2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Automatic item generation [[electronic resource] ] : theory and practice / / edited by Mark J. Gierl and Thomas M. Haladyna
Automatic item generation [[electronic resource] ] : theory and practice / / edited by Mark J. Gierl and Thomas M. Haladyna
Pubbl/distr/stampa New York, : Routledge, 2012
Descrizione fisica 1 online resource (257 p.)
Disciplina 371.26
371.260285
Altri autori (Persone) GierlMark J
HaladynaThomas M
Soggetto topico Educational psychology
Educational tests and measurements
ISBN 1-283-59015-8
9786613902603
0-203-80391-4
1-136-63689-7
Classificazione EDU030000EDU000000
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover; Title Page; Copyright Page; Contents; List of Figures and Tables; Acknowledgments; PART I Initial Considerations for Automatic Item Generation; 1. Automatic Item Generation: An Introduction; 2. Automatic Item Generation: A Historical Perspective; 3. Using Weak and Strong Theory to Create Item Models for Automatic Item Generation: Some Practical Guidelines with Examples; 4. Item Generation: Implications for a Validity Argument; PART II Connecting Theory and Practice in Automatic Item Generation; 5. An Introduction to Assessment Engineering for Automatic Item Generation
6. Generating Items Under the Assessment Engineering Framework7. Using Evidence-Centered Design Task Models in Automatic Item Generation; PART III Psychological Foundations for Automatic Item Generation; 8. Learning Sciences, Cognitive Models, and Automatic Item Generation; 9. Using Cognitive Psychology to Generate Items and Predict Item Characteristics; 10. Difficulty Modeling and Automatic Generation of Quantitative Items: Recent Advances and Possible Next Steps; PART IV Technical Developments in Automatic Item Generation; 11. Statistical Modeling of Automatically Generated Items
12. Automatic Item Generation for Computerized Adaptive Testing13. IGOR: A Web-Based Automatic Item Generation Tool; 14. Obstacles for Automatic Item Generation; Author Index; Subject Index
Record Nr. UNINA-9910791902203321
New York, : Routledge, 2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Automatic item generation [[electronic resource] ] : theory and practice / / edited by Mark J. Gierl and Thomas M. Haladyna
Automatic item generation [[electronic resource] ] : theory and practice / / edited by Mark J. Gierl and Thomas M. Haladyna
Pubbl/distr/stampa New York, : Routledge, 2012
Descrizione fisica 1 online resource (257 p.)
Disciplina 371.26
371.260285
Altri autori (Persone) GierlMark J
HaladynaThomas M
Soggetto topico Educational psychology
Educational tests and measurements
ISBN 1-283-59015-8
9786613902603
0-203-80391-4
1-136-63689-7
Classificazione EDU030000EDU000000
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover; Title Page; Copyright Page; Contents; List of Figures and Tables; Acknowledgments; PART I Initial Considerations for Automatic Item Generation; 1. Automatic Item Generation: An Introduction; 2. Automatic Item Generation: A Historical Perspective; 3. Using Weak and Strong Theory to Create Item Models for Automatic Item Generation: Some Practical Guidelines with Examples; 4. Item Generation: Implications for a Validity Argument; PART II Connecting Theory and Practice in Automatic Item Generation; 5. An Introduction to Assessment Engineering for Automatic Item Generation
6. Generating Items Under the Assessment Engineering Framework7. Using Evidence-Centered Design Task Models in Automatic Item Generation; PART III Psychological Foundations for Automatic Item Generation; 8. Learning Sciences, Cognitive Models, and Automatic Item Generation; 9. Using Cognitive Psychology to Generate Items and Predict Item Characteristics; 10. Difficulty Modeling and Automatic Generation of Quantitative Items: Recent Advances and Possible Next Steps; PART IV Technical Developments in Automatic Item Generation; 11. Statistical Modeling of Automatically Generated Items
12. Automatic Item Generation for Computerized Adaptive Testing13. IGOR: A Web-Based Automatic Item Generation Tool; 14. Obstacles for Automatic Item Generation; Author Index; Subject Index
Record Nr. UNINA-9910810750803321
New York, : Routledge, 2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Computerized Adaptive and Multistage Testing with R [[electronic resource] ] : Using Packages catR and mstR / / by David Magis, Duanli Yan, Alina A. von Davier
Computerized Adaptive and Multistage Testing with R [[electronic resource] ] : Using Packages catR and mstR / / by David Magis, Duanli Yan, Alina A. von Davier
Autore Magis David
Edizione [1st ed. 2017.]
Pubbl/distr/stampa Cham : , : Springer International Publishing : , : Imprint : Springer, , 2017
Descrizione fisica 1 online resource (XX, 171 p. 20 illus.)
Disciplina 371.260285
Collana Use R!
Soggetto topico Statistics 
Assessment
Psychometrics
Educational psychology
Education—Psychology
R (Computer program language)
Statistical Theory and Methods
Assessment, Testing and Evaluation
Statistics for Social Sciences, Humanities, Law
Statistics and Computing/Statistics Programs
Educational Psychology
ISBN 3-319-69218-6
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Foreword -- Preface -- Ch 1 Overview of Adaptive Testing -- Ch 2 An Overview of Item Response Theory -- Part 1 Item-Level Computerized Adaptive Testing -- Ch 3 An Overview of Computerized Adaptive Testing -- Ch 4 Simulations of Computerized Adaptive Tests -- Ch 5 Examples of Simulations using catR -- Part 2 Computerized Multistage Testing -- Ch 6 An Overview of Computerized Multistage testing -- Ch 7 Simulations of Computerized Multistage Tests -- Ch 8 Examples of Simulations using mstR -- Index.
Record Nr. UNINA-9910254311403321
Magis David  
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2017
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Computerized adaptive testing in Kinanthropology : Monte Carlo simulations using the physical self-description questionnaire / / Martin Komarc
Computerized adaptive testing in Kinanthropology : Monte Carlo simulations using the physical self-description questionnaire / / Martin Komarc
Autore Komarc Martin
Edizione [First edition.]
Pubbl/distr/stampa [Place of publication not identified] : , : Karolinum Press, , 2019
Descrizione fisica 1 online resource (132 pages) : illustrations
Disciplina 371.260285
Soggetto topico Computer adaptive testing
Anthropology
Item response theory
Soggetto genere / forma Electronic books.
ISBN 80-246-3984-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910479983103321
Komarc Martin  
[Place of publication not identified] : , : Karolinum Press, , 2019
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Computerized adaptive testing in Kinanthropology : Monte Carlo simulations using the physical self-description questionnaire / / Martin Komarc
Computerized adaptive testing in Kinanthropology : Monte Carlo simulations using the physical self-description questionnaire / / Martin Komarc
Autore Komarc Martin
Edizione [First edition.]
Pubbl/distr/stampa [Place of publication not identified] : , : Karolinum Press, , 2019
Descrizione fisica 1 online resource (132 pages) : illustrations
Disciplina 371.260285
Soggetto topico Computer adaptive testing
Anthropology
Item response theory
ISBN 80-246-3984-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910793440903321
Komarc Martin  
[Place of publication not identified] : , : Karolinum Press, , 2019
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Computerized adaptive testing in Kinanthropology : Monte Carlo simulations using the physical self-description questionnaire / / Martin Komarc
Computerized adaptive testing in Kinanthropology : Monte Carlo simulations using the physical self-description questionnaire / / Martin Komarc
Autore Komarc Martin
Edizione [First edition.]
Pubbl/distr/stampa [Place of publication not identified] : , : Karolinum Press, , 2019
Descrizione fisica 1 online resource (132 pages) : illustrations
Disciplina 371.260285
Soggetto topico Computer adaptive testing
Anthropology
Item response theory
ISBN 80-246-3984-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910825392303321
Komarc Martin  
[Place of publication not identified] : , : Karolinum Press, , 2019
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Student Assessment in Digital and Hybrid Learning Environments
Student Assessment in Digital and Hybrid Learning Environments
Autore Hummel Sandra
Edizione [1st ed.]
Pubbl/distr/stampa Wiesbaden : , : Springer Fachmedien Wiesbaden GmbH, , 2024
Descrizione fisica 1 online resource (320 pages)
Disciplina 371.260285
Altri autori (Persone) DonnerMana-Teresa
Collana Doing Higher Education Series
ISBN 3-658-42253-X
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Intro -- Contents -- Editors and Contributors -- About the Editors -- Contributors -- Introduction -- References -- Part I General Approaches to Assessments in Digital and Hybrid Environments -- Experiences and Critical Reflection on Online-Assessment with Excel Case Studies - Review on a Successful Online-Assessment Practice as Well as the Adaptation to a Remote Setting Due to the COVID-19 Pandemic -- 1 Introduction -- 2 Didactic Concept and Literature -- 2.1 Constructive Alignment -- 2.2 Focus on Practical Skills -- 2.3 Fairness in Correction Through Automation -- 3 On-Site Online-Assessment Using Managed Devices -- 3.1 Online-Assessment Environments at the University of Zurich -- 3.2 Case Studies in Excel and Question Types -- 3.3 Consequential Errors -- 3.4 Automated Grading -- 4 Personalised Remote Online-Assessment (with BYOD) -- 4.1 Prevention of Cheating -- 4.2 Detection of Cheating -- 5 Reflection on the Student's Perspective -- 5.1 Student Feedback on On-Site Online-Assessment Using Managed Devices -- 5.2 Student Feedback on Personalised Remote Online-Assessments (with BYOD) -- 6 Critical Reflection and Discussion -- 7 Conclusion -- References -- E-Examinations@Home: Adapting Large-Scale Digital Assessments for Remote Environments -- 1 Introduction -- 2 E-Examinations@Home: A Concept in Practice -- 2.1 Assessment Preparation -- 2.2 Conducting the Assessment -- 2.3 After the Assessment -- 3 Transforming Digital Assessments for Remote Settings - Key Areas -- 3.1 Analytical Categories -- 3.2 Assessment Literacy -- 3.3 Assessment Technologies -- 3.4 Logistics and Organisation -- 3.5 Legal Reliability -- 4 Making a Virtue of Necessity? Remaining Challenges and Unresolved Issues -- References -- Student Feedback in Hybrid/Online Teaching: Relevance, Approaches and Practices -- 1 Introduction.
2 Evaluation, Feedback and the Shift from Teaching to Learning in Higher Education -- 2.1 What is Feedback? -- 2.2 What is Evaluation? -- 2.3 How is Evaluation Regulated? -- 2.4 What Are the Advantages of Feedback Over Evaluation? -- 3 Feedback in Hybrid/Online Teaching Scenarios -- 3.1 WHAT to Preconceive Before Initiating Student Feedback Processes? -- 3.2 WHY Initiate Student Feedback Processes? -- 3.3 HOW to Initiate Student Feedback Processes? -- 3.4 WHEN to Initiate Student Feedback Processes? -- 4 Possible Feedback Questions -- 5 Conclusion and Implications -- References -- Digitalisation of Examination Formats in Higher Education Corona-Related Changes -- 1 Introduction -- 2 Research Design -- 3 Results -- 3.1 Examination Format -- 3.2 Exam Questions -- 3.3 Examination Strategies -- 3.4 Communication with Students -- 3.5 Achievements in Online Examinations -- 4 Summary -- References -- Developing Questions for Digital Assessments - Approaches and Reflections from a Didactical Point of View -- 1 Introduction -- 2 Planning Digital Exams: Determining Factors -- 3 Digital Exams - Methods and Approaches -- 3.1 Type 1: Advisory Assessments -- 3.2 Type 2: Diagnostic Assessments -- 3.3 Type 3: Formative Assessments -- 3.4 Type 4: Summative Assessments -- 3.5 Type 5: Quality Assurance Assessments -- 3.6 Different Scenarios for Digital Exams -- 4 Didactical Considerations for Formulating Digital Exam Questions -- 4.1 The Biggs Model of Constructive Alignment -- 4.2 Find and Set Learning Objectives as the Basis for Formulating Examination Questions -- 4.3 Derive Exam Questions from Defined Learning Objectives -- 4.4 Develop Exam Questions for the Digital Context -- 5 Challenges and Open Questions -- 5.1 Rethink Exam Settings in Universities and Higher Education Institutions -- 5.2 Consider Solutions for Data Security Issues.
5.3 More Empirical Research on Digital Learning Methods and Digital Exam Settings -- 5.4 Foster Diversity in (Digital) Exam Settings -- References -- Trust and Cheating: From Assessment Literacy to a Literacy of Practice for Digital Remote Assessments -- 1 Introduction -- 1.1 From Assessment Literacy to a Literacy of Practice for Digital Assessments -- 1.2 Academic Integrity: Current Findings of Misconduct in Digital Remote Assessments During the Coronavirus Pandemic -- 2 Comparison and Implementation -- 2.1 Digital On-Campus and Digital Remote Assessments Conditions, Measures, and Procedures -- 2.2 Evaluating Measures for Quality Assurance Within Digital Remote Assessments -- 3 Conclusion and Outlook -- References -- Online Assessment from a Broader Perspective with Practical Applications -- 1 Introduction -- 2 Theoretical Approaches -- 3 Suggested Designs for Specific Fields -- 3.1 Case 1: Online Assessment Methods and Techniques that Can Be Used in the Field of Education -- 3.2 Case 2: Online Assessment Methods and Techniques that Can Be Used in the Field of Medical Education -- 3.3 Case 3: Online Assessment Methods and Techniques that Can Be Used in the Field of Legal Education -- 4 Conclusion and Implications -- References -- Support Measures for Students Before and During Written Online Distance Exams: The Case of Vienna University of Economics and Business (WU) -- 1 Introduction -- 2 Written Online Distance Exams at WU -- 2.1 Online Exam Environments -- 2.2 Online Exams -- 2.3 Administration of Written Online Distance Exams -- 3 Support Measures for Online Exams with a Focus on Students -- 3.1 Autonomous Self-preparation -- 3.2 Calming Measures -- 3.3 Compensation for Disadvantages -- 4 Conclusion -- References -- E-Assessment of Mathematics and Statistics in Biomedical Sciences -- 1 Introduction.
2 The Scenario for E-Assessment in the Statistics Module -- 3 The Scenario for E-Assessment in the Mathematics Module -- 4 Design of E-Assessment Based on TPACK -- 5 Discussion and Conclusion -- References -- Is Anonymous Aggregated Peer-Evaluation as a Learning Activity Feasible to Identify Differences in Dentistry Students' Clinical Reasoning Performance? -- 1 Introduction -- 1.1 Feasibility of Peer-Evaluation in Medical Training -- 1.2 Overcoming Barriers to Implement Peer-Evaluation -- 2 Aggregated Peer-Evaluation: Online Tools and Procedures -- 3 Feasibility of Online Peer-Evaluation in Clinical Reasoning Tasks -- 3.1 Piloting the Learning Scenario -- 3.2 Does Aggregated Peer-Evaluation Replicate Performance Differences? -- 4 Conclusions and Implications -- References -- Part II Exam Formats -- Peer Assessment in MOOC of Students Performance with Paragogy Framework: Evidence: Higher Education Institutions in Indonesia -- 1 Introduction -- 2 Literature Review -- 2.1 Theory Peer of Learning -- 2.2 Applying the Integrated Peer Assessment to MOOC -- 3 Method -- 3.1 Data Resources and Analysis -- 4 Results -- 5 Conclusions -- References -- Formative Assessment of E-Service Learning Using Learning Diaries and Group Reflections -- 1 Introduction -- 2 Theoretical Approach -- 2.1 Specificities of e-Service Learning -- 2.2 Assessing Service Learning Courses -- 2.3 The e-Service Learning Course -- 3 Methodology -- 4 Main Findings -- 4.1 Weekly Learning Diaries -- 4.2 Online Focus Group Discussions -- 4.3 Online Presentations -- 5 Conclusions and Implications -- References -- Validity and Fairness of an Admission Examination Using a Study-Related Learning Test -- 1 Introduction -- 1.1 Self-regulated Learning: A Core Factor for Learning Success -- 1.2 Admission Procedures at Universities -- 2 Method -- 2.1 Participants -- 2.2 Procedure and Material.
3 Results -- 3.1 Comparison of the Four Waves -- 3.2 Fairness -- 3.3 Validity -- 4 Conclusion and Implications -- References.
Record Nr. UNINA-9910767583603321
Hummel Sandra  
Wiesbaden : , : Springer Fachmedien Wiesbaden GmbH, , 2024
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui