1.

Record Nr.

UNINA9910254277503321

Autore

Mislevy Robert J

Titolo

Assessing Model-Based Reasoning using Evidence- Centered Design : A Suite of Research-Based Design Patterns  / / by Robert J Mislevy, Geneva Haertel, Michelle Riconscente, Daisy Wise Rutstein, Cindy Ziker

Pubbl/distr/stampa

Cham : , : Springer International Publishing : , : Imprint : Springer, , 2017

ISBN

3-319-52246-9

Edizione

[1st ed. 2017.]

Descrizione fisica

1 online resource (XVII, 130 p. 23 illus., 9 illus. in color.)

Collana

SpringerBriefs in Statistics, , 2191-544X

Disciplina

005.1

Soggetti

Statistics 

Assessment

Educational technology

Teaching

Learning

Instruction

Statistics for Social Sciences, Humanities, Law

Assessment, Testing and Evaluation

Statistical Theory and Methods

Educational Technology

Teaching and Teacher Education

Learning & Instruction

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di bibliografia

Includes bibliographical references and index.

Nota di contenuto

Preface -- Introduction -- Model-Based Reasoning -- Evidence-Centered Assessment Design -- Design Patterns for Model-Based Reasoning -- Model Formation -- Model Use -- Model Elaboration -- Model Articulation -- Model Evaluation -- Model Revision -- Model-based Inquiry -- Conclusion -- References -- Appendix -- Summary Form of Design Patterns for Model-based Reasoning -- Appendix.

Sommario/riassunto

This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards.



Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop. Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based reasoning: Model Formation, Model Use, Model Elaboration, Model Articulation, Model Evaluation, Model Revision, and Model-Based Inquiry. Each design pattern lays out considerations concerning targeted knowledge and ways of capturing and evaluating students’ work. These design patterns are available at http://design-drk.padi.sri.com/padi/do/NodeAction?state=listNodes&NODE_TYPE=PARADIGM_TYPE. The ideas are illustrated with examples from existing assessments and the research literature.