IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Automated Scoring of Multicomponent Tasks

Automated Scoring of Multicomponent Tasks
View Sample PDF
Author(s): William Lorié (Questar Assessment, Inc., USA)
Copyright: 2016
Pages: 32
Source title: Handbook of Research on Technology Tools for Real-World Skill Development
Source Author(s)/Editor(s): Yigal Rosen (Harvard University, USA), Steve Ferrara (Pearson, USA)and Maryam Mosharraf (Pearson, USA)
DOI: 10.4018/978-1-4666-9441-5.ch024

Purchase

View Automated Scoring of Multicomponent Tasks on the publisher's website for pricing and purchasing information.

Abstract

Assessment of real-world skills increasingly requires efficient scoring of non-routine test items. This chapter addresses the scoring and psychometric treatment of a broad class of automatically-scorable complex assessment tasks allowing a definite set of responses orderable by quality. These multicomponent tasks are described and proposals are advanced on how to score them so that they support capturing gradations of performance quality. The resulting response evaluation functions are assessed empirically against alternatives using data from a pilot of technology-enhanced items (TEIs) administered to a sample of high school students in one U.S. state. Results support scoring frameworks leveraging the full potential of multicomponent tasks for providing evidence of partial knowledge, understanding, or skill.

Related Content

Agah Tugrul Korucu, Handan Atun. © 2017. 18 pages.
Larisa Olesova, Jieun Lim. © 2017. 21 pages.
JoAnne Dalton Scott. © 2017. 20 pages.
Geraldine E Stirtz. © 2017. 25 pages.
Enilda Romero-Hall, Cristiane Rocha Vicentini. © 2017. 21 pages.
Beth Allred Oyarzun, Sheri Anderson Conklin, Daisyane Barreto. © 2017. 21 pages.
Nikolina Tsvetkova, Albena Antonova, Plama Hristova. © 2017. 24 pages.
Body Bottom