Annual Meeting of the American Educational Research Association

0
359

This paper describes the process for developing and refining diagnostic score reports using a reporting framework (Roberts & Gierl, 2010) and incorporating feedback from both the test developer and teachers. We designed and evaluated diagnostic score reports for teachers in the context of a diagnostic mathematics assessment. The Diagnostic Mathematics Project is funded by the Learner Assessment Branch at Alberta Education in Edmonton, Alberta, Canada. Diagnostic Mathematics is a computer-based on-line assessment for students in Kindergarten to Grade 6 in both English and French. The goal of the project is to create tests that will provide teachers with diagnostic information so students’ cognitive mathematical knowledge and skills can be identified and evaluated. Score reports were created for assessments in the content area of Number for Grade 3 in Spring 2009 and 2010. The score reports developed and the results of a small scale evaluation with grade 3 teachers are presented. Developing and Evaluating Score Reports 2 Developing and Evaluating Score Reports for a Diagnostic Mathematics Assessment The recent emphasis on understanding the psychology underlying test performance has lead to developments in cognitive diagnostic assessment (CDA; Leighton & Gierl, 2007) which integrates cognitive psychology and educational measurement for the purposes of enhancing learning and instruction. The results of a CDA yield a profile of scores with specific information about a student’s cognitive strengths and weaknesses. Score reporting serves a critical function as the interface between the test developer and a diverse audience of test users. Effective reporting of diagnostic assessment results is important because teachers can look to these results to help guide their instructional practice, parents often seek information on ways to help their children in identified areas of academic difficulty, and students seek feedback to validate their study and testing efforts. In short, cognitive diagnostic feedback may be used by instructors, parents, and students to guide and monitor their teaching and learning processes. Research studies on score reporting have noted that the communication between test developers and users of educational tests is weak and requires improvement. This is evident by teachers receiving student test results too late to influence instruction, typically many months after the test administration (Huff & Goodman, 2007). Further, information is reported in ways that are difficult to read and understand (Ryan, 2003) and often without adequate supporting material to promote clear test score interpretations (Trout & Hyde, 2006). Large variability exists in how test scores are reported to the public on educational tests (Goodman & Hambleton, 2004; Knupp & Ainsley, 2008). As developments in CDA continue to progress, the need to address score reporting issues of comprehensibility, interpretability, and timeliness become even more pressing. Diagnostic testing information, including skills descriptions and learning concepts, is fundamentally Developing and Evaluating Score Reports 3 different in purpose from information typically reported from traditional large-scale assessments, such as total number correct scores or percentile ranks. Test developers must report and present new kinds of information from these diagnostic tests. In short, the challenge of diagnostic score reporting lies in the integration of the substantive and technical information needs of the educational community with the psychologically sophisticated information unique to CDA. Currently, there are few examples of cognitive diagnostic score reports. One operational example is the College Board’s Score Report Plus for the PSAT/NMSQT where cognitive diagnostic feedback is given in the form of a description of the top three skills requiring improvement for each content area of Mathematics, Critical Reading, and Writing along with recommended remedial activities. As developments continue to progress, current score reporting approaches need to be recast in light of the new kinds of information yielded by CDA and the context in which this information is to be used by its target audiences. Purpose of the Study There are few operational examples that illustrate the development and evaluation of student score reporting in the context of cognitive diagnostic assessment. Therefore, the purpose of this study is to describe the steps taken to create score reports for an operational cognitive diagnostic assessment in mathematics developed using the Attribute Hierarchy Method (AHM, Leighton, Gierl, & Hunka, 2004). The AHM is a cognitively-based psychometric procedure used to classify examinees’ test item responses into a set of attribute patterns associated with a cognitive model of task performance. Cognitive attributes in the AHM are described as the procedural or declarative knowledge needed to perform a task in a specific domain. The AHM is a two-stage procedure that employs principled test design procedures where the first stage involves cognitive model specification and item development and the second stage involves a Developing and Evaluating Score Reports 4 psychometric analysis, in a confirmatory mode, of student responses to yield model-based diagnostic information about student mastery of cognitive skills (Gierl, 2007). For an operational application of the AHM, the reader is referred to Gierl, Alves, and Taylor-Majeau (2010). This paper describes the early development and creation of the student score reports and the results of a small scale evaluation with teachers using a questionnaire and focus group format. The aims of this study were to: 1) identify information that should appear on the score report that would help a teacher to make a diagnostic decision about a student in his or her own class, and 2) identify the potential uses and issues with the format and content of the student score report that may need to be addressed in the next revised version. Context for the Study: The Cognitive Diagnostic Mathematics Assessment The context for this study is the Cognitive Diagnostic Mathematics Assessment (CDMA) project funded by the Learner Assessment Branch at Alberta Education in Edmonton, Alberta, Canada. CDMA is a curriculum-based set of assessments that can be used throughout the school year to measure students’ thinking and learning in mathematics. The goal of the project is to create tests that will provide teachers with diagnostic information so students’ cognitive mathematical knowledge and skills can be identified and evaluated. The online computer-based administration system includes the assessments and score reports. Principled test design procedures in the context of the AHM as a form of cognitive diagnostic assessment were used to create online diagnostic tests were in four content areas: (a) Number, (b) Patterns and Relations, (c) Shape and Space, and (d) Statistics and Probability at two grade levels, 3 and 6. More specifically, the assessments developed for Grade 3 in Number are used as the basis for developing and evaluating the score reports. Development of CDMA in Grade 3 began in 2008. Developing and Evaluating Score Reports 5 An operational CDMA in the strand of Numbers for grade 3 is scheduled for implementation across the province in April 2011. Application of the Diagnostic Reporting Framework for Creating Score Reports The AHM yields diagnostic scores that must be communicated through score reports in an accessible manner to a diverse audience such as students, parents, and instructors. An adapted reporting framework based on research by Jaegar (1998) and Ryan (2003) was created for reporting cognitive diagnostic scores. An example of the diagnostic reporting framework (Roberts & Gierl, 2010) applied to elements of an AHM analysis is provided in Table 1. Inspection of the framework shows that elements and outcomes of a diagnostic analysis can be systematically identified and presented in different ways and combinations. Test developers may choose to report some or all of the content outlined in the framework in various formats and modes, however the final form will likely be influenced by the information needs of a particular audience and policy considerations. Additionally, implementation of information design principles including contrast, repetition, proximity, and alignment should be applied when organizing and presenting numerical, graphical, or text-based information on a document. The reporting framework combines both content and form considerations with design principles for presenting information as a principled approach to developing diagnostic score reports. The purpose of the reports was to provide a summary of student performance across attributes in one skill category. This type of reporting allows the reader to compare mastery across attributes providing a diagnostic profile of cognitive strengths and weaknesses. It was anticipated that a teacher could use the document as a starting point for discussions with the student or the parent on areas requiring further instruction or study. The following description of the development and evaluation of the reports are based on the strand of Number: Developing Developing and Evaluating Score Reports 6 number sense under the skill category of “Estimate quantities less than 1000 using referents”. The cognitive model for this reporting scheme is a five attribute hierarchy, depicted in Figure 1. The reports presented in Figures 2 and 3 incorporate the AHM reporting elements of the cognitive model, attribute scores, and attribute descriptions. Table 1. Alignment of AHM elements and outcomes to a general reporting framework. Reporting Characteristic AHM Analysis Element or Outcome Form of Reporting Results Scale Reference for interpretation Assessment Unit Reporting unit Error of measurement Attribute probabilities, total correct Cognitive model, criterion-referenced Attr