Sign In / Sign Out
Navigation for Entire University
- ASU Home
- My ASU
- Colleges and Schools
- Map and Locations
Measures are a tool or instrument to score students’ work. In the measure section of an assessment plan, programs will be expected to identify the student artifact/assignment (i.e., paper, capstone project, dissertation, thesis, exam) that will be scored, what course or where the artifact will be collected from (e.g., PSY 101 or an external internship), and the measurement tool being used to assess the artifact (i.e., rubric, survey, exam, etc.). Measures work in tandem with performance criteria. More specifically, programs should include the artifact and tool used to make a judgment concerning demonstrable student and graduate abilities.
Direct and Indirect Measures
Both direct and indirect data are important for evaluating program quality. Direct measures collect data on student learning directly related to knowledge and academic performance while indirect measures provide information on attitudes, experiences, and perceptions from stakeholders that can help support and explain findings of direct assessment data.
Direct data: For direct data, UOEEE recommends that programs use rubrics paired with a student artifact whenever possible. Rubrics or score cards can be paired with a number of student artifacts including class assignments, research papers, capstone projects, performances, laboratory activities, or clinical examinations. Rubrics are preferred over grades (i.e., class and exam grades) since they can identify trends in the different areas of knowledge.
Rubrics: When utilizing rubrics, UOEEE recommends rubrics with four levels (1-4) with faculty calibrating their rubrics so that a majority of students, or an “average” student, would achieve a rating of 3 out of 4, with 4 being reserved for the exceptional student. Regardless of the number of levels chosen, UOEEE recommends the inclusion of a “0” rating or N/A to represent the absence of material or the absence of relevant work submitted. Please look at the sample rubric (see Figure 18 in the UOEEE handbook) and the rubric module in Canvas for more instruction on how to develop rubrics.
Indirect data: Indirect data can be collected in many ways including focus groups or interviews where faculty can provide feedback and insight to a program’s curriculum, reflective essays asking where and how students learned specific information, and alumni surveys where alumni are asked to reflect on how their educational experiences shaped their current career.
Within each assessment plan, there must be a minimum of two measures for each outcome and at least one measure must be a direct measure. In addition, programs seeking establishment or going through APR, will now be required to have at least one indirect assessment per plan.
Formative and Summative Measures
Like direct and indirect measures, including both formative and summative measures within a program’s assessment plan can provide a richer and fuller view of student learning. While direct and indirect measures differ in the type of information, formative and summative measures differ in when student learning is assessed.
Formative measures are assessments that occur during the learning process to monitor student progress and help identify instructional areas where continuous improvements can be focused. At ASU, bachelor programs can begin assessing students during the students’ 200 and 300 level courses if it is important to measure learning gained while progressing through the program. Not all students in the program are expected to be assessed but should be eligible to be sampled if programs are large and reliability is tested to ensure accurate assessment results.
Summative measures are assessments that occur at the point of mastery, often as students graduate from the degree program. They provide insight into a program’s bottom line—have students achieved the learning outcomes. Data collection after graduation also provides summative data. This data can include licensure exam scores, certification numbers and indirect data such as employment numbers, graduate school admissions and student surveys asking students insight on how well prepared they felt they were entering the workforce.
For further guidance on the developing measures, the different types of measures, and best practices please visit the Measures section of the handbook or go to the Concept, Competencies, Measure, and Performance Criteria section of the Canvas site. Completed sample plans can also be found on the UOEEE's Canvas site under the Sample Plans/Report module.