Sign In / Sign Out
Navigation for Entire University
Whenever possible, do not create separate assessment instruments to measure program outcomes. Rather, use the processes and instruments being used in the classroom for assessment. It is important, however, that some measures focus on identifying the skills and knowledge acquisition areas where students struggle most during the program, in addition to program outcome measured administered to program graduates. Having this full scope of assessment information is crucial to being able to effectively focus continuous improvement efforts on strengthening student outcomes and potentially aiding with program retention. In addition, to collect data at the skill and knowledge set levels, using a rubric with analytic scoring is strongly recommended in most circumstances.
Academic program assessment should not be thought of as a periodic activity with a finite beginning and end. It is a continuous and ongoing process; each cycle provides information about the degree of success from the previous cycle and informs improvement decisions and activities in the subsequent cycle.
Often courses have students from a variety of programs enrolled, particularly lower-division courses. When this occurs, the separation of students by program needs to occur to ensure assessment findings are not inappropriately muddled. While it may require help from your college delegate and their resources, it is imperative to report findings for program populations only. If your college delegate is not able to help a program separate program students within a course, contact the UOEEE Assessment Team.
There are numerous programs at ASU with assessment populations below 20 students, the preferred limit for assessment findings to be considered reliable. For these programs, there are two potential changes that can be made to increase assessment populations. First, add measures that assess students during the program when current outcomes measure only program graduates, which is the case for many ASU programs. As an example, programs can have 30 students in a program and 10 are graduating, making the graduating population too small for an annual assessment while there are plenty of students enrolled to meet the 20 observation minimum standard. This ‘formative” data can also be more informative for identifying areas for instructional improvement, as programs know where to focus improvement efforts.
When assessment populations are still under 20 observations, it is then possible to combine data from past years to draw findings. Using rolling three-year populations for assessment, many programs will be able to reach observation minimums. Programs that continue to have fewer than 20 observations in their assessment data set, caution should be used in drawing conclusions. When there is still not 20 observations available, please contact the UOEEE assessment team for further guidance on processing program findings.
Rubrics are very effective tools for measuring skill and knowledge attainment in many instances if developed carefully and analytic scoring is utilized. However, when designing a rubric there are a few considerations to be made. First, is the work being addressed holistic (cumulative) or analytic (item focused and cumulative)? The difference between these types is that a holistic rubric will result in a single score, thus the criteria being assessed consist of related properties that will be assessed holistically. An analytic rubric consists of criteria that are assessed and scored separately, resulting in scores for specific skills and knowledge that are combined to the course, program, department, college, and university levels. The other element to consider is whether the rubric consists of checklists, ratings, or descriptions. A checklist rubric consists of checkboxes that indicate whether a criterion was met or not.
A rating scale rubric determines the level to which a criterion exists and is preferred for most program assessments as data assisting continuous improvement efforts are created. A descriptive rubric keeps the ratings but replaces the checkboxes with spaces where brief descriptions can be written in to explain the rating. For programs that want to include outcomes that may seem ambiguous or difficult to measure, consider using AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics. The rubrics were developed as part of a large FIPSE-funded project... The rubrics can be downloaded, free of charge, https://www.aacu.org/value-rubrics. Although the rubrics were developed for undergraduate education, they also can be used to measure graduate work. Numerous examples of rubrics can also be found through the Association for the Assessment of Learning in Higher Education; AALHE Sample Rubrics
ASU has become a national leader in the use of digital portfolios, as they are an effective assessment tool allowing students to display a wide variety of learning and skills. Portfolios can show the value-added of a student’s education as it can demonstrate development across the program. Additionally, portfolios require student reflection upon their work for inclusion in the portfolio, allowing the student to choose how to document their achievement of learning outcomes. This process further involves the student within the assessment process and allows for a very holistic review of learning for students and faculty.
To meet new ABOR general education skill and habit expectations, all new programs, those undergoing academic program review (APR), or those choosing to upgrade assessment plans to meet new standards must now provide the following information on how general education-specific activities are to be assessed.
Table One: Areas of Knowledge
Table Two: General Education Skills and Intellectual Habits
In addition to the information that must be considered in Tables One and Two, the following instructions directly from ABOR Policy 2-210 must be kept in mind while developing undergraduate programs.
From ABOR Policy 2-210: https://public.azregents.edu/Policy%20Manual/2-210%20General%20Education.pdf
Areas of Knowledge: Where in the program are these areas addressed? | General Studies Courses | Core Curriculum | Other: IA, Internships, Graduate School, Publications |
Literature, Fine Arts & Humanities |
|
|
|
Mathematics/ quantitative reasoning |
|
|
|
Social/ behavioral sciences |
|
|
|
Natural sciences |
|
|
|
American Institutions, Economics & History |
|
|
|
Composition, Communication & Rhetoric |
|
|
|
Ethics and Ethical Reasoning |
|
|
|
Civil Discourse/ Civic Knowledge |
|
|
|
Global Awareness, Diversity & Inclusion |
|
|
|
Options: Measure, Narrative, Proxy |
Skills and Intellectual Habits: Where in the program are these areas addressed? | General Studies Courses | Core Curriculum | Other: IA, Internships, Graduate School, Publications |
Written Communication |
|
|
|
Verbal Communication |
|
|
|
Intercultural Competency |
|
|
|
Reasoning & Evidence |
|
|
|
Critical Thinking |
|
|
|
Ideas to Real-World Application |
|
|
|
Civic Engagement |
|
|
|
Civil Discourse |
|
|
|
Lifelong Learning |
|
|
|
Options: Measure, Narrative, Proxy |
Three-year reports at the program level are generated annually and loaded onto the UOEEE Assessment Analytics website Click Here. If you do not have permission to access these reports, please contact your assessment coordinator to request permission to receive your department’s report. Instructors and administrators should also have access to course evaluation data through the Course Evaluation interface. Additional analysis for both course evaluation and university survey data can be requested here.
The most commonly used assessment tools are exams, portfolios, rubrics, and university data (e.g., surveys, course evaluations).
Advantages of Using Digital Portfolios:
Arizona State University has a digital portfolio system with features that include artifact collection and rubric scoring that can be adapted to the course and program level. Programs are encouraged to utilize the digital portfolio system to help students build their academic repertoires as well as aid in program assessment and continuous improvement. Incorporating rubrics into digital portfolios makes course expectations transparent, allowing students to better understand how levels of performance are determined for a course or program. Furthermore, rubrics utilized within ASU’s digital portfolio system allow faculty, programs, departments, and colleges to create a history of assessment and continuous improvement efforts.To assist units in the assessment planning process, we created a handbook: Program Assessment Handbook. Please refer to this handbook as you create your assessment plans and reports. To access this handbook, please authenticate using your ASURITE.
The following link will open the UOEEE Assessment Portal where all assessment plan development and reporting activities take place.