Skip to Main Page Content
Home / Data Collection

Data Collection

Overview

Whenever possible, do not create separate assessment instruments to measure program outcomes.  Rather, use the processes and instruments being used in the classroom for assessment.  It is important, however, that some measures focus on identifying the skills and knowledge acquisition areas where students struggle most during the program, in addition to program outcome measured administered to program graduates.  Having this full scope of assessment information is crucial to being able to effectively focus continuous improvement efforts on strengthening student outcomes and potentially aiding with programs retention.  In addition, to collect data at the skill and knowledge set levels, using a rubric with analytic scoring is strongly recommended in most circumstances. 

 Assessment Plan Requirements

Academic program assessment should not be thought of as a periodic activity with a finite beginning and end. It is a continuous and ongoing process; each cycle provides information about the degree of success from the previous cycle and informs improvement decisions and activities in the subsequent cycle. 

  • Required Minimum Outcomes: Two outcomes with two measures each.
  • Maximum Outcomes: None, yet no more than six recommended. 
  • Provide at least one direct outcome (see Assessment Plan Handbook for more directions)
  • Include/ describe program specific and general education skills
  • Specialized accreditation or certification requirements included

Program Exclusive Populations

Often courses have students from a variety of programs enrolled, particularly lower division courses.  When this occurs, the separation of students by program needs to occur to ensure assessment findings are not inappropriately muddled.  While it may require help from you college delegate and their resources, it is imperative to report findings for program populations only.  If your college delegate is not able to help a program separate program students within a course, contact the UOEEE Assessment Team. 

Small Population Solutions

There are numerous programs at ASU with assessment populations below 20 students, the preferred limit for assessment findings to be considered reliable.  For these programs, there are two potential changes that can be made to increase assessment populations.  First, add measures that assess students during the program when current outcomes measure only program graduates, which is the case for many ASU programs.  As an example, programs can have 30 students in a program and 10 are graduating, making the graduating population too small for an annual assessment while there are plenty of students enrolled to meet the 20 observation minimum standard.  This ‘formative” data can also be more informative for identifying areas for instructional improvement, as programs know where to focus improvement efforts. 

When assessment populations are still under 20 observations, it is then possible to combine data from past years to draw findings.  Using rolling three year populations for assessment, many programs will be able to reach observation minimums.  Programs that continue to have fewer than 20 observations in their assessment data set, caution should be used in drawing conclusions.  When there is still not 20 observations available, please contact the UOEEE assessment team for further guidance on processing program findings. 

Rubrics and Digital Portfolios

Rubrics are very effective tools for measuring skill and knowledge attainment in many instances, if developed carefully and analytic scoring is utilized. However, when designing a rubric there are a few considerations to be made. First, is the work being addressed holistic (cumulative) or analytic (item focused and cumulative)? The difference between these types is that a holistic rubric will result in a single score, thus the criteria being assessed consists of related properties that will be assessed holistically. An analytic rubric consists of criteria that are assessed and scored separately, resulting in scores for specific skills and knowledge that are combined to the course, program, department, college, and university levels. The other element to consider is whether the rubric consists of checklists, ratings, or descriptions. A checklist rubric consists of checkboxes that indicate whether a criterion was met or not.

A rating scale rubric determines the level to which a criterion exists and is preferred for most program assessments as data assisting continuous improvement efforts are created. A descriptive rubric keeps the ratings but replaces the checkboxes with spaces where brief descriptions can be written in to explain the rating. For programs that want to include outcomes that may seem ambiguous or difficult to measure, consider using AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics. The rubrics were developed as part of a large FIPSE-funded project... The rubrics can be downloaded, free of charge, https://www.aacu.org/value-rubrics. Although the rubrics were developed for undergraduate education, they also can be used to measure graduate work. Numerous examples of rubrics can also be found through the Association for the Assessment of Learning in Higher Education; AALHE Sample Rubrics

ASU has become a national leader in the use of digital portfolios, as they are an effective assessment tool allowing students to display a wide variety of learning and skills. Portfolios can show the value added of a student’s education as it can demonstrate development across the program. Additionally, portfolios require student reflection upon their work for inclusion in the portfolio, allowing the student to choose how to document their achievement of learning outcomes. This process further involves the student within the assessment process and allows for a very holistic review of learning for students and faculty. 

Identifying General Education Skills

During the course of college program, numerous general education skills are imparted to students explicitly and inexplicitly.  For the purpose of assessment, ASU identifies general education skills provided in each program based on descriptions in assessment plans.  The purpose of assessment is not to make general education skills the focus of any program.  Rather, program specific skills and knowledge sets should be the focus of assessment with general educational skills supporting total academic efforts.  Currently, ASU is grouping general education skills into 12 categories adapted from the American Association of Colleges and Universities’ VALUE Rubrics Categories and Definitions.  Programs are not expected to provide all 12 general education skills, as not all skills are relevant to all academic disciplines.  The general education skills relevant to each program, however, need to be discussed in assessment plans and report descriptions so they can be aggregated with other data and used for reporting and program improvement efforts at ASU. 

General Education Skills

    1. Critical Thinking
    2. Creative Thinking
    3. Language & Literature (Reading)
    4. Information Literacy
    5. Collaboration/Teamwork
    6. Written Communication
    7. Verbal Communication
    8. Quantitative Literacy
    9. Inquiry & Analysis
    10. Problem Solving
    11. Ethnical Reasoning
    12. Global , Cultural, Historical Awareness

    Requesting University Data

    Three year reports at the program level are generated annually and loaded onto the UOEEE Assessment Analytics website Click Here. If you do not have permission to access these reports, please contact your assessment coordinator to request permissions to receive your department’s report. Instructors and administrators should also have access to course evaluation data through the Course Evaluation interface. Additional analysis for both course evaluation and university survey data can be requested here

    Assessment Tools and Resources

    The most commonly used assessment tools are exams, portfolios, rubrics, and university data (e.g., surveys, course evaluations).

    • Rubrics: For any subjective assessment (portfolios, papers, capstones, dissertations, etc.), rubrics are the most common method for determining student attainment of outcomes. However, when designing a rubric there are a few considerations to be made. First, is the work being addressed holistic or analytic? The difference between these types is that a holistic rubric will result in a single score, thus the criteria being assessed consists of related properties that will be assessed holistically. An analytic rubric consists of criteria that are assessed and scored separately resulting in a composite score. The other element to consider is whether the rubric consists of checklists, ratings, or descriptions. A checklist rubric consists of checkboxes that indicate whether a criteria exists or not. A rating scale rubric determines the level to which a criteria exists in a work or not. A descriptive rubric keeps the ratings but replaces the checkboxes with spaces where brief descriptions can be written in to explain the rating. For programs that want to include outcomes that may seem ambiguous or difficult to measure, consider using AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics. The rubrics were developed as part of a large FIPSE-funded project. More about the project can be found at http://www.aacu.org/value/ . The rubrics can be downloaded, free of charge, https://www.aacu.org/value-rubrics . Although the rubrics were developed for undergraduate education, they can also be used to measure graduate work.  Numerous examples of rubrics can also be found through the Association for the Assessment of Learning in Higher Education; AALHE Sample Rubrics 
    • Exams: Either as an objective or subjective assessment, exams can be used for outcome indicators for the completion of a course. When designing an exam both for a course as well as a program assessment, it can be helpful to design a blueprint for the exam. This will help ensure all learning goals are represented and balance among conceptual understanding and thinking skills is struck. This will make the writing of the questions for the exam easier as it is clear what knowledge and which skills a student must demonstrate to meet the learning outcome. Additionally, the test blueprint will make it easier in the review process to pair questions back to their appropriate outcomes, as well as allowing for an in-depth review of the demonstrated skills of each section of the test.
    • Portfolios: ASU has become a national leader in the use of digital portfolios, and they are an effective assessment tool as they allow students to display a wide variety of learning and skills. Portfolios can show the value added of a student’s education as it can demonstrate development across the program. Additionally, portfolios require student reflection upon their work for inclusion in the portfolio, allowing the student to choose how to document their achievement of learning outcomes. This process further involves the student within the assessment process and allows for a very holistic review of learning for students and faculty.
    • Advantages of Using Digital Portfolios: 

      Arizona State University has a digital portfolio system with features that include artifact collection and rubric scoring that can be adapted to the course and program level. Programs are encouraged to utilize the digital portfolio system to help students build their academic repertoires as well as aid in program assessment and continuous improvement. Incorporating rubrics into digital portfolios makes course expectations transparent, allowing students to better understand how levels of performance are determined for a course or program. Furthermore, rubrics utilized within ASU’s digital portfolio system allow faculty, programs, departments and colleges to create a history of assessment and continuous improvement efforts. 
    • University Data: Though indirect, it is important to consider the attitudes, dispositions, and values students assign to their education and learning outcomes. The best method for collecting this information is through the graduating and alumni surveys or the course evaluations. This data indicates students’ reflections on their education as a whole in addition to students’ behaviors after obtaining the program’s learning objectives. This data can provide new insight into growing fields and expanding learning opportunities to be explored for current students.

    Assessment Handbook

    To assist units in the assessment planning process, we created a handbook: Effective Assessment Planning, Reporting, and Decision Making.  Please refer to this handbook as you create your assessment plans and reports. To access this handbook, please authenticate using your ASURITE.

    Assessment Portal

    The following link will open the UOEEE Assessment Portal where all assessment plan development and reporting activities take place.