Sign In / Sign Out
Navigation for Entire University
- ASU Home
- My ASU
- Colleges and Schools
- Map and Locations
As opposed to full grades in a course or completion of an assignment, course or program.as opposed to full grades in a course or completion of an assignment, course or program. Rubrics created at the skill or knowledge level allows programs to identify instructional areas where students most, when coupled with analytic scoring. The first matrix below is a simply example of using a matrix to measure quantitative literacy skill levels among students in the hypothetical BS JPS program. These results can then be fed into the second matrix below, “Analytic Scoring of Course Level Rubrics” in order to create a course level portrait of student’s strengths and weakness. This information is crucial to guiding continuous improvement efforts that can have the greatest positive impact on student outcomes.
While using courses critical to students’ success in a program, important knowledge and skills can be evaluated using rubrics with analytic scoring. Through the use of rubric analytic scoring at this detailed level, program faculty can separately evaluate students’ ability to knowledge and skills as they relate to specific program outcome. Such a rubric will permit faculty to give feedback (and grades) for each of the separate components. This same approach can be used at the course level and aggregated to program-wide levels easily. We will see later that this approach can also yield rich assessment information that can be used to identify specific program strengths and weaknesses, and guide continuous improvement efforts, and measure this development over time. (Note: 3.8 was chosen in the examples above because it is > 75% of a 1-5 scale).
Some specialized accreditors provide specific learning outcomes that institutions must measure. Although the language and format of those mandated outcomes may not adhere to our guidelines, you should use the specific language provided by the specialized accreditation agency. The only time you may need to restate an external standard would be to focus on the student, if the standard is focused more on resources or program operations. Additionally, please make it clear when an outcome comes directly from an accreditor when designing your assessment plan.
Advantages from Using Digital Portfolios
Arizona State University has a digital portfolio system with features that include artifact collection and rubric scoring that can be adapted to the course and program level. Programs are encouraged to utilize the digital portfolio system to help students build their academic repertoires, as well as aid in program assessment and continuous improvement. Incorporating rubrics into digital portfolios make course expectations transparent, allowing students to better understand how levels of performance are determined for a course or program. Furthermore, rubrics utilized within ASU’s digital portfolio system allow faculty, programs, departments and colleges to create a history of assessment and continuous improvement efforts. The following subsection from the ASU University Technology Office (UTO) provide more detailed information
For information and assistance with digital portfolios at Arizona State University, contact the Digital Portfolio Initiative team in the University Office of Technology email@example.com
Digication Digital Portfolios - Reporting Capabilities
Digication provides a variety of reporting options for viewing data collected through portfolio submissions as well as assessment data. Additionally, Digication provides options to export reports and submission data and provides administrators with tools to oversee system usage statistics.
Digication renders the process of data collection seamless due to its nested model of assessment. Data for student competencies and program effectiveness can be collected from any number and types of sources: assignment responses, practicum performance, course-level outcomes, specialization-specific outcomes (e.g. special education, early childhood education), whole portfolios or just particular portfolio pages, other types of student work.
Once the Digication system is adopted by students, faculty and administrators, their individual modalities of system usage will be integrated in the Digication system as: forms of learning evidence; assessment of learning outcomes, and management and analysis of collected data.
Portfolio Submission and Assessment Data
Digication has a unique approach in regards to program and institutional level assessment needs. Traditionally, most tools have had to choose to either support unstructured (but expressive) portfolios, or structured (but limiting) portfolios. Neither way has worked for institutions: unstructured portfolios are “hard” to assess, and structured portfolios do not encourage students to document authentic learning experiences. Digication supports both through a unique mapping process. Students can freely create portfolios collecting and presenting work in a variety of formats (text, files, image, video, audio, hyperlinks, etc.), and submit only relevant pages to any assignments / projects in courses that they take or within assessment groups that are cohort or program based. Faculty, programs or the institution can create different contexts in which to collect student work, whether it is by learning outcomes or by projects and assignments. Each of the contexts can be mapped to one or more learning outcomes. During the submission process, Digication creates a permanent archive of the submission so that students can be freed to further edit their “live” portfolios while allowing institutions to retain non-editable, permanent, and time-stamped copies of any portfolio submissions. Digication also supports the process of letting programs randomly sample and assign committee members (including internal and external reviewers) to perform assessments using rubrics retrospectively. The results can then be shown in reports for accreditation and for curriculum development.
Every portfolio and artifact uploaded to Digication and every submission of user-created portfolios and artifacts are associated with the user’s ASUrite unique ID and also the course ID (if relevant).
Within, Digication students can submit individual files (such as documents), text as well as a whole portfolio or selected pages of a portfolio. Each submission is archived within the system so that it may be accessed for assessment and reporting needs in the future. Archived portfolios (which are referred to as snapshots) are clickable versions of the portfolio no longer editable by the student. The student can make changes to their portfolio without impacting the assessment process or important institutional archives of student work. All submitted documents or portfolio snapshots are permanently archived. Any files, sections or pages within the portfolio that are later amended will have no impact on the archived submissions or assessment data collected.
Snapshots are time-stamped and include other important metadata (the student, course or assessment group, mapped outcomes, assessment collected via written feedback and/or rubric scores, faculty and reviewers)
These are permanent archives for the institution that may be used for course, program and large scale institutional assessment. Archived work is also referenced in assessment reports.
Archived work may be used in evaluated student growth during a course but may serve as a collection of work to be reviewed as a larger college experience and may also be utilized in longitudinal research efforts.
Collecting snapshots does not require manual steps by University staff.
As portfolios are developed and submitted to courses and relevant assessment groups, specific programs can accumulate large amounts of data about student performance and assess the data in light of determined institutional objectives and program learning outcomes. This can be easily done by: initiating queries under any category deemed relevant (such a courses, specific portfolio assignments, specific outcomes, faculty, submission dates, assessment dates, etc.) sampling individual portfolios or program-specific portfolios, comparing sets of data, or measuring learning outcomes by reference to institution-specific, program-specific, AAC&U and other metrics of interest.
Digication reporting capabilities will provide relevant data to the University regarding student progress toward learning outcomes and standards. The platform enables programs to specify its goals in detail, track progress towards them at the course level rather waiting extended periods of time after a student may have been enrolled to assess their work. Instructors and programs will have access to real-time assessment data linking to actual evidence of student work and therefore make use of valid measures to assess program performance. This real-time collection and assessment of student work ultimately provides instructors and programs with the information they need to make critical curriculum or course design changes to benefit student learning. It provides programs with the data they need to develop and adopt new policies much more quickly if students are not developing the skills at the rate intended to ensure future program improvements.
The University will be able to export critical assessment data for sharing with accreditation teams regarding student progress and achievement and provide evidence of student learning through archived student work and portfolios.
The University will also be able to share with accrediting teams reports detailing course and program alignment to specific standards and learning outcomes and how data is driving decisions for curriculum development and course design.
Many predefined reports are available with sophisticated filtering options to allow institutions to create ad hoc reports as needed. Digication is also available to customize reports based on your needs to add to the list of predefined reports. Custom reports are included for no additional charge. Below is a list of the predefined reports available and the filtering options for each report.
Assessment By Standard (See evidence and assessment data with options to filter by Standard(s), Specific Student(s), Major, Graduation, Specific Course(s), Specific Faculty, Evidence format, Evidence Submission Date Range, Assessment Date Range, Custom Filters).
Course Assessment (Course evidence and assessment data with options to filter Specific Student(s), Major, Graduation, Specific Course(s), Specific Faculty, Assignment Step Types (Evidence, Rubric, Reflection, Standards), Custom Filters).
Overview: Standards (see an Overview of standards and their current usage with options to filter by Standard(s), Specific Course(s), Custom Filters).
Note: It is imperative that assessments include only students in the specific program, which is possible for all programs yet may require seeking assistance. Often, courses have students in more than one program and results can be skewed if program populations are not separated for analysis. It may be necessary to involve an UOEEE college delegate or designee to aid in the isolation of student rosters for each program being assessed, yet because of its importance all programs are required to work with whomever is necessary to report only program participants.