A Unified College Plan for Assessing Student Learning
The goal of this project is to assess student learning on multiple levels with a common reservoir of data and to increase faculty collection and submission of this data. As a result, we will develop a step by step online guide and tutorial to train faculty to track and report student general education and course learning outcomes. The tutorial and online guide culminates into an electronic assessment reporting rubric. This e-rubric will permit data to be cross-referenced within and between courses and disciplines.
Reasons for Project
On review of our 2006 and 2010 Systems Portfolios, our new VPAA became aware that faculty needed to provide complete and regular documentation of their students' academic learning. The shortcoming was not with program reviews, which tend to focus on specific sets of specialized courses. Rather, a better process for collecting and reporting general education and student learning outcomes was needed. Our Student Learning Committee revised our five general education outcomes in 2011 and developed a process whereby faculty submitted artifacts (portfolios, papers, tests) for peer review. (This academic year we will move from assessing two general education outcomes per year to all five.) There was a low return rate from faculty, which affected a clear snapshot of student competency levels with regard to general education outcomes. Furthermore, faculty tracking of student learning outcomes seemed to be dependent on certain programs. If a program needed to report the data to an accrediting body, for example nursing, there was ample data. If a program was not tied to such a body, data was inconsistently reported. System Portfolio reviewers wrote that there was too much anecdotal evidence and not sufficient hard data. As a result of such feedback and our own observations, we initiated this project.
Organizational Areas Affected
Full-time faculty will input student data into an e-rubric during the fall 2013 semester. Part-time faculty will become part of this process beginning in spring 2014. Division administrative assistants and deans will monitor submissions to insure all faculty submit assessment reports. The VPAA and the Student Learning Committee will evaluate the assessment process and make adjustments where appropriate.
Key Organizational Process(es)
This project will affect three organizational processes: assessment, accreditation, and state appropriations. As a result, our college will need to have a systematic process in place to accurately track the progress of our students. As part of our growing culture of quality, our assessment plan will also provide raw data for individual faculty, departments, and divisions to use for program and accreditation reports as well as make improvements in course design and delivery—all supporting student success and completion.
Project Time Frame Rationale
One academic year provides sufficient time for information/feedback sessions and beta testing. Moreover, this period allows for a three phase sequential rollout: draft, pilot, and modifications. The project began June 2013 by holding meetings with deans, chairs, and program directors, most of who now serve on our Student Learning Committee. We launched a draft of our project plan to faculty in August 2013 during our in-service week. We held two information/feedback sessions, one for full-time faculty and one for part-time faculty. The VPAA and interim dean of liberal arts also made a presentation on assessment to insure faculty were familiar with its general terms and processes. Full-time faculty will be expected to report assessment results beginning this fall and part-time faculty will begin the process in spring 2014. The fall 2013 pilot provides us with some time to evaluate the process before bringing all faculty on board. We will aggregate and analyze data at the completion of the fall and spring semesters with input from faculty and our Student Learning Committee, making changes as needed.
Project Success Monitoring
- Number of faculty who submit data to e-rubric
- Part-time and new full-time faculty trained in spring 2014 on e-rubric
- Modifications made prior to spring 2014 to e-rubric/assessment process based on feedback from faculty
Project Outcome Measures
- 100% of faculty entered assessment data using e-rubric
- All five general education outcomes had a minimum of 10 course artifacts submitted by faculty
- Faculty provided feedback during fall and spring in-service and division meetings on assessment process
- Identified and tracked changes faculty made to course syllabi, course design, and course delivery based on assessment data; report any best practices