By: Dr. Christina Dryden, Director of Assessment
Assessment is a vital part of the core APUS mission and work in higher education. As such, our Assessment and Accreditation Department is a multifaceted group that facilitates Higher Learning Commission accreditation activities and all programmatic/specialty accreditation endeavors, as well as oversees assessment processes including curricular mapping and program review. Our 2016 goals collectively included the following:
- Complete a quality assurance process for curricular mapping of institutional learning outcomes for all programs;
- Monitor and evaluate signature assignment data in program review;
- Implement program and university wide rubrics;
- Provide additional student learning data to students and stakeholders; and
- Enhance the assessment web presence for demonstrating data and institution-wide assessment processes.
Fundamentally, colleges and universities need to ask “are our students learning?” and “how do we know?” Assessment of student learning across the university promotes academic quality and continuous improvement. These two topics are especially important to faculty members.
While faculty are involved with assessment at the course-level, APUS is also rich with assessment at the program- and institutional-levels. From the Educational Testing Service (ETS) Proficiency Profile and National Survey of Student Engagement (NSSE) to employer surveys, direct and indirect assessment measures are used to measure student learning and satisfaction at the course, program, and institutional levels. A major component of our assessment process is program review, conducted for every program every three years. Program review can result in creation or deletion of courses, changes in the program core, addition of specializations within the program, or other program adjustments.
The university assessment process is richer when faculty participation is threaded throughout and not just limited to the classroom. By working together to answer the question, “Are our students learning?” we can look for areas of improvement and ensure the quality of our degrees.
In 2013, the Lumina Foundation’s Degree Qualifications Profile (DQP) framework, along with digital literacy, was adopted by APUS as its Institutional Learning Outcomes (ILOs). Lumina states that “The DQP outlines a set of reference points for what students should know and be able to do upon completion of associate, bachelor’s, and master’s degrees – in any field .” These reference points include specialized knowledge, broad and integrative knowledge, intellectual skills, applied and collaborative learning and civic and global learning.
Developed to address the knowledge needs in an ever-changing global and technological workplace, DQP provides the lens for assessment efforts at APUS, which have shifted from using the term DQP framework to ILOs. With new ILOs, the next steps occurred over 2013-15 and included mapping of program objectives and identification of “signature assignments.”
The process provided an opportunity to bring university community members together to discuss the curriculum of the programs and define key assignments to assess the ILOs. This curricular mapping process with signature assignments provided another opportunity to track the progress of student success by incorporating reporting within the university program review process. At APUS continual improvement in student learning is aided by program review, which provides an opportunity for program directors, deans, and faculty members to evaluate their program in detail every three years. In fact, an article highlighting our efforts was published by the National Institute for Learning Outcomes Assessment (NILOA).
All associates, bachelor’s, and masters programs launched prior to 2015 now have ILO maps with identified signature assignments. In order to provide some feedback and make connections stronger about the alignment of the POs and signature assignments, a quality assurance review was completed by the Assessment team. This process looks at the assignment selected by the program director and rates the following:
- Is all information complete? (course #, name, learning objective, assignment name, description and details about why selected)
- What is the overall alignment? (ILO, program objective, course objective, and assignment using keywords and concepts)
The key to rating the alignment is matching the ILO language to the assignment description. Quality assurance review findings are reported to school deans and program directors to be used in program reviews or at other times when changes to the program are considered.
During the process of ILO mapping and identifying signature assignments a need was realized for the creation of rubrics which aligned with the newly-adopted ILO areas and could be used university-wide, regardless of the program or signature assignment. The availability of university-wide rubrics is nothing new at APUS. Writing rubrics for lower- and upper-level undergraduate and graduate courses can be found on the Learning Outcomes Assessment website.
ILO rubrics were created and adopted by faculty for their programs. The ILO rubric incorporates the ILO into the criteria of the rubric and then builds on the ILO language to make a full range of levels of mastery. For example, one of the bachelor’s degree Specialized Knowledge ILOs is “defines and explains boundaries, divisions, styles, and practices of the field in a clear manner.” This definition is set as the “Accomplished” level. By adding descriptions for “Exemplary,” “Developing” and “Beginning” the criteria for the rubric are set.
Faculty can use the pre-built ILO rubric criteria and add it to any current or new rubrics. By incorporating the ILO rubric criteria, both faculty and students will begin to learn more about the ILOs and understand how ILO mastery is defined.