Print

Education & Internships >> Browse Articles >> Education News

+2

Assessing Student Learning: A Work in Progress

Assessing Student Learning: A Work in Progress

Richard Ekman and Stephen Pelletier, Change Magazine

December 02, 2009

Lessons Learned
Several key lessons from the CIC/CLA Consortium have thus far emerged:

1) Involve faculty members. First, as other people have noted since the beginning of the assessment movement, engaging faculty members in assessment is essential to improving student learning. Linda DeMeritt, dean of the college at Allegheny College, says, “I think that once you get faculty members to sit down and look at what the CLA is testing, they agree generally that this is a valuable test.” It’s not that faculty members are leery of assessment, she says, but rather that they are wary of any standardized test they are afraid they’ll end up “teaching to.” Allegheny’s experience, she says, has been that “when faculty members actually see a test like this, which is not your standard multiple choice test, they begin to see its value.” In fact, DeMeritt says that as the Allegheny community reviews syllabi and teaching effectiveness, it is asking whether it can “incorporate any of these types of performance-based tests into our own pedagogy.” In that sense, she says, the CLA is having an impact by prompting Allegheny’s faculty “to think more in terms of learning outcomes than teaching objectives.” The Cabrini experience also illustrates that once faculty members become comfortable with the idea of measuring outcomes, they will become eager to find appropriate assessments and to link them to ongoing efforts to improve the teaching of their subjects.

Ursinus College in Pennsylvania attributes its interest in the CLA to the increased role of faculty members in assessment. The college’s Committee on Outcomes Assessment allows faculty members to draw insights from CLA results that can strengthen students’ learning. They were especially interested in documenting the impact of the first-year liberal studies seminar, the Common Intellectual Experience (CIE), on students’ intellectual development. This seminar and its assessment results are being monitored for applicability to a planned “CIE for Seniors” seminar.

Faculty involvement doesn’t happen by itself. Both Grimes at Barton and McCormick at Cabrini spent a great deal of time meeting with faculty members to articulate the rationale for using the CLA. Perhaps equally important, from the beginning they had the support of a few faculty members who were willing to experiment with the instrument and who then became advocates for it.

2) Don’t rely on a single measure. Much of the resistance to any standardized test has targeted the danger of relying on a single measure of institutional outcomes (see Richard J. Shavelson, “Assessing Student Learning Responsibly,” Change, January/February 2007). But pairing standardized test results with other assessment measures, such as the NSSE or portfolio analyses, and linking test scores with other relevant assessment data provides more robust diagnostic information for efforts to improve teaching and student learning.

Cabrini College, for example, compared its NSSE “Level of Academic Challenge” scores with its CLA results to inform its revision of the general education curriculum. Stonehill College in Massachusetts had a similar experience: Initial results from both the CLA and NSSE—in combination with increasingly selective admissions practices—led the college to question whether it was adequately challenging its students and eventually to a proposed modification of its course-credit model. At Alaska Pacific University, the CLA complemented a recent faculty effort to integrate rubrics detailing student-learning outcomes—and ways to assess them—into course syllabi.

Good performance-based models of testing such as the CLA can also inform the development and improvement of other assessments. At Seton Hill University in Pennsylvania, Mary Ann Gawelek, vice president for academic affairs and dean of the faculty, says faculty members and administrators are now asking themselves, “How do we … push ourselves to use more creative, more applied assessment techniques?”