Linda Suskie

  A Common Sense Approach to Assessment in Higher Education

Blog

How many samples of student work do you need to assess?

Posted on November 22, 2019 at 7:00 AM

It’s a question I get a lot! And—fair warning!—you probably won’t like my answers.

 

First, the learning goals we assess are promises we make to our students, their families, employers, and society: Students who successfully complete a course, program, gen ed curriculum, or other learning experience can do the things we promise in our learning goals. Those learning goals also are (or should be) the most important things we want students to learn. As a matter of integrity, we should therefore make sure, through assessment, that every student who completes a learning experience has indeed achieved its learning goals. So my first answer is that you should assess everyone’s work, not a sample.

 

Second, if you are looking at a sample rather than everyone’s work, you must look at a large enough sample (and a representative enough sample) to be able to generalize from that sample to all students. Political polls take this approach. A poll may say, for example, that 23% of registered voters prefer Candidate X with (in fine print at the bottom of the table) an error margin of plus or minus 5%. That means the pollster is reasonably confident that, if every registered voter could be surveyed, between 18% and 28% would prefer Candidate X.

 

Here’s the depressing part of this approach: An error margin of 5%--and I wouldn’t want an error margin bigger than that—requires looking at about 400 examples of student work. (This is why those political polls typically sample about 400 people.) Unless your institution or program is very large, once again you need to look at everyone’s work, not a sample. Even if your institution or program is very large, your accreditor may expect you to look separately at students at each location or otherwise break your students down into smaller groups for analysis, and those groups may well be under 400 students.

 

I can think of only three situations in which samples may make sense.

 

Expensive, supplemental assessments. Published or local surveys, interviews, and focus groups can be expensive in terms of time and/or dollars. These are supplemental assessments—indirect evidence of student learning—and it’s usually not essential to have all students participate in them.

 

Learning goals you don’t expect everyone to achieve. Some institutions and programs have some statements that aren’t really learning goals but aspirations: things they hope some students will achieve but not that they can realistically promise that every student will achieve. Having a passion for lifelong learning or a commitment to civic engagement are two examples of such aspirations. It may be fine to assess aspirations by looking at samples to estimate how many students are indeed on the path to achieving them.

 

Making evidence of student learning part of a broader analysis. For many faculty, the burden of assessment is not assessing students in classes—they do that through the grading process. The burden is in the extra work of folding their assessment into a broader analysis of student learning across a program or gen ed requirement. Sometimes faculty submit rubric or test scores to an office or committee; sometimes faculty submit actual student work; sometimes a committee assesses student work. These additional steps can be laborious and time consuming, especially if your institution doesn’t have a suitable assessment information management system. In these situations, samples of student work may save considerable time—if the samples are sufficiently large and representative to yield useful, generalizable evidence of student learning, as discussed above.

 

For more information on sampling, see Chapter 12 of the third edition of Assessing Student Learning: A Common Sense Guide.

 

Categories: How to assess, Good assessment

Post a Comment

Oops!

Oops, you forgot something.

Oops!

The words you entered did not match the given text. Please try again.

0 Comments