Assessments / by Tyler Wood

What does assessment look like in a backward designed class?

Grant Wiggins hasn't changed his overarching ideas about assessments from his 2002 article Toward Genuine Accountability: The Case for a New State Assessment System. He still promotes the idea of building evidence of learning and having more assessments rather than a few tests a year that "provide woefully sketchy and delayed feedback, on tasks that do not reflect real achievement" (Wiggins, 2002). As he puts it in his 2006 article Healthier Testing Made Easy: The Idea of Authentic Assessment, "more 'authentic' and comprehensive forms of assessment provide not only significant gains on conventional tests but also more useful feedback" (Wiggins, 2006). His point is summed up in Understanding by Design, " Effective assessment is more like a scrapbook of mementos and pictures than a single snapshot" (Wiggins & McTighe, 2006, p. 152). In other words, he remains consistent with the idea of diversity of assessments rather than high-stakes testing being the basis of grades and assessments, as many places still do today.

What has changed?

In 2002, Wiggins' "student portfolio" was a smaller and less diverse thing. It included test data from state and district tests, state-approved writing prompts, and the vague "all relevant locally-designed assessments" (Wiggins, 2002). Later, he paints a much richer picture of the diversity of assessments including "checks of understanding (such as oral questions, observations, dialogues); traditional quizzes, tests, and open-ended prompts; and performance tasks and projects. They vary in terms of scope (from simple to complex), time frame (from short- to long-term), setting (from decontexualized to authentic contexts), and structure (from highly directive to unstructured" (Wiggins & McTighe, 2006, p. 152). Though the general ideas haven't changed, the specifics have become more detailed.

What should change?

In a short answer - our attitudes. I believe we have the research to back up Wiggins' ideas here. It seems pretty common sense, and research supports this, that diversifying the way we assess a student is a more effective way of determining how the student is doing. How well are they understanding content, critically thinking about real-world problems, self-assessing their learning etc... Yet, we remain hooked on high-stakes testing, especially here in Korea. I'm not sure why we can't seem to adapt to this information, but it remains the Achilles' heal for our education system. I remain tied to testing at my school, but I try and build up other methods of assessment to counteract the importance of the test. I want to flood the grading system with other types of assessment to get a better picture of the classroom. I use quick oral assessments in every class, journal writing prompts, quizzes, and technology-based critical thinking projects, among other things, to build a portfolio of learning in the class. I also encourage each student to look at and evaluate each of these assessments to grow from their mistakes, instead of just taking it home and throwing it away.

South Korea does well on international testing, does that prove that relying on high stakes testing works?

I'm not well versed in the history of Korean education but it wasn't good until awhile after the war. The 'dictator' who lead them to prosperity (and whose daughter is the current president) also built up the education system in the 1970's (Dalporto, 2014). The success of the system on international testing is a bit of a trouble subject. I've read many interpretations, but I think it's certainly not something that can be replicated in other societies well. It is very high-stakes oriented and Korea has a high student suicide rate to prove how high stakes it is. However, education remains a powerful force to better yourself and it is heavily promoted, so it remains a goal most students have been raised to have their whole lives which helps create a motivated student body. However, they mostly do well on subjects that can be memorized, like math and science, and, as is noted in Clark Sorenson's article Success and Education in South Korea, "educated Koreans often respond to questions about South Korean students' mastery of math by noting that none of the world's famous mathematicians have been East Asian." In other words, high stakes testing may work to create high test scores, but not transfer to actual success in the critical thinking skills it takes to discover something new in math, and perhaps science (though I think they have had more success in science than math). 

Having said that, it seems clear to me that relying on these tests to tell me about my students is weak at best, or completely inaccurate at worst. I don't teach something that can easily be memorized, I'm teaching a living language. Language (for fluency) can not be memorized because we need to be able to build our own sentences to communicate our own unique ideas. How can you create something new if you rely on information someone gave you alone? The tests will help me see what the kids remember about the content, and that's what I use them for, but to get a grasp of what they think, I use journal writing, class discussion (formal and informal), questions (from them and from me to them), and other small assignments and tasks that relate to the particular skill or skills we are learning. 

For more information on the South Korean education model, click below. 

Possible side-effects of the system of high-stakes testing without a focus on authentic assessment when confronted with the 'real world'.


Dalporto, D. (2014). South Korea's School Success. We Are Teachers. Retrieved from

Sorenson, C. (1994, Feb.). Success and education in South Korea. University of Washington. Retrieved from

Wiggins, G. (2002, Jan.) Toward genuine accountability: the case for a new state assessment system. Edutopia. Retrieved from

Wiggins, G. (2006, Apr.) Healthier testing made easy: the idea of authentic assessment. Edutopia. Retrieved from

Wiggins, G., & McTighe, J. (2006). Understanding by design. (2nd ed). Upper Saddle River, NJ: Pearson.

Posted in