Thursday, March 10, 2011

No to Cloze Tests But Yes to Cloze Tasks

One of the people who replied to my post on readabiity formulas asked me what I thought of cloze testing. The timing of the question was perfect because I've recently been thinking and reading about cloze testing as an alternative to standard readability formulas.I've been thinking about this subject because I recently FAILED a cloze test that Jakob Nielsen included in one of his Alertbox newsletters on web usability.

In responding to the question, my answer grew so long--a not unusual occurrence in my case--I decided to make cloze testing the subject of a new post because I think my experience with a cloze test illustrates some of its problems (It has to have problems; otherwise how could I do so badly?).

If I scored the passage according to the advice of the cloze procedure's inventor W.L. Taylor--Taylor believed the reader had to fill in the blank with the exact word originally deleted--I got very few right (And no, I am not revealing how many I got wrong or right). If I went with one of the alternatives, which Nielsen suggests i.e. accepting synonyms as correct answers, I got a little over 60% correct, which technically means that the text was within my range of comprehension as long as you accept that method of scoring. My husband also took it, and he got a score of 100, again following Nielsen's suggestion that synonyms were fine.

My humiliating experience on Nielsen's cloze test highlights what are just two of the problems related to using it as a test of comprehension: (1) There's a good deal of disagreement about what a correct answer is (Some people get around the debate by making the tests multiple choice and supplying several possible words or phrases to fill in the blanks, which I think is a good idea) and (2) The reader's background knowledge or lack of it plays a big role in performance. The passage was about Facebook privacy policies. My husband is on Facebook. I am not (which is clearly the reason he scored higher than I did, at least this is what I tell myself)

But there are other problems or issues that are much debated in the literature on cloze. For instance, should the test-maker eliminate every 5th word (I think that's the most common choice). Or should she eliminate every 6th,7th, or 9th word in the "mutilated" passage. (That's how some researchers describe passages with deletions. I mention it here because I find the word choice just hilarious).

Another question is, should the elimination be random or should it focus on deleted nouns versus deleted verbs? Some researchers think it's nouns that are the big meaning carriers. Thus, eliminating them makes the test too hard to be useful. I could go on here, but I think my point is clear: Using cloze testing to determine either ease of reading or rate of comprehension raises an awful lot of complex questions about both test creation and scoring.

All that being said about why not to use cloze tests to determine comprehension or readability, I still think cloze tasks, or exercises, are a great way to sharpen students' sense of how a text builds meaning step by step or sentence by sentence. I also think cloze-based exercises make readers focus on context cues, not just for vocabulary but for overall meaning.

For those reasons,I would definitely suggest including cloze tasks in the classroom. They are a terrific way to get a sense of, for instance, how well or poorly students make use of linguistic cues that tell readers about relationships between ideas. I'm working, for example, on a cloze exercise where all explicit connectives have been eliminated, i.e. transitions, conjunctions and opening adverbial clauses.

Used in this way, I think cloze testing, or more precisely, tasking becomes an excellent and very specifically-focused diagnostic tool. Of course, it also becomes, technically, more a fill-in-the-blank exercise than a formal cloze exercise, where the deletions are usually decided on by a formula, or at least they were when Taylor introduced the cloze procedure in 1953. But Taylor's formula has been fooled with so much since he first published it in the Journalism Quarterly, more than half a century ago, I don't feel compelled to follow it too abjectly, which is pretty much my attitude toward all formulas now that I think about it.

3 comments:

Anonymous said...

Thanks for the detailed response. Next time, I'll try to make my comment follow the right post.

retProf said...

I was interested in what you said about cloze tests or tasks. Do you know of any particular books that do a lot with cloze materials as opposed to ordinary fill in the blank exercises, with which I am familiar?

Laraine Flemming said...

@retProf

No, I don't off hand, but I will do some research and get back to you on that. For diagnostic and practice purposes, I'm a real fan of cloze tasks, so I need to know this for myself as well as to answer you.

About this blog: For years now, whenever I wanted to test out a new exercise or figure out how I’d like to address a new topic,I’ve been sending out an SOS to teachers I’ve worked with or met at conferences and online and asking them what they thought of my approach or if they had another way of addressing say improving students ability to stay focused while reading on the Web.

Probably later than it should have, it’s now occurred to me that a blog might be a good way to bring others into these online discussions, which, for me anyway, have been incredibly valuable. So every week or so, I’m going to post my thoughts on a topic that I consider really central to the teaching of reading and writing. In every post, I’ll include practical strategies for addressing the topic discussed.

My hope is that other instructors will respond with their thoughts and, over time, we can come up with a repository of teaching methods geared to specific objectives like teaching coherence in writing or using linguistic cues in reading and a host of others.