REEP Writing Rubric

From LiteracyTentWiki
Jump to: navigation, search

Back to Assessment Information


The following Guest Discussion took place on the NIFL-Assessment Listserv between February 14 and February 18, 2005. The topic is the REEP Writing Process and Rubric and Voice in Writing.


Read Discussion Summary: REEP Writing Rubric


Announcement

Good morning, afternoon, evening, and Happy Valentine's Day to you all!

Please join us this week for our discussion of the REEP Writing Processand Rubric with Suzanne Grant and Pat Thurston, of the Arlington Education and Employment Program (REEP) in VA. The full information and the suggested preview resources for the discussion are listed for you below. Also, I wanted you to recall that Suzanne and Pat will be accompanied by several trainers-in-training of the REEP process.

Welcome colleagues from REEP!!

February 14 - 18
Topic: Assessing Writing, Developing Rubrics, and Developing Effective Writing Tasks
Guests: Suzanne Grant and Pat Thurston, REEP Master Trainers

Recommended preparations for this discussion:

"The REEP Writing Story" [1] [2] which discusses the development of their writing process and the accompanying rubric.

"Making Sense of the REEP" [3] which discusses one program's experience with and reflections on using the REEP process.

marie cora
Moderator, NIFL Assessment Discussion List, and
Coordinator/Developer LINCS Assessment Special Collection [4]


Discussion Thread

Good Morning: I wish to congratulate the REEP creators! Brooklyn Public Library Literacy Program moved to writing in the early 90's. In an effort to codify students' gains and with a grant from the then Lila Wallace foundation we created a writing rubric for non-reading Adults - up to about a fifth grade reading level. We have been using this successfully for years. But because it was not normed we couldn't use it to show gain in an NRS environment.

I have downloaded your article and handed it off to the folks at the New York State Education Dept. I found this all very exciting.

Susan K. O'Connor
Brooklyn Public Library
Literacy Program Manager


Folks,

Check out the article in the first volume of this online journal.

I downloaded three of them. Some good writing here.

A good place also, for those writing in the field to publish their work.

The Online Journal of Adult & Workforce Development [5]

George Demetrion


Greetings from Suzanne and Pat at REEP!

We would like to thank Susan O'Connor for her message, congratulations, and for getting this writing assessment discussion started.

The development of our writing assessment, too, was also supported by the Lila Wallace Foundation. From 1997-2002, the REEP Program was one of 12 adult education programs nationwide funded to study what works in assessment. The project was called the What Works Literacy Partnership (WWLP) and was funded by the Lila Wallace Foundation. The lead agency was Literacy Partners of New York. We had developed the rubric earlier, but through WWLP, we developed pre and post prompts and carried out studies to determine the effectiveness of using the rubric to measure progress.

Susan and all on this list, we would be interested in hearing what writing traits you feel are important to include in writing assessment rubrics. In our case, the engagement of the writer with the topic was a factor in how we were assessing the writing, and we, therefore, felt we needed to include voice as a trait in the REEP Writing Rubric.

Other writing assessment questions and topics are welcome.

Suzanne Grant and Pat Thurston
REEP Writing Assessment Master Trainers
Arlington Education and Employment Program (REEP)
Arlington Public Schools
Arlington, Virginia


Hi everyone,

A couple of observations:

First, please do note that this assessment is a fine example of a performance-based assessment that has been standardized. So if anyone still thinks that standardized assessments all look like TABE, consider your myth debunked.

I think that capturing voice in writing is quite important, and I'm glad that the REEP rubric includes this area. If not for voice, the rest of the examination of the writing is based on the 'academics' of the writing - and I feel like that leaves out the writer's (emerging) personality. I note in looking around a little bit, not a whole bunch of other writing assessments take voice into account (the GED does not for example). I also think that because voice is a dimension of the rubric, students will pay more attention to that area and view it as equally important as the other dimensions. (A bit of "what counts gets counted" there.)

What do others think about voice and the other dimensions?

marie cora


Hello:

Since reading the rubric and noting the inclusion of voice, I have spent an extraordinary amount of time pondering this particular assessment area. It is a difficult area to assess. Writing can have strong or weak elements of voice; however, it would be difficult to assess someone's writing voice as right or wrong unlike, say, grammatical errors. Voice is a product of the culmination of many things, and whether it should be assessed outside of accelerated or gifted high school programs or college English classes is an interesting question. To teach writing students about voice is as necessary as teaching other elements of writing, but because many of these students' lack of basic writing experience, I am not "sold" on the benefits of using it as an assessment area.

Thanks,
Shannon Purcell
Adult and Community Education
Leon County Florida


Marie:

I think I would have to see the two pieces to provide valuable input; however, with what you have provided, I will take a “shot” at it. It appears that what we are witnessing here is difference in structure. The first student has chosen a time-ordered piece, a narrative (in time) as the structuring element. The second student provides text written in a list form, where in the order of the items is not anchored to a time sequential format. Keeping these two structures in mind, I would say that it is more a difference in structure than in voice, but keep in mind, I do believe that voice is a culmination of many elements, including but not limited to structure. It is because of this that I am not "sold" on the idea of using voice as an assessment area.

Shannon Purcell
Adult and Community Education
Leon County Public School District
Leon County Florida


Shannon and all,

Voice can indeed be difficult to assess, certainly more difficult than other assessment areas, such as organization, content, mechanics, and structure. Voice is harder to "quantify", is more a question of degree of engagement than of correct voice or incorrect voice, and is more subjective.

Nonetheless, as we read and scored hundreds of student essays using various rubrics that did not include voice, we felt there was something missing * that these other rubrics did not capture our students' writing abilities. Our purpose in developing our rubric was to describe what we found, what our students could do.

We saw voice in the writing of even our beginning level adult English language learners. As the students' writing developed in other areas, the voice developed as well.

We also learned (the hard way) that not all writing topics lend themselves to bringing out a writer's voice. We found that the key to generating voice in our adult student responses was an engaging topic. Our students are engaged by topics that provide them with an opportunity to validate their life experiences.

We'd like to hear others weigh in on this.

Suzanne Grant and Pat Thurston
REEP


Hello All,

I, too, have been intrigued by the idea of "voice" in the rubric, and while I intuitively "know" what it means, I'm interested as an emerging writing specialist as to what elements would constitute voice, beyond more traditional "academic" ways of "measuring" it. I think of the clarity or persuasiveness of a point of view supported with meaningful examples, the personal voice in a narrator struggling with complex questions, forthright emotion strikingly articulated with imagery or other means, an attempt at critical thinking, or "learning to learn," self-reflectiveness... I'd be interested in hearing from others.

Another point I encountered when I was involved with CT's working with the CASAS writing assessments: the rubric was not meant to distinguish between ABE and ESL students. As an evaluator, I as an ESL specialist was at a disadvantage: having attained a certain level of skill in "translating" English learners' language into meaningful utterances, I'd automatically bring that to my evaluation: it was extremely difficult to adhere to the rubric controls and anchors, and not want to commend the ESL learner for attempting with limited language ability to voice something difficult to articulate in another language, as having communicated more than in fact they did.

Best,
Bonnie Odiorne, Ph.D.
Writing Center, English Language Institute
Post University, Waterbury, CT


Hi Bonnie, thanks for this.

Yes, I think that it would have been real tricky for me to have a rubric that didn't distinguish between ESOL/ABE students. Unless they are transitioning from ESOL to ABE perhaps. It's tricky enough, as you note, to adhere to rubric anchors and so forth, so adding that you are working with different populations with the assessment would add a layer that I would also find difficult.

CASAS folks: can you tell us why the writing rubric is not separate? What's the rationale there? It seems like the needs, esp. at the lower levels, would be very different.

REEP folks: what do you think about that? Perhaps that was never a consideration for you though, since REEP serves the ESOL population (is that right?).

Thanks,
marie cora


I, too, am interested in the question of separate rubrics for ESOL and ABE/ASE learners. At my program we don't differentiate, and the simplified reason is that we don't hold learners to different standards. Our instructors see "good writing" as "good writing" whoever is doing the writing.

Of course, at the lower levels of the rubric, one can usually distinguish the native born speakers and writers from the ESOL speakers and writers by the type of errors and issues in the writing. But, does that mean there should be a different rubric, or qualifiers, or descriptors for ESOL learners?

Is separate inherently unequal, or it is appropriate and necessary to facilitate learning?

Howard L. Dooley, Jr.
Director of Accountability, Project RIRAL


Hi Howard, how are you? Thanks for your reply.

Can you show us some of the descriptors in your rubric? Do you find that ESOL learners have the same types of challenges as ABE learners in writing then? Do ESOL and ABE people attend writing classes together in that case? Or are they in separate classes? And if so, how do your classes align with your rubric?

I guess I'm having a hard time envisioning your rubric (I feel like it has to be enormous to cover a beginning ESOL level and go thru an advanced ABE level).

Thanks,
marie cora


Marie,

Howard has articulated the main reason that the CASAS rubric is for both ABE and ESL learners. He said, "We don't hold learners to different standards. Our instructors see 'good writing' as 'good writing' whoever is doing the writing."

We would add that employers and others on the receiving end of our students' writing don't have different standards, either.

We would recommend placing ESL and ABE students in different classes since instruction and the kinds of strengths and errors will be very different for the two groups, but the general characteristics of writing for both groups can be described within a single rubric. We have been working with this for nearly ten years and have become very comfortable with scoring both types of learners on the same rubric, though it is often necessary to be careful not to over-reward ESL learners for "trying" when they haven't quite succeeded in writing at a certain level.

In answer to your earlier questions about writing prompts, I can respond with respect to the CASAS Functional Writing Assessment Picture Task, which is currently being used for accountability reporting in Kansas, Iowa, Connecticut, Oregon, Indiana, Vermont and New York Even Start. Prompts for this task are line drawings showing a scene with a central critical incident as well as a number of other things happening in the picture. This type of prompt can be answered by students from beginning to advanced levels in ABE, ASE and ESL programs.

It takes a long time to develop a viable prompt, with many rounds of revisions based on field-testing input from teachers and students and back and forth work with an artist. They are written by a small team of test developers who have extensive experience as adult ed. teachers. Topics for the prompts come from needs assessments from adult ed. programs and workplace surveys. We currently have seven prompts - four that are on general life skills topics (a car accident scene, a grocery store check-out scene, a park scene, and a department store scene). There are three more prompts that have a workplace focus - a restaurant kitchen scene, a hotel scene and a warehouse scene.

Like the REEP, these prompts are scored with an analytic rubric, but with slightly different categories: Content; Organization; Word Choice; Grammar and Sentence Structure; and Spelling, Capitalization and Punctuation. The categories are weighted, with more importance given to the first three categories to emphasize the importance of communication of ideas in writing. We have recently completed a study to convert the rubric scores to a common IRT scale, which provides a more accurate means of reporting results across prompts. We have also just completed a cut score study to refine the relationship of the CASAS Picture Task writing scores to the NRS levels.

With all of the work that goes into developing and standardizing a test prompt, it is not made available for classroom practice. However, we have found several published materials that contain similar types of pictures that can be used for classroom practice.

We encourage programs to share the rubric with students for instruction, in addition to using it to communicate test results to teachers and learners. Many teachers tell us that completing the training for the writing assessment, which focuses on the scoring rubric, has given them a better understanding of how to approach the teaching of writing. The analytic rubric provides clear diagnostic information about students' strengths and weaknesses in the different rubric categories.

I am very pleased that some states are choosing to include writing in the mix of assessments that can be reported for accountability purposes. It is more work to include performance assessment in a state's accountability system, due to the additional training and scoring demands, but the states that are doing it have found it to be worth the extra effort.

Linda Taylor, CASAS
(800) 255-1036, ext. 186


Well, I think Linda answered for me, and in a better way than I could. Thanks, Linda! We are using the CASAS writing assessments, and in the GED preparation classes the GED rubric, and their analyses for writing.

In the lower-level ESL classes we do not assess writing with a standardized instrument. We use informal assessments and measures, though we provide the rubric to every teacher so they can see what will be expected of learners who choose to continue class work at the higher level ESL or who transfer into our ABE and ASE classes.

At the lower ESL levels, we don't need the information a standardized writing assessment would provide for our program decisions, the instructors don't need it for instructional decisions, and the state and fed's don't require it (we use reading and/or listening for our federal reports).

Howard Dooley


Howard, You need to re-visit the NRS requirements (federal reporting). ABE students as well as ESL students need to be assessed in writing. ABE assessment is Reading, writing and Math. ESL assessment is Reading, writing and Listening. This is what is required to determine Entering Functioning Level (EFL). This is required with a normed, standarized instrument. not an informal assessment.

your statement:

instructors don't need it for instructional decisions, and the state and fed's don't require it (we use reading and/or listening for our federal reports).

Shauna South


Shauna --

The NRS guidelines state: "The functional level descriptors describe what a learner entering that level can do in the areas of reading and writing, numeracy, speaking and listening and/or functional or workplace skills. The local program need not assess the learner in all areas, but the assessment should be in the areas in which instruction will be focused. If the learner is functioning at different levels in the areas, the lowest functioning level should be the basis for initial placement."

My statement was that writing is not an area in which instruction is focused at that level, and hence we are not required to assess it with a uniform, standardized assessment.

Your email seemed to state that a learner must be assessed with a standardized instrument in all instructional areas to determine the entering functional level. This is not true. For another example, at the High Intermediate ABE and Low Secondary ASE Levels, my program provides separate Mathematics classes. In these classes, we only assess using the CASAS Life Skills Mathematics assessment. Any other assessments would not only be irrelevant, but disruptive to the instructional process -- as well as damn annoying to the teachers and learners!

Howard Dooley


Kansas has used the CASAS Functional Writing Assessment (FWA) for almost 10 years. While it requires an enormous commitment of time and energy to ensure that the scoring of a performance-based assessment is standardized, Kansas adult educators have responded positively to the lengthy process of being "certified" to use the FWA and to maintaining certification. They report that the process has helped them become much better teachers of writing.

Dianne S. Glass
Director of Adult Education
Kansas Board of Regents
1000 SW Jackson Street, Suite 520
Topeka, KS 66612-1368
785.296.7159
Phone: 785.296.7159
FAX: 785.296.0983
dglass@ksbor.org


Hello Linda, Marie, Bonnie, and Howard and all on the list,

Rubrics for ABE and ESL learners:

We developed the REEP Writing Rubric to describe what our adult ESOL learners could do. So, our rubric was not designed for native speakers. But is it appropriate for native speakers? In a study done comparing the REEP Writing Rubric and the CASAS rubric, REEP readers scored CASAS essays with the REEP Writing Rubric while CASAS readers scored REEP essays with the CASAS Rubric. The CASAS essays were a mixture of ESOL and native speaker writers. REEP readers were able to score the CASAS essays effectively with the REEP Writing Rubric. The only real difference that we notes was that the native speakers consistently achieved higher scores for structure than the ESOL learners. For those interested, this study can be found on the CASAS website [6].

Suzanne and Pat
REEP
Arlington, Virginia


The CASAS writing assessment is valuable in assessing independent writing skills. I would question its value in evaluating "voice" in that the writing prompts are highly selective in asking students to respond to one of several descriptive scenarios.

In measuring accuracy of response based on the 4-5 rubric categories, it's not particularly supportive of process approaches to writing, which often times provide the idiosyncratic format wherein "voice" might flourish.

This is not to take away from what CASAS does measure--accuracy and fullness of response to a specific prompt--and there is much merit to that kind of measurement. Voice, in my view, requires a different sort of measurement. For example, one might get at that by evaluating a collection of student writing in a given program according to the literary quality of the expression.

I'm not sure a rubric would be the best form of measurement for that, though I would not rule that out. Also, on the CASAS writing assessment, the resulting essay might be viewed as a manifestation of authorial voice, but that's not what I would be primarily looking for in such an "artificially" constructed essay.

While there may be (and ideally should be) convergences in underlying pedagogical assumptions undergirding the type of writing fostered by the CASAS writing prompts and a more free flowing "existential" narrative fostered in process writing schools of thought, the differences may be even more critically important.

Stating this, I believe a worthy discussion could ensue here on the multi-purposes of a writing program in adult literacy education below the GED level--a discussion that could be stimulated in reflecting on the differences in the types of writing that CASAS prompts and process writing orientations stimulate.

What also would be of interest are the ways in which the REEP rubric relates to the two types of writing.

George Demetrion


Hello all,

Thanks to those who have shared ideas on voice in writing. Marie gave an example of two essays that were very similar in all aspects except voice.

It is fairly safe to assume that all of us at some point in our careers have read essays on the topic: "A Holiday in my Country." Think about one of those essays now. Did the topic generate rich variety of responses and engage the learners? In our experience, this topic did not. But add a strong element of voice to an otherwise predictable type of response to when the holiday is celebrated, how the holiday is celebrated, what special activities are involved, and here's what you get:

       "My country Bosnia has always been on the border of west and east, and Christian and Islamic
worlds. That way there are lots of different holidays, and everybody likes to celebrate many of
them. New Year is celebrated three days, and after that there is a whole month off for school
children. Because of that most people spend the holiday in the mountains surrounding Sarajevo.
There are a lot of skiing centers, hotels, and small private houses which are only used for
weekends and holidays. In fact, the whole town moves to the mountains.
There are Christmas trees and presents like for Christmas in the U.S.A. For New Year's
dinner there are usually turkey, Russian salad, chocolate torte, fruit salad, and lots of
different meats and cakes. There are different kinds of drinks.
At 12:00 midnight, everybody goes to ski, except those who took too much alcohol.
People meet each other and say Happy New Year and kiss each other. Most of them take a vacation
week after and stay long with the children. That way the New Year is much longer than three
days. January is beautiful and sunny in the mountains and foggy in town.
Now all the mountains, beautiful hotels, and weekend houses are occupied by the Serbian
Army (Chetniks). They use those beautiful places to kill civilians in Sarajevo, and there is no
more happy new year in my country."

This writer addresses all parts of the task (paragraphs 2-4), but makes the topic her own with her introduction and powerful conclusion. In this essay, we see the elements of voice that Bonnie described in her earlier posting - with this narrator struggling with complex questions and articulating her emotions.

Hope this sheds some more light on the topic of voice in writing.

Suzanne Grant and Pat Thurston
REEP
Arlington, Virginia


I'd like to add some personal experience and insight in using the REEP Writing Assessment for the first time in our 2005-06 school year. My adult education ESOL program is straining with rapid increases in numbers of students seeking adult ESOL instruction. Keeping up with the demands has many dimensions in program planning but certainly hiring qualified teachers and then offering in-service training specific to our parameters, is critical.

Having gone through the RWA training as well as sending staff to training, I find that teachers have come to see the RWA as more than just a standard assessment (which was our primary goal for implementing it). It has become multi-dimensional in it's use as an instructional tool as well. The Rubric is leveled and provides bullets of what writing characteristics students have at those levels. Teachers are able to use the Rubric as a diagnostic tool for student's writing and alter instruction according to learner needs.

As teachers have become more aware of the learner's needs with the language skill of writing, they have become more aware of integrating all four language skills in lesson planning. They also have self-diagnosed areas of professional development as they have seen a need for improvement in teaching writing. It has become increasingly important for our particular population in this area who are looking to transition to academic programs and needing to have writing skills in place for that transition.

Debby Cargill

Debra H. Cargill
Lead ESOL and Program Developer
Prince William County Public Schools
Adult Education
P.O. Box 389
Manassas, VA 20108
work 703-791-8387
fax 703-791-8889
cargildh@pwcs.edu
www.pwcs.edu/curriculum/adulted/services.htm


Hi everyone,

I was wondering about the writing prompt end of things with the REEP (and other writing assessment tools as well). Do you have dozens of prompts that people can select? Are they available for teachers and students to see and practice with beforehand? Can anyone develop a prompt? How does that all work?

How do folks PRACTICE their writing before they get to the test part also? Can they use the rubric in class as well, not just as an assessment?

Sorry, that might be 2 questions in there!

Thanks,
marie cora
Moderator, NIFL Assessment Discussion List, and
Coordinator/Developer LINCS Assessment Special Collection [7]


Hello all,

In response to Marie's question, the REEP Writing Assessment currently has 4 available prompts (two developed by staff at REEP and two developed by the Center for Educational Assessment at UMass). Both UMass and REEP are working on additional test prompts. So, are there dozens and dozens of prompts that people can select from? No.

In an earlier posting, Linda noted that it takes a long time to develop viable prompts, and we would add, particularly for accountability purposes. However, not all prompts need to be put through the rigor that the CASAS and REEP prompts have been. There are certainly dozens and dozens of good prompts that can be used effectively in a classroom setting to develop and assess writing. These are some of the questions we ask and the characteristics that we look for in a writing prompt - both for testing and classroom purposes:

Questions to answer before prompt development: Who will take the tests? What levels of students?

Characteristics of Effective Prompts
The topic:

 Has a controlling idea that assists with organization & development.
Generates a variety of responses.
Adjusts to students' abilities and life experiences (Everyone can write something about the topic.)
Has a universal subject. Does not require knowledge about a specific subject.
Generates a variety of tenses and structures.
Provides an "out". Students can choose whether or not to take an emotional risk.
Is one you'd like to write about.

So, can anyone develop a prompt for the classroom? Why not?

In terms of how students can practice their writing before the test: At REEP, students develop their writing skills by writing about a variety of topics. In the case of the lifeskills topics in our curriculum, students can write about their jobs or job goals in the work unit or write to elected officials about challenges immigrants face in the community unit or write family histories in the parenting unit. Not all writing topics/practice need to relate to lifeskills. Important people in the students' lives and life experiences are also engaging topics. Feel free to pursue REEP's on-line curriculum for more writing ideas: [8].

In the classroom as in the REEP Writing Assessment, pre-writing activities are an essential step in the writing process. For those using the REEP Writing Assessment, we also stress the importance of practicing the types of pre-writing activities that will be found on the test (group brainstorming and pair conversation activities) since students should not be introduced to new types of activities during a test.

With respect to sharing the rubric with students: by all means! It is important for students to know the standards against which they are being assessed. Also, after the students' writing has been assessed with the rubric, the next step in the students' writing development is articulated in the next level of the rubric. For example, a student whose structures can best be described as "restricted to basic patterns" can see where he needs to go next - "compound, complex sentences with more control of present and past tense."

Suzanne Grant and Pat Thurston
REEP
Arlington, Virginia

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox