[NIFL-ASSESSMENT:490] Re: Training in good assessment practice

From: French, Allan (afrench@sccd.ctc.edu)
Date: Thu Apr 08 2004 - 18:22:41 EDT


Return-Path: <nifl-assessment@literacy.nifl.gov>
Received: from literacy (localhost [127.0.0.1]) by literacy.nifl.gov (8.10.2/8.10.2) with SMTP id i38MMfm02164; Thu, 8 Apr 2004 18:22:41 -0400 (EDT)
Date: Thu, 8 Apr 2004 18:22:41 -0400 (EDT)
Message-Id: <4B73F518F0F98A4EA67DDAA3DB84941A01C162B8@SCCDMAIL.SCCD.CTC.EDU>
Errors-To: listowner@literacy.nifl.gov
Reply-To: nifl-assessment@literacy.nifl.gov
Originator: nifl-assessment@literacy.nifl.gov
Sender: nifl-assessment@literacy.nifl.gov
Precedence: bulk
From: "French, Allan" <afrench@sccd.ctc.edu>
To: Multiple recipients of list <nifl-assessment@literacy.nifl.gov>
Subject: [NIFL-ASSESSMENT:490] Re: Training in good assessment practice
X-Listprocessor-Version: 6.0c -- ListProcessor by Anastasios Kotsikonas
Content-Transfer-Encoding: 8bit
Content-Type: text/plain;
Status: O
Content-Length: 8951
Lines: 139

Marie:

I will try to respond to your questions in order.

I believe that understanding such technical concepts as validity and reliability would help test administrators understand why certain testing rules by the makers of a standardized test are necessary.  Even with training in basic test administration, consistency in the actual administration of said tests is a problem in practice.  In addition, each program makes specific decisions about room, time, number to be tested, etc., so it would help those who make such local
decisions understand the technical standards that should be striven for.  On the other hand, we are all well aware that physical, budgetary, temporal, and personnel constraints often impinge on all of this more than test makers and government agencies would like to acknowledge.

Coming from the other direction, I also think that being trained to conduct standardized tests can also help with designing other types of assessment, including performance-based assessment.  Even the best ideas for the latter must face the hard reality of execution, especially to make these valid, reliable, fair and useful.  I don't think it is too much of a stretch to argue that a competent understanding of good test administration could be one important factor in determining whether a newly-designed assessment instrument turns out to be successful or problematic.

Washington state was mandated by the feds to use a standardized test for ESL and ABE/GED.  The state organization of program directors chose the CASAS over the BEST for ESL, and the CASAS over the TABE for ABE/GED.  These were our only choices as they were the only instruments on the feds' approved list.  The program directors' choices became the rule for the entire state.  It was intended that instructors could, and I know that many in fact did, influence the votes of their respective directors.  But once the decision was made, that was it.  For us, we all understand that CASAS is not going to go away.

With respect to your final question, our problem out here is that we are spending so much energy on complying with federal mandates, that we don't get around to discussing, as a state-wide field, instruction.  Hopefully, once we become more efficient with CASAS administration, we can look at improved integration of assessment and instruction.  From my point of view, you need to first consider your desired educational goals/standards/outcomes, second look at what the mandated tests fail to asses in those, and finally proceed very gradually (small steps) at developing not just effective assessment tools, but also ways to integrate these into instruction that are both practical and useful.  I can't be more specific than that.  Again, take your time, go it in manageable stages, and be continuous (i.e., don't be gung ho one year, put it aside the second year, then try to revive the process the third year when you have forgotten what was accomplished the first year).  In all honesty, I have to say that this recommendation comes more from my critique of previous assessment movements and trainings, then of a tried and true model.

Allan French
South Seattle Community College

-----Original Message-----
From: Marie Cora [mailto:mariecora@hotmail.com]
Sent: Thursday, April 08, 2004 7:39 AM
To: Multiple recipients of list
Subject: [NIFL-ASSESSMENT:481] Re: Training in good assessment practice


Hi Allan, thanks for your reply.

I had a couple questions for everyone regarding your post.

Do you think that a lack of good understanding of validity and reliabiity 
(and other technical test notions) can affect the more straightforward 
"train them to administer said test" situation?

Does understanding basic tenets of test ADMINISTRATION (not test 
development) help at all in selecting or developing classroom assessments?

If you have knowledge and training beyond 'how to administer said test', 
would that enable you to apply those test results in your classroom, as well 
as use the results for accountability (high stakes) purposes?

Allan - you said that Washington State adopted CASAS.  I have 2 questions, 
one is retrospective:  how did that process unfold?  who was involved in 
making the decision?  I ask because it is different for every state, but in 
some, there was/is a process involving practitioners and others from the 
field (not just policy makers, for example) in determining which are the 
most appropriate commercial tests to select.  For you Allan, this has 
passed, but for others, perhaps you can find out how this is working and 
join in that decision-making.

My second question about CASAS - although first I MUST acknowledge your note 
Allan, about lack of time(!!!) - is what should teachers know in order to 
develop classroom assessments that might enhance, fill holes, coach, 
broaden, etc. the results that they get from the CASAS assessments?

Ok, let's see what folks have to say.  And I also agree with Allan that one 
of the toughest things about all this is the actual implementation of the 
good ideas!  It's pretty easy to dream, but how to make stuff a reality?

marie cora
NIFL Assessment List Moderator


>From: "French, Allan" <afrench@sccd.ctc.edu>
>Reply-To: nifl-assessment@nifl.gov
>To: Multiple recipients of list <nifl-assessment@literacy.nifl.gov>
>Subject: [NIFL-ASSESSMENT:478] Re: Training in good assessment practice
>Date: Wed, 7 Apr 2004 17:05:16 -0400 (EDT)
>
>I believe that someone else has already cautioned us to stay aware of the 
>difference between training to administer a required standardized test and 
>learning how to make and do assessment in general.  On the one hand, it is 
>good for professionals like us to have as much knowlege of assessment as 
>possible even when we are merely administering a standardized test.  On the 
>other hand, such knowledge will, in most cases, not help us to choose which 
>test to use because such a decision has already been made at the state 
>level, and limited by the feds' list of approved tests.  Here in Washington 
>state, many sqawk about the problems with the CASAS, but we are in no 
>position to change the federal mandate.
>
>As far as training for other types of assessment (appraisal, formative, 
>summative, etc.), as opposed to the high-stakes assessment for reporting 
>purposes that the feds want, my caution is to go slowly and to make it 
>practical for the real-world classroom.  I have attended many assessment 
>meetings over the last dozen years, of different types and in varying 
>formats, and have often come away with one of two internal reactions.
>
>First, it was informative but so abstract or so detailed that I would have 
>liked more time to practice the ideas so that I could better understand how 
>such things as validity, reliability, fairness, actually work.  Such 
>concepts need to be presented in a series of workshops with ample time for 
>discussion, practice and questions (sometimes we looked at a single writing 
>task to understand validity!?), and should be gone over gradually.  Too 
>often programs and state agencies want to do something on the quick or on 
>the cheap.  A great idea takes a long time to germinate in even the best 
>professional, and even more so with a large and varied group.
>
>Second, I would often leave thinking, "great idea," but I have no way of 
>knowing how to implement the process in my classrooms.  I have read 
>academics and listened to well-intentioned practitioners go into great 
>detail on a complicated design that would assure validity, relevance, 
>student participation, etc., yet not consider that many of my classes are 
>multi-level and with irregular attendance patterns, nor that my preparation 
>time is limited (imagine!).
>
>So, assessment is vitally important and the training for it should be 
>increased, but it must be designed to be as accessible and as practical as 
>possible.  We should not try to hit the bull's eye the first year, but take 
>little steps on a long path towards some educational goal.
>
>Allan French
>ESL Instructor
>Testing and Placement Coordinator
>General Studies Division
>South Seattle Community College
>e-mail:  afrench@sccd.ctc.edu
>
>-----Original Message-----
>From: Marie Cora [mailto:mariecora@hotmail.com]
>Sent: Wednesday, April 07, 2004 12:49 PM
>To: Multiple recipients of list
>Subject: [NIFL-ASSESSMENT:476] Re: Training in good assessment practice
>
>
>Hi everyone,
>
>I didn't want us to lose site of Kerry's question, which I derailed some
>myself.
>
>Given some of the discussion about what good assessment training actually
>is, can anyone give us a sense of what they do in their location, and if
>they think it's successful or not?
>
>marie cora
>NIFL Assessment List Moderator
>
>

_________________________________________________________________
Check out MSN PC Safety & Security to help ensure your PC is protected and 
safe. http://specials.msn.com/msn/security.asp



This archive was generated by hypermail 2b30 : Thu Dec 23 2004 - 09:46:14 EST