From djrosen at comcast.net Sun Jan 1 18:32:33 2006 From: djrosen at comcast.net (David Rosen) Date: Sun, 1 Jan 2006 18:32:33 -0500 Subject: [Assessment] Literacy List Updated Message-ID: <65423178-F949-450B-92AE-F5626C4E9B91@comcast.net> Colleagues, For several years, as a volunteer service, I have published the Literacy List, a large online collection of free Adult Basic Education and English language (ELL/ESL/ESOL) Web sites, electronic discussion lists ("listservs"), and other Internet resources for adult basic skills learners, teachers and tutors. I have just updated it, removing a few outdated links and adding new ones. Please have a look. If you know of a good free Web site resource which you think should be added, please let me know. The Literacy List gets better as a result of teachers sharing their favorite online resources. You will find the Literacy List now in two locations: http://alri.org/literacylist.html or http:newsomeassociates.com (Select Publications at the Bottom of the Page) All the best in 2006. David J. Rosen djrosen at comcast.net From PHCSJean.2163953 at bloglines.com Tue Jan 3 12:47:20 2006 From: PHCSJean.2163953 at bloglines.com (PHCSJean.2163953 at bloglines.com) Date: 3 Jan 2006 17:47:20 -0000 Subject: [Assessment] NAAL Question Message-ID: <1136310440.3140336464.21128.sendItem@bloglines.com> Hi all. I've been off-list for a couple of months and jsut getting caught up. I'm not sure if Mark is still answering NAAL questions. The preliminary report was quite limited by comparison to all that was released in the 1992 survey. I'm looking for more detail on the incarcerated literacy rate at the below basic level, which was documented extensively in the earlier reports. Is there any timeframe for that release, or anything that mentions those breakdowns? I'm doing my dissertation work on basic literacy levels and my subject population is half in prison and half out. I'd like to use the recent statistics rather than the 1992 info. Thanks! Jean Marrapodi Providence Assembly of God Learning Center Providence, RI From PHCSJean.2163953 at bloglines.com Tue Jan 3 12:50:10 2006 From: PHCSJean.2163953 at bloglines.com (PHCSJean.2163953 at bloglines.com) Date: 3 Jan 2006 17:50:10 -0000 Subject: [Assessment] Best in Class Assessments? Message-ID: <1136310610.4049970681.619.sendItem@bloglines.com> Hi all. I'm taking a reading assessment class for K-12 teachers and have been asked to bring a list of the adult reading assessments, formal and informal. What are folks using out there these days? I know the TABE is most common, but I'm sure there are some great standbys I've not seen out there I'd love to know about. Thanks! Jean Marrapodi Providence Assembly of God Learning Center From marie.cora at hotspurpartners.com Tue Jan 3 13:46:32 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 3 Jan 2006 13:46:32 -0500 Subject: [Assessment] FW: [AAACE-NLA] NAAL/Proliteracy Webcast on January 6 Message-ID: <012401c61096$04745730$0402a8c0@frodo> Forwarded from the Library Literacy Electronic List: This is a reminder about an important webcast scheduled for Friday, January 6, from 10:30 - noon. The results of the 2003 National Assessment of Adult Literacy have finally been released -- but what do the figures really tell us and how do we tell our communities and the news media what the numbers mean in terms of adult learners and adult basic education? Join ProLiteracy Worldwide's Marsha Tait, senior vice president of public affairs and Rochelle Cassella, director of marketing and corporate communications for a presentation and question and answer session on analyzing and interpreting the 2003 NAAL. This webcast is sponsored by California Library Literacy Services, in cooperation with ProLiteracy Worldwide. Friday, January 6, 10:30 am - Noon - FREE To participate live, go to http://rurallibraries.org/webcasts/01-06-06/ and take these three easy steps: 1. Test your computer (using the "Wizard" link at the bottom of the page) 2. Download and print Speaker Slides and Handouts 3. Register for webcast If you can't join us live, come view the archive afterwards at http://rurallibraries.org/webcasts/01-06-06/ The agenda will include: * An analysis of the NAAL report * Suggestions about how to talk to the media about it * Some ideas about next steps we can all take to ensure good public relations and public awareness about literacy issues About the Technology: A webcast is where the presenters are in a studio broadcasting live over the Internet, so you don't have to travel - you can participate at your desktop computer. Some libraries or literacy programs may want to project the webcast on a screen in a conference room, so that several participants can watch together. If you have any questions, please contact Dan Theobald at 415-431-0329 or dtheobald at i2icom.com Hi, Maryland requires the CASAS reading assessment . I have used the QRI-3 by Leslie and Caldwell with a few adults. It has been useful in identifying automatic word recognition, fluency, and comprehension needs - as well as approximate reading level. Some teachers have used the Be a Better Reader diagnostics from Globe Fearon to assist upper level readers. >>> PHCSJean.2163953 at bloglines.com 01/03/2006 12:50:10 PM >>> Hi all. I'm taking a reading assessment class for K-12 teachers and have been asked to bring a list of the adult reading assessments, formal and informal. What are folks using out there these days? I know the TABE is most common, but I'm sure there are some great standbys I've not seen out there I'd love to know about. Thanks! Jean Marrapodi Providence Assembly of God Learning Center ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment Jennifer Reed, Instructional Specialist Carroll Adult Learning Connection 224 N. Center Street, Room 103 Westminster, MD 21157 410-751-3682 Ext. 226 FAX 410-751-3686 Peace on earth will come to stay, When we live Christmas every day. - Helen Steiner Rice From MKutner at air.org Tue Jan 3 14:51:04 2006 From: MKutner at air.org (Kutner, Mark) Date: Tue, 3 Jan 2006 14:51:04 -0500 Subject: [Assessment] NAAL Question Message-ID: As in 1992 there will be a report on the literacy of incarcerated adults; that report will include comparisons between 1992 and 2003 using the new literacy levels. The report is scheduled to be released in late Spring. Mark -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of PHCSJean.2163953 at bloglines.com Sent: Tuesday, January 03, 2006 12:47 PM To: assessment at nifl.gov Subject: [Assessment] NAAL Question Hi all. I've been off-list for a couple of months and jsut getting caught up. I'm not sure if Mark is still answering NAAL questions. The preliminary report was quite limited by comparison to all that was released in the 1992 survey. I'm looking for more detail on the incarcerated literacy rate at the below basic level, which was documented extensively in the earlier reports. Is there any timeframe for that release, or anything that mentions those breakdowns? I'm doing my dissertation work on basic literacy levels and my subject population is half in prison and half out. I'd like to use the recent statistics rather than the 1992 info. Thanks! Jean Marrapodi Providence Assembly of God Learning Center Providence, RI ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Wed Jan 4 13:41:46 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 4 Jan 2006 13:41:46 -0500 Subject: [Assessment] NAAL/Proliteracy Webcast on January 6 Message-ID: <018c01c6115e$8497b290$0402a8c0@frodo> This is a reminder about an important webcast scheduled for Friday, January 6, from 10:30 - noon. The results of the 2003 National Assessment of Adult Literacy have finally been released -- but what do the figures really tell us and how do we tell our communities and the news media what the numbers mean in terms of adult learners and adult basic education? Join ProLiteracy Worldwide's Marsha Tait, senior vice president of public affairs and Rochelle Cassella, director of marketing and corporate communications for a presentation and question and answer session on analyzing and interpreting the 2003 NAAL. This webcast is sponsored by California Library Literacy Services, in cooperation with ProLiteracy Worldwide. Friday, January 6, 10:30 am - Noon - FREE To participate live, go to http://rurallibraries.org/webcasts/01-06-06/ and take these three easy steps: 1. Test your computer (using the "Wizard" link at the bottom of the page) 2. Download and print Speaker Slides and Handouts 3. Register for webcast If you can't join us live, come view the archive afterwards at http://rurallibraries.org/webcasts/01-06-06/ The agenda will include: * An analysis of the NAAL report * Suggestions about how to talk to the media about it * Some ideas about next steps we can all take to ensure good public relations and public awareness about literacy issues About the Technology: A webcast is where the presenters are in a studio broadcasting live over the Internet, so you don't have to travel - you can participate at your desktop computer. Some libraries or literacy programs may want to project the webcast on a screen in a conference room, so that several participants can watch together. If you have any questions, please contact Dan Theobald at 415-431-0329 or dtheobald at i2icom.com Hi everyone, Regarding the webcast tomorrow - I have just learned that it will be in PACIFIC TIME - so this will obviously affect your schedules. Sorry this info was not included in the original announcement. marie -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Wednesday, January 04, 2006 1:42 PM To: Assessment Discussion List Subject: [Assessment] NAAL/Proliteracy Webcast on January 6 This is a reminder about an important webcast scheduled for Friday, January 6, from 10:30 - noon. The results of the 2003 National Assessment of Adult Literacy have finally been released -- but what do the figures really tell us and how do we tell our communities and the news media what the numbers mean in terms of adult learners and adult basic education? Join ProLiteracy Worldwide's Marsha Tait, senior vice president of public affairs and Rochelle Cassella, director of marketing and corporate communications for a presentation and question and answer session on analyzing and interpreting the 2003 NAAL. This webcast is sponsored by California Library Literacy Services, in cooperation with ProLiteracy Worldwide. Friday, January 6, 10:30 am - Noon - FREE To participate live, go to http://rurallibraries.org/webcasts/01-06-06/ and take these three easy steps: 1. Test your computer (using the "Wizard" link at the bottom of the page) 2. Download and print Speaker Slides and Handouts 3. Register for webcast If you can't join us live, come view the archive afterwards at http://rurallibraries.org/webcasts/01-06-06/ The agenda will include: * An analysis of the NAAL report * Suggestions about how to talk to the media about it * Some ideas about next steps we can all take to ensure good public relations and public awareness about literacy issues About the Technology: A webcast is where the presenters are in a studio broadcasting live over the Internet, so you don't have to travel - you can participate at your desktop computer. Some libraries or literacy programs may want to project the webcast on a screen in a conference room, so that several participants can watch together. If you have any questions, please contact Dan Theobald at 415-431-0329 or dtheobald at i2icom.com Hello again everyone, I've received another update on this webcast, and it is open only to California libraries. However, it will be archived for public access. I apologize in advance for any inconveniences I've caused by sending out confusing emails. Either the ProLiteracy folks or I will let you know when the archive becomes available. Thanks for your patience, marie cora Moderator, The National Institute for Literacy Assessment Discussion List, and Coordinator/Developer LINCS Assessment Special Collection at http://literacy.kent.edu/Midwest/assessment/ marie.cora at hotspurpartners.com From marie.cora at hotspurpartners.com Thu Jan 5 12:33:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 5 Jan 2006 12:33:08 -0500 Subject: [Assessment] EFF Discussion Next Week! Message-ID: <000501c6121e$182a6bf0$0402a8c0@frodo> Announcement Please join the National Institute for Literacy Assessment Discussion List for a discussion on Teaching and Assessing with EFF Date: January 9 through 13, 2006 Guests: Aaron Kohring, Peggy McGuire, Regie Stites, and EFF Center Staff and Consultants. Aaron Kohring is a Research Associate at the Center for Literacy Studies, University of Tennessee, and is Coordinator of the Equipped for the Future Websites and Moderator of the NIFL Content Standards Discussion List. Peggy McGuire, M.A., is a Senior Research Associate and Equipped for the Future National Consultant at the Center for Literacy Studies, The University of Tennessee. Regie Stites, Ph.D., is Program Manager of the Literacy and Lifelong Learning, Center for Education Policy, SRI International in Menlo Park, CA. Dr. Stites assists EFF in planning assessment development and validation processes. Our discussion will focus on the EFF Assessment Framework, and how EFF assessments are developed for classroom use. Suggested preparations for this discussion: Go to the homepage of the EFF Collection at http://eff.cls.utk.edu/default.htm Ask yourself: what do I know about EFF? Click on Standards (left-hand toolbar) (http://eff.cls.utk.edu/assessment/standards.htm) Ask yourself: what are standards? what's the difference between EFF standards and competencies? Click on Guides (left-hand toolbar) http://eff.cls.utk.edu/assessment/guides.htm Ask yourself: what are EFF Performance Continua and how can I use this in my classroom? Click on Assessment Resource Collection (square red button) (http://eff.cls.utk.edu/assessment/default.htm) Ask yourself: what makes EFF assessments different from other types of assessment? Click on Assessment Tools (left-hand toolbar) (http://eff.cls.utk.edu/assessment/assessment_tools.htm) Ask yourself: is the Read With Understanding Assessment appropriate for my students' needs? Thought-provokers: 1. Pick any EFF standard, read its definition, and imagine what it would look like if you were actually assessing the application of the integrated skill process described in the standard's definition. 2. How often do you feel a need to look for evidence that learning has happened? How does the nature of the evidence you are looking for change as you look for learning within the space of one class session, one week, one week, one course, one year, and so on. Please join us! marie cora Moderator, NIFL Assessment Discussion List, and Coordinator/Developer LINCS Assessment Special Collection at http://literacy.kent.edu/Midwest/assessment marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060105/5ee51676/attachment.html From marie.cora at hotspurpartners.com Mon Jan 9 10:09:34 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 9 Jan 2006 10:09:34 -0500 Subject: [Assessment] EFF Discussion Begins Today! Message-ID: <009901c6152e$b4008650$0402a8c0@frodo> Good morning, afternoon, and evening to you all. I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our discussion. I've been thinking about this over the weekend, and I have a couple of questions to start us off: For our guests: -The EFF Standards are complex in terms of what they try to capture in a performance. Is this was makes them different from competencies? Or perhaps even different from other standards? For subscribers: I found the "thought-provokers" really helped me to focus on a piece of this big picture so I could get a handle on it. Did anyone try #1 below? Or perhaps if there are EFF users on the List, you might comment on this activity. As for #2 below - I found this question helpful because it did make me consider how often and in what ways I would look for achievement over time, and it also made me think that I would necessarily look for such incremental gains via classroom assessment rather than with a high stakes test. 1. Pick any EFF standard, read its definition, and imagine what it would look like if you were actually assessing the application of the integrated skill process described in the standard's definition. 2. How often do you feel a need to look for evidence that learning has happened? How does the nature of the evidence you are looking for change as you look for learning within the space of one class session, one week, one month, one course, one year, and so on. Anyway, that's what I was thinking about. How about you? Please post your questions and comments! Thanks, marie Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060109/c2b5bd9c/attachment.html From samuel.mcgraw at seattlegoodwill.org Mon Jan 9 11:39:33 2006 From: samuel.mcgraw at seattlegoodwill.org (Samuel McGraw III) Date: Mon, 9 Jan 2006 08:39:33 -0800 Subject: [Assessment] EFF Discussion Begins Today! Message-ID: <802F2B4590320142A57872DC43A2BFD20218ADCC@seamail.seagoodwill.org> Marie et. al., I have a simple (yet possible complex answer) question. Has anyone cross referenced EFF and CASAS standards? And if so. What the outcome. Sam Seattle Goodwill Learning Center -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: Monday, January 09, 2006 7:10 AM To: Assessment Discussion List Subject: [Assessment] EFF Discussion Begins Today! Good morning, afternoon, and evening to you all. I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our discussion. I've been thinking about this over the weekend, and I have a couple of questions to start us off: For our guests: -The EFF Standards are complex in terms of what they try to capture in a performance. Is this was makes them different from competencies? Or perhaps even different from other standards? For subscribers: I found the "thought-provokers" really helped me to focus on a piece of this big picture so I could get a handle on it. Did anyone try #1 below? Or perhaps if there are EFF users on the List, you might comment on this activity. As for #2 below - I found this question helpful because it did make me consider how often and in what ways I would look for achievement over time, and it also made me think that I would necessarily look for such incremental gains via classroom assessment rather than with a high stakes test. 1. Pick any EFF standard, read its definition, and imagine what it would look like if you were actually assessing the application of the integrated skill process described in the standard's definition. 2. How often do you feel a need to look for evidence that learning has happened? How does the nature of the evidence you are looking for change as you look for learning within the space of one class session, one week, one month, one course, one year, and so on. Anyway, that's what I was thinking about. How about you? Please post your questions and comments! Thanks, marie Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060109/5d48bbc6/attachment.html From marie.cora at hotspurpartners.com Mon Jan 9 13:27:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 9 Jan 2006 13:27:08 -0500 Subject: [Assessment] FW: NAAL Web cast archive Message-ID: <00e901c6154a$4e3cf9e0$0402a8c0@frodo> Hi everyone, The NAAL Web cast that was held with California libraries is now archived and can be accessed at http://rurallibraries.org/webcasts/01-06-06/ Thanks, marie From alantoops at cs.com Mon Jan 9 17:05:39 2006 From: alantoops at cs.com (alan toops) Date: Mon, 9 Jan 2006 17:05:39 -0500 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <802F2B4590320142A57872DC43A2BFD20218ADCC@seamail.seagoodwill.org> References: <802F2B4590320142A57872DC43A2BFD20218ADCC@seamail.seagoodwill.org> Message-ID: Sam, CASAS and EFF have held joint discussions in the past regarding a cross walk. In my last job before retirement, we did cross walk the EFF roles to our curriculum which is cross walked to CASAS life and employability skills, the fit was generally very good for our purposes. It was not scientific but we found that outside of some overlap (there are a lot of CASAS competencies) the two fit reasonable well. Alan Toops On Jan 9, 2006, at 11:39 AM, Samuel McGraw III wrote: > Marie et. al., > > I have a simple (yet possible complex answer) question. > > Has anyone cross referenced EFF and CASAS standards? And if so. > What the outcome. > > Sam > Seattle Goodwill Learning Center > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov]On Behalf Of Marie Cora > Sent: Monday, January 09, 2006 7:10 AM > To: Assessment Discussion List > Subject: [Assessment] EFF Discussion Begins Today! > > Good morning, afternoon, and evening to you all. > > > > I?m pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to > our discussion. I?ve been thinking about this over the weekend, > and I have a couple of questions to start us off: > > > > For our guests: > > > > -The EFF Standards are complex in terms of what they try to capture > in a performance. Is this was makes them different from > competencies? Or perhaps even different from other standards? > > > > For subscribers: I found the ?thought-provokers? really helped me > to focus on a piece of this big picture so I could get a handle on > it. Did anyone try #1 below? Or perhaps if there are EFF users on > the List, you might comment on this activity. As for #2 below ? I > found this question helpful because it did make me consider how > often and in what ways I would look for achievement over time, and > it also made me think that I would necessarily look for such > incremental gains via classroom assessment rather than with a high > stakes test. > > > > 1. Pick any EFF standard, read its definition, and imagine what it > would look like if you were actually assessing the application of > the integrated skill process described in the standard?s definition. > > > > 2. How often do you feel a need to look for evidence that learning > has happened? How does the nature of the evidence you are looking > for change as you look for learning within the space of one class > session, one week, one month, one course, one year, and so on. > > > > Anyway, that?s what I was thinking about. How about you? Please > post your questions and comments! > > Thanks, > > marie > > Assessment Discussion List Moderator > > > > > ------------------------------- > National Insitute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060109/dcb962a6/attachment.html From regie.stites at sri.com Mon Jan 9 18:50:15 2006 From: regie.stites at sri.com (Regie Stites) Date: Mon, 09 Jan 2006 15:50:15 -0800 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <009901c6152e$b4008650$0402a8c0@frodo> References: <009901c6152e$b4008650$0402a8c0@frodo> Message-ID: <43C2F6B7.6080208@sri.com> Marie and all, Thanks for invitation to participate in this discussion. I have some initial thoughts in response to your question about the complexity of EFF in comparison to competencies. I want to ponder a bit more before responding to the second part your question about how EFF is different from other standards? (Thanks to my EFF colleagues Aaron Kohring and Peggy McGuire for comments and suggestions on an earlier draft of this. I'm sure they will have more of their own thoughts to add as the discussion continues). The EFF Standards are grounded in different conceptualizations of adult performance and adult learning than competency-based education (CBE). EFF is based on an understanding of expertise (high-level human performance) that comes out of cognitive science research and theory developed in the late 1970s and elaborated in the 1980s and 1990s. CBE is based on a somewhat different (and earlier) model of human performance that stems from cognitive psychology and industrial/organizational psychology research and theory from the 1960s. The CBE model is fairly simple. It assumes that human performance can be understood as the ability to accomplish tasks. It is basically focused on the question "What should people be able to do?" Researchers studied human performance in various contexts and analyzed the tasks that people performed in those contexts. Through large-scale surveys (such as the Adult Performance Level study - APL) tasks were identified and through task analysis tasks were placed in a hierarchy from simple to complex. This is the basis for the scaled lists of CASAS competencies that are the foundation for CASAS tests. Items on CASAS tests are designed to simulate as closely as possible, the tasks that people perform in work and life. Through careful design of test items and analysis of test results (using Item Response Theory - IRT), CASAS has been able to provide a clear picture of the relative difficulty of each item (test question) used in the CASAS tests. EFF's model of human performance goes several steps beyond this analysis of the relative difficulty of tasks. EFF focuses on the question "What should people know and be able to do?" To address this question EFF researchers developed descriptions of the underlying knowledge, skills, and strategies, as well as levels of fluency (ease) and independence that adults use as they apply each EFF Standard (each Standard defined as a purposeful application of an integrate skill process) in performing increasingly more challenging tasks. Looking at more of the cognitively complexity involved in using skills like Reading With Understanding and Conveying Ideas in Writing is what makes the EFF model appear more complicated than CBE and CASAS competencies model. This complexity has the advantage of providing more detailed guidance for learning, instruction, and assessment. In a competency-based approach, the question of how someone is able to accomplish a task is left open. The manner in which knowledge, skills, and abilities are applied to accomplishing a task is not addressed directly. By contrast, cognitive science approaches (such as that guiding EFF) let us lift the lid of the black box of human performance to better understand (and teach and assess) the knowledge, skills, and strategies that adult learners need to be successful in performing a wide range of tasks in a wide range of contexts. Regie Stites SRI International Marie Cora wrote: >Good morning, afternoon, and evening to you all. > >I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our >discussion. I've been thinking about this over the weekend, and I have >a couple of questions to start us off: > >For our guests: > >-The EFF Standards are complex in terms of what they try to capture in a >performance. Is this was makes them different from competencies? Or >perhaps even different from other standards? > >For subscribers: I found the "thought-provokers" really helped me to >focus on a piece of this big picture so I could get a handle on it. Did >anyone try #1 below? Or perhaps if there are EFF users on the List, you >might comment on this activity. As for #2 below - I found this question >helpful because it did make me consider how often and in what ways I >would look for achievement over time, and it also made me think that I >would necessarily look for such incremental gains via classroom >assessment rather than with a high stakes test. > >1. Pick any EFF standard, read its definition, and imagine what it >would look like if you were actually assessing the application of the >integrated skill process described in the standard's definition. > >2. How often do you feel a need to look for evidence that learning has >happened? How does the nature of the evidence you are looking for >change as you look for learning within the space of one class session, >one week, one month, one course, one year, and so on. > >Anyway, that's what I was thinking about. How about you? Please post >your questions and comments! >Thanks, >marie >Assessment Discussion List Moderator > > > > >------------------------------------------------------------------------ > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060109/39f56726/attachment.html From regie.stites at sri.com Mon Jan 9 19:12:45 2006 From: regie.stites at sri.com (Regie Stites) Date: Mon, 09 Jan 2006 16:12:45 -0800 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <802F2B4590320142A57872DC43A2BFD20218ADCC@seamail.seagoodwill.org> References: <802F2B4590320142A57872DC43A2BFD20218ADCC@seamail.seagoodwill.org> Message-ID: <43C2FBFD.3010308@sri.com> Hi Sam, You are right. This is simple question with a complicated answer. Crosswalks between CASAS competencies and EFF Standards and EFF Performance Levels have been done in various ways for various purposes. I don't know if anyone has yet done a crosswalk between CASAS content standards and EFF, but that certainly is possible. But rather than provide you with links to these crosswalks, let me first insert a note of caution. Comparing CASAS and EFF is like comparing apples and oranges. It's okay if the difference between apples and oranges doesn't matter to you. For example, we crosswalked EFF performance levels to NRS Educationing Function Levels which, in turn, are referenced to CASAS scale scores. So for some EFF standards you could make a connection between EFF levels and CASAS scores that way. But this really is an apples and oranges comparison, because CASAS "reading" is not the same thing as EFF "reading" (see my post about how EFF standards are different from CASAS competencies). As long you are happy defining "reading" or "math" etc. at the "fruit" level (where the differences between apples, oranges, and grapes don't matter much - that's what NRS does) you can cross-reference CASAS and EFF. But in most cases, the difference between apples and oranges matters (substituting 6 oranges for 6 apples in an apple pie recipe is not a good idea). So, my question is how exactly do you plan to make use of a cross-reference between CASAS and EFF? Regie Stites SRI International Samuel McGraw III wrote: >Marie et. al., > >I have a simple (yet possible complex answer) question. > >Has anyone cross referenced EFF and CASAS standards? And if so. What the outcome. > >Sam >Seattle Goodwill Learning Center >-----Original Message----- >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora >Sent: Monday, January 09, 2006 7:10 AM >To: Assessment Discussion List >Subject: [Assessment] EFF Discussion Begins Today! > > >Good morning, afternoon, and evening to you all. > >I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our discussion. I've been thinking about this over the weekend, and I have a couple of questions to start us off: > >For our guests: > >-The EFF Standards are complex in terms of what they try to capture in a performance. Is this was makes them different from competencies? Or perhaps even different from other standards? > >For subscribers: I found the "thought-provokers" really helped me to focus on a piece of this big picture so I could get a handle on it. Did anyone try #1 below? Or perhaps if there are EFF users on the List, you might comment on this activity. As for #2 below - I found this question helpful because it did make me consider how often and in what ways I would look for achievement over time, and it also made me think that I would necessarily look for such incremental gains via classroom assessment rather than with a high stakes test. > >1. Pick any EFF standard, read its definition, and imagine what it would look like if you were actually assessing the application of the integrated skill process described in the standard's definition. > >2. How often do you feel a need to look for evidence that learning has happened? How does the nature of the evidence you are looking for change as you look for learning within the space of one class session, one week, one month, one course, one year, and so on. > >Anyway, that's what I was thinking about. How about you? Please post your questions and comments! >Thanks, >marie >Assessment Discussion List Moderator > > > > >------------------------------------------------------------------------ > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060109/06814bed/attachment.html From marie.cora at hotspurpartners.com Tue Jan 10 15:01:01 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 10 Jan 2006 15:01:01 -0500 Subject: [Assessment] EFF in the classroom Message-ID: <019401c61620$9528a030$0402a8c0@frodo> Hi everyone, I'm wondering if one of our guests (or anyone who works with EFF!) can describe how to use EFF to develop an assessment for the classroom. Maybe this is a lot to ask, but if you can walk us through an example of assessing Read With Understanding or Use Math to Solve Problems for example, that would be great (but an example using any standard will do!). Also, how do you assess some of the Lifelong Learning and Interpersonal Skills? Some of the Broad Areas of Responsibility and Key Activities in the Role Maps (http://eff.cls.utk.edu/fundamentals/eff_roles.htm) are pretty abstract and the concepts are large - can someone comment on this? Thanks, marie Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060110/c619b0b1/attachment.html From akohring at utk.edu Tue Jan 10 15:11:36 2006 From: akohring at utk.edu (Aaron Kohring) Date: Tue, 10 Jan 2006 15:11:36 -0500 Subject: [Assessment] EFF in the classroom In-Reply-To: <019401c61620$9528a030$0402a8c0@frodo> Message-ID: <5.1.0.14.2.20060110150458.03c7f308@pop.utk.edu> Marie, I'll tackle your first question with an example from the EFF Toolkit: http://eff.cls.utk.edu/toolkit/example_math_risk_ratios.htm You can read through this example and see how the teacher came up with an assessment checklist with students for a particular lesson. Can you see how the assessment connects back to the definition for the Math Standard? The example also shows in a table at the bottom how the integrated skill process for the Standard is addressed within the lesson- what it looks like for that lesson. Aaron At 03:01 PM 1/10/2006 -0500, you wrote: >Hi everyone, > > > >I m wondering if one of our guests (or anyone who works with EFF!) can >describe how to use EFF to develop an assessment for the classroom. Maybe >this is a lot to ask, but if you can walk us through an example of >assessing Read With Understanding or Use Math to Solve Problems for >example, that would be great (but an example using any standard will do!). > > > >Also, how do you assess some of the Lifelong Learning and Interpersonal >Skills? Some of the Broad Areas of Responsibility and Key Activities in >the Role Maps (http://eff.cls.utk.edu/fundamentals/eff_roles.htm) are >pretty abstract and the concepts are large can someone comment on this? > > > >Thanks, > >marie > >Assessment Discussion List Moderator > > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to >http://www.nifl.gov/mailman/listinfo/assessment Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu From akohring at utk.edu Tue Jan 10 16:13:06 2006 From: akohring at utk.edu (Aaron Kohring) Date: Tue, 10 Jan 2006 16:13:06 -0500 Subject: [Assessment] EFF in the classroom & Role Maps In-Reply-To: <019401c61620$9528a030$0402a8c0@frodo> Message-ID: <5.1.0.14.2.20060110152509.03c21928@pop.utk.edu> Marie, Another of your questions refers to the Role Maps. For me, I'd try to keep it simple by narrowing the focus. For example, say you have a classroom where the students' primary focus is to find a job. As a class, we have been looking at the Worker Role Map (http://eff.cls.utk.edu/fundamentals/role_map_worker.htm). We've already talked about the knowledge and skills they have and what they need to work on. Maybe we've decided that one important area to address is the Broad Area of Responsibility: Work With Others and a Key Activity: Communicate with Others Inside and Outside the Organization. Students have identified Communication Skills as an area to work on. For this lesson, we choose the Listen Actively standard as our focus. These students have done role plays. They brainstorm some workplace scenarios. You choose a scenario as a class. Then looking at the Listen Actively standard and the scenario, we want to develop a simple rubric check-off to assess our role play. How well did we Listen Actively during the role play: Well Done, Sometimes, Need to Work on. What evidence will we look for: -You focused your attention on the person speaking - You asked questions when you didn't understand - You did not interrupt the speaker - You looked directly at the person speaking - Your facial expression and posture showed you were paying attention - You thought about what the speaker said and compared it to what you already know Then ask, does our rubric adequately address the standard we are focusing on for this role play. The students could break into groups to write dialog for the role play. Each group could act out their role play while the other groups assess their performance. Often there is great dialogue that comes out of this interaction, too. Aaron At 03:01 PM 1/10/2006 -0500, you wrote: >Hi everyone, > > > >I m wondering if one of our guests (or anyone who works with EFF!) can >describe how to use EFF to develop an assessment for the classroom. Maybe >this is a lot to ask, but if you can walk us through an example of >assessing Read With Understanding or Use Math to Solve Problems for >example, that would be great (but an example using any standard will do!). > > > >Also, how do you assess some of the Lifelong Learning and Interpersonal >Skills? Some of the Broad Areas of Responsibility and Key Activities in >the Role Maps (http://eff.cls.utk.edu/fundamentals/eff_roles.htm) are >pretty abstract and the concepts are large can someone comment on this? > > > >Thanks, > >marie > >Assessment Discussion List Moderator > > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to >http://www.nifl.gov/mailman/listinfo/assessment Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu From ltaylor at casas.org Tue Jan 10 17:47:20 2006 From: ltaylor at casas.org (Linda Taylor) Date: Tue, 10 Jan 2006 17:47:20 -0500 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <43C2F6B7.6080208@sri.com> Message-ID: <0ISW00EL5F751NC8@vms044.mailsrvcs.net> I and my colleagues at CASAS find this discussion quite interesting. We want to take the opportunity to respond to clarify and comment on some of the statements that have been made. CASAS began in the 1980s as a competency-based assessment system assessing reading comprehension, listening comprehension and applied mathematics in functional adult contexts related to the family, the community and the workplace. Over a 25-year period, the CASAS system has evolved to include and reflect recent cognitive science research and theory. This research is reflected in the development of CASAS performance assessments in writing, speaking and developmental skills which use rubrics to evaluate both the underlying knowledge, skills and abilities and competencies. It is also reflected in an additional component in the CASAS system, the CASAS Content Standards. CASAS assessments, like EFF, measure both what someone knows and is able to do. The underlying knowledge, skills and strategies are embedded in each performance task and test item, and are spelled out in the CASAS Content Standards. These new Content Standards directly address the manner in which knowledge, skills and abilities are applied to accomplishing a task. They provide a framework to understand the cognitive complexity within each performance task and test item, and they allow teachers and students to gain a fuller understanding of the underlying basic skills. We would further suggest that, like EFF, CASAS assessments provide a "purposeful application of an integrated skill process in performing increasingly more challenging tasks" through the use of competencies and content standards, as well as rubrics for performance assessments. The competencies and content standards can be assessed over a broad spectrum of instructional levels so programs can teach and measure progress from beginning literacy through high school completion. In fact, the same competency can be targeted to one or more instructional levels. In addition, the range of contexts for CASAS standardized assessments is very wide, with separate test series focusing on life skills, employability, and workplace settings. The new CASAS Life and Work Reading series was developed based on both competencies and content standards, and each item is coded in both ways. CASAS Content Standards are currently available for Reading and Listening, and will soon be available in all skill areas. We would also like to refer readers to the CA Dept. of Education sponsored EL Civics website developed by CASAS, a rich resource that exemplifies an approach to integrating underlying language and literacy objectives with competency and performance objectives in the area of EL Civics instruction. This El Civics website also includes performance assessment plans for classroom-based assessment, and information about it can be found in an article on the CASAS website at http://www.casas.org/Online_Quarterlies/Index_fall04.cfm?selected_id=1400 &wtarget=body#2. Linda Taylor, Director of Assessment Development, CASAS _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Regie Stites Sent: Monday, January 09, 2006 6:50 PM To: The Assessment Discussion List Subject: Re: [Assessment] EFF Discussion Begins Today! Marie and all, Thanks for invitation to participate in this discussion. I have some initial thoughts in response to your question about the complexity of EFF in comparison to competencies. I want to ponder a bit more before responding to the second part your question about how EFF is different from other standards? (Thanks to my EFF colleagues Aaron Kohring and Peggy McGuire for comments and suggestions on an earlier draft of this. I'm sure they will have more of their own thoughts to add as the discussion continues). The EFF Standards are grounded in different conceptualizations of adult performance and adult learning than competency-based education (CBE). EFF is based on an understanding of expertise (high-level human performance) that comes out of cognitive science research and theory developed in the late 1970s and elaborated in the 1980s and 1990s. CBE is based on a somewhat different (and earlier) model of human performance that stems from cognitive psychology and industrial/organizational psychology research and theory from the 1960s. The CBE model is fairly simple. It assumes that human performance can be understood as the ability to accomplish tasks. It is basically focused on the question "What should people be able to do?" Researchers studied human performance in various contexts and analyzed the tasks that people performed in those contexts. Through large-scale surveys (such as the Adult Performance Level study - APL) tasks were identified and through task analysis tasks were placed in a hierarchy from simple to complex. This is the basis for the scaled lists of CASAS competencies that are the foundation for CASAS tests. Items on CASAS tests are designed to simulate as closely as possible, the tasks that people perform in work and life. Through careful design of test items and analysis of test results (using Item Response Theory - IRT), CASAS has been able to provide a clear picture of the relative difficulty of each item (test question) used in the CASAS tests. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060110/5d0e2e56/attachment.html From marie.cora at hotspurpartners.com Tue Jan 10 19:24:12 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 10 Jan 2006 19:24:12 -0500 Subject: [Assessment] EFF in the classroom Message-ID: <01e301c61645$59210ad0$0402a8c0@frodo> The following is posted for Peggy McGuire. Hey Marie and all! There is an entire chapter of the publication "Improving Performance, Reporting Results" devoted to using the EFF Read wth Understanding standard and performance continuum to develop instructional (classroom) assessments. You can find the PDF version of this publication in the Assessment Resource Collection by clicking on "assessment tools" and then clicking on the title. Many thanks to Aaron for pointing out the resource from the Teaching/Learning Toolkit -- there are other examples there as well that are worth exploring! Peggy McGuire, M.A. Senior Research Associate and Equipped for the Future National Consultant Center for Literacy Studies The University of Tennessee 111 5th Street, PO Box 16 Mt. Gretna, PA 17064 717-964-1341 (p/f) 215-888-6507 (cell) mcguirep555 at aol.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060110/91a6e377/attachment.html From lmullins89 at yahoo.com Tue Jan 10 21:11:45 2006 From: lmullins89 at yahoo.com (Lisa Mullins) Date: Tue, 10 Jan 2006 18:11:45 -0800 (PST) Subject: [Assessment] EFF in the classroom In-Reply-To: <019401c61620$9528a030$0402a8c0@frodo> Message-ID: <20060111021145.41740.qmail@web30207.mail.mud.yahoo.com> Hello everyone, I have been using EFF for a few years. I am an ABE/GED/ESOL teacher in Rogersville, Tennesee. My favorite part of EFF is the Teaching/Learning cycle steps. In using the T/L cycle the instructor and learners must think about what they will produce or do in order to meet the components of the Standard. Everyone involved must think about what they are trying to do and how it will look as a finished product. I often do several types of informal assessment with each EFF lesson. Those informal assessments include pre-lesson surveys, checklists, rubrics, ranking scales, and post-lesson surveys. In addition, I connect the task at hand to the more formal assessment instruments such as the TABE or GED. Sometimes this is accomplished by having the students create their own questions in GED format. At other times, on in addition, I connect to a goal the students have in common or to a real-life situation. A lesson I have tried several times is one on graphs using the Use Math to Solve Problems and Communicate Standard. First, I ask the students to try to answer some questions. What are graphs? What is data? What are the types of graphs? Why would you use a graph? What are the parts of graphs? The answers provide me with information about prior knowledge. Next, we examine several graphs. We discuss the data source and use of each graph. Then, we make a list of the parts all graphs must have to be effective. This list becomes our checklist. Now, the students decide what they would like to graph and why. They each decide which type of graph would work best for their purpose. We take a look at the components of the Use Math Standard to plan how the learners will accomplish each one. At this point, we make a description of a presentation the learners must give in order to communicate about their graph. This is our rubric. Finally, the students create a graph of their own. When the graphs are complete, the student must give a brief presentation to the class explaining the graph. Each of us uses the checklist to determine if the graphs have the parts. Also, we use the rubric to determine the quality of the presentation. Then, we take a look at the plan we made for the Standard components and determine if we accomplished our goals and to what degree. Now, I connect the lesson on graphs to the GED. We discuss where graphs will be found on the test. We talk about the types of questions that will be asked. Then, I have the students create their own GED type question from the information found in their graphs. In the last steps, I give the pre-lesson survey again. I also ask the learners to describe what they learned and how it can be used in daily life. I like to use the EFF Standards Use Math, Read with Understanding, and Convey ideas in Writing since these are things that all adults need to be able to do. This is just one example of this lesson. I've used it a dozen times and each time it turns out different. Thank you, Lisa Mullins --- Marie Cora wrote: > Hi everyone, > > I'm wondering if one of our guests (or anyone who > works with EFF!) can > describe how to use EFF to develop an assessment for > the classroom. > Maybe this is a lot to ask, but if you can walk us > through an example of > assessing Read With Understanding or Use Math to > Solve Problems for > example, that would be great (but an example using > any standard will > do!). > > Also, how do you assess some of the Lifelong > Learning and Interpersonal > Skills? Some of the Broad Areas of Responsibility > and Key Activities in > the Role Maps > (http://eff.cls.utk.edu/fundamentals/eff_roles.htm) > are > pretty abstract and the concepts are large - can > someone comment on > this? > > Thanks, > marie > Assessment Discussion List Moderator > > > > ------------------------------- > National Insitute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, > please go to > http://www.nifl.gov/mailman/listinfo/assessment > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From lmullins89 at yahoo.com Tue Jan 10 21:11:45 2006 From: lmullins89 at yahoo.com (Lisa Mullins) Date: Tue, 10 Jan 2006 18:11:45 -0800 (PST) Subject: [Assessment] EFF in the classroom In-Reply-To: <019401c61620$9528a030$0402a8c0@frodo> Message-ID: <20060111021145.41740.qmail@web30207.mail.mud.yahoo.com> Hello everyone, I have been using EFF for a few years. I am an ABE/GED/ESOL teacher in Rogersville, Tennesee. My favorite part of EFF is the Teaching/Learning cycle steps. In using the T/L cycle the instructor and learners must think about what they will produce or do in order to meet the components of the Standard. Everyone involved must think about what they are trying to do and how it will look as a finished product. I often do several types of informal assessment with each EFF lesson. Those informal assessments include pre-lesson surveys, checklists, rubrics, ranking scales, and post-lesson surveys. In addition, I connect the task at hand to the more formal assessment instruments such as the TABE or GED. Sometimes this is accomplished by having the students create their own questions in GED format. At other times, on in addition, I connect to a goal the students have in common or to a real-life situation. A lesson I have tried several times is one on graphs using the Use Math to Solve Problems and Communicate Standard. First, I ask the students to try to answer some questions. What are graphs? What is data? What are the types of graphs? Why would you use a graph? What are the parts of graphs? The answers provide me with information about prior knowledge. Next, we examine several graphs. We discuss the data source and use of each graph. Then, we make a list of the parts all graphs must have to be effective. This list becomes our checklist. Now, the students decide what they would like to graph and why. They each decide which type of graph would work best for their purpose. We take a look at the components of the Use Math Standard to plan how the learners will accomplish each one. At this point, we make a description of a presentation the learners must give in order to communicate about their graph. This is our rubric. Finally, the students create a graph of their own. When the graphs are complete, the student must give a brief presentation to the class explaining the graph. Each of us uses the checklist to determine if the graphs have the parts. Also, we use the rubric to determine the quality of the presentation. Then, we take a look at the plan we made for the Standard components and determine if we accomplished our goals and to what degree. Now, I connect the lesson on graphs to the GED. We discuss where graphs will be found on the test. We talk about the types of questions that will be asked. Then, I have the students create their own GED type question from the information found in their graphs. In the last steps, I give the pre-lesson survey again. I also ask the learners to describe what they learned and how it can be used in daily life. I like to use the EFF Standards Use Math, Read with Understanding, and Convey ideas in Writing since these are things that all adults need to be able to do. This is just one example of this lesson. I've used it a dozen times and each time it turns out different. Thank you, Lisa Mullins --- Marie Cora wrote: > Hi everyone, > > I'm wondering if one of our guests (or anyone who > works with EFF!) can > describe how to use EFF to develop an assessment for > the classroom. > Maybe this is a lot to ask, but if you can walk us > through an example of > assessing Read With Understanding or Use Math to > Solve Problems for > example, that would be great (but an example using > any standard will > do!). > > Also, how do you assess some of the Lifelong > Learning and Interpersonal > Skills? Some of the Broad Areas of Responsibility > and Key Activities in > the Role Maps > (http://eff.cls.utk.edu/fundamentals/eff_roles.htm) > are > pretty abstract and the concepts are large - can > someone comment on > this? > > Thanks, > marie > Assessment Discussion List Moderator > > > > ------------------------------- > National Insitute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, > please go to > http://www.nifl.gov/mailman/listinfo/assessment > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From marilyn.gillespie at sri.com Wed Jan 11 10:47:11 2006 From: marilyn.gillespie at sri.com (Marilyn Gillespie) Date: Wed, 11 Jan 2006 10:47:11 -0500 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <009901c6152e$b4008650$0402a8c0@frodo> References: <009901c6152e$b4008650$0402a8c0@frodo> Message-ID: <43C5287F.2090403@sri.com> Marie, I wanted to let list members know about the latest issue of Education Week. As some of you who have contact with K-12 education may know, Education Week is a weekly newspaper that reports federal and state news. This week the entire issue includes a review of the success of the standards-based education effort of the past decade. They call the results at once "heartening and sobering". They're heartening in that student achievement in some areas, particularly in math have improved. In addition, there have been real gains for black, Hispanic and low income students, especially in math. However, after all the "effort" that has been put into reading, overall reading scores have barely budged form 1992 to 2005 (although the scores of black, Hispanic and low income children increased at nearly triple the national average). There is also a discussion of high school drop out and graduation rates (still as low as ever for those same groups.) There are lots of reflective articles on the benefits, trade-offs and negative aspects of standards-based education that we, as adult educators, could learn from Several states are profiled in-depth including Delaware, New York, Texas, Massachusetts, Iowa and Nevada. The articles contain a lot of food for thought and would be a great resource for discussion groups or study circles. Marilyn Gillespie SRI International Marie Cora wrote: > Good morning, afternoon, and evening to you all. > > > > I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to > our discussion. I've been thinking about this over the weekend, and I > have a couple of questions to start us off: > > > > For our guests: > > > > -The EFF Standards are complex in terms of what they try to capture in > a performance. Is this was makes them different from competencies? > Or perhaps even different from other standards? > > > > For subscribers: I found the "thought-provokers" really helped me to > focus on a piece of this big picture so I could get a handle on it. > Did anyone try #1 below? Or perhaps if there are EFF users on the > List, you might comment on this activity. As for #2 below - I found > this question helpful because it did make me consider how often and in > what ways I would look for achievement over time, and it also made me > think that I would necessarily look for such incremental gains via > classroom assessment rather than with a high stakes test. > > > > 1. Pick any EFF standard, read its definition, and imagine what it > would look like if you were actually assessing the application of the > integrated skill process described in the standard's definition. > > > > 2. How often do you feel a need to look for evidence that learning > has happened? How does the nature of the evidence you are looking for > change as you look for learning within the space of one class session, > one week, one month, one course, one year, and so on. > > > > Anyway, that's what I was thinking about. How about you? Please post > your questions and comments! > > Thanks, > > marie > > Assessment Discussion List Moderator > > > >------------------------------------------------------------------------ > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > -- Marilyn Gillespie, Ed.D. Educational Researcher SRI International 1100 Wilson Blvd., Suite 2800 Arlington, VA 22209-2268 Phone: (703) 247-8510 Fax: (703) 247-8493 marilyn.gillespie at sri.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060111/1ebbe4d4/attachment.html From MWPotts2001 at aol.com Wed Jan 11 13:37:32 2006 From: MWPotts2001 at aol.com (MWPotts2001 at aol.com) Date: Wed, 11 Jan 2006 13:37:32 EST Subject: [Assessment] EFF and CASAS Assessment Message-ID: <1ff.104fa25f.30f6aa6c@aol.com> Regie, Thank you for your explanations re: the differences between EFF and CASAS and the cautions about mixing apples and oranges. I have carried around your former posts on this subject because I work with groups who are mandated to use both EFF and CASAS, and it is not easy trying to help them with assessment problems. It has become more than a problem; it is now an issue. In light of Linda's posting, I am wondering how the two of you might help us find ways to work more satisfactorily so that we are not at odds, rather more in sync as we struggle to include the best of both? Is it possible, after all, that the fruit salad might be a variety of apples, say Macintosh and Jonathan, rather than apples and oranges? I confess ignorance and plead for enlightenment. In the absence of a twofold assessment design that gratifies the needs of the programs and states who are mandated to use both EFF and CASAS, I have been trying to help people use portfolio assessment, in which they demonstrate mastery of the EFF reading purposes (or Math purposes) and the CASAS competencies, which seem (and I emphasize seem) to be located at the same level of expertise. This has worked for us, and students have accomplished their goals of moving (slowly) along the EFF continuum. We are at the novice stage, however, with this kind of assessment. Teachers still struggle with questions, such as *How many entries are good enough to demonstrate mastery?* And *What can we include that will demonstrate the range of fluency, independence, and the ability to perform in a variety of settings?* And more. So help us, please. Thanks and All the Best, Meta Potts FOCUS on Literacy Glen Allen, VA -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060111/2bfbf88f/attachment.html From hdooley at riral.org Wed Jan 11 14:24:33 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Wed, 11 Jan 2006 14:24:33 -0500 Subject: [Assessment] EFF and CASAS Assessment In-Reply-To: <1ff.104fa25f.30f6aa6c@aol.com> References: <1ff.104fa25f.30f6aa6c@aol.com> Message-ID: <43C55B71.5080308@riral.org> Meta -- Thank you for your post. I am always heartened when I hear and see that others are fighting the same good fight, and have come to similar conclusions as our own. At my program we have been investigating portfolios (we call them folders) as a way of doing (or organizing, or collecting) non-NRS assessments, and one that helps instructors and learners align curriculum with instruction with assessment. "Alignment" to me means making sure that assessment, instruction and curriculum are all relevant and meaningful to all participants. I think the salad analogy holds for any program or class where the standards used don't come with ready-made assessments. States that have developed content standards, have programs using the TABE, the CASAS, the BEST, even the GED or EDP, and, voila!, alignment issues arise. I truly believe this issue, that of alignment, is the more important issue (more important than which set of standards or which assessements we decide / are mandated to use), and the issue that most of us will confront in our programs, and continually confront since staff comes and goes so regularly. As I read your last paragraph, I appreciate your struggle. We struggle, too. It is important for me to remember that the portfolios are not for the funder (as often the CASAS or TABE or BEST are); they are for learner, and the questions that arise with them need answers that are learner determined. The learners should decide how many is enough, how good is good enough, what demonstrations are appropriate. I don't believe these decisions need to hold across classes, or sites, or programs. Consistency of form and process is good, of course, it will help your administrators to understand what is going on, and help your teachers to share thoughts and ideas. Over the semesters, your teachers may find that the rubrics and checklists used each time are pretty darn similar -- or not. And of course, if you find that your portfolio assessment indicates better reading but your CASAS score doesn't, then you'll need some reflection and re-alignment. But, by and large, I think you'll find you won't, because by and large you have sincere and experienced instructors, and you can trust their science and their art. I hope to hear more about how the struggles go, and what good ideas we can share with you as we move ahead. Howard D. Project RIRAL MWPotts2001 at aol.com wrote: > Regie, > > Thank you for your explanations re: the differences between EFF and > CASAS and the cautions about mixing apples and oranges. I have > carried around your former posts on this subject because I work with > groups who are mandated to use both EFF and CASAS, and it is not easy > trying to help them with assessment problems. It has become more than > a problem; it is now an issue. > > In light of Linda's posting, I am wondering how the two of you might > help us find ways to work more satisfactorily so that we are not at > odds, rather more in sync as we struggle to include the best of both? > Is it possible, after all, that the fruit salad might be a variety of > apples, say Macintosh and Jonathan, rather than apples and oranges? I > confess ignorance and plead for enlightenment. > > In the absence of a twofold assessment design that gratifies the needs > of the programs and states who are mandated to use both EFF and CASAS, > I have been trying to help people use portfolio assessment, in which > they demonstrate mastery of the EFF reading purposes (or Math > purposes) and the CASAS competencies, which seem (and I emphasize > seem) to be located at the same level of expertise. This has worked > for us, and students have accomplished their goals of moving (slowly) > along the EFF continuum. We are at the novice stage, however, with > this kind of assessment. > > Teachers still struggle with questions, such as *How many entries are > good enough to demonstrate mastery?* And *What can we include that > will demonstrate the range of fluency, independence, and the ability > to perform in a variety of settings?* And more. > > So help us, please. > > Thanks and All the Best, > > Meta Potts > FOCUS on Literacy > Glen Allen, VA > > > > > >------------------------------------------------------------------------ > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > From akohring at utk.edu Wed Jan 11 15:54:11 2006 From: akohring at utk.edu (Aaron Kohring) Date: Wed, 11 Jan 2006 15:54:11 -0500 Subject: [Assessment] Education Week: Quality Counts report In-Reply-To: <43C5287F.2090403@sri.com> References: <009901c6152e$b4008650$0402a8c0@frodo> <009901c6152e$b4008650$0402a8c0@frodo> Message-ID: <5.1.0.14.2.20060111151028.02f6fc80@pop.utk.edu> Hi all, The link to the report that Marilyn mentioned below is: http://www.edweek.org/qc06 Aaron At 10:47 AM 1/11/2006 -0500, you wrote: >Marie, > >I wanted to let list members know about the latest issue of Education >Week. As some of you who have contact with K-12 education may know, >Education Week is a weekly newspaper that reports federal and state news. >This week the entire issue includes a review of the success of the >standards-based education effort of the past decade. They call the results >at once "heartening and sobering". They're heartening in that student >achievement in some areas, particularly in math have improved. In >addition, there have been real gains for black, Hispanic and low income >students, especially in math. However, after all the "effort" that has >been put into reading, overall reading scores have barely budged form 1992 >to 2005 (although the scores of black, Hispanic and low income children >increased at nearly triple the national average). There is also a >discussion of high school drop out and graduation rates (still as low as >ever for those same groups.) There are lots of reflective articles on the >benefits, trade-offs and negative aspects of standards-based education >that we, as adult educators, could learn from Several states are profiled >in-depth including Delaware, New York, Texas, Massachusetts, Iowa and >Nevada. The articles contain a lot of food for thought and would be a >great resource for discussion groups or study circles. > >Marilyn Gillespie >SRI International > >Marie Cora wrote: >> >>Good morning, afternoon, and evening to you all. >> >> >> >>I m pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our >>discussion. I ve been thinking about this over the weekend, and I have a >>couple of questions to start us off: >> >> >> >>For our guests: >> >> >> >>-The EFF Standards are complex in terms of what they try to capture in a >>performance. Is this was makes them different from competencies? Or >>perhaps even different from other standards? >> >> >> >>For subscribers: I found the thought-provokers really helped me to focus >>on a piece of this big picture so I could get a handle on it. Did anyone >>try #1 below? Or perhaps if there are EFF users on the List, you might >>comment on this activity. As for #2 below I found this question helpful >>because it did make me consider how often and in what ways I would look >>for achievement over time, and it also made me think that I would >>necessarily look for such incremental gains via classroom assessment >>rather than with a high stakes test. >> >> >> >>1. Pick any EFF standard, read its definition, and imagine what it would >>look like if you were actually assessing the application of the >>integrated skill process described in the standard s definition. >> >> >> >>2. How often do you feel a need to look for evidence that learning has >>happened? How does the nature of the evidence you are looking for change >>as you look for learning within the space of one class session, one week, >>one month, one course, one year, and so on. >> >> >> >>Anyway, that s what I was thinking about. How about you? Please post >>your questions and comments! >> >>Thanks, >> >>marie >> >>Assessment Discussion List Moderator >> >> >> >> >> >> >> >>------------------------------- >>National Insitute for Literacy >>Assessment mailing list >>Assessment at nifl.gov >>To unsubscribe or change your subscription settings, please go to >>http://www.nifl.gov/mailman/listinfo/assessment >> > > >-- >Marilyn Gillespie, Ed.D. >Educational Researcher >SRI International >1100 Wilson Blvd., Suite 2800 >Arlington, VA 22209-2268 >Phone: (703) 247-8510 >Fax: (703) 247-8493 >marilyn.gillespie at sri.com >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to >http://www.nifl.gov/mailman/listinfo/assessment Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu From marilyn.gillespie at sri.com Thu Jan 12 10:01:13 2006 From: marilyn.gillespie at sri.com (Marilyn Gillespie) Date: Thu, 12 Jan 2006 10:01:13 -0500 Subject: [Assessment] EFF and CASAS Assessment In-Reply-To: <43C55B71.5080308@riral.org> References: <1ff.104fa25f.30f6aa6c@aol.com> <43C55B71.5080308@riral.org> Message-ID: <43C66F39.9020908@sri.com> Howard, I thought your comments were very interesting and insightful. They make another case for why teachers working in standards-based adult education need plenty of time for program-embedded professional development. Teachers need time to work as a team to analyze student work for evidence of progress, particularly because of the lack of alignment of instruction with assessment. Ideally, programs would be able to collect and share these additional indicators of learner progress so that better more aligned assessments can eventually be developed. Cris Smith and I are working on a paper on the implications of K-12 professional development research findings for adult educators. Your comments have been useful in developing our discussions of the implications of standards based teaching and learning for teacher professional development in adult education, so thanks. Marilyn Howard L. Dooley, Jr. wrote: >Meta -- Thank you for your post. I am always heartened when I hear and >see that others are fighting the same good fight, and have come to >similar conclusions as our own. At my program we have been >investigating portfolios (we call them folders) as a way of doing (or >organizing, or collecting) non-NRS assessments, and one that helps >instructors and learners align curriculum with instruction with >assessment. "Alignment" to me means making sure that assessment, >instruction and curriculum are all relevant and meaningful to all >participants. I think the salad analogy holds for any program or class >where the standards used don't come with ready-made assessments. States >that have developed content standards, have programs using the TABE, the >CASAS, the BEST, even the GED or EDP, and, voila!, alignment issues >arise. I truly believe this issue, that of alignment, is the more >important issue (more important than which set of standards or which >assessements we decide / are mandated to use), and the issue that most >of us will confront in our programs, and continually confront since >staff comes and goes so regularly. > >As I read your last paragraph, I appreciate your struggle. We struggle, >too. It is important for me to remember that the portfolios are not for >the funder (as often the CASAS or TABE or BEST are); they are for >learner, and the questions that arise with them need answers that are >learner determined. The learners should decide how many is enough, how >good is good enough, what demonstrations are appropriate. I don't >believe these decisions need to hold across classes, or sites, or >programs. Consistency of form and process is good, of course, it will >help your administrators to understand what is going on, and help your >teachers to share thoughts and ideas. Over the semesters, your teachers >may find that the rubrics and checklists used each time are pretty darn >similar -- or not. And of course, if you find that your portfolio >assessment indicates better reading but your CASAS score doesn't, then >you'll need some reflection and re-alignment. But, by and large, I >think you'll find you won't, because by and large you have sincere and >experienced instructors, and you can trust their science and their art. > >I hope to hear more about how the struggles go, and what good ideas we >can share with you as we move ahead. > >Howard D. >Project RIRAL > > >MWPotts2001 at aol.com wrote: > > > >>Regie, >> >>Thank you for your explanations re: the differences between EFF and >>CASAS and the cautions about mixing apples and oranges. I have >>carried around your former posts on this subject because I work with >>groups who are mandated to use both EFF and CASAS, and it is not easy >>trying to help them with assessment problems. It has become more than >>a problem; it is now an issue. >> >>In light of Linda's posting, I am wondering how the two of you might >>help us find ways to work more satisfactorily so that we are not at >>odds, rather more in sync as we struggle to include the best of both? >> Is it possible, after all, that the fruit salad might be a variety of >>apples, say Macintosh and Jonathan, rather than apples and oranges? I >>confess ignorance and plead for enlightenment. >> >>In the absence of a twofold assessment design that gratifies the needs >>of the programs and states who are mandated to use both EFF and CASAS, >>I have been trying to help people use portfolio assessment, in which >>they demonstrate mastery of the EFF reading purposes (or Math >>purposes) and the CASAS competencies, which seem (and I emphasize >>seem) to be located at the same level of expertise. This has worked >>for us, and students have accomplished their goals of moving (slowly) >>along the EFF continuum. We are at the novice stage, however, with >>this kind of assessment. >> >>Teachers still struggle with questions, such as *How many entries are >>good enough to demonstrate mastery?* And *What can we include that >>will demonstrate the range of fluency, independence, and the ability >>to perform in a variety of settings?* And more. >> >>So help us, please. >> >>Thanks and All the Best, >> >>Meta Potts >>FOCUS on Literacy >>Glen Allen, VA >> >> >> >> >> >>------------------------------------------------------------------------ >> >> >>------------------------------- >>National Insitute for Literacy >>Assessment mailing list >>Assessment at nifl.gov >>To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment >> >> >> >> > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > -- Marilyn Gillespie, Ed.D. Educational Researcher SRI International 1100 Wilson Blvd., Suite 2800 Arlington, VA 22209-2268 Phone: (703) 247-8510 Fax: (703) 247-8493 marilyn.gillespie at sri.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060112/029fe38f/attachment.html From marie.cora at hotspurpartners.com Thu Jan 12 13:08:22 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 12 Jan 2006 13:08:22 -0500 Subject: [Assessment] EFF standards-based practice and accountability testing Message-ID: <003e01c617a3$2da36970$0402a8c0@frodo> The following post is from Peggy McGuire. Hi everyone. I've been doing a lot of thinking since collaborating with Regie on his initial posts this week,and then reading the recent responses from Linda, Meta and Howard (thanks, folks!). Feels like there's so much to talk about, but given that this medium doesn't truly allow us to have the deep conversations I'd love to be having with y'all (over coffee/tea?), I'm going to put out just a few brief thoughts to add to all the thoughtful offerings. Meta, I have no doubt that it's possible to align teaching, learning and instructional/classroom assessment around the EFF standards while still being mandated to use an accountability assessment that is not aligned to the standards. And as difficult and unsatisfactory as it often feels to have to do this, many of the learners who engage with us in this process are prospering -- they are pursuing lifelong learning that focuses on their real needs and goals. The experiences you and Howard relate, and my own several years' experience of working with wonderful teachers and their students all over the country during the field development of the EFF Assessment Framework and Read with Understanding Assessment Prototype, strengthen my belief. I want to join in saying well done for fighting the good fight! And Howard, I want to say that I absolutely agree that "alignment" of assessment, instruction and curriculum around what is relevant and meaningful to learners is important. That's one reason why, after all these years, I still feel pretty passionate about EFF -- because I believe that the EFF standards, by the inclusive and iterative way they were developed, really do represent what is relevant and meaningful to learners, what they and many other key stakeholders in their success told us was important to know and be able to do in order to meet their goals in their primary adult roles. And it sure doesn't hurt that they also have a solid research base in cognitive science! In my mind, then, implementing the EFF standards in particular makes it possible to align what gets taught, learned and assessed around what's really relevant and meaningful to learners. I applaud the efforts that you and Meta are making to document EFF standards-based learning and to use tools like portfolios to engage students in reflection on their own learning. Well done! But I also have to admit to feeling a little sad at the thought that such evidence and activities are only for learners, and not for reporting. I'm sad because I want a world where true accountability to learners happens system-wide, not just in the classroom. I want a world where the "science and art" of good teachers is respected beyond their individual classrooms or programs. For these (among other) reasons, I think "which assessment" is incredibly important. What gets reported as a result of the accountability assessments we use is linked to what gets funded and supported and acknowledged as important in the adult basic/literacy/ESOL education world. And I want this system of ours to be fully focusing its resources on what is really important to the adult learners we serve. I'm convinced that won't happen as long as we live with the lack of alignment between instruction (what gets taught and learned) and accountability assessment (what gets measured, and therefore, what actually counts as important when policy and dollars are at issue). What feels especially frustrating is that I'm sure that we could achieve the alignment if we were willing to commit the time and resources needed to do so! I'll put my own professional bias (which of course I consider to be well-reasoned and insightful!) right out here -- I think the EFF standards and framework give us the tools we need to do the job. I could go on and on... but enough for now. Thanks to Marie and to all you contributors for giving me this opportunity to reflect on some issues near to my heart. All the best! Peggy McGuire, M.A. Senior Research Associate and Equipped for the Future National Consultant Center for Literacy Studies The University of Tennessee 111 5th Street, PO Box 16 Mt. Gretna, PA 17064 717-964-1341 (p/f) 215-888-6507 (cell) mcguirep555 at aol.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060112/066dfdb0/attachment.html From jcrawford at nifl.gov Thu Jan 12 16:09:22 2006 From: jcrawford at nifl.gov (Crawford, June) Date: Thu, 12 Jan 2006 16:09:22 -0500 Subject: [Assessment] Best in Class Assessments? Message-ID: <9B35BF1886881547B5DFF88364AF31A30573B6CF@wdcrobe2m03.ed.gov> In addition to individual tests, please check out the assessment website that we provide free-of-charge for teachers of reading. It has a short lesson on how to use the site and then you can enter test scores and get information about how best to teach students since we all know that they have different needs! I hope you find this useful: www.nifl.gov/readingprofiles and will contact us if you need further information. June Crawford NIFL -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of PHCSJean.2163953 at bloglines.com Sent: Tuesday, January 03, 2006 12:50 PM To: assessment at nifl.gov Subject: [Assessment] Best in Class Assessments? Hi all. I'm taking a reading assessment class for K-12 teachers and have been asked to bring a list of the adult reading assessments, formal and informal. What are folks using out there these days? I know the TABE is most common, but I'm sure there are some great standbys I've not seen out there I'd love to know about. Thanks! Jean Marrapodi Providence Assembly of God Learning Center ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From samuel.mcgraw at seattlegoodwill.org Thu Jan 12 17:25:09 2006 From: samuel.mcgraw at seattlegoodwill.org (Samuel McGraw III) Date: Thu, 12 Jan 2006 14:25:09 -0800 Subject: [Assessment] EFF Discussion Begins Today! Message-ID: <802F2B4590320142A57872DC43A2BFD20218ADE8@seamail.seagoodwill.org> Regie and others, We are in the position of setting new standards and developing new curriculum. Because the kinds of students we serve and our program structure - we would like to use the portfolio method of assessment, however, we would like to provide our students with the tools to get to community (etc.) --- we are private non-profit -- my organization (public schools) use CASAS standards...we would like to be able to tell our students where they are based on CASAS....a large number of our students are more suited to EFF (life skills) as that is where they are and most-likely will remain....portfolio (shows progression and what the student is currently capable of) CASAS (shows students what level they are at...and thereby where they could/should be)...EFF (is a goal for functional skills (for life))..... I would like to be able to offer students...all three.... Sam -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Regie Stites Sent: Monday, January 09, 2006 4:13 PM To: The Assessment Discussion List Subject: Re: [Assessment] EFF Discussion Begins Today! Hi Sam, You are right. This is simple question with a complicated answer. Crosswalks between CASAS competencies and EFF Standards and EFF Performance Levels have been done in various ways for various purposes. I don't know if anyone has yet done a crosswalk between CASAS content standards and EFF, but that certainly is possible. But rather than provide you with links to these crosswalks, let me first insert a note of caution. Comparing CASAS and EFF is like comparing apples and oranges. It's okay if the difference between apples and oranges doesn't matter to you. For example, we crosswalked EFF performance levels to NRS Educationing Function Levels which, in turn, are referenced to CASAS scale scores. So for some EFF standards you could make a connection between EFF levels and CASAS scores that way. But this really is an apples and oranges comparison, because CASAS "reading" is not the same thing as EFF "reading" (see my post about how EFF standards are different from CASAS competencies). As long you are happy defining "reading" or "math" etc. at the "fruit" level (where the differences between apples, oranges, and grapes don't matter much - that's what NRS does) you can cross-reference CASAS and EFF. But in most cases, the difference between apples and oranges matters (substituting 6 oranges for 6 apples in an apple pie recipe is not a good idea). So, my question is how exactly do you plan to make use of a cross-reference between CASAS and EFF? Regie Stites SRI International Samuel McGraw III wrote: Marie et. al., I have a simple (yet possible complex answer) question. Has anyone cross referenced EFF and CASAS standards? And if so. What the outcome. Sam Seattle Goodwill Learning Center -----Original Message----- From: assessment-bounces at nifl.gov [ mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: Monday, January 09, 2006 7:10 AM To: Assessment Discussion List Subject: [Assessment] EFF Discussion Begins Today! Good morning, afternoon, and evening to you all. I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our discussion. I've been thinking about this over the weekend, and I have a couple of questions to start us off: For our guests: -The EFF Standards are complex in terms of what they try to capture in a performance. Is this was makes them different from competencies? Or perhaps even different from other standards? For subscribers: I found the "thought-provokers" really helped me to focus on a piece of this big picture so I could get a handle on it. Did anyone try #1 below? Or perhaps if there are EFF users on the List, you might comment on this activity. As for #2 below - I found this question helpful because it did make me consider how often and in what ways I would look for achievement over time, and it also made me think that I would necessarily look for such incremental gains via classroom assessment rather than with a high stakes test. 1. Pick any EFF standard, read its definition, and imagine what it would look like if you were actually assessing the application of the integrated skill process described in the standard's definition. 2. How often do you feel a need to look for evidence that learning has happened? How does the nature of the evidence you are looking for change as you look for learning within the space of one class session, one week, one month, one course, one year, and so on. Anyway, that's what I was thinking about. How about you? Please post your questions and comments! Thanks, marie Assessment Discussion List Moderator _____ ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060112/0ebc0a5f/attachment.html From marie.cora at hotspurpartners.com Fri Jan 13 06:01:59 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 13 Jan 2006 06:01:59 -0500 Subject: [Assessment] Seeking evidence on CBE or EFF Message-ID: <008601c61830$c6e34dc0$0402a8c0@frodo> The following post is from Tom Sticht. January 12, 2006 Competency- or Standards-Based Education for Adult Literacy Education: Faith-Based or Evidence-Based? Tom Sticht International Consultant in Adult Education In the K-12 system standards-based education has been around now for the last decade, and has been reinforced by President Bush's No Child Left Behind program. Unfortunately, data from the National Center for Education Statistics released this year indicate that from 1971 up to 2004, a graph of average scores on the NAEP for 9, 13, or 17 year olds for the thirty year period from 1971 to 2004, on a scale ranging from 200 to around 320 scale scores, shows that 9 year olds increased from 208 in 1971 to 215 in 1980, then fell to 209 in 1990 and then rose again to 219 in 2004. This is only 4 scale score points higher than in 1980. This is evidence of ups and downs over a thirty year period but no real improvement. There is a more pronounced lack of evidence of any average improvement in reading for 13 and 17 year olds over this period. The lack of evidence for gains by 9 year olds is made even more apparent, and disappointing, when the data for 9 year olds at differing percentiles of achievement are examined. In 1971 students at the 90th percentile scored 260, then rose gradually to 266 in 1990 and then fell to 264 in 2004. Nine year olds at the 50th percentile scored as indicated above. Really poorly reading students, those at the 10th percentile scored 152 in 1971, then rose to 165 in 1980 and then rose again to 169 in 2004, though the latter was not statistically greater than 25 years ago in 1980. Thirteen year olds at the 10th percentile scored 208 in 1971, rose to 213 in 1988, and then fell to 210 in 2004. The least able 17 year old readers, those at the 10th percentile, scored 225 in 1971, rose to 241 in 1988, and then fell to 227 in 2004. Though there were some improvements in the scores for 9 year old African-Americans and Hispanics from 1988, scores for 13 year olds were flat and they actually dropped for 17 year olds. Hence there is little evidence for the practical impact of standards-based education on the reading skills of various ethnic groups in over the last decade and a half. The data for the three decades from 1971 to 2004 do not show substantial increases in reading achievement for 9, 13, or 17 year olds at various percentile ranks, even for the decade after the start of standards-based education. The NCES data do show that as children go up through primary, elementary, and secondary school, they do get better at reading across the percentile spectrum. But in 2004 the bottom ten percent of 17 year olds scored below the median for 13 year olds, and were just 6 scale score points above the median for 9 year olds. These poorly scoring students will no doubt be those who will later discover the real life importance of literacy and will enter into adult basic education to try to gain skills needed to support themselves and their families. Mathematics Regarding mathematics, there were gains for 9 and 13 year olds across the 30 year period starting in 1971, but no evidence that the implementation of standards-based education in the decade of the 1990s up to the present made any acceleration in the rate of improvement which started before the standards-based education movement. And for 17 year old African-Americans there were declines in mathematics from 1990 to 2004 and declines for Hispanics from 1992 to 2004. Overall, the NCES long term trend data for reading and mathematics do not support the claim that standards-based education over the last decade has had a positive effect on student achievement in these curricula areas. Efforts to implement either competency-based or standards-based education in adult literacy education over the last quarter system have also produced no evidence to support these reforms. There has been no evaluation of the Equipped for the Future (EFF) effort and the Comprehensive Adult Student Assessment System (CASAS) with its competency-based education (CBE) approach has produced no evidence that programs implementing CBE are more effective than programs that do not implement CBE. At the present time, then, the movement to implement either CBE or EFF content standards education in adult literacy education is progressing as a faith-based rather than an evidence-based movement. Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019-2059 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net From marie.cora at hotspurpartners.com Fri Jan 13 09:26:13 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 13 Jan 2006 09:26:13 -0500 Subject: [Assessment] Your thoughts and questions Message-ID: <009401c6184d$4f160770$0402a8c0@frodo> Hi everyone, I wanted to remind folks that this is the last day of our discussion with EFF Guests. There has been some great conversation this week - I encourage you to take the opportunity to post your thoughts and questions today. Thanks! marie Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060113/e75066c4/attachment.html From marie.cora at hotspurpartners.com Fri Jan 13 09:31:04 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 13 Jan 2006 09:31:04 -0500 Subject: [Assessment] Join or Renew Membership with AALPD for 2006! Message-ID: <001401c6184d$fc3e0830$0402a8c0@frodo> Dear Colleague: Are you interested in getting more involved with adult literacy professional development? If so, then I hope you will join or renew membership with the Association of Adult Literacy Professional Developers (AALPD) for 2006: http://www.aalpd.org/membership_form.cfm While you are joining or renewing membership this month, you will also have the opportunity to: 1) Vote on this year's slate of officers (by January 31) 2) Vote on the top 6 priorities for AALPD (by January 31) Membership with AALPD is *free*. If you'd like more information about joining AALPD, then please see below. Thanks! Jackie Taylor, List Moderator, Adult Literacy Professional Development, jataylor at utk.edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Why should you join or renew your membership? Renewing Membership: As we grow and expand, we need updated information about our members in order to advocate effectively for professional development and provide members with the best possible services. So, we are asking everyone to renew their memberships by registering as an AALPD member in January of each year. ***Please take a moment to update your membership information and *vote* for this year's slate of officers and the top 6 AALPD priorities by visiting: http://www.aalpd.org/membership_form.cfm New Membership/Prospective Members: If you are not yet a member, are you interested in: - getting more involved with adult literacy professional development? - contributing your voice along with other advocates of adult literacy PD? - taking part in establishing (in the eyes of policy makers) the legitimacy of a national association of practitioners committed to adult literacy professional development? We invite you to become a formal member of the Association of Adult Literacy Professional Developers (AALPD). Membership in AALPD is *free* and open to adult educators interested or working in professional development in adult literacy. Individuals join AALPD by completing and submitting the Membership Form: http://www.aalpd.org/membership_form.cfm ============================================================= Vote While You Join or Renew Membership 1) **Members Vote for Slate of Officers by January 31st** On the membership page, you can also VOTE for the current slate of nominated AALPD officers (Chair, Vice-Chair and Secretary-Treasurer). http://www.aalpd.org/membership_form.cfm 2) **Vote on the Top 6 Priorities for AALPD** This year, the AALPD Executive Board is identifying top priorities for AALPD. Ideas for AALPD activities were gathered both from individuals in the field and by the board. Update your membership at: http://www.aalpd.org/membership_form.cfm and scroll to the bottom of the page to vote on the top 6 AALPD priorities. ============================================================= Why should you become a member of AALPD? * It's free! * You can vote for AALPD officers and on special issues that arise (Only AALPD members will be eligible to vote). * We will send you the latest information about upcoming trainings, events and resources. * You can have input into the design of next year's COABE pre-conference session. * You can contribute your voice to our advocacy efforts. * You can help to establish AALPD's legitimacy in the eyes of policy makers by demonstrating a strong membership of concerned practitioners committed to professional development. Thank you for joining or renewing your membership with AALPD. We're glad to have you on board! On behalf of the AALPD Executive Board, Jackie Taylor Adult Literacy Professional Development List Moderator jataylor at utk.edu ===================================================================== The Association of Adult Literacy Professional Developers (AALPD) is a national group for professional developers in adult literacy. As a special interest group within COABE (Commission on Adult Basic Education), AALPD meets at COABE Conferences and other professional development events. AALPD is a member of the National Coalition for Literacy. ===================================================================== From akohring at utk.edu Fri Jan 13 09:55:10 2006 From: akohring at utk.edu (Aaron Kohring) Date: Fri, 13 Jan 2006 09:55:10 -0500 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <802F2B4590320142A57872DC43A2BFD20218ADE8@seamail.seagoodwi ll.org> Message-ID: <5.1.0.14.2.20060113094028.03ccb008@pop.utk.edu> Sam, I can certainly see value in using all three means of assessment that you mention. Each one gives you a snapshot in time of what a learner knows and is able to do and gives you information to inform instruction based on what that particular assessment is designed to assess. A variety of assessments gives you a bigger picture of what a learner knows and can do. As instructors, we also use some very informal assessments in addition to these you have mentioned to add to that picture of where learners are and where they need to get to- such as, probing question and answer during a lesson. Assessments over time give us that picture of progress. The challenge will be to make sure that all parts of your system are in alignment: the assessment, instruction, and curriculum. But that does not mean a variety of assessments cannot be used- only that we understand what they are based upon and how we are using them. Any other suggestions? Aaron At 02:25 PM 1/12/2006 -0800, you wrote: >content-class: urn:content-classes:message >Content-Type: multipart/alternative; > boundary="----_=_NextPart_001_01C617C7.0BCF833C" > >Regie and others, > > >We are in the position of setting new standards and developing new >curriculum. Because the kinds of students we serve and our program >structure - we would like to use the portfolio method of assessment, >however, we would like to provide our students with the tools to get to >community (etc.) --- we are private non-profit -- my organization (public >schools) use CASAS standards...we would like to be able to tell our >students where they are based on CASAS....a large number of our students >are more suited to EFF (life skills) as that is where they are and >most-likely will remain....portfolio (shows progression and what the >student is currently capable of) CASAS (shows students what level they are >at...and thereby where they could/should be)...EFF (is a goal for >functional skills (for life))..... > >I would like to be able to offer students...all three.... > >Sam > > >-----Original Message----- >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On >Behalf Of Regie Stites >Sent: Monday, January 09, 2006 4:13 PM >To: The Assessment Discussion List >Subject: Re: [Assessment] EFF Discussion Begins Today! > >Hi Sam, >You are right. This is simple question with a complicated >answer. Crosswalks between CASAS competencies and EFF Standards and EFF >Performance Levels have been done in various ways for various purposes. I >don't know if anyone has yet done a crosswalk between CASAS content >standards and EFF, but that certainly is possible. But rather than >provide you with links to these crosswalks, let me first insert a note of >caution. Comparing CASAS and EFF is like comparing apples and >oranges. It's okay if the difference between apples and oranges doesn't >matter to you. For example, we crosswalked EFF performance levels to NRS >Educationing Function Levels which, in turn, are referenced to CASAS scale >scores. So for some EFF standards you could make a connection between EFF >levels and CASAS scores that way. But this really is an apples and >oranges comparison, because CASAS "reading" is not the same thing as EFF >"reading" (see my post about how EFF standards are different from CASAS >competencies). As long you are happy defining "reading" or "math" etc. >at the "fruit" level (where the differences between apples, oranges, and >grapes don't matter much - that's what NRS does) you can cross-reference >CASAS and EFF. But in most cases, the difference between apples and >oranges matters (substituting 6 oranges for 6 apples in an apple pie >recipe is not a good idea). So, my question is how exactly do you plan >to make use of a cross-reference between CASAS and EFF? > >Regie Stites >SRI International > >Samuel McGraw III wrote: >> >>Marie et. al., >> >> >> >>I have a simple (yet possible complex answer) question. >> >> >> >>Has anyone cross referenced EFF and CASAS standards? And if so. What the >>outcome. >> >> >> >>Sam >> >>Seattle Goodwill Learning Center >> >>-----Original Message----- >> >>From: assessment-bounces at nifl.gov >>[mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora >> >>Sent: Monday, January 09, 2006 7:10 AM >> >>To: Assessment Discussion List >> >>Subject: [Assessment] EFF Discussion Begins Today! >> >> >> >>Good morning, afternoon, and evening to you all. >> >> >> >>I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our >>discussion. I've been thinking about this over the weekend, and I have a >>couple of questions to start us off: >> >> >> >>For our guests: >> >> >> >>-The EFF Standards are complex in terms of what they try to capture in a >>performance. Is this was makes them different from competencies? Or >>perhaps even different from other standards? >> >> >> >>For subscribers: I found the "thought-provokers" really helped me to >>focus on a piece of this big picture so I could get a handle on it. Did >>anyone try #1 below? Or perhaps if there are EFF users on the List, you >>might comment on this activity. As for #2 below - I found this question >>helpful because it did make me consider how often and in what ways I >>would look for achievement over time, and it also made me think that I >>would necessarily look for such incremental gains via classroom >>assessment rather than with a high stakes test. >> >> >> >>1. Pick any EFF standard, read its definition, and imagine what it would >>look like if you were actually assessing the application of the >>integrated skill process described in the standard's definition. >> >> >> >>2. How often do you feel a need to look for evidence that learning has >>happened? How does the nature of the evidence you are looking for change >>as you look for learning within the space of one class session, one week, >>one month, one course, one year, and so on. >> >> >> >>Anyway, that's what I was thinking about. How about you? Please post >>your questions and comments! >> >>Thanks, >> >>marie >> >>Assessment Discussion List Moderator >> >> >> >> >> >> >> >>------------------------------- >> >>National Insitute for Literacy >> >>Assessment mailing list >> >>Assessment at nifl.gov >> >>To unsubscribe or change your subscription settings, please go to >>http://www.nifl.gov/mailman/listinfo/assessment >> >> > > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to >http://www.nifl.gov/mailman/listinfo/assessment Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu From atrawick at charter.net Fri Jan 13 11:58:26 2006 From: atrawick at charter.net (Amy R. Trawick) Date: Fri, 13 Jan 2006 11:58:26 -0500 Subject: [Assessment] EFF Discussion References: <009401c6184d$4f160770$0402a8c0@frodo> Message-ID: <00f701c61862$928fd840$3002a8c0@ben2ut66kkx7o3> Hello, all! I would like to share another informal assessment technique that we developed as part of the EFF Reading Project. We call it a "Listening-In Assessment" because it requires you to "listen in", so to speak, on what a reader is thinking as s/he is reading. It goes something like this: 1) The student is engaged in reading for a life-based task that, in EFF lingo, relates to a shared priority. For instance, students may be researching certain careers, or reading about parent/teacher conferences, or exploring housing options. 2) The teacher approaches the student and has several brief discussions, using the components of the Read With Understanding standard (http://eff.cls.utk.edu/fundamentals/standard_read_with_understanding.htm) as a guide. The teacher asks such things as: -- Why did you choose this piece to read? Why? What are you hoping to find? -- How are you going about reading for this purpose? Is that working for you? Questions like these help to understand students' decision-making processes in choosing material, whether or not they are able to choose material that they can make sense of, and ways in which they approach certain texts and tasks (e.g., do they start at the beginning or do they use a table of contents/index to pinpoint a starting place). 3) After a brief discussion and any impromptu instruction that the teacher feels the need to provide based on student responses, the teacher leaves the student to continue with the task. 4) Later, the teacher returns to the student and continues with the conversation, again using the RWU components as a guide. She might ask things like, --How's it going? What have you found out? --Is the strategy I taught you working for you? (a strategy lesson might have been provided prior to students working on the task or in an impromptu lesson). --Has anything been difficult for you to understand? What did you do about it? --How does what you read compare with what you already know? to what another author said? --What are you finding to be the author's main point? --What are you going to do with this information? Questions like these are aimed at getting at the student's ability to monitor their comprehension, adjust strategies, analyze information, and integrate what they are reading with what they already know. 5) The teacher might then ask the student to read aloud a section that was particularly interesting to them for some reason--novel ideas, funny, something they agreed/disagreed with. The point here is to create an authentic reason to read aloud, i.e., to share information. As the student reads aloud, the teacher notes word recognition issues and fluency issues. After the student reads, the pair engage in brief conversation about what the student read. The teacher may then ask if the meaning of any words gave the student any problems--and what they did about it. She might also pinpoint a word or two to ask the student about. 6) At a later point, the teacher and student might discuss the teacher notes and student perspectives on those. Teacher notes about the "listening-in" experience are recorded on a form, which basically consists of 2 pages. On the first page is a 2-column table, with the components of the standard in the left column and blank space for teacher observation about strategies used and instruction provided and/or needed in the right column. On the second page are sections related to alphabetics, fluency, and vocabulary. Notes are usually made away from the student, so the student doesn't experience the conversation as an "assessment event", but we do encourage teachers to be transparent with students about how they will regularly talk with them about their reading and make notes to share with them. Teachers conduct these assessments regularly with each student (though they define "regularly" differently) in order to collect multiple examples of reading performance over a range of texts and tasks. The principal purpose is to identify what students already know, what strategies/skills they are using independently and fluently, and what the instructional implications might be. Combined with similar student-written assessments of their reading processes, either through journaling or use of the Read With Understanding Diary (http://eff.cls.utk.edu/eff_docs/toolkit_docs/tools_read_understand.doc), and with rubrics, artifacts, and reading/book logs collected in a portfolio, students and their teachers can also monitor growth in reading performance over time, as Meta suggested. Teachers with whom we have shared this process relate that they learn so much about students' reading strategies when they conduct assessments like these, though at first, some students have trouble talking about their own reading. However, when discussions around reading occur regularly as part of the classroom culture, not just in Listening-In Assessments but in large group and small group activities as well, students become better at recognizing what they do, naming what they do, and identifying trouble spots. And teachers gather valuable data that can inform instruction. Thank you all for such interesting ideas and posts this week. I'm eager to hear what others are doing as well! Amy Amy R. Trawick, M.S. Ed. Equipped for the Future National Consultant Center for Literacy Studies The University of Tennessee-Knoxville atrawick at charter.net ----- Original Message ----- From: Marie Cora To: Assessment Discussion List Sent: Friday, January 13, 2006 9:26 AM Subject: [Assessment] Your thoughts and questions Hi everyone, I wanted to remind folks that this is the last day of our discussion with EFF Guests. There has been some great conversation this week - I encourage you to take the opportunity to post your thoughts and questions today. Thanks! marie Assessment Discussion List Moderator ------------------------------------------------------------------------------ ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060113/39e9fea6/attachment.html From ltaylor at casas.org Fri Jan 13 13:59:15 2006 From: ltaylor at casas.org (Linda Taylor) Date: Fri, 13 Jan 2006 13:59:15 -0500 Subject: [Assessment] Seeking evidence on CBE or EFF In-Reply-To: <008601c61830$c6e34dc0$0402a8c0@frodo> Message-ID: <0IT1001QROMYEIHM@vms048.mailsrvcs.net> For a summary of CASAS research, please see the CASAS website at www.casas.org and go to "Research and Reports." Linda Taylor, CASAS -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, January 13, 2006 6:02 AM To: Assessment Discussion List Subject: [Assessment] Seeking evidence on CBE or EFF The following post is from Tom Sticht. January 12, 2006 Competency- or Standards-Based Education for Adult Literacy Education: Faith-Based or Evidence-Based? Tom Sticht International Consultant in Adult Education In the K-12 system standards-based education has been around now for the last decade, and has been reinforced by President Bush's No Child Left Behind program. Unfortunately, data from the National Center for Education Statistics released this year indicate that from 1971 up to 2004, a graph of average scores on the NAEP for 9, 13, or 17 year olds for the thirty year period from 1971 to 2004, on a scale ranging from 200 to around 320 scale scores, shows that 9 year olds increased from 208 in 1971 to 215 in 1980, then fell to 209 in 1990 and then rose again to 219 in 2004. This is only 4 scale score points higher than in 1980. This is evidence of ups and downs over a thirty year period but no real improvement. There is a more pronounced lack of evidence of any average improvement in reading for 13 and 17 year olds over this period. The lack of evidence for gains by 9 year olds is made even more apparent, and disappointing, when the data for 9 year olds at differing percentiles of achievement are examined. In 1971 students at the 90th percentile scored 260, then rose gradually to 266 in 1990 and then fell to 264 in 2004. Nine year olds at the 50th percentile scored as indicated above. Really poorly reading students, those at the 10th percentile scored 152 in 1971, then rose to 165 in 1980 and then rose again to 169 in 2004, though the latter was not statistically greater than 25 years ago in 1980. Thirteen year olds at the 10th percentile scored 208 in 1971, rose to 213 in 1988, and then fell to 210 in 2004. The least able 17 year old readers, those at the 10th percentile, scored 225 in 1971, rose to 241 in 1988, and then fell to 227 in 2004. Though there were some improvements in the scores for 9 year old African-Americans and Hispanics from 1988, scores for 13 year olds were flat and they actually dropped for 17 year olds. Hence there is little evidence for the practical impact of standards-based education on the reading skills of various ethnic groups in over the last decade and a half. The data for the three decades from 1971 to 2004 do not show substantial increases in reading achievement for 9, 13, or 17 year olds at various percentile ranks, even for the decade after the start of standards-based education. The NCES data do show that as children go up through primary, elementary, and secondary school, they do get better at reading across the percentile spectrum. But in 2004 the bottom ten percent of 17 year olds scored below the median for 13 year olds, and were just 6 scale score points above the median for 9 year olds. These poorly scoring students will no doubt be those who will later discover the real life importance of literacy and will enter into adult basic education to try to gain skills needed to support themselves and their families. Mathematics Regarding mathematics, there were gains for 9 and 13 year olds across the 30 year period starting in 1971, but no evidence that the implementation of standards-based education in the decade of the 1990s up to the present made any acceleration in the rate of improvement which started before the standards-based education movement. And for 17 year old African-Americans there were declines in mathematics from 1990 to 2004 and declines for Hispanics from 1992 to 2004. Overall, the NCES long term trend data for reading and mathematics do not support the claim that standards-based education over the last decade has had a positive effect on student achievement in these curricula areas. Efforts to implement either competency-based or standards-based education in adult literacy education over the last quarter system have also produced no evidence to support these reforms. There has been no evaluation of the Equipped for the Future (EFF) effort and the Comprehensive Adult Student Assessment System (CASAS) with its competency-based education (CBE) approach has produced no evidence that programs implementing CBE are more effective than programs that do not implement CBE. At the present time, then, the movement to implement either CBE or EFF content standards education in adult literacy education is progressing as a faith-based rather than an evidence-based movement. Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019-2059 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From regie.stites at sri.com Fri Jan 13 16:26:29 2006 From: regie.stites at sri.com (Regie Stites) Date: Fri, 13 Jan 2006 13:26:29 -0800 Subject: [Assessment] EFF Discussion Begins Today! In-Reply-To: <5.1.0.14.2.20060113094028.03ccb008@pop.utk.edu> References: <5.1.0.14.2.20060113094028.03ccb008@pop.utk.edu> Message-ID: <43C81B05.5030501@sri.com> I would also like to heartily applaud the wisdom and creativity of instructors in finding a way to make whatever tools they have work for their learners. I think researchers (and I know because I am one) are often far too narrow and dismissive in their attitudes. Teachers experiment and collect evidence of what works on a daily basis and that experience and the craft knowledge it builds should never be totally discounted. There are a lot of measures by which an instructional program can be evaluated. Gains on reading on math tests are important, but so are teachers' opinions about the fit between the teaching content and methods and learners' goals. In a post early this week, I said I would get back to the question of what makes EFF standards different from other standards (then the flu slowed down my response time considerably). The first obvious difference is that EFF focuses on a broader range of skills than most standards. EFF is not solely focused on improving scores on standardized reading and math tests. Improving scores on reading and math tests is important (and tied to funding now of course), but EFF standards (more than most state content standards) promote a broader focus on giving adults opportunities to learn what they need to know and be able to do to fulfill key adult roles as workers, parents/family members, community members/citizens. For that reason, EFF includes standards for decision making, lifelong learning, and interpersonal skills as well as communication skills. Second, EFF supports and encourages learner centered instruction. EFF standards define purposeful applications of 16 skills, but don't dictate the specific techniques or context in which those skills should be taught or learned. This allows for relatively more flexibility (and requires more creativity) in applying the EFF standards to guide curriculum and instruction than is true of state content standards (where there is generally a stronger link is made to curriculum content/frameworks). Flexibility (and lack of specificity) is not always a good thing, but it is definitely an advantage for programs that adopt a learner centered approach. Third (and related to the first two points), EFF is more "customer driven" than most other standards in the sense that the standards were developed in direct response to what adult learners said they wanted to learn. This customer driven orientation also extends to the EFF approach to assessment which gives priority (particularly in instructionally embedded assessment) to clear communications of learning goals and evidence of learning progress between learners and instructors. Portfolio assessment is an excellent tool for this purpose because it creates opportunities for learners and instructors to talk about concrete evidence (and the criteria to evaluate that evidence) of progress. Finally, EFF is a multi-component system (standards, performance continua, teaching and learning toolkit, etc.) that can adopted and/or adapted in many ways. Though I agree with Tom Sticht that we implementation of EFF (and other standards-based education in adult education) has not been sufficiently studied, we should be clear about what kinds of applications of EFF we want to evaluate and by what measures. Let's evaluate EFF (and I hope someone with deep pockets likes this idea) using a multi-methods design and a broad range of outcome measures befitting the scope of the EFF approach (and the variety of the ways it can be implemented). There has been a lot of research to support various pieces of what EFF Standards can include when implemented. For example, the Read With Understanding performance level descriptions incorporate evidence-based practices in the teaching of alphabetics, fluency, vocabulary, and reading comprehension strategies. There is no reason to restart the reading wars and make everyone choose one of two sides. The research supporting these practices is sound and that is why they are included in the EFF approach to reading instruction. But EFF is not solely an approach to reading instruction. So if we have the opportunity for a large-scale evaluation of EFF, let's include measures of growth in teachers' knowledge of content area instruction and job satisfaction and retention. Let's include longitudinal measures of goal achievement for learners (such as jobs, personal growth, community involvement, etc.). While we are at it, let's also get good pre- and post test data on reading, math, writing, English language learning (all 4 skills), lifelong learning, decision making, and/or interpersonal skills (as appropriate to the content of the instructional program). Regie Stites SRI International Aaron Kohring wrote: >Sam, > >I can certainly see value in using all three means of assessment that you >mention. Each one gives you a snapshot in time of what a learner knows and >is able to do and gives you information to inform instruction based on what >that particular assessment is designed to assess. A variety of assessments >gives you a bigger picture of what a learner knows and can do. As >instructors, we also use some very informal assessments in addition to >these you have mentioned to add to that picture of where learners are and >where they need to get to- such as, probing question and answer during a >lesson. Assessments over time give us that picture of progress. > >The challenge will be to make sure that all parts of your system are in >alignment: the assessment, instruction, and curriculum. But that does not >mean a variety of assessments cannot be used- only that we understand what >they are based upon and how we are using them. > >Any other suggestions? >Aaron > >At 02:25 PM 1/12/2006 -0800, you wrote: > > >>content-class: urn:content-classes:message >>Content-Type: multipart/alternative; >> boundary="----_=_NextPart_001_01C617C7.0BCF833C" >> >>Regie and others, >> >> >>We are in the position of setting new standards and developing new >>curriculum. Because the kinds of students we serve and our program >>structure - we would like to use the portfolio method of assessment, >>however, we would like to provide our students with the tools to get to >>community (etc.) --- we are private non-profit -- my organization (public >>schools) use CASAS standards...we would like to be able to tell our >>students where they are based on CASAS....a large number of our students >>are more suited to EFF (life skills) as that is where they are and >>most-likely will remain....portfolio (shows progression and what the >>student is currently capable of) CASAS (shows students what level they are >>at...and thereby where they could/should be)...EFF (is a goal for >>functional skills (for life))..... >> >>I would like to be able to offer students...all three.... >> >>Sam >> >> >>-----Original Message----- >>From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On >>Behalf Of Regie Stites >>Sent: Monday, January 09, 2006 4:13 PM >>To: The Assessment Discussion List >>Subject: Re: [Assessment] EFF Discussion Begins Today! >> >>Hi Sam, >>You are right. This is simple question with a complicated >>answer. Crosswalks between CASAS competencies and EFF Standards and EFF >>Performance Levels have been done in various ways for various purposes. I >>don't know if anyone has yet done a crosswalk between CASAS content >>standards and EFF, but that certainly is possible. But rather than >>provide you with links to these crosswalks, let me first insert a note of >>caution. Comparing CASAS and EFF is like comparing apples and >>oranges. It's okay if the difference between apples and oranges doesn't >>matter to you. For example, we crosswalked EFF performance levels to NRS >>Educationing Function Levels which, in turn, are referenced to CASAS scale >>scores. So for some EFF standards you could make a connection between EFF >>levels and CASAS scores that way. But this really is an apples and >>oranges comparison, because CASAS "reading" is not the same thing as EFF >>"reading" (see my post about how EFF standards are different from CASAS >>competencies). As long you are happy defining "reading" or "math" etc. >>at the "fruit" level (where the differences between apples, oranges, and >>grapes don't matter much - that's what NRS does) you can cross-reference >>CASAS and EFF. But in most cases, the difference between apples and >>oranges matters (substituting 6 oranges for 6 apples in an apple pie >>recipe is not a good idea). So, my question is how exactly do you plan >>to make use of a cross-reference between CASAS and EFF? >> >>Regie Stites >>SRI International >> >>Samuel McGraw III wrote: >> >> >>>Marie et. al., >>> >>> >>> >>>I have a simple (yet possible complex answer) question. >>> >>> >>> >>>Has anyone cross referenced EFF and CASAS standards? And if so. What the >>>outcome. >>> >>> >>> >>>Sam >>> >>>Seattle Goodwill Learning Center >>> >>>-----Original Message----- >>> >>>From: assessment-bounces at nifl.gov >>>[mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora >>> >>>Sent: Monday, January 09, 2006 7:10 AM >>> >>>To: Assessment Discussion List >>> >>>Subject: [Assessment] EFF Discussion Begins Today! >>> >>> >>> >>>Good morning, afternoon, and evening to you all. >>> >>> >>> >>>I'm pleased to welcome Peggy, Aaron, Regie, and EFF Center Staff to our >>>discussion. I've been thinking about this over the weekend, and I have a >>>couple of questions to start us off: >>> >>> >>> >>>For our guests: >>> >>> >>> >>>-The EFF Standards are complex in terms of what they try to capture in a >>>performance. Is this was makes them different from competencies? Or >>>perhaps even different from other standards? >>> >>> >>> >>>For subscribers: I found the "thought-provokers" really helped me to >>>focus on a piece of this big picture so I could get a handle on it. Did >>>anyone try #1 below? Or perhaps if there are EFF users on the List, you >>>might comment on this activity. As for #2 below - I found this question >>>helpful because it did make me consider how often and in what ways I >>>would look for achievement over time, and it also made me think that I >>>would necessarily look for such incremental gains via classroom >>>assessment rather than with a high stakes test. >>> >>> >>> >>>1. Pick any EFF standard, read its definition, and imagine what it would >>>look like if you were actually assessing the application of the >>>integrated skill process described in the standard's definition. >>> >>> >>> >>>2. How often do you feel a need to look for evidence that learning has >>>happened? How does the nature of the evidence you are looking for change >>>as you look for learning within the space of one class session, one week, >>>one month, one course, one year, and so on. >>> >>> >>> >>>Anyway, that's what I was thinking about. How about you? Please post >>>your questions and comments! >>> >>>Thanks, >>> >>>marie >>> >>>Assessment Discussion List Moderator >>> >>> >>> >>> >>> >>> >>> >>>------------------------------- >>> >>>National Insitute for Literacy >>> >>>Assessment mailing list >>> >>>Assessment at nifl.gov >>> >>>To unsubscribe or change your subscription settings, please go to >>>http://www.nifl.gov/mailman/listinfo/assessment >>> >>> >>> >>> >>------------------------------- >>National Insitute for Literacy >>Assessment mailing list >>Assessment at nifl.gov >>To unsubscribe or change your subscription settings, please go to >>http://www.nifl.gov/mailman/listinfo/assessment >> >> > >Aaron Kohring >Coordinator, LINCS Literacy & Learning Disabilities Special Collection >(http://ldlink.coe.utk.edu/) >Moderator, National Institute for Literacy's Content Standards Discussion >List (http://www.nifl.gov/mailman/listinfo/Contentstandards) >Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) > >Center for Literacy Studies, University of Tennessee >EFF Center for Training and Technical Assistance >Phone:(865) 974-4109 main > (865) 974-4258 direct >Fax: (865) 974-3857 >e-mail: akohring at utk.edu > > >------------------------------- >National Insitute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > -- Regie Stites, Ph.D. Program Manager, Literacy and Lifelong Learning Center for Education Policy SRI International 333 Ravenswood Avenue Menlo Park, CA 94025-3493 Direct: 650.859.3768 Fax: 650.859.3375 regie.stites at sri.com www.sri.com/policy/cep -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060113/33ed260c/attachment.html From sreid at workbase.org.nz Fri Jan 13 20:16:57 2006 From: sreid at workbase.org.nz (Susan Reid) Date: Sat, 14 Jan 2006 14:16:57 +1300 Subject: [Assessment] Your thoughts and questions Message-ID: <14794889A1E3AF419042F64CC5425A1E051EF8@server1.wbeductrust.local> Hin Marie I hope I am not too late to ask this question I was interested in Regie's post about the difference between EFF and CBE. Here is New Zealand we have a very well established competency based assessment framework called the National Qualifications Framework www.nzqa.govt.nz In lieu of any literacy standards a number of practitioners use until standards from the NQF to assess their learners - e.g. Read texts for practical purposes, Fill in a form , Write an incident report, Solve problems using whole numbers, Solve problems using numbers in different forms. In lieu of anything else workforce ( in particular) practitioners use the outcome statements of these standards as curriculum despite The New Zealand Qualifications Authority saying that is not their purpose. This use of unit standards is widespread in technical areas as well as in literacy and numeracy areas. The New Zealand Minsistry of Education is about to introduce key competencies into our tertiary education system based on the novice to expert continuum.. One of the key competencies is use tools interactively which includes standards around literacy and numeracy. http://www.nzliteracyportal.org.nz/download/20050419100458key_competencies.pdf The approach is based very much on the OECD's DeSeCo approach. http://www.portal-stat.admin.ch/deseco/ccp-bac1.pdf Five descriptive standards have been developed - reading , writing, speaking, listening and maths which are very loosely based on the EFF Standards ( except they have taken the knowledge component out). http://www.nzliteracyportal.org.nz/download/20050419110405stds_doc.pdf The Descriptive Standards are in draft form but there is currently underway a project to write progressions for the 5 draft standards ( again loosely based on the EFF appraoch) However what is different is that there has not been the level of consultation with learners and to some extent practitioners that underpinned the EFF approach. We expect that these progressions will become a reporting system for all literacy and numeracy programmes. The reason the Ministry is keen to introduce Key Competencies into the tertiary system is that they are part of the primary and secondary ( K-12 ) system and the idea is to have a seamless system. What advice would you have about the introduction of these standards bearing in mind they will be very different from the current well established CBE system. How can we make sure practitioners understand the differences rather than just use them as another CBE system Thank you Susan Reid Manager, Consultancy Services Workbase the New Zealand Centre for Workforce Literacy @ Vermont St, Ponsonby Auckland , New Zealand www.workbase.org.nz www.nzliteracyportal.org.nz ________________________________ From: assessment-bounces at nifl.gov on behalf of Marie Cora Sent: Sat 14/01/2006 3:26 a.m. To: Assessment Discussion List Subject: [Assessment] Your thoughts and questions Hi everyone, I wanted to remind folks that this is the last day of our discussion with EFF Guests. There has been some great conversation this week - I encourage you to take the opportunity to post your thoughts and questions today. Thanks! marie Assessment Discussion List Moderator -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 7469 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060114/69b76b12/attachment.bin From kabeall at comcast.net Mon Jan 16 18:48:02 2006 From: kabeall at comcast.net (Kaye Beall) Date: Mon, 16 Jan 2006 18:48:02 -0500 Subject: [Assessment] New from NCSALL--Review of Adult Learning and Literacy, Volume 6 Message-ID: <005001c61af7$49f7f8f0$0202a8c0@your4105e587b6> The newest volume of the Review of Adult Learning and Literacy: Connecting Research, Policy, and Practice (Vol. 6, 2006) is now available from NCSALL. For more information, please visit the NCSALL Web site at http://www.ncsall.net. It includes chapters on: ? demographic change and low-literacy Americans ? the role of vocabulary in adult basic education (ABE) ? implications of research on spelling for ABE ? issues in teaching speaking skills to adult ESOL learners ? the preparation and stability of the ABE teaching workforce ? the adult literacy system in Ireland ? broad-based organizing as a vehicle for promoting adult literacy To order the Review of Adult Learning and Literacy, Volume 6, visit Erlbaum?s Web site (https://www.erlbaum.com/shop/tek9.asp?pg=search&mode=regular). To order Volume 6 at a 30% discount from NCSALL, go to our Order Form (http://www.ncsall.net/?id=1002); limited quantities available. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060116/f9220613/attachment.html From marie.cora at hotspurpartners.com Tue Jan 17 10:05:35 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 17 Jan 2006 10:05:35 -0500 Subject: [Assessment] Discussion thanks! Message-ID: <003501c61b77$785d33b0$0402a8c0@frodo> Good morning, afternoon, and evening to you all. I would like to thank our guests from last week's discussion, Regie Stites, Aaron Kohring, and Peggy McGuire, for participating in this forum on EFF. Many thanks as well to List members for making the conversation rich and interesting. I will prepare the discussion in user-friendly format and have it posted at the NIFL Discussion List Website as well as the ALEWiki. I will send out email letting you know when this will be available. If you still have questions or comments regarding last week's discussion, please don't hesitate to post these to the Assessment Discussion List. Conversations needn't come to a halt when a Guest Discussion is completed - in fact such Guest Discussions could provide a catalyst for further exploration of a topic. So don't be shy! Thanks so much again to everyone, marie marie cora Moderator, NIFL Assessment Discussion List, and Coordinator/Developer LINCS Assessment Special Collection at http://literacy.kentedu/Midwest/assessment marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060117/e2210dc1/attachment.html From marie.cora at hotspurpartners.com Tue Jan 17 10:21:11 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 17 Jan 2006 10:21:11 -0500 Subject: [Assessment] FW: [ContentStandards] Live Chat: Rethinking National Standards Message-ID: <005201c61b79$a642db70$0402a8c0@frodo> Hello Folks - the following is cross-posted from the Content Standards (formerly the EFF) Discussion List. marie ********** Greetings all, Education Week is hosting an online chat regarding the Quality Counts report that I referenced last week. Aaron Join us Wednesday, Jan. 18, from 3 p.m. to 4 p.m., Eastern time, for an online chat with guest Diane Ravitch on national standards, curricula, and tests. In a commentary piece published in EDUCATION WEEK's recently released QUALITY COUNTS 2006 report, "Quality Counts at 10: A Decade of Standards-Based Education," Ravitch contends that standards-based education reform has been compromised because each of the 50 states sets its own standards and monitors its own progress, creating mixed messages about what students should know and be able to do and incentives for the states to lower existing standards so as to demonstrate "progress." Ravitch argues that adopting national standards is the best way to solve the problem of inconsistent standards and to prevent states from lowering passing scores on state exams to show progress. Read Diane Ravitch's Commentary: http://www.edweek.org/ew/articles/2006/01/05/17ravitch.h25.html This chat will focus on the impact that adopting national standards could have on standards-based reform, and address how the National Assessment of Educational Progress could serve as a blueprint for national standards. Please join us for this special live Web chat. http://www.edweek-chat.org Submit advance questions here: http://www.edweek-chat.org/question.php3 No special equipment is needed to participate in this text-based chat. A complete transcript will be posted shortly after the chat is completed. Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu ---------------------------------------------------- National Insitute for Literacy Adult Education Content Standards mailing list ContentStandards at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/contentstandards From marie.cora at hotspurpartners.com Tue Jan 17 16:04:26 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 17 Jan 2006 16:04:26 -0500 Subject: [Assessment] FW: [AAACE-NLA] Need Help with Grade Equivalency Categories Message-ID: <00c501c61ba9$99fa52a0$0402a8c0@frodo> Hi everyone, I'm forwarding this query for your responses from the NLA List. marie ***** Hello all, I am needing to communicate "grade level equivalency" categories with adult literacy volunteer tutors and would like to gather opinions on how to best do this. My first thought was to use the already established NRS categories, which are: Beginning Literacy (0-1.9 grade level) Beginning ABE (2-3.9) Low Intermediate ABE (4-5.9) High Intermediate ABE (6-8.9) Low Advanced ASE (9-10.9) High Advanced ASE (11-12.9) What do you think? Any better ideas out there??? Thanks in advance for sharing your thoughts. Val Harris Director of Adult Education Lewis & Clark Community College 5800 Godfrey Road Godfrey, IL 62035 (618) 468-4100 _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org From marie.cora at hotspurpartners.com Wed Jan 18 09:52:58 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 18 Jan 2006 09:52:58 -0500 Subject: [Assessment] Help with Grade Level Equivalency Categories Message-ID: <002501c61c3e$dfde5d80$0402a8c0@frodo> The following response if from Peggy McGuire. Hi Val and all. A couple thoughts: If you and your volunteer tutors are required to communicate adult learner achievement in terms of "grade level equivalency", then I am not aware of any good reason not to use the categories that the NRS describes (if anyone is aware of one, please chime in!). The good news about this approach would be that at least some attempt has been made to draw relatonships between these categories and information related to measures designed to describe adult achievement (EFF performance levels, TABE scores, CASAS scores, BEST scores). And that brings me to my other thought: in this context it feels very important to help tutors of adults to understand where the "grade level equivalencies" come from, so that they have the opportunity to reflect on what information these numbers can offer us about the adults with whom we work. Specifically I would want to be clear that they were designed to describe the mastery of elementary/middle/high school academic curricula as articulated in the achievement tests taken by students at particular grades during their K-12 careers. Personally, I think it would be a wonderful thing to then have a conversation with the tutors about what the categories tell us about what the adult learners they are/will be working with need to know and be able to do in order to meet their goals in seeking tutoring -- and about other opportunities to assess learners' needs and gather evidence of their achievement. A pretty cool "teachable moment", maybe? All the best! Peggy McGuire, M.A. Senior Research Associate and Equipped for the Future National Consultant Center for Literacy Studies The University of Tennessee 111 5th Street, PO Box 16 Mt. Gretna, PA 17064 717-964-1341 (p/f) 215-888-6507 (cell) mcguirep555 at aol.com Hello all, I am needing to communicate "grade level equivalency" categories with adult literacy volunteer tutors and would like to gather opinions on how to best do this. My first thought was to use the already established NRS categories, which are: Beginning Literacy (0-1.9 grade level) Beginning ABE (2-3.9) Low Intermediate ABE (4-5.9) High Intermediate ABE (6-8.9) Low Advanced ASE (9-10.9) High Advanced ASE (11-12.9) What do you think? Any better ideas out there??? Thanks in advance for sharing your thoughts. Val Harris Director of Adult Education Lewis & Clark Community College 5800 Godfrey Road Godfrey, IL 62035 (618) 468-4100 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060118/ab34e7ad/attachment.html From Shauna.South at schools.utah.gov Wed Jan 18 15:53:53 2006 From: Shauna.South at schools.utah.gov (South, Shauna) Date: Wed, 18 Jan 2006 13:53:53 -0700 Subject: [Assessment] FW: [AAACE-NLA] Need Help with Grade Equivalency Categories Message-ID: <3B603C8007FB954BB1A4FE9B16CC4A1701A5B3DF@edell.usoe.k12.ut.us> This appears appropriate on the surface for volunteer tutors. Depending on how they are going to use the information, it might be a good idea to clarify that GLE used this way does not necessarily mean that a student is reading on grade level. Each state seems to interpret what Grade level means in academic areas. What these NRS levels mean to me is that the student being assessed with the adult ed assessment has been correlated to the National Reporting System. Actual cognitive skill levels for that person may not necessarily be at those grade levels. I believe the DOL likes to see these GLE's. I think it depends on how you are going to have the volunteers use this information. -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, January 17, 2006 2:04 PM To: Assessment Discussion List Subject: [Assessment] FW: [AAACE-NLA] Need Help with Grade EquivalencyCategories Hi everyone, I'm forwarding this query for your responses from the NLA List. marie ***** Hello all, I am needing to communicate "grade level equivalency" categories with adult literacy volunteer tutors and would like to gather opinions on how to best do this. My first thought was to use the already established NRS categories, which are: Beginning Literacy (0-1.9 grade level) Beginning ABE (2-3.9) Low Intermediate ABE (4-5.9) High Intermediate ABE (6-8.9) Low Advanced ASE (9-10.9) High Advanced ASE (11-12.9) What do you think? Any better ideas out there??? Thanks in advance for sharing your thoughts. Val Harris Director of Adult Education Lewis & Clark Community College 5800 Godfrey Road Godfrey, IL 62035 (618) 468-4100 _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org ------------------------------- National Insitute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From jgore at readingconnections.org Thu Jan 19 11:17:59 2006 From: jgore at readingconnections.org (Jenny Gore) Date: Thu, 19 Jan 2006 11:17:59 -0500 Subject: [Assessment] FW: [AAACE-NLA] Need Help with Grade EquivalencyCategories References: <3B603C8007FB954BB1A4FE9B16CC4A1701A5B3DF@edell.usoe.k12.ut.us> Message-ID: <12f301c61d13$eacbb280$0901a8c0@RCLAB8> Do we have a way to relate the NRS levels to NALS Levels 1-5? Jennifer B. Gore Executive Director Reading Connections, Inc. 122 N. Elm Street, Suite 520 Greensboro, NC 27401 336-230-2223 Treat people as if they were what they ought to be, and help them become what they are capable of being. Goethe ----- Original Message ----- From: "South, Shauna" To: "The Assessment Discussion List" Sent: Wednesday, January 18, 2006 3:53 PM Subject: Re: [Assessment] FW: [AAACE-NLA] Need Help with Grade EquivalencyCategories > This appears appropriate on the surface for volunteer tutors. Depending > on how they are going to use the information, it might be a good idea to > clarify that GLE used this way does not necessarily mean that a student > is reading on grade level. Each state seems to interpret what Grade > level means in academic areas. What these NRS levels mean to me is that > the student being assessed with the adult ed assessment has been > correlated to the National Reporting System. Actual cognitive skill > levels for that person may not necessarily be at those grade levels. I > believe the DOL likes to see these GLE's. I think it depends on how you > are going to have the volunteers use this information. > > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > On Behalf Of Marie Cora > Sent: Tuesday, January 17, 2006 2:04 PM > To: Assessment Discussion List > Subject: [Assessment] FW: [AAACE-NLA] Need Help with Grade > EquivalencyCategories > > Hi everyone, > > I'm forwarding this query for your responses from the NLA List. > marie > ***** > > > > Hello all, > > I am needing to communicate "grade level equivalency" categories with > adult > literacy volunteer tutors and would like to gather opinions on how to > best > do this. My first thought was to use the already established NRS > categories, which are: > > Beginning Literacy (0-1.9 grade level) > Beginning ABE (2-3.9) > Low Intermediate ABE (4-5.9) > High Intermediate ABE (6-8.9) > Low Advanced ASE (9-10.9) > High Advanced ASE (11-12.9) > > What do you think? Any better ideas out there??? Thanks in advance for > sharing your thoughts. > > Val Harris > Director of Adult Education > Lewis & Clark Community College > 5800 Godfrey Road > Godfrey, IL 62035 > (618) 468-4100 > > _______________________________________________ > AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org > http://lists.literacytent.org/mailman/listinfo/aaace-nla > LiteracyTent: web hosting, news, community and goodies for literacy > http://literacytent.org > > > ------------------------------- > National Insitute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Insitute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Tue Jan 24 09:40:49 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 24 Jan 2006 09:40:49 -0500 Subject: [Assessment] Career Opportunities at the National Institute for Literacy Message-ID: <025001c620f4$2b80d3d0$0402a8c0@frodo> The National Institute for Literacy is launching projects in new areas and seeks additional staff members, including those with expertise in early literacy, English language acquisition, and workforce and basic skills development. Other positions include: Associate Directors for Communication and Programs, Contract Specialists, Human Resources Officer, Budget and Policy Analyst. For more information on career opportunities with the National Institute for Literacy and how to apply please visit: http://www.nifl.gov and click on Career Opportunities. Please review instruction on How to Apply. Incomplete applications will not be accepted. Questions regarding these positions should be submitted to staff_search at nifl.gov Note: Applications will be accepted until 3:00 p.m. February 10, 2006. Shelly Coles National Institute for Literacy From marie.cora at hotspurpartners.com Tue Jan 24 10:18:39 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 24 Jan 2006 10:18:39 -0500 Subject: [Assessment] Discussion of State Professional Development Systems Message-ID: <025b01c620f9$74ee5600$0402a8c0@frodo> The following post is from Jackie Taylor. *************** Colleagues: The Adult Literacy Professional Development Discussion List is hosting a disscussion of "State Professional Development Systems," featuring professional development offered both regionally in New England and in the following states: California, Florida, Massachusetts, New Mexico, New York, Ohio, and Rhode Island. Colleagues from all states are invited to participate and share their work or experiences with state PD! To participate, subscribe by visiting: http://www.nifl.gov/mailman/listinfo/Professionaldevelopment See below for the list of guests participating. I hope you will be able to join us! Jackie Taylor, Adult Literacy Professional Development List Moderator, jataylor at utk.edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Topic: State Professional Development Systems Discussion Dates: January 30 - February 10 Guest Facilitator: Cassie Drennon Bryant, President, Cassandra Drennon & Associates, Inc. To participate: Subscribe by visiting: http://www.nifl.gov/mailman/listinfo/Professionaldevelopment General Overview: Join our guests to discuss a broad range of topics on how state professional development (PD) systems work, including (but not limited to): funding, leadership, structure, provision of PD, policy, state initiatives, assessment and evaluation, continuous improvement, and other related issues. The discussion is open to anyone who would like to share their work or experiences with state PD. Guests from the following seven states and one region will be joining us in discussion and participating on behalf of their professional development entities/organizations: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ GUESTS New England Silja Kallenbach, Coordinator, New England Literacy Resource Center (NELRC)/World Education California Mary Ann Corley, Ph.D., Principal Research Analyst and California Adult Literacy Professional Development Project (CALPRO) Director, American Institutes for Research Erik Jacobson, Research Analyst, American Institutes for Research/CALPRO Wendi Maxwell, Education Programs Consultant, California Department of Education Florida Teresa G. Bestor, State Director of Adult Education and Compliance Monitoring, Division of Community Colleges and Workforce Education, Florida Department of Education Debra Hargrove, Coordinator, Florida TechNet Massachusetts Mina Reddy, Director, System for Adult Basic Education Support (SABES) Central Resource Center, World Education Steve Reuys, Director, Adult Literacy Resource Institute/Greater Boston SABES Regional Support Center George Kohout, Director, SABES Western Regional Support Center and has worked for five years as Technology Coordinator New Mexico Nick Evangelista, Executive Director, New Mexico Adult Education Association New York Ira Yankwitt, Director of the New York City Regional Adult Education Network (NYC RAEN), Literacy Assistance Center Ohio Jeff Fantine, Director of the Central/Southeast ABLE Resource Center at Ohio University, participating on behalf of the Ohio ABLE Resource Center Network Rhode Island Janet Isserlis Project Director, Literacy Resources/RI ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ To participate, subscribe by visiting: http://www.nifl.gov/mailman/listinfo/Professionaldevelopment See you on the list! Best, Jackie Taylor _______________________________________________ National Insitute for Literacy Moderators mailing list Moderators at nifl.gov http://www.nifl.gov/mailman/listinfo/moderators From marie.cora at hotspurpartners.com Thu Jan 26 08:59:46 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 26 Jan 2006 08:59:46 -0500 Subject: [Assessment] A response to Susan Reid's post to the assessment list Message-ID: <030e01c62280$c44c70a0$0402a8c0@frodo> Hi everyone, The following post if from Peggy McGuire, who apologizes for the lateness of her post - to which I say: "let no post remain unread!". marie cora Assessment Discussion List Moderator ************* Hello Susan. I have spent some (certainly not exhaustive) time looking at the NZ documents and thinking about the questions that Susan has raised with regard to standards-based vs. competency-based approaches to teaching. I have a few thoughts: First, I'm not surprised that practitioners, in the absence of widely-accepted literacy standards, have used the outcomes statements from the National Qualifications Framework unit standards to guide curriculum and assessment despite the Qualification Authority's caution that they were not intended for that purpose. A lot of folks here in the U.S. have been using the National Reporting System's Educational Functioning Levels in much the same way, despite the same kinds of cautions (Practitioners need and want guidance about what's most important to teach in order to best serve adult learners. They also know that they have to report student outcomes in terms that are acceptable to external audiences, particularly the NRS). That's one major reason why there has been a push over the last couple years for states to develop content standards - and why there is increasing interest in adopting or adapting the Equipped for the Future standards and developing aligned curriculum frameworks at the state level. Next, I completely agree with Regie that it is equally important for practitioners to be exposed to both the conceptual and operational differences between competency-based and standards-based approaches. I think Regie has done a nice job of describing the conceptual differences. And as I read the introduction in the document discussing NZ's draft Descriptive Standards, it occurred to me that another key to understanding the concept might be to elaborate on the goals for introducing descriptive standards articulated in that introduction - to help teachers promote students' fluency, independence and ability to use key skills in a range of authentic contexts for a range of real-life purposes. This seems a wonderful opportunity to discriminate between teaching a student how to competently perform a particular task, and teaching knowledge, skills and strategies in authentic contexts of use so that a student can competently perform a particular task as well as a range of tasks as needed. The approach to planning, teaching and assessment suggested by the latter is all about developing expertise in skills through purposeful, contextual and constructivist learning and practice. This in turn promotes transfer, and the fluency and independence in the use of the skills that the authors of NZ's Descriptive Standards advocate. It is all about preparing students to be lifelong learners. As for helping teachers and students to a functional understanding of the differences between this standards-based model and a competency-based model, it seems to me that there are a couple of key tools needed to support the Descriptive Standards before that can be done. According to Susan's post, one of those tools is currently in development - the "progressions" for the five draft standards that, I assume, will articulate in greater detail the knowledge, skills and strategies that need to be taught, learned and assessed at each developmental level (from novice toward expert) of each standard. This is a critical step in guidance for teachers, as it will give them the basis for assessing particular learning needs and selecting appropriate targets for direct instruction, depending on the purpose for and context in which the skill(s) will be used. But my experience with EFF suggests that giving teachers tools to figure out what to teach/assess at a particular level in a standards-based system is not enough. Teachers also need and value a comprehensive and common-sense model of what it looks like in a standards-based system to 1) surface students' purposes and goals for learning, 2) identify the knowledge and skills that they will need to meet those purposes/goals, 3) develop instructional activities that allow students to learn and practice needed skills in meaningful contexts related to their purposes/goals, and 4) monitor and gather evidence of learning, both for reporting purposes and to inform further instructional planning. Further, teachers need quality, ongoing professional development to help them understand and utilize such a model. In the case of the EFF Standards and Performance Continua (our version of "progressions"), we have tried to meet this need by developing, with close collaboration of teachers, an 8-step "Teaching and Learning Cycle" along with its companion "Teaching and Learning Toolkit", both of which utilize the EFF standards and Continua to align teaching and assessment. And then we offer implementation training and technical assistance to states and other organizations who are trying to "put this all together", through the EFF Center for Training and Implementation at the Center for Literacy Studies, University of Tennessee. The EFF Center is also providing expertise and support to those states who are ready to develop curriculum frameworks based on Content Standards. It occurs to me that, whatever models and training protocols get developed to support the NZ Descriptive Standards, this might be an excellent opportunity to increase the level of teacher (and perhaps even student) consultation in the overall process of moving toward a more standards-based system. And as I believe Regie rightly pointed out, the teachers who become engaged in the development process may well be your best, most informed translators to other teachers of the differences between standards-based and competency-based practice. Thanks so much for the opportunity to learn about the news from New Zealand and to think about the questions that your experience is raising. I wish you all the best in your efforts! Peggy McGuire, M.A. Senior Research Associate and Equipped for the Future National Consultant Center for Literacy Studies The University of Tennessee 111 5th Street, PO Box 16 Mt. Gretna, PA 17064 717-964-1341 (p/f) 215-888-6507 (cell) mcguirep555 at aol.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060126/ef25a8d8/attachment.html From sreid at workbase.org.nz Thu Jan 26 14:03:55 2006 From: sreid at workbase.org.nz (Susan Reid) Date: Fri, 27 Jan 2006 08:03:55 +1300 Subject: [Assessment] A response to Susan Reid's post to the assessment list Message-ID: <14794889A1E3AF419042F64CC5425A1E0C51BB@server1.wbeductrust.local> Hi Peggy Thank you so much for your thoughtful and comprehensive response It has certainly provided me with a way forward when talking to officials and others in New Zealand about approaches for the work that is currently happening here You and Regie have given me a way of describing the clear differences between the 2 types of stds which I realise I haven't been doing really adequately up to now and I am most grateful for your input I am sure I met you at the Rutgers Conference in 2003 and it is great to connect electronically 2 years later Thank you or as we say in New Zealand Kia ora Susan -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, 27 January 2006 3:00 a.m. To: Assessment Discussion List Subject: [Assessment] A response to Susan Reid's post to the assessment list Hi everyone, The following post if from Peggy McGuire, who apologizes for the lateness of her post - to which I say: "let no post remain unread!". marie cora Assessment Discussion List Moderator ************* Hello Susan. I have spent some (certainly not exhaustive) time looking at the NZ documents and thinking about the questions that Susan has raised with regard to standards-based vs. competency-based approaches to teaching. I have a few thoughts: First, I'm not surprised that practitioners, in the absence of widely-accepted literacy standards, have used the outcomes statements from the National Qualifications Framework unit standards to guide curriculum and assessment despite the Qualification Authority's caution that they were not intended for that purpose. A lot of folks here in the U.S. have been using the National Reporting System's Educational Functioning Levels in much the same way, despite the same kinds of cautions (Practitioners need and want guidance about what's most important to teach in order to best serve adult learners. They also know that they have to report student outcomes in terms that are acceptable to external audiences, particularly the NRS). That's one major reason why there has been a push over the last couple years for states to develop content standards - and why there is increasing interest in adopting or adapting the Equipped for the Future standards and developing aligned curriculum frameworks at the state level. Next, I completely agree with Regie that it is equally important for practitioners to be exposed to both the conceptual and operational differences between competency-based and standards-based approaches. I think Regie has done a nice job of describing the conceptual differences. And as I read the introduction in the document discussing NZ's draft Descriptive Standards, it occurred to me that another key to understanding the concept might be to elaborate on the goals for introducing descriptive standards articulated in that introduction - to help teachers promote students' fluency, independence and ability to use key skills in a range of authentic contexts for a range of real-life purposes. This seems a wonderful opportunity to discriminate between teaching a student how to competently perform a particular task, and teaching knowledge, skills and strategies in authentic contexts of use so that a student can competently perform a particular task as well as a range of tasks as needed. The approach to planning, teaching and assessment suggested by the latter is all about developing expertise in skills through purposeful, contextual and constructivist learning and practice. This in turn promotes transfer, and the fluency and independence in the use of the skills that the authors of NZ's Descriptive Standards advocate. It is all about preparing students to be lifelong learners. As for helping teachers and students to a functional understanding of the differences between this standards-based model and a competency-based model, it seems to me that there are a couple of key tools needed to support the Descriptive Standards before that can be done. According to Susan's post, one of those tools is currently in development - the "progressions" for the five draft standards that, I assume, will articulate in greater detail the knowledge, skills and strategies that need to be taught, learned and assessed at each developmental level (from novice toward expert) of each standard. This is a critical step in guidance for teachers, as it will give them the basis for assessing particular learning needs and selecting appropriate targets for direct instruction, depending on the purpose for and context in which the skill(s) will be used. But my experience with EFF suggests that giving teachers tools to figure out what to teach/assess at a particular level in a standards-based system is not enough. Teachers also need and value a comprehensive and common-sense model of what it looks like in a standards-based system to 1) surface students' purposes and goals for learning, 2) identify the knowledge and skills that they will need to meet those purposes/goals, 3) develop instructional activities that allow students to learn and practice needed skills in meaningful contexts related to their purposes/goals, and 4) monitor and gather evidence of learning, both for reporting purposes and to inform further instructional planning. Further, teachers need quality, ongoing professional development to help them understand and utilize such a model. In the case of the EFF Standards and Performance Continua (our version of "progressions"), we have tried to meet this need by developing, with close collaboration of teachers, an 8-step "Teaching and Learning Cycle" along with its companion "Teaching and Learning Toolkit", both of which utilize the EFF standards and Continua to align teaching and assessment. And then we offer implementation training and technical assistance to states and other organizations who are trying to "put this all together", through the EFF Center for Training and Implementation at the Center for Literacy Studies, University of Tennessee. The EFF Center is also providing expertise and support to those states who are ready to develop curriculum frameworks based on Content Standards. It occurs to me that, whatever models and training protocols get developed to support the NZ Descriptive Standards, this might be an excellent opportunity to increase the level of teacher (and perhaps even student) consultation in the overall process of moving toward a more standards-based system. And as I believe Regie rightly pointed out, the teachers who become engaged in the development process may well be your best, most informed translators to other teachers of the differences between standards-based and competency-based practice. Thanks so much for the opportunity to learn about the news from New Zealand and to think about the questions that your experience is raising. I wish you all the best in your efforts! Peggy McGuire, M.A. Senior Research Associate and Equipped for the Future National Consultant Center for Literacy Studies The University of Tennessee 111 5th Street, PO Box 16 Mt. Gretna, PA 17064 717-964-1341 (p/f) 215-888-6507 (cell) mcguirep555 at aol.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060127/45aa2b30/attachment.html From dbaycich at literacy.kent.edu Mon Jan 30 13:55:42 2006 From: dbaycich at literacy.kent.edu (Dianna Baycich) Date: Mon, 30 Jan 2006 13:55:42 -0500 Subject: [Assessment] TABE question Message-ID: <00aa01c625ce$cd46ae90$f5607b83@Hebe> Hello, Is the TABE appropriate to administer to youth as young as 13 or 14 years old? Thank you, Dianna Baycich Ohio Literacy Resource Center Research I Bld. 1100 Summit St. PO Box 5190 Kent State University Kent, OH 44242 330.672.7841 330.672.4841 (fax) "I have never been lost, but I will admit to being confused for several weeks." -Daniel Boone (frontier explorer) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060130/f8b4418d/attachment.html From pcampbell at interbaun.com Mon Jan 30 14:17:20 2006 From: pcampbell at interbaun.com (Pat Campbell) Date: Mon, 30 Jan 2006 12:17:20 -0700 Subject: [Assessment] TABE question In-Reply-To: <00aa01c625ce$cd46ae90$f5607b83@Hebe> Message-ID: HI Dianna, There are many versions of the TABE. I would suggest you contact the publisher and find out the norming population for the version you want to use. Sincerely, Pat On 1/30/06 11:55 AM, "Dianna Baycich" wrote: > Hello, > Is the TABE appropriate to administer to youth as young as 13 or 14 years old? > Thank you, > Dianna Baycich > Ohio Literacy Resource Center > Research I Bld. > 1100 Summit St. > PO Box 5190 > Kent State University > Kent, OH 44242 > 330.672.7841 330.672.4841 (fax) > "I have never been lost, but I will admit to being confused for several > weeks." -Daniel Boone (frontier explorer) > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > -- Dr. Pat Campbell President, Grass Roots Press Mailing Address: 6520 - 82 Avenue, Main Floor Edmonton, AB, T6B 0E7 Phone: (780) 448-7323 (READ) Fax: (780) 413-6582 Web site: www.literacyservices.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060130/f1e878a7/attachment.html From bcarmel at rocketmail.com Tue Jan 31 11:28:23 2006 From: bcarmel at rocketmail.com (Bruce Carmel) Date: Tue, 31 Jan 2006 08:28:23 -0800 (PST) Subject: [Assessment] TABE question In-Reply-To: Message-ID: <20060131162823.6355.qmail@web30206.mail.mud.yahoo.com> Hi Dianna and Fellow List Servers, My two cents, somewhat related to your question.... We are required to use the TABE here in New York State. So it's "appropriate" that we adminster it. In my opinion, the TABE does not test reading ability, it tests the ability to take a standardized test. And it's not even a very good standardized test: tricky, culturally biased, etc. There is some connection between taking a standarized- multiple-choice reading test and being able to read--I will admit it--but not enough for me. Maybe you've heard this sort of thing before, but just checking. I would not use the TABE test if I were not required to do so. From Bruce Carmel Turning Point Pat Campbell wrote: HI Dianna, There are many versions of the TABE. I would suggest you contact the publisher and find out the norming population for the version you want to use. Sincerely, Pat On 1/30/06 11:55 AM, "Dianna Baycich" wrote: Hello, Is the TABE appropriate to administer to youth as young as 13 or 14 years old? Thank you, Dianna Baycich Ohio Literacy Resource Center Research I Bld. 1100 Summit St. PO Box 5190 Kent State University Kent, OH 44242 330.672.7841 330.672.4841 (fax) "I have never been lost, but I will admit to being confused for several weeks." -Daniel Boone (frontier explorer) --------------------------------- ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -- Dr. Pat Campbell President, Grass Roots Press Mailing Address: 6520 - 82 Avenue, Main Floor Edmonton, AB, T6B 0E7 Phone: (780) 448-7323 (READ) Fax: (780) 413-6582 Web site: www.literacyservices.com ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- What are the most popular cars? Find out at Yahoo! Autos -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/ebddf7c9/attachment.html From AWilder106 at aol.com Tue Jan 31 11:39:33 2006 From: AWilder106 at aol.com (AWilder106 at aol.com) Date: Tue, 31 Jan 2006 11:39:33 EST Subject: [Assessment] TABE question Message-ID: <218.12444816.3110ecc5@aol.com> Bruce, What would you use? That would satisy both you and funders? Thanks for ANY enlightenment on this problem. Andrea -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/94a73084/attachment.html From marie.cora at hotspurpartners.com Tue Jan 31 12:03:48 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 31 Jan 2006 12:03:48 -0500 Subject: [Assessment] TABE questions Message-ID: <04c101c62688$4e6c83a0$0402a8c0@frodo> Hi all, I also just wanted to note that the TABE was not normed on people the age that Dianna inquires after - 13/14 year olds. So wouldn't this make the TABE invalid for use with young teens? For what purpose would the TABE be used with people so young? marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/b6dbecea/attachment.html From dbaycich at literacy.kent.edu Tue Jan 31 12:01:20 2006 From: dbaycich at literacy.kent.edu (Dianna Baycich) Date: Tue, 31 Jan 2006 12:01:20 -0500 Subject: [Assessment] TABE questions In-Reply-To: <04c101c62688$4e6c83a0$0402a8c0@frodo> Message-ID: <002d01c62687$f8816d70$f5607b83@Hebe> I can give one example of when it was used with folks age 13 to 16. In 1991 and 1992 I worked at a Private Industry Council Summer Youth Employment and Training Program. We gave the TABE to all the kids we accepted into the program. We used the results to determine if they would be placed in tutoring as well as a job or just a job. The reason I asked the question this time is for a collegue of mine who needs an instrument to determine reading level for a study she is doing. She wanted to use the instrument that was commonly used in Ohio. She was originally going to work with adults but had to change her study to use youth. So... a bit more explanation for why I was asking. Thank you to everyone who has responded so far. I hope to hear more. Dianna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, January 31, 2006 12:04 PM To: Assessment Discussion List Subject: [Assessment] TABE questions Hi all, I also just wanted to note that the TABE was not normed on people the age that Dianna inquires after - 13/14 year olds. So wouldn't this make the TABE invalid for use with young teens? For what purpose would the TABE be used with people so young? marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/408d19e3/attachment.html From bcarmel at rocketmail.com Tue Jan 31 12:30:10 2006 From: bcarmel at rocketmail.com (Bruce Carmel) Date: Tue, 31 Jan 2006 09:30:10 -0800 (PST) Subject: [Assessment] TABE question In-Reply-To: <218.12444816.3110ecc5@aol.com> Message-ID: <20060131173010.55129.qmail@web30213.mail.mud.yahoo.com> Hi Andrea, What would I use instead of the TABE if I didn't have to use the TABE? I would use different tools for different purposes: placement, measuring individual progress, and reporting program impact. --For beginning readers, I would use something like the Literacy Volunteers READ Test for all of those purposes. In that test, you ask people to read passages of different levels of difficulty and see whether or not they can actually READ them. --For people who are closer to GED, I would have them take the GED a predictor test. Yes it's still a standardized test, but it's the one our students need to pass to achieve their goals. The TABE is not a very good predictor of GED success. --For everyone's individual progress, I would use portfolios. Portfolios are great for noting individual progress, but I don't think they work for program impact. Accomplishments in portfolios are too all-over-the-place to be aggregated. --Another way I have placed beginner-intermediate people is by having them read a range of texts. For example, after an orientation, students did a one-on-one assessment. They were shown a supermarket circular, a subway map, a job application, a telephone book, and a newspaper. We asked them which ones they could read. Then they were asked to DO things such as tell us how they got to school using the map,try to find their best friend's name in the phone book, and explain how to fill out the application. This took a lot of staff time, but it showed us a lot about students reading levels. Assessment is a great challenge in our field. I have never found a tool I like that is easily administered. The only ones I like take lots of time, and they are not perfect. And even if we devote the time to a good tool, we still have to use standardized tests as well to comply with funders. It's a great frustration that there is no useful, valid, easily adminstered assessment tool for all stakeholders: funders, students, staff and others. From Bruce Carmel AWilder106 at aol.com wrote: Bruce, What would you use? That would satisy both you and funders? Thanks for ANY enlightenment on this problem. Andrea ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Yahoo! Autos. Looking for a sweet ride? Get pricing, reviews, & more on new and used cars. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/43b4dd12/attachment.html From sfallsliteracy at yahoo.com Tue Jan 31 12:46:34 2006 From: sfallsliteracy at yahoo.com (Nancy Hansen) Date: Tue, 31 Jan 2006 09:46:34 -0800 (PST) Subject: [Assessment] TABE questions In-Reply-To: <002d01c62687$f8816d70$f5607b83@Hebe> Message-ID: <20060131174634.3427.qmail@web34706.mail.mud.yahoo.com> Dianna - Isn't there a testing instrument used in the Ohio public school system for (and with) this age group that would be more appropriate than the Teaching Adults Beginning Education tool (TABE)? If reading level is being measured, how do the schools gain that data? In my opinion, these kids are far from adults - even if they are entering the job market. I agree with Maria that it should be questioned whether or not the TABE would provide valid scores when it's designed to be used with adults. If your collegue's study is going to be worth anything at all, the tool used to determine reading levels should be one designed to work with that age group. With kids this age (particularly if they are having difficulties in school), I even wonder whether or not timed testing is the way to *get* accurate data. Does the public school system have a different type of evaluation that determines grade level than TABE for special education students? Nancy Hansen adult literacy administrator Sioux Falls Area Literacy Council sfallsliteracy at yahoo.com Sioux Falls, SD Dianna Baycich wrote: st1\:* { BEHAVIOR: url(#default#ieooui) } @page Section1 {size: 8.5in 11.0in; margin: 1.0in 1.25in 1.0in 1.25in; mso-header-margin: .5in; mso-footer-margin: .5in; mso-paper-source: 0; } P.MsoNormal { FONT-SIZE: 12pt; MARGIN: 0in 0in 0pt; FONT-FAMILY: "Times New Roman"; mso-style-parent: ""; mso-pagination: widow-orphan; mso-fareast-font-family: "Times New Roman" } LI.MsoNormal { FONT-SIZE: 12pt; MARGIN: 0in 0in 0pt; FONT-FAMILY: "Times New Roman"; mso-style-parent: ""; mso-pagination: widow-orphan; mso-fareast-font-family: "Times New Roman" } DIV.MsoNormal { FONT-SIZE: 12pt; MARGIN: 0in 0in 0pt; FONT-FAMILY: "Times New Roman"; mso-style-parent: ""; mso-pagination: widow-orphan; mso-fareast-font-family: "Times New Roman" } A:link { COLOR: blue; TEXT-DECORATION: underline; text-underline: single } SPAN.MsoHyperlink { COLOR: blue; TEXT-DECORATION: underline; text-underline: single } A:visited { COLOR: purple; TEXT-DECORATION: underline; text-underline: single } SPAN.MsoHyperlinkFollowed { COLOR: purple; TEXT-DECORATION: underline; text-underline: single } SPAN.EmailStyle17 { COLOR: windowtext; FONT-FAMILY: Arial; mso-style-type: personal-compose; mso-style-noshow: yes; mso-ansi-font-size: 10.0pt; mso-bidi-font-size: 10.0pt; mso-ascii-font-family: Arial; mso-hansi-font-family: Arial; mso-bidi-font-family: Arial } SPAN.SpellE { mso-style-name: ""; mso-spl-e: yes } SPAN.GramE { mso-style-name: ""; mso-gram-e: yes } DIV.Section1 { page: Section1 } I can give one example of when it was used with folks age 13 to 16. In 1991 and 1992 I worked at a Private Industry Council Summer Youth Employment and Training Program. We gave the TABE to all the kids we accepted into the program. We used the results to determine if they would be placed in tutoring as well as a job or just a job. The reason I asked the question this time is for a collegue of mine who needs an instrument to determine reading level for a study she is doing. She wanted to use the instrument that was commonly used in Ohio. She was originally going to work with adults but had to change her study to use youth. So... a bit more explanation for why I was asking. Thank you to everyone who has responded so far. I hope to hear more. Dianna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, January 31, 2006 12:04 PM To: Assessment Discussion List Subject: [Assessment] TABE questions Hi all, I also just wanted to note that the TABE was not normed on people the age that Dianna inquires after ? 13/14 year olds. So wouldn?t this make the TABE invalid for use with young teens? For what purpose would the TABE be used with people so young? marie cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- What are the most popular cars? Find out at Yahoo! Autos -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/608781b1/attachment.html From marie.cora at hotspurpartners.com Tue Jan 31 13:07:58 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 31 Jan 2006 13:07:58 -0500 Subject: [Assessment] Standardize: the ritual harp Message-ID: <050701c62691$45240620$0402a8c0@frodo> Hi all, Great conversation going here..looking forward to more. Bruce, thanks so much for your posts. I'm not picking on you, I promise: I'm picking on EVERYONE!!! Bruce, you said: "And even if we devote the time to a good tool, we still have to use standardized tests as well to comply with funders." Many of you are moaning now, thinking: "here she goes yet again." - but it's my job: good tools and bad tools are standardized; portfolios are standardized; quizzes can be standardized; TABE is standardized but so is the writing rubric REEP; in theory, any assessment can be standardized. What everyone rails against is the particular selection of assessments that are available to us today - and while those are standardized, that's not what makes them good, bad, or ugly. Standardized tests should be viewed as positive things because their soul mission in life is to attempt a level playing field. As a field, we object to the paucity of choices, not that those choices are standardized. We also object to the use of materials that are out-dated or do not reflect today's needs. We object to the mis-use of a particular assessment. We should object to incorrect use of data and test results. We should object to mis-alignment between curriculum and assessment - which directly speaks to Bruce's (and many many folks') wish that there be an assessment that can serve the purposes of the classroom and program as well as it can serve the purposes of high stakes issues (like funding or career advancement). What we really want are in fact standardized assessments, that's not the issue. The issue is that we don't yet have a thorough selection of tools that meet our complex needs - whether those tools are standardized or not. marie-harp-on-cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/ee426ecb/attachment.html From prwhite at MadisonCounty.NET Tue Jan 31 13:09:41 2006 From: prwhite at MadisonCounty.NET (Patti White) Date: Tue, 31 Jan 2006 12:09:41 -0600 Subject: [Assessment] TABE question References: <20060131173010.55129.qmail@web30213.mail.mud.yahoo.com> Message-ID: <052d01c62691$81e352f0$6501a8c0@PattiAALRC> Bruce, Let me applaud - loudly and with gusto - everything you wrote in this post. The practice test is indeed a much better tool for predicting success on the GED, and the portfolio assessment approach is really the most fair way to assess the diverse population of ABE/GED learners. I think there is surely a way to use portfolios to somehow demonstrate a program's success as well as an individual student's success, but not with the current NRS. Thank you so much for sharing your views. It is my continued hope that someday the funders will understand the complexities of "valid" assessment tools. In the meantime, I'm very happy for the students in your program who have so many choices regarding how they may demonstrate their learning. Patti White, M.Ed. Disabilities Project Manager Arkansas Adult Learning Resource Center prwhite at madisoncounty.net ----- Original Message ----- From: Bruce Carmel To: The Assessment Discussion List Sent: Tuesday, January 31, 2006 11:30 AM Subject: Re: [Assessment] TABE question Hi Andrea, What would I use instead of the TABE if I didn't have to use the TABE? I would use different tools for different purposes: placement, measuring individual progress, and reporting program impact. --For beginning readers, I would use something like the Literacy Volunteers READ Test for all of those purposes. In that test, you ask people to read passages of different levels of difficulty and see whether or not they can actually READ them. --For people who are closer to GED, I would have them take the GED a predictor test. Yes it's still a standardized test, but it's the one our students need to pass to achieve their goals. The TABE is not a very good predictor of GED success. --For everyone's individual progress, I would use portfolios. Portfolios are great for noting individual progress, but I don't think they work for program impact. Accomplishments in portfolios are too all-over-the-place to be aggregated. --Another way I have placed beginner-intermediate people is by having them read a range of texts. For example, after an orientation, students did a one-on-one assessment. They were shown a supermarket circular, a subway map, a job application, a telephone book, and a newspaper. We asked them which ones they could read. Then they were asked to DO things such as tell us how they got to school using the map,try to find their best friend's name in the phone book, and explain how to fill out the application. This took a lot of staff time, but it showed us a lot about students reading levels. Assessment is a great challenge in our field. I have never found a tool I like that is easily administered. The only ones I like take lots of time, and they are not perfect. And even if we devote the time to a good tool, we still have to use stan dardized tests as well to comply with funders. It's a great frustration that there is no useful, valid, easily adminstered assessment tool for all stakeholders: funders, students, staff and others. >From Bruce Carmel AWilder106 at aol.com wrote: Bruce, What would you use? That would satisy both you and funders? Thanks for ANY enlightenment on this problem. Andrea ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment Yahoo! Autos. Looking for a sweet ride? Get pricing, reviews, & more on new and used cars. ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From bcarmel at rocketmail.com Tue Jan 31 13:13:17 2006 From: bcarmel at rocketmail.com (Bruce Carmel) Date: Tue, 31 Jan 2006 10:13:17 -0800 (PST) Subject: [Assessment] Standardize: the ritual harp In-Reply-To: <050701c62691$45240620$0402a8c0@frodo> Message-ID: <20060131181317.29409.qmail@web30202.mail.mud.yahoo.com> Hi Marie (and everyone) I hear your point about standardizing assessment tools. Thanks for the posting. When I hear people say "standardized" in this context, it is always "standardized tests." (Except in the posting you just wrote:)) I don't think they mean something such as a standardized system for scoring portfolios. I think "standardized" means TABE, CASAS, etc. in our field. Marie Cora wrote: st1\:*{behavior:url(#default#ieooui) } Hi all, Great conversation going here .looking forward to more. Bruce, thanks so much for your posts. I?m not picking on you, I promise: I?m picking on EVERYONE!!! Bruce, you said: ?And even if we devote the time to a good tool, we still have to use standardized tests as well to comply with funders.? Many of you are moaning now, thinking: ?here she goes yet again ? ? but it?s my job: good tools and bad tools are standardized; portfolios are standardized; quizzes can be standardized; TABE is standardized but so is the writing rubric REEP; in theory, any assessment can be standardized. What everyone rails against is the particular selection of assessments that are available to us today ? and while those are standardized, that?s not what makes them good, bad, or ugly. Standardized tests should be viewed as positive things because their soul mission in life is to attempt a level playing field. As a field, we object to the paucity of choices, not that those choices are standardized. We also object to the use of materials that are out-dated or do not reflect today?s needs. We object to the mis-use of a particular assessment. We should object to incorrect use of data and test results. We should object to mis-alignment between curriculum and assessment ? which directly speaks to Bruce?s (and many many folks?) wish that there be an assessment that can serve the purposes of the classroom and program as well as it can serve the purposes of high stakes issues (like funding or career advancement). What we really want are in fact standardized assessments, that?s not the issue. The issue is that we don?t yet have a thorough selection of tools that meet our complex needs ? whether those tools are standardized or not. marie-harp-on-cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Yahoo! Autos. Looking for a sweet ride? Get pricing, reviews, & more on new and used cars. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/279ae6e0/attachment.html From tarv at chemeketa.edu Tue Jan 31 13:20:32 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Tue, 31 Jan 2006 10:20:32 -0800 Subject: [Assessment] TABE question Message-ID: <89DA2100D59D7341BEA4D938F9FB2A93024B52D2@cccmail2.chemeketa.network> I'd have to agree with Patti White. TABE is useful for staff who are used to grade levels, but many people who work with adults prefer CASAS, a test normed on adults and much less stressful than a TABE test. When it comes to students CASAS and portfolios are more user friendly. CASAS has a comparison study on GED test success; the higher the CASAS score the higher the probability of passing the GED test. CASAS and Official Practice Tests are better predictors than a grade level score, in my opinion. va Virginia Tardaewether Chemeketa Community College 4000 Lancaster Drive NE Salem, OR 97305 503-399-6147 When new life appears, just as a budding plant rises from the ground, the heart aspires to new growth and is filled with hope. Use life to renew oneself. Voices of the Heart -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Patti White Sent: Tuesday, January 31, 2006 10:10 AM To: The Assessment Discussion List Subject: Re: [Assessment] TABE question Bruce, Let me applaud - loudly and with gusto - everything you wrote in this post. The practice test is indeed a much better tool for predicting success on the GED, and the portfolio assessment approach is really the most fair way to assess the diverse population of ABE/GED learners. I think there is surely a way to use portfolios to somehow demonstrate a program's success as well as an individual student's success, but not with the current NRS. Thank you so much for sharing your views. It is my continued hope that someday the funders will understand the complexities of "valid" assessment tools. In the meantime, I'm very happy for the students in your program who have so many choices regarding how they may demonstrate their learning. Patti White, M.Ed. Disabilities Project Manager Arkansas Adult Learning Resource Center prwhite at madisoncounty.net ----- Original Message ----- From: Bruce Carmel To: The Assessment Discussion List Sent: Tuesday, January 31, 2006 11:30 AM Subject: Re: [Assessment] TABE question Hi Andrea, What would I use instead of the TABE if I didn't have to use the TABE? I would use different tools for different purposes: placement, measuring individual progress, and reporting program impact. --For beginning readers, I would use something like the Literacy Volunteers READ Test for all of those purposes. In that test, you ask people to read passages of different levels of difficulty and see whether or not they can actually READ them. --For people who are closer to GED, I would have them take the GED a predictor test. Yes it's still a standardized test, but it's the one our students need to pass to achieve their goals. The TABE is not a very good predictor of GED success. --For everyone's individual progress, I would use portfolios. Portfolios are great for noting individual progress, but I don't think they work for program impact. Accomplishments in portfolios are too all-over-the-place to be aggregated. --Another way I have placed beginner-intermediate people is by having them read a range of texts. For example, after an orientation, students did a one-on-one assessment. They were shown a supermarket circular, a subway map, a job application, a telephone book, and a newspaper. We asked them which ones they could read. Then they were asked to DO things such as tell us how they got to school using the map,try to find their best friend's name in the phone book, and explain how to fill out the application. This took a lot of staff time, but it showed us a lot about students reading levels. Assessment is a great challenge in our field. I have never found a tool I like that is easily administered. The only ones I like take lots of time, and they are not perfect. And even if we devote the time to a good tool, we still have to use stan dardized tests as well to comply with funders. It's a great frustration that there is no useful, valid, easily adminstered assessment tool for all stakeholders: funders, students, staff and others. >From Bruce Carmel AWilder106 at aol.com wrote: Bruce, What would you use? That would satisy both you and funders? Thanks for ANY enlightenment on this problem. Andrea ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment Yahoo! Autos. Looking for a sweet ride? Get pricing, reviews, & more on new and used cars. ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Tue Jan 31 13:48:53 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 31 Jan 2006 13:48:53 -0500 Subject: [Assessment] Standardize: the ritual harp In-Reply-To: <20060131181317.29409.qmail@web30202.mail.mud.yahoo.com> Message-ID: <053401c62696$fca353f0$0402a8c0@frodo> Hi Bruce, Thanks for this - and I completely agree with you: for whatever reasons, 'standardized' does seem to equal TABE, CASAS, and BEST within our field. I guess you could say it's a piece of my mission - to convince our field not to think this way or use the term this way any longer. marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Bruce Carmel Sent: Tuesday, January 31, 2006 1:13 PM To: The Assessment Discussion List Subject: Re: [Assessment] Standardize: the ritual harp Hi Marie (and everyone) I hear your point about standardizing assessment tools. Thanks for the posting. When I hear people say "standardized" in this context, it is always "standardized tests." (Except in the posting you just wrote:)) I don't think they mean something such as a standardized system for scoring portfolios. I think "standardized" means TABE, CASAS, etc. in our field. Marie Cora wrote: Hi all, Great conversation going here..looking forward to more. Bruce, thanks so much for your posts. I'm not picking on you, I promise: I'm picking on EVERYONE!!! Bruce, you said: "And even if we devote the time to a good tool, we still have to use standardized tests as well to comply with funders." Many of you are moaning now, thinking: "here she goes yet again." - but it's my job: good tools and bad tools are standardized; portfolios are standardized; quizzes can be standardized; TABE is standardized but so is the writing rubric REEP; in theory, any assessment can be standardized. What everyone rails against is the particular selection of assessments that are available to us today - and while those are standardized, that's not what makes them good, bad, or ugly. Standardized tests should be viewed as positive things because their soul mission in life is to attempt a level playing field. As a field, we object to the paucity of choices, not that those choices are standardized. We also object to the use of materials that are out-dated or do not reflect today's needs. We object to the mis-use of a particular assessment. We should object to incorrect use of data and test results. We should object to mis-alignment between curriculum and assessment - which directly speaks to Bruce's (and many many folks') wish that there be an assessment that can serve the purposes of the classroom and program as well as it can serve the purposes of high stakes issues (like funding or career advancement). What we really want are in fact standardized assessments, that's not the issue. The issue is that we don't yet have a thorough selection of tools that meet our complex needs - whether those tools are standardized or not. marie-harp-on-cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Autos. Looking for a sweet ride? Get pricing, reviews, & more on new and used cars. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/864238d7/attachment.html From bcarmel at rocketmail.com Tue Jan 31 14:00:05 2006 From: bcarmel at rocketmail.com (Bruce Carmel) Date: Tue, 31 Jan 2006 11:00:05 -0800 (PST) Subject: [Assessment] Standardize: the ritual harp In-Reply-To: <053401c62696$fca353f0$0402a8c0@frodo> Message-ID: <20060131190005.61639.qmail@web30209.mail.mud.yahoo.com> Hi List (and Marie) I think Marie's point is very important. It didn't work to talk about assessment tools that weren't TABE-like as "alternative." I didn't work to criticize standardized tests but have no alternative to offer. I think putting forth the strengths and legitimacy of tools such as portofios, outcome checklists, holistically scored writing samples, etc is a good way to go. Bruce Marie Cora wrote: v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} st1\:*{behavior:url(#default#ieooui) } Hi Bruce, Thanks for this ? and I completely agree with you: for whatever reasons, ?standardized? does seem to equal TABE, CASAS, and BEST within our field. I guess you could say it?s a piece of my mission ? to convince our field not to think this way or use the term this way any longer. marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Bruce Carmel Sent: Tuesday, January 31, 2006 1:13 PM To: The Assessment Discussion List Subject: Re: [Assessment] Standardize: the ritual harp Hi Marie (and everyone) I hear your point about standardizing assessment tools. Thanks for the posting. When I hear people say "standardized" in this context, it is always "standardized tests." (Except in the posting you just wrote:)) I don't think they mean something such as a standardized system for scoring portfolios. I think "standardized" means TABE, CASAS, etc. in our field. Marie Cora wrote: Hi all, Great conversation going here .looking forward to more. Bruce, thanks so much for your posts. I?m not picking on you, I promise: I?m picking on EVERYONE!!! Bruce, you said: ?And even if we devote the time to a good tool, we still have to use standardized tests as well to comply with funders.? Many of you are moaning now, thinking: ?here she goes yet again ? ? but it?s my job: good tools and bad tools are standardized; portfolios are standardized; quizzes can be standardized; TABE is standardized but so is the writing rubric REEP; in theory, any assessment can be standardized. What everyone rails against is the particular selection of assessments that are available to us today ? and while those are standardized, that?s not what makes them good, bad, or ugly. Standardized tests should be viewed as positive things because their soul mission in life is to attempt a level playing field. As a field, we object to the paucity of choices, not that those choices are standardized. We also object to the use of materials that are out-dated or do not reflect today?s needs. We object to the mis-use of a particular assessment. We should object to incorrect use of data and test results. We should object to mis-alignment between curriculum and assessment ? which directly speaks to Bruce?s (and many many folks?) wish that there be an assessment that can serve the purposes of the classroom and program as well as it can serve the purposes of high stakes issues (like funding or career advancement). What we really want are in fact standardized assessments, that?s not the issue. The issue is that we don?t yet have a thorough selection of tools that meet our complex needs ? whether those tools are standardized or not. marie-harp-on-cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Yahoo! Autos. Looking for a sweet ride? Get pricing, reviews, & more on new and used cars. ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Do you Yahoo!? With a free 1 GB, there's more in store with Yahoo! Mail. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/2d882fbf/attachment.html From Shauna.South at schools.utah.gov Tue Jan 31 18:52:41 2006 From: Shauna.South at schools.utah.gov (South, Shauna) Date: Tue, 31 Jan 2006 16:52:41 -0700 Subject: [Assessment] TABE questions Message-ID: <3B603C8007FB954BB1A4FE9B16CC4A1701B18FA9@edell.usoe.k12.ut.us> TABE is not Teaching Adults Beginning Education. It is Test of Adult Basic Education ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Nancy Hansen Sent: Tuesday, January 31, 2006 10:47 AM To: The Assessment Discussion List Subject: Re: [Assessment] TABE questions Dianna - Isn't there a testing instrument used in the Ohio public school system for (and with) this age group that would be more appropriate than the Teaching Adults Beginning Education tool (TABE)? If reading level is being measured, how do the schools gain that data? In my opinion, these kids are far from adults - even if they are entering the job market. I agree with Maria that it should be questioned whether or not the TABE would provide valid scores when it's designed to be used with adults. If your collegue's study is going to be worth anything at all, the tool used to determine reading levels should be one designed to work with that age group. With kids this age (particularly if they are having difficulties in school), I even wonder whet her or not timed testing is the way to *get* accurate data. Does the public school system have a different type of evaluation that determines grade level than TABE for special education students? Nancy Hansen adult literacy administrator Sioux Falls Area Literacy Council sfallsliteracy at yahoo.com Sioux Falls, SD Dianna Baycich wrote: I can give one example of when it was used with folks age 13 to 16. In 1991 and 1992 I worked at a Private Industry Council Summer Youth Employment and Training Program. We gave the TABE to all the kids we accepted into the program. We use d the results to determine if they would be placed in tutoring as well as a job or just a job. The reason I asked the question this time is for a collegue of mine who needs an instrument to determine reading level for a study she is doing. She wanted to use the instrument that was commonly used in Ohio. She was originally going to work with adults but had to change her study to use youth. So... a bit more explanation for why I was asking. Thank you to everyone who has responded so far. I hope to hear more. Dianna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, January 31, 2006 12:04 PM To: Assessment Discussion List Subject: [Assessment] TABE questions Hi all, I also just wanted to note that the TABE was not normed on people the age that Dianna inquires after - 13/14 year olds. So wouldn't this make the TABE invalid for use with young teens? For what purpose would the TABE be used with people so young? marie cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ________________________________ What are the most popular cars? Find out at Yahoo! Autos -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060131/1b7e69f4/attachment.html From marie.cora at hotspurpartners.com Wed Feb 1 10:46:37 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 1 Feb 2006 10:46:37 -0500 Subject: [Assessment] Adult Education Reading Resources Message-ID: <05d401c62746$afee1e90$0402a8c0@frodo> Dear List Members: The National Institute for Literacy Partnership for Reading Publications Web page has a couple of resources that are of interest to adult education instructors. One is entitled Applying Research in Reading Instruction for Adults: First Steps for Teachers. The author says the book "aims first to build background knowledge about reading and scientifically based reading instruction." The book is full of student and classroom illustrations and sample instructional activities. The second publication is entitled Teaching Adults to Read. This booklet "describes strategies proven to work by the most rigorous scientific research available on the teaching of reading." It summarizes the trends and principles identified in the 2002 publication Research-Based Principles for Adult Basic Education Reading Instruction. Both publications may be viewed and may be downloaded from: http://www.nifl.gov/partnershipforreading/publications/adult.html Subscribers can also order copies. (Note: Applying Research in Reading Instruction for Adults is currently only available online.) National Institute for Literacy at EdPubs PO Box 1398 Jessup, MD 20794-1398 800-228-8813 fax 301-470-1244 edpuborders at edpubs.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060201/79b31bd8/attachment.html From marie.cora at hotspurpartners.com Wed Feb 1 11:09:01 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 1 Feb 2006 11:09:01 -0500 Subject: [Assessment] Correct email address for List - Reminder Message-ID: <05ea01c62749$d198dd70$0402a8c0@frodo> Hi everyone, Now that Mailman has been up and running for a few months we want to make sure everyone is sending emails to the correct address: assessment at nifl.gov Below is the link to the Mailman Help Page where you can access general information regarding all the Discussion Lists and Mailman functions. http://www.nifl.gov/lincs/discussions/help/help_mailman.html Here is the link to the Assessment List information page, where you can change your settings if you want to: http://www.nifl.gov/mailman/listinfo/assessment If you have any questions or are having difficulties receiving or sending posts, please do not hesitate to contact me. Thanks! marie cora Assessment Discussion List Moderator National Institute for Literacy marie.cora at hotspurpartners.com From marie.cora at hotspurpartners.com Thu Feb 2 11:51:38 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 2 Feb 2006 11:51:38 -0500 Subject: [Assessment] Legitimacy of alternative tools Message-ID: <064901c62818$f074a420$0402a8c0@frodo> Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060202/39d342b7/attachment.html From koconnor at framingham.k12.ma.us Thu Feb 2 13:08:50 2006 From: koconnor at framingham.k12.ma.us (Kevin O'Connor) Date: Thu, 2 Feb 2006 13:08:50 -0500 Subject: [Assessment] Legitimacy of alternative tools Message-ID: Hi Marie, Bruce and All, These kinds of constructed response assessments are easier to build that selected-response, but MUCH harder to score. The REEP is one Performance Assessment with which many of us are familiar. It is a standardized, constructed-response tool, and I think we can look at its statewide implementation as a bellwether of using more authentic, alternative assessments. In Massachusetts, a lot of time and effort goes into standardizing scorers, initially and continually, in order to ensure that the tool is being used according to its design. Despite the institutional commitment of the DOE, it is a great struggle, perhaps even an act of faith, to ensure that all scorers are aligned. We all know of cases where two scorers, reading the same essay and using the same rubric, show a startling disparity in points awarded. This is not to say that authentic assessment is invalid or undesirable; I fell that they are MORE authentic, valid and desirable... but we need to keep an eye on their reliability. As we put forward the strengths of these tools, we must be ready to acknowledge and pro-actively address their limitations by diligently and thoroughly preparing these tools. This is not as hard as it might sound: we must be sure that the tools we select are design actually measure the domain for which we aim and we must make sure that we use them reliably, i.e., with some standardization (which is NOT a four-letter word). We can't just take them off the shelf and expect one size to fit all- that's what gave "standardized testing" its bad name in the first place. Every teacher designs assessments for their own class- I have a great presentation rating form, but it only works for the specific curriculum. I'm sure that others have great things as well and I'd like to get ideas from them; what's the best way to get these out in the field, and discuss where they are appropriate? -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060202/8031989b/attachment.html From pmcnaughton at language.ca Thu Feb 2 14:30:00 2006 From: pmcnaughton at language.ca (Pauline Mcnaughton) Date: Thu, 2 Feb 2006 14:30:00 -0500 Subject: [Assessment] Legitimacy of alternative tools In-Reply-To: <064901c62818$f074a420$0402a8c0@frodo> Message-ID: <006001c6282f$0fc72080$9a01a8c0@language.ca> I think that alternative assessment tools (portfolios, outcome checklists etc.) are an excellent way to ensure that assessment - relates to what is being taught in the classroom, - focuses on tasks that relate to learner goals and objectives - is supported by teacher/learner conferencing The challenge is to ensure that teachers have good standards-based tools such as (e.g. rubrics, outcome checklists etc.) to inform their assessments, and adequate professional development training and support - so that they are confident in their use of these tools to inform their assessments. I am a particular fan of portfolio assessment such as the European Language Portfolio - which provides a flexible but structured approach based on a common, shared standard. To quote from a report on the European Language Portfolio - portfolios provide "an important interface between language learning, teaching and assessment" and achieve these "invisible learning outcomes ... : - commitment to and ownership of one's language learning: - tolerance of ambiguity and uncertainty in communicative situations and learning - willingness to take risks in order to cope with communicative tasks - learning skills and strategies necessary for continuous, independent language learning reflective basic orientation to language learning, with abilities for self-assessment of language competence[1] ---------------------------------------------------------------------------- ---- [1] Page 13, A European Language Portfolio From piloting to implementation (2001-2004): Consolidated report - Final Version, Rolf Scharer, General Rapporteur, Language Policy Division, Strasbourg Pauline McNaughton Executive Director / Directrice executive Centre for Canadian Language Benchmarks/Centre des niveaux de competence linguistique canadiens 200 Elgin Street, Suite 803 / 200 rue Elgin, piece 803 Ottawa, ON K2P 1L5 T (613) 230-7729 F (613) 230-9305 pmcnaughton at language.ca This communication is intended for the use of the recipient to which it is addressed, and may contain confidential, personal, and or privileged information. Please contact us immediately if you are not the intended recipient of this communication, and do not copy, distribute, or take action relying on it. Any communication received in error, or subsequent reply, should be deleted or destroyed. Le present message n'est destine qu'a la personne ou l'organisme auquel il est adresse et peut contenir de l'information confidentielle, personnelle ou privilegiee. Si vous n'etes pas le destinataire de ce message, informez-nous immediatement. Il est interdit de copier, diffuser ou engager des poursuites fondees sur son contenu. Si vous avez recu ce communique par erreur, ou une reponse subsequente, veuillez le supprimer ou le detruire. -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: February 2, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060202/7a5b3311/attachment.html From bcarmel at rocketmail.com Thu Feb 2 14:50:51 2006 From: bcarmel at rocketmail.com (Bruce Carmel) Date: Thu, 2 Feb 2006 11:50:51 -0800 (PST) Subject: [Assessment] Legitimacy of AUTHENTIC tools In-Reply-To: Message-ID: <20060202195051.57916.qmail@web30202.mail.mud.yahoo.com> Hello List, Can we call them AUTHENTIC instead of ALTERNATIVE, I know it's semantics, but let's have the semantics work in our favor. Anyway... This is how we do assessment at Turning Point: We use TABE and BEST to report progress to our FUNDERS, and a whole set of assessment tools (including those tests) to report progress to the STUDENTS, the TEACHERS and the PROGRAM including: --Writing samples --Portfolios --Attendance and participation --GED predictor for higher levels --Teachers' assessment of the student skill level. I know that last one is tricky. This is what it means: If a student is breezing through the work in Basic Education 2, but bombs out on the TABE--the teacher can promote him or her to BE 3. There is no Education Gain reported to our funder, but the student moves to the next level class, something she cares about more than her TABE score (usually). I know it would be great if we could use Portfolios or other authentic tools to report programmatic gain, and maybe this discussion will push me to do more on that. But even if I do, it's not going to be recognized by our major (government) funders. From Bruce Carmel Kevin O'Connor wrote: st1\:* { BEHAVIOR: url(#default#ieooui) } @page Section1 {size: 8.5in 11.0in; margin: 1.0in 1.25in 1.0in 1.25in; mso-header-margin: .5in; mso-footer-margin: .5in; mso-paper-source: 0; } P.MsoNormal { FONT-SIZE: 12pt; MARGIN: 0in 0in 0pt; FONT-FAMILY: "Times New Roman"; mso-style-parent: ""; mso-pagination: widow-orphan; mso-fareast-font-family: "Times New Roman" } LI.MsoNormal { FONT-SIZE: 12pt; MARGIN: 0in 0in 0pt; FONT-FAMILY: "Times New Roman"; mso-style-parent: ""; mso-pagination: widow-orphan; mso-fareast-font-family: "Times New Roman" } DIV.MsoNormal { FONT-SIZE: 12pt; MARGIN: 0in 0in 0pt; FONT-FAMILY: "Times New Roman"; mso-style-parent: ""; mso-pagination: widow-orphan; mso-fareast-font-family: "Times New Roman" } A:link { COLOR: blue; TEXT-DECORATION: underline; text-underline: single } SPAN.MsoHyperlink { COLOR: blue; TEXT-DECORATION: underline; text-underline: single } A:visited { COLOR: purple; TEXT-DECORATION: underline; text-underline: single } SPAN.MsoHyperlinkFollowed { COLOR: purple; TEXT-DECORATION: underline; text-underline: single } SPAN.EmailStyle17 { COLOR: windowtext; FONT-FAMILY: Arial; mso-style-type: personal-compose; mso-style-noshow: yes; mso-ansi-font-size: 10.0pt; mso-bidi-font-size: 10.0pt; mso-ascii-font-family: Arial; mso-hansi-font-family: Arial; mso-bidi-font-family: Arial } SPAN.SpellE { mso-style-name: ""; mso-spl-e: yes } SPAN.GramE { mso-style-name: ""; mso-gram-e: yes } DIV.Section1 { page: Section1 } Hi Marie, Bruce and All, These kinds of constructed response assessments are easier to build that selected-response, but MUCH harder to score. The REEP is one Performance Assessment with which many of us are familiar. It is a standardized, constructed-response tool, and I think we can look at its statewide implementation as a bellwether of using more authentic, alternative assessments. In Massachusetts, a lot of time and effort goes into standardizing scorers, initially and continually, in order to ensure that the tool is being used according to its design. Despite the institutional commitment of the DOE, it is a great struggle, perhaps even an act of faith, to ensure that all scorers are aligned. We all know of cases where two scorers, reading the same essay and using the same rubric, show a startling disparity in points awarded. This is not to say that authentic assessment is invalid or undesirable; I fell that they are MORE authentic, valid and desirable... but we need to keep an eye on their reliability. As we put forward the strengths of these tools, we must be ready to acknowledge and pro-actively address their limitations by diligently and thoroughly preparing these tools. This is not as hard as it might sound: we must be sure that the tools we select are design actually measure the domain for which we aim and we must make sure that we use them reliably, i.e., with some standardization (which is NOT a four-letter word). We can't just take them off the shelf and expect one size to fit all- that's what gave "standardized testing" its bad name in the first place. Every teacher designs assessments for their own class- I have a great presentation rating form, but it only works for the specific curriculum. I'm sure that others have great things as well and I'd like to get ideas from them; what's the best way to get these out in the field, and discuss where they are appropriate? -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: ?I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go.? This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of ?the strengths and legitimacy? of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let?s leave it at that for right now. Let us hear what your thoughts are. We?re looking forward to it. Thanks, marie cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Do you Yahoo!? With a free 1 GB, there's more in store with Yahoo! Mail. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060202/0c3c4a29/attachment.html From cook.sandra at northlandscollege.sk.ca Thu Feb 2 15:00:17 2006 From: cook.sandra at northlandscollege.sk.ca (Cook.Sandra) Date: Thu, 2 Feb 2006 14:00:17 -0600 Subject: [Assessment] Legitimacy of alternative tools In-Reply-To: <064901c62818$f074a420$0402a8c0@frodo> Message-ID: <004f01c62833$4a5ed590$786dbd0a@nlcadmin.ca> Hi, Well my opinion is that assessment should pertain to the task at hand and be outlined as such. Whether you are using a Rubric or checklist. To standardize is to say that all students are learning at the same rate/pace. If your assessment is based on things like content, effort, use of certain language (depending where your students are, then you will be assessing each individual student on what they are capable of. That is what makes a portfolio such an effective tool in evaluating individual students. Thanks, Sandra Cook Northlands College Technology Enhanced Literacy _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 10:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060202/f6390b5e/attachment.html From marie.cora at hotspurpartners.com Fri Feb 3 08:44:44 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 3 Feb 2006 08:44:44 -0500 Subject: [Assessment] Legitimacy of AUTHENTIC tools In-Reply-To: <20060202195051.57916.qmail@web30202.mail.mud.yahoo.com> Message-ID: <070101c628c7$fe3559b0$0402a8c0@frodo> Hi Bruce and all, Actually, a discussion of semantics would be quite welcomed by me. I do believe that part of the difficulty of navigating an already hugely complex system (lack of system?) is that we don't really have a common language together - just look at how often we (I) discuss the term 'standardize'. Some of our terms are clear, but many are not well-defined, or their definitions have shifted over time in response to either politics or research or educational trends. Still other terms have multiple meanings, and folks can interpret those terms within their own contexts - which might be different from the contexts of practitioners in another place. What do folks think about this? What do folks think about Bruce's suggestion that we use 'authentic' for this discussion instead of 'alternative'? How do you understand these two terms? Do you think this matters? Also Bruce, thanks for the outline of your assessment structure at Turning Point. Others - please also let us know how you mix and match Commercial assessments with other types of assessments at your programs. Here's a couple of resources: There is a pretty good-sized Assessment Glossary that can be accessed from either the LINCS Special Collection in Assessment (http://literacy.kent.edu/Midwest/assessment/) or the ALEWiki Assessment area at http://wiki.literacytent.org/index.php/Assessment_Information As a professional on-line community, we could build our own set of definitions that speak directly to the issues that we experience. You can add your own definitions or revise ones that are there at the Wiki right now - at this point, "alternative", "authentic" and "performance" assessment all share the same definition there. Do you agree with this? Here's a good resource that discusses Authentic Assessment in the context of workplace education -it discusses a distinction between alternative and authentic. Using Authentic Assessment in Vocational Education by Rodney Custer et al. ERIC doc - see the first chapter of the book. http://www.eric.ed.gov/ERICWebPortal/Home.portal?_nfpb=true&ERICExtSearc h_SearchValue_0=Using+Authentic+Assessment+in+Vocational+Education&ERICE xtSearch_SearchType_0=kw&_pageLabel=RecordDetails&objectId=0900000b80091 a0c Do folks have other resources to share? Thoughts, ideas? Let's hear them! marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Bruce Carmel Sent: Thursday, February 02, 2006 2:51 PM To: The Assessment Discussion List Subject: Re: [Assessment] Legitimacy of AUTHENTIC tools Hello List, Can we call them AUTHENTIC instead of ALTERNATIVE, I know it's semantics, but let's have the semantics work in our favor. Anyway... This is how we do assessment at Turning Point: We use TABE and BEST to report progress to our FUNDERS, and a whole set of assessment tools (including those tests) to report progress to the STUDENTS, the TEACHERS and the PROGRAM including: --Writing samples --Portfolios --Attendance and participation --GED predictor for higher levels --Teachers' assessment of the student skill level. I know that last one is tricky. This is what it means: If a student is breezing through the work in Basic Education 2, but bombs out on the TABE--the teacher can promote him or her to BE 3. There is no Education Gain reported to our funder, but the student moves to the next leve l class, something she cares about more than her TABE score (usually). I know it would be great if we could use Portfolios or other authentic tools to report programmatic gain, and maybe this discussion will push me to do more on that. But even if I do, it's not going to be recognized by our major (government) funders. >From Bruce Carmel Kevin O'Connor wrote: Hi Marie, Bruce and All, These kinds of constructed response assessments are easier to build that selected-response, but MUCH harder to score. The REEP is one Performance Assessment with which many of us are familiar. It is a standardized, constructed-response tool, and I think we can look at its statewide implementation as a bellwether of using more authentic, alternative assessments. In Massachusetts, a lot of time and effort goes into standardizing scorers, initially and continually, in order to ensure that the tool is being used according to its design. Despite the institutional commitment of the DOE, it is a great struggle, perhaps even an act of faith, to ensure that all scorers are aligned. We all know of cases where two scorers, reading the same essay and using the same rubric, show a startling disparity in points awarded. This is not to say that authentic assessment is invalid or undesirable; I fell that they are MORE authentic, valid and desirable... but we need to keep an eye on their reliability. As we put forward the strengths of these tools, we must be ready to acknowledge and pro-actively address their limitations by diligently and thoroughly preparing these tools. This is not as hard as it mig ht sound: we must be sure that the tools we select are design actually measure the domain for which we aim and we must make sure that we use them reliably, i.e., with some standardization (which is NOT a four-letter word). We can't just take them off the shelf and expect one size to fit all- that's what gave "standardized testing" its bad name in the first place. Every teacher designs assessments for their own class- I have a great presentation rating form, but it only works for the specific curriculum. I'm sure that others have great things as well and I'd like to get ideas from them; what's the best way to get these out in the field, and discuss where they are appropriate? -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, use s, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Do you Yahoo!? With a free 1 GB, there's more in store with Yahoo! Mail. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060203/edbd24b0/attachment.html From marie.cora at hotspurpartners.com Fri Feb 3 08:56:26 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 3 Feb 2006 08:56:26 -0500 Subject: [Assessment] Legitimacy of alternative tools In-Reply-To: <004f01c62833$4a5ed590$786dbd0a@nlcadmin.ca> Message-ID: <070601c628c9$a0e6b6d0$0402a8c0@frodo> Hi Sandra, thanks so much for your post. You said: "To standardize is to say that all students are learning at the same rate/pace." This is not correct. To standardize does not speak to the outcomes of the students' learning. It speaks to the inputs of developing a test that tries to be fair to all students. A standardized test precisely will NOT take into consideration differing rates or pace or anything else - because if it did, then you would start introducing bias. A correct statement would be: "To standardize is to say that all students are provided an equal opportunity to demonstrate their knowledge, skill, or performance." marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Cook.Sandra Sent: Thursday, February 02, 2006 3:00 PM To: 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools Hi, Well my opinion is that assessment should pertain to the task at hand and be outlined as such. Whether you are using a Rubric or checklist. To standardize is to say that all students are learning at the same rate/pace. If your assessment is based on things like content, effort, use of certain language (depending where your students are, then you will be assessing each individual student on what they are capable of. That is what makes a portfolio such an effective tool in evaluating individual students. Thanks, Sandra Cook Northlands College Technology Enhanced Literacy _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 10:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060203/c8407b03/attachment.html From r.millar at uwinnipeg.ca Fri Feb 3 09:24:51 2006 From: r.millar at uwinnipeg.ca (Robin Millar) Date: Fri, 03 Feb 2006 08:24:51 -0600 Subject: [Assessment] Legitimacy of alternative tools Message-ID: In Manitoba we use a guided portfolio for people to demonstrate progress and skill development in reading text, document use, writing and oral communications. We have three separate levels of portfolio... the highest is transferable to the adult high school diploma. So students with those kinds of goals have something to "work for." The first two levels give more basic literacy students a certificate at the end which is also good for students wanting some demonstration of success when they might take years to get their GED. The website for more information is: http://www.edu.gov.mb.ca/aet/all/publications/Stages/stages.htm Robin Millar >>> marie.cora at hotspurpartners.com 02/02/06 10:51 AM >>> Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator Dr. Robin Millar Executive Director Centre for Education and Work 515 Portage Avenue Winnipeg, MB R3B 2E9 204-786-9395 From AWilder106 at aol.com Fri Feb 3 09:39:18 2006 From: AWilder106 at aol.com (AWilder106 at aol.com) Date: Fri, 3 Feb 2006 09:39:18 EST Subject: [Assessment] Legitimacy of AUTHENTIC tools Message-ID: <2c5.3048ef2.3114c516@aol.com> marie-- I would go for "assessment" to cover all assessments. Sub=headings could be used for different types of assessments, e.g., standardized tests, portfolios, attendance. Then code each assessment as to who wants it and who gets it. Put in cost, too, if that is a critical variable. andrea -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060203/5a1cc8b1/attachment.html From ezl109 at psu.edu Fri Feb 3 09:47:18 2006 From: ezl109 at psu.edu (=?iso-8859-1?Q?Eugenio_Longoria_S=E1enz?=) Date: Fri, 3 Feb 2006 09:47:18 -0500 Subject: [Assessment] Legitimacy of alternative tools In-Reply-To: <070601c628c9$a0e6b6d0$0402a8c0@frodo> Message-ID: <000f01c628d0$bb28ce00$e72d7680@cham231> I do not really agree with either statement. I agree that the first statement is not addressing outcomes and that standardization is about input. However, I disagree with the words ?equal opportunity? in the second statement. Educational inequality and the achievement gap are very real things. Regardless of standardization of input, there is still the issue of equal opportunity which translates to access to the same resources, qualified teachers, adequate learning environments, supportive social structures (family, friends, work, etc.). It is not about the standard, it is about all the other stuff in society that we have not taken care of. Not too long ago all people were given the right to vote, but the trick was that they had to prove they could read and write, and not to long before that they had to be property owners (I hope you know where I am going with this). Well, we have not gotten rid of the vote because this is fundamentally important in a democratic society, but we have fought to equalize and in some cases eliminate some barriers to the right to vote completely. Standards are not the problem; we should not have to get rid of them. It is the inequality and the prejudices motivated by race, economics, and social position that continue to be a problem. I guess what I am saying is that our fight against the standard is misdirected. We should fighting to eliminate those things that are keeping many from meeting the standards. I hope I made some sense, I tend not to many times. Eu- _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, February 03, 2006 8:56 AM To: cook.sandra at northlandscollege.sk.ca; 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools Hi Sandra, thanks so much for your post. You said: ?To standardize is to say that all students are learning at the same rate/pace.? This is not correct. To standardize does not speak to the outcomes of the students? learning. It speaks to the inputs of developing a test that tries to be fair to all students. A standardized test precisely will NOT take into consideration differing rates or pace or anything else ? because if it did, then you would start introducing bias. A correct statement would be: ?To standardize is to say that all students are provided an equal opportunity to demonstrate their knowledge, skill, or performance.? marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Cook.Sandra Sent: Thursday, February 02, 2006 3:00 PM To: 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools Hi, Well my opinion is that assessment should pertain to the task at hand and be outlined as such. Whether you are using a Rubric or checklist. To standardize is to say that all students are learning at the same rate/pace. If your assessment is based on things like content, effort, use of certain language (depending where your students are, then you will be assessing each individual student on what they are capable of. That is what makes a portfolio such an effective tool in evaluating individual students. Thanks, Sandra Cook Northlands College Technology Enhanced Literacy _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 10:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: ?I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go.? This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of ?the strengths and legitimacy? of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let?s leave it at that for right now. Let us hear what your thoughts are. We?re looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060203/b12c2c78/attachment.html From marie.cora at hotspurpartners.com Fri Feb 3 17:52:04 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 3 Feb 2006 17:52:04 -0500 Subject: [Assessment] Legitimacy of alternative tools In-Reply-To: <000f01c628d0$bb28ce00$e72d7680@cham231> Message-ID: <000701c62914$7465ef40$0402a8c0@frodo> Hi Eugenio, Thanks so much for your post. You are talking about the standard Opportunity to Learn (OTL), and it is a very real and important standard. Perhaps the most important one, but unfortunately in ABE, the one that commands the least resources. It is true that even the mechanisms that strive to be the most fair, are always going to be limited by their environment. What do others have to say about this piece of the equation? For a good, succinct reading on standards-based reform including a discussion of OTL, see: A User's Guide to Standards-Based Educational Reform: From Theory to Practice by Regie Stites http://www.ncsall.net/?id=352 marie -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Eugenio Longoria S?enz Sent: Friday, February 03, 2006 9:47 AM To: 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools I do not really agree with either statement. I agree that the first statement is not addressing outcomes and that standardization is about input. However, I disagree with the words ?equal opportunity? in the second statement. Educational inequality and the achievement gap are very real things. Regardless of standardization of input, there is still the issue of equal opportunity which translates to access to the same resources, qualified teachers, adequate learning environments, supportive social structures (family, friends, work, etc.). It is not about the standard, it is about all the other stuff in society that we have not taken care of. Not too long ago all people were given the right to vote, but the trick was that they had to prove they could read and write, and not to long before that they had to be property owners (I hope you know where I am going with this). Well, we have not gotten rid of the vote because this is fundamentally important in a democratic society, but we have fought to equalize and in some cases eliminate some barriers to the right to vote completely. Standards are not the problem; we should not have to get rid of them. It is the inequality and the prejudices motivated by race, economics, and social position that continue to be a problem. I guess what I am saying is that our fight against the standard is misdirected. We should fighting to eliminate those things that are keeping many from meeting the standards. I hope I made some sense, I tend not to many times. Eu- _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, February 03, 2006 8:56 AM To: cook.sandra at northlandscollege.sk.ca; 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools Hi Sandra, thanks so much for your post. You said: ?To standardize is to say that all students are learning at the same rate/pace.? This is not correct. To standardize does not speak to the outcomes of the students? learning. It speaks to the inputs of developing a test that tries to be fair to all students. A standardized test precisely will NOT take into consideration differing rates or pace or anything else ? because if it did, then you would start introducing bias. A correct statement would be: ?To standardize is to say that all students are provided an equal opportunity to demonstrate their knowledge, skill, or performance.? marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Cook.Sandra Sent: Thursday, February 02, 2006 3:00 PM To: 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools Hi, Well my opinion is that assessment should pertain to the task at hand and be outlined as such. Whether you are using a Rubric or checklist. To standardize is to say that all students are learning at the same rate/pace. If your assessment is based on things like content, effort, use of certain language (depending where your students are, then you will be assessing each individual student on what they are capable of. That is what makes a portfolio such an effective tool in evaluating individual students. Thanks, Sandra Cook Northlands College Technology Enhanced Literacy _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 10:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: ?I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go.? This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of ?the strengths and legitimacy? of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let?s leave it at that for right now. Let us hear what your thoughts are. We?re looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060203/c6c80dba/attachment.html From Ajit.Gopalakrishnan at po.state.ct.us Sat Feb 4 12:42:19 2006 From: Ajit.Gopalakrishnan at po.state.ct.us (Gopalakrishnan, Ajit) Date: Sat, 4 Feb 2006 12:42:19 -0500 Subject: [Assessment] Legitimacy of alternative tools Message-ID: <6D87A5CD4E209448BBF7CCFB41D57C0E026013D1@DOIT-EX302.exec.ds.state.ct.us> Marie, et al, By "alternative", I presume you mean that these assessment options are an alternative to multiple-choice assessments. Is that a fair inference? I sometimes refer to alternative assessments as non-multiple choice assessments, just to make clear what I am talking about. >From my perspective, referring to them as authentic seems to muddy this discussion. Webster provides two of the following definitions for authentic which may help to illustrate my thinking: a) worthy of acceptance or belief as conforming to or based on fact b) true to one's own personality, spirit, or character So for example, a student's CASAS scale score in math (say 212) from a multiple choice test may be worthy of acceptance of a person's math ability. An analysis of the test item responses may even provide greater information about a person's strengths and weaknesses. However, they cannot say much about how the student perceives the relation of "math" to his/her own personality and life. Two students at entry might both achieve a score of 207 in math for very different reasons. One student might have liked math, viewed herself as being capable of learning math but just not used it for many years. The other student might have never liked math, generally seen herself as having other strengths, but been forced to use math as part of her job. To ascertain this type of information, the teacher might have to talk to the student and find out the student's past experiences with math, the student's perceptions of its importance in his/her life, etc. Then, a custom assessment/project can be designed that is meaningful and authentic to that particular student. >From my perspective, all standardization (whether multiple-choice or non-multiple choice assessments) will to some extent reduce the authenticity for the student. The CASAS system attempts to address this by providing assessments that are relevant to adults and based in various contexts (life skills, employability skills, workforce learning, citizenship, etc.) so that the student can be assessed in contexts that are somewhat authentic to their experiences and goals. Therefore, I prefer the term alternative assessments because then we can focus our discussion on the differences between multiple choice assessments and non-multiple choice assessments. There is no question that non-multiple choice assessments can be legitimate and have many strengths. For example, Connecticut is currently piloting a CASAS workplace speaking assessment. This is a standardized assessment designed for ESL learners who are currently working to demonstrate their listening and speaking abilities in a workplace context. Compared to the CASAS listening multiple-choice assessments which we have used over the years, the speaking assessment has the potential for the instructor to gain a greater understanding of a student's strengths and weaknesses. Students also seem to enjoy taking the assessment. However, it needs to be administered one-on-one unlike the listening which can be group administered. The speaking assessment also places a greater training and certification burden on the test administrator and scorer. We have experienced many of these challenges with our statewide implementation of the CASAS Functional Writing Assessment over the past few years. Kevin alluded to some of those challenges such as maintaining scorer certification and interrater reliability. The scoring rubric used in both the writing and the speaking assessments can be valuable tools for classroom instruction. In my opinion, at least some non-multiple choice assessments should be standardized so that they can be used to broaden the array of assessments available for state-level reporting/accountability. Thanks. Ajit Ajit Gopalakrishnan Education Consultant Connecticut Department of Education 25 Industrial Park Road Middletown, CT 06457 Tel: (860) 807-2125 Fax: (860) 807-2062 ajit.gopalakrishnan at po.state.ct.us ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -- No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.1.375 / Virus Database: 267.15.0/248 - Release Date: 2/1/2006 From hdooley at riral.org Sat Feb 4 14:58:33 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Sat, 04 Feb 2006 14:58:33 -0500 Subject: [Assessment] Standardized assessment of lowest literacy level In-Reply-To: <000701c62914$7465ef40$0402a8c0@frodo> References: <000701c62914$7465ef40$0402a8c0@frodo> Message-ID: <43E50769.9@riral.org> A recent point has come up at discussions of assessments that I would like to delve into. In our local discussion of standardized assessments, the statement has been made that there are no standardized assessments that are appropriate for learners at or near the lowest levels of literacy. As examples, this includes the CASAS, the TABE, and the BEST, which you might agree are reliable and valid for intermediate to advanced learners. This is for either ABE or ESOL learners. I'm simply curious to ask and hear from others: (1) Do you agree with this statement, and why or why not? (2) What do you think an appropriate standardized assessment for that level should include or look like? and, (3) Do you think we could come up with a specific enough list for a program, publisher or vendor to work through creating a standardized assessment that would meet this need? I appreciate your assistance, so that I can think this through sincerely and deeply. Howard. From marie.cora at hotspurpartners.com Sun Feb 5 08:30:23 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 5 Feb 2006 08:30:23 -0500 Subject: [Assessment] Legitimacy of alternative tools In-Reply-To: <6D87A5CD4E209448BBF7CCFB41D57C0E026013D1@DOIT-EX302.exec.ds.state.ct.us> Message-ID: <002b01c62a58$5214a090$0402a8c0@frodo> Hi Ajit and everyone, Yeah, that's a good question you posed to me Ajit: I guess you are right in saying that I do think about 'alternative' as referring to any assessment that is not multiple choice. Actually, the terms I use in my head to separate this stuff out are selected-response and constructed response. Selected response describes that situation: the person must choose (select) from a set of answers (responses) which one they think is the right one. That's pretty tightly wrapped up in terms of what that means: you get the list of answers, you look at the choices, you determine which item on the list you think is right. Constructed response also describes the situation precisely: the person must recall info or build for themselves (construct) the answer to a particular question. No choices are given for the person to consider - they are not selecting anything. The other thing that is hugely useful about using this term is that it is not prescriptive in how big or small a response must be constructed. So for example, many people think that a 'performance assessment' (which is a constructed response: because you are demonstrating your performance) must necessarily entail something big, lengthy, intense, etc. But in fact, a constructed response might entail just one word (as long as you are not selecting that word from a list). Here's a great example: you know what a 'cloze' exercise is? Those fill-in-the-blank worksheets that can test you on vocab or grammar? Well, that is a performance assessment, even though you are only filling in one word here and there. I like to think about these notions this way because they are devoid of other distractors - for example, there is no mention of standardization with selected or constructed response, that is a whole other step in the process. And if you continue to think about selected response as 'multiple choice' then I bet you a dime you just fall back on equating multiple choice with TABE - and that is just not correct at all. While the TABE is an EXAMPLE of a multiple choice test - one does not equal the other. A couple of questions back to you Ajit and to all the subscribers: - Ajit, you made some really thoughtful comments in your arguments against using authentic assessment - what do others think of Ajit's point of view? - Ajit, you said: "In my opinion, at least some non-multiple choice assessments should be standardized so that they can be used to broaden the array of assessments available for state-level reporting/accountability." Folks - can anyone give us any examples of what Ajit describes above? Let's see if we can develop a growing list of the assessments being used that are different - I'll start by adding the REEP Writing Rubric to the list - it is standardized, it is a constructed response test, and at least Massachusetts uses it for reporting writing gains to the feds. Also, Andrea Wilder (post on 2/3) suggested that we use Assessment for all types of 'tests' but that we divide that into sub-headings that list the various types, and include information on who wants the data from said test and who gets that data. We do have some amount of info listed on types of tests and costs, but we don't have a whole lot of info on who actually gets the test data and what is gets used for. What do folks think about this?...I'm intrigued.... Robin Millar (post on 2/3) describes a guided portfolio in use in Manitoba that sounds interesting: it has several levels to it. Robin - are parts of the portfolio standardized? The whole thing? Does the portfolio include both selected response and constructed responses types of assessments and info? Ok, enough chatter from me for a Sunday morning. Hope everyone is having a lovely weekend, and see you again tomorrow, Marie cora Assessment Discussion List Moderator For definitions see: http://wiki.literacytent.org/index.php/Assessment_Information#Assessment _Glossary For a bunch of details and info on Commercial Assessments, but that do not discuss the uses of data and should! Go to: http://wiki.literacytent.org/index.php/Commercially_Available_Assessment _Tools To help me develop the Wiki section on Alternative Assessment, go to: http://wiki.literacytent.org/index.php/Alternative_Assessment To make informed choices about test selection, go to: http://wiki.literacytent.org/index.php/Selecting_Assessment_Tools -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gopalakrishnan, Ajit Sent: Saturday, February 04, 2006 12:42 PM To: The Assessment Discussion List Subject: Re: [Assessment] Legitimacy of alternative tools Marie, et al, By "alternative", I presume you mean that these assessment options are an alternative to multiple-choice assessments. Is that a fair inference? I sometimes refer to alternative assessments as non-multiple choice assessments, just to make clear what I am talking about. >From my perspective, referring to them as authentic seems to muddy this discussion. Webster provides two of the following definitions for authentic which may help to illustrate my thinking: a) worthy of acceptance or belief as conforming to or based on fact b) true to one's own personality, spirit, or character So for example, a student's CASAS scale score in math (say 212) from a multiple choice test may be worthy of acceptance of a person's math ability. An analysis of the test item responses may even provide greater information about a person's strengths and weaknesses. However, they cannot say much about how the student perceives the relation of "math" to his/her own personality and life. Two students at entry might both achieve a score of 207 in math for very different reasons. One student might have liked math, viewed herself as being capable of learning math but just not used it for many years. The other student might have never liked math, generally seen herself as having other strengths, but been forced to use math as part of her job. To ascertain this type of information, the teacher might have to talk to the student and find out the student's past experiences with math, the student's perceptions of its importance in his/her life, etc. Then, a custom assessment/project can be designed that is meaningful and authentic to that particular student. >From my perspective, all standardization (whether multiple-choice or non-multiple choice assessments) will to some extent reduce the authenticity for the student. The CASAS system attempts to address this by providing assessments that are relevant to adults and based in various contexts (life skills, employability skills, workforce learning, citizenship, etc.) so that the student can be assessed in contexts that are somewhat authentic to their experiences and goals. Therefore, I prefer the term alternative assessments because then we can focus our discussion on the differences between multiple choice assessments and non-multiple choice assessments. There is no question that non-multiple choice assessments can be legitimate and have many strengths. For example, Connecticut is currently piloting a CASAS workplace speaking assessment. This is a standardized assessment designed for ESL learners who are currently working to demonstrate their listening and speaking abilities in a workplace context. Compared to the CASAS listening multiple-choice assessments which we have used over the years, the speaking assessment has the potential for the instructor to gain a greater understanding of a student's strengths and weaknesses. Students also seem to enjoy taking the assessment. However, it needs to be administered one-on-one unlike the listening which can be group administered. The speaking assessment also places a greater training and certification burden on the test administrator and scorer. We have experienced many of these challenges with our statewide implementation of the CASAS Functional Writing Assessment over the past few years. Kevin alluded to some of those challenges such as maintaining scorer certification and interr ater reliability. The scoring rubric used in both the writing and the speaking assessments can be valuable tools for classroom instruction. In my opinion, at least some non-multiple choice assessments should be standardized so that they can be used to broaden the array of assessments available for state-level reporting/accountability. Thanks. Ajit Ajit Gopalakrishnan Education Consultant Connecticut Department of Education 25 Industrial Park Road Middletown, CT 06457 Tel: (860) 807-2125 Fax: (860) 807-2062 ajit.gopalakrishnan at po.state.ct.us ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -- No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.1.375 / Virus Database: 267.15.0/248 - Release Date: 2/1/2006 ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Sun Feb 5 09:01:05 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 5 Feb 2006 09:01:05 -0500 Subject: [Assessment] Standardized assessment of lowest literacy level In-Reply-To: <43E50769.9@riral.org> Message-ID: <002c01c62a5c$9c0ea0c0$0402a8c0@frodo> Hi Howard, how are you? Things moving along in Rhode Island, eh? Great question you have posed for us here Howard. Massachusetts is developing its own low level ABE reading test (it's in pilot right now) because we do agree with the statement that the available commercial (as opposed to standardized :0) tests do no justice at measuring gain at lower levels. We're also in the midst of developing a math test too. (Let me go get some links for you to check out - I'm having a hard time finding what I want to send you all at this moment....must be a Sunday morning thing...) As for your question regarding what such a test would look like, the one in pilot was developed from the Mass. curriculum frameworks (aligning curriculum, instruction, and assessment). The math test will also be aligned with the state curriculum frameworks of course. Finally, you said: "(3) Do you think we could come up with a specific enough list for a program, publisher or vendor to work through creating a standardized assessment that would meet this need?" Ok, ONLY my opinion here so let's hear from everyone, but I do not believe so. In order for something like this to happen, we first need to have national standards, which we do not have. States are developing their own standards, and so an appropriate test can be developed for use in-state (like in Mass.). But Mass. tests would be worthless for a state that doesn't use our curriculum frameworks. So I'm not holding my breath. However! There are interesting things afoot! I recently attended the EFF (Equipped for the Future) conference in December and learned that ETS (Educational Testing Service) is right now running a project to develop 3 assessments based on both the EFF framework and the ARCS Reading Profile from Harvard (http://www.nifl.gov/readingprofiles/). So this could answer part of your question Howard regarding #3 above maybe. The 3 tests will be focused on the areas of "reads with understanding", "reading components measure", and "uses math to solve problems". They actually are in phase II of this project, in which they need to recruit 10 states at least to sign on for developing the tests themselves. So this is actually a direct attempt to develop assessments based on the curricula and framework of EFF and ARCS. Hold onto your hats for this last bit folks! I'm organizing a discussion and Q&A as we speak with the folks at ETS on this project. I'll send out some info on the project soon, and let you know when ETS folks will be available to answer your questions and hold discussion. Thanks everyone and I hope that you are enjoying your weekend, marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Saturday, February 04, 2006 2:59 PM To: The Assessment Discussion List Subject: [Assessment] Standardized assessment of lowest literacy level A recent point has come up at discussions of assessments that I would like to delve into. In our local discussion of standardized assessments, the statement has been made that there are no standardized assessments that are appropriate for learners at or near the lowest levels of literacy. As examples, this includes the CASAS, the TABE, and the BEST, which you might agree are reliable and valid for intermediate to advanced learners. This is for either ABE or ESOL learners. I'm simply curious to ask and hear from others: (1) Do you agree with this statement, and why or why not? (2) What do you think an appropriate standardized assessment for that level should include or look like? and, (3) Do you think we could come up with a specific enough list for a program, publisher or vendor to work through creating a standardized assessment that would meet this need? I appreciate your assistance, so that I can think this through sincerely and deeply. Howard. ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Sun Feb 5 10:34:32 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 5 Feb 2006 10:34:32 -0500 Subject: [Assessment] Links to Mass. test development info Message-ID: <003701c62a69$a999f610$0402a8c0@frodo> Hi all, found some links for you regarding the tests being developed here in Massachusetts: Curriculum and Assessment Updates from Adult and Community Learning Services (ACLS) See the May 2005 mailing: http://www.doe.mass.edu/acls/mailings/2005/0513/assess.html This mailing includes a timeline for the reading and math test development. The Nov 2005 mailing: http://www.doe.mass.edu/acls/mailings/2005/1111/assess.html This mailing briefly outlines some revised info based on the Fall pilot. marie -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060205/4dff098a/attachment.html From Tina_Luffman at yc.edu Sun Feb 5 10:54:42 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Sun, 5 Feb 2006 08:54:42 -0700 Subject: [Assessment] Standardized assessment of lowest literacy level Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060205/5283e5f4/attachment.html From Ajit.Gopalakrishnan at po.state.ct.us Sun Feb 5 12:03:51 2006 From: Ajit.Gopalakrishnan at po.state.ct.us (Gopalakrishnan, Ajit) Date: Sun, 5 Feb 2006 12:03:51 -0500 Subject: [Assessment] Legitimacy of alternative tools Message-ID: <281DD0D97E3EC94FB83030B1379CE426D05C55@DOIT-EX302.exec.ds.state.ct.us> Hi Marie, Thanks for your response. I like your labels of selected response and constructed response. I guess by alternative, you were really referring to constructed response. I agree that there are selected response options that are not just multiple choice - I presume you might have been referring to matching, true false, etc. I wonder what the NAAL used. With respect to your second question, I thought that I had mentioned in my earlier email two examples of standardized constructed response assessment options that we are using in Connecticut: (i) the CASAS Functional Writing Assessment and (ii) the CASAS Workplace Speaking Assessment. The former is currently reportable to the NRS while the latter is in the midst of being correlated to the NRS levels and will be reportable shortly. I would love to hear what other states are using/considering. Ajit ________________________________ From: assessment-bounces at nifl.gov on behalf of Marie Cora Sent: Sun 2/5/2006 8:30 AM To: 'The Assessment Discussion List' Subject: Re: [Assessment] Legitimacy of alternative tools Hi Ajit and everyone, Yeah, that's a good question you posed to me Ajit: I guess you are right in saying that I do think about 'alternative' as referring to any assessment that is not multiple choice. Actually, the terms I use in my head to separate this stuff out are selected-response and constructed response. Selected response describes that situation: the person must choose (select) from a set of answers (responses) which one they think is the right one. That's pretty tightly wrapped up in terms of what that means: you get the list of answers, you look at the choices, you determine which item on the list you think is right. Constructed response also describes the situation precisely: the person must recall info or build for themselves (construct) the answer to a particular question. No choices are given for the person to consider - they are not selecting anything. The other thing that is hugely useful about using this term is that it is not prescriptive in how big or small a response must be constructed. So for example, many people think that a 'performance assessment' (which is a constructed response: because you are demonstrating your performance) must necessarily entail something big, lengthy, intense, etc. But in fact, a constructed response might entail just one word (as long as you are not selecting that word from a list). Here's a great example: you know what a 'cloze' exercise is? Those fill-in-the-blank worksheets that can test you on vocab or grammar? Well, that is a performance assessment, even though you are only filling in one word here and there. I like to think about these notions this way because they are devoid of other distractors - for example, there is no mention of standardization with selected or constructed response, that is a whole other step in the process. And if you continue to think about selected response as 'multiple choice' then I bet you a dime you just fall back on equating multiple choice with TABE - and that is just not correct at all. While the TABE is an EXAMPLE of a multiple choice test - one does not equal the other. A couple of questions back to you Ajit and to all the subscribers: - Ajit, you made some really thoughtful comments in your arguments against using authentic assessment - what do others think of Ajit's point of view? - Ajit, you said: "In my opinion, at least some non-multiple choice assessments should be standardized so that they can be used to broaden the array of assessments available for state-level reporting/accountability." Folks - can anyone give us any examples of what Ajit describes above? Let's see if we can develop a growing list of the assessments being used that are different - I'll start by adding the REEP Writing Rubric to the list - it is standardized, it is a constructed response test, and at least Massachusetts uses it for reporting writing gains to the feds. Also, Andrea Wilder (post on 2/3) suggested that we use Assessment for all types of 'tests' but that we divide that into sub-headings that list the various types, and include information on who wants the data from said test and who gets that data. We do have some amount of info listed on types of tests and costs, but we don't have a whole lot of info on who actually gets the test data and what is gets used for. What do folks think about this?...I'm intrigued.... Robin Millar (post on 2/3) describes a guided portfolio in use in Manitoba that sounds interesting: it has several levels to it. Robin - are parts of the portfolio standardized? The whole thing? Does the portfolio include both selected response and constructed responses types of assessments and info? Ok, enough chatter from me for a Sunday morning. Hope everyone is having a lovely weekend, and see you again tomorrow, Marie cora Assessment Discussion List Moderator For definitions see: http://wiki.literacytent.org/index.php/Assessment_Information#Assessment _Glossary For a bunch of details and info on Commercial Assessments, but that do not discuss the uses of data and should! Go to: http://wiki.literacytent.org/index.php/Commercially_Available_Assessment _Tools To help me develop the Wiki section on Alternative Assessment, go to: http://wiki.literacytent.org/index.php/Alternative_Assessment To make informed choices about test selection, go to: http://wiki.literacytent.org/index.php/Selecting_Assessment_Tools -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gopalakrishnan, Ajit Sent: Saturday, February 04, 2006 12:42 PM To: The Assessment Discussion List Subject: Re: [Assessment] Legitimacy of alternative tools Marie, et al, By "alternative", I presume you mean that these assessment options are an alternative to multiple-choice assessments. Is that a fair inference? I sometimes refer to alternative assessments as non-multiple choice assessments, just to make clear what I am talking about. >From my perspective, referring to them as authentic seems to muddy this discussion. Webster provides two of the following definitions for authentic which may help to illustrate my thinking: a) worthy of acceptance or belief as conforming to or based on fact b) true to one's own personality, spirit, or character So for example, a student's CASAS scale score in math (say 212) from a multiple choice test may be worthy of acceptance of a person's math ability. An analysis of the test item responses may even provide greater information about a person's strengths and weaknesses. However, they cannot say much about how the student perceives the relation of "math" to his/her own personality and life. Two students at entry might both achieve a score of 207 in math for very different reasons. One student might have liked math, viewed herself as being capable of learning math but just not used it for many years. The other student might have never liked math, generally seen herself as having other strengths, but been forced to use math as part of her job. To ascertain this type of information, the teacher might have to talk to the student and find out the student's past experiences with math, the student's perceptions of its importance in his/her life, etc. Then, a custom assessment/project can be designed that is meaningful and authentic to that particular student. >From my perspective, all standardization (whether multiple-choice or non-multiple choice assessments) will to some extent reduce the authenticity for the student. The CASAS system attempts to address this by providing assessments that are relevant to adults and based in various contexts (life skills, employability skills, workforce learning, citizenship, etc.) so that the student can be assessed in contexts that are somewhat authentic to their experiences and goals. Therefore, I prefer the term alternative assessments because then we can focus our discussion on the differences between multiple choice assessments and non-multiple choice assessments. There is no question that non-multiple choice assessments can be legitimate and have many strengths. For example, Connecticut is currently piloting a CASAS workplace speaking assessment. This is a standardized assessment designed for ESL learners who are currently working to demonstrate their listening and speaking abilities in a workplace context. Compared to the CASAS listening multiple-choice assessments which we have used over the years, the speaking assessment has the potential for the instructor to gain a greater understanding of a student's strengths and weaknesses. Students also seem to enjoy taking the assessment. However, it needs to be administered one-on-one unlike the listening which can be group administered. The speaking assessment also places a greater training and certification burden on the test administrator and scorer. We have experienced many of these challenges with our statewide implementation of the CASAS Functional Writing Assessment over the past few years. Kevin alluded to some of those challenges such as maintaining scorer certification and interr ater reliability. The scoring rubric used in both the writing and the speaking assessments can be valuable tools for classroom instruction. In my opinion, at least some non-multiple choice assessments should be standardized so that they can be used to broaden the array of assessments available for state-level reporting/accountability. Thanks. Ajit Ajit Gopalakrishnan Education Consultant Connecticut Department of Education 25 Industrial Park Road Middletown, CT 06457 Tel: (860) 807-2125 Fax: (860) 807-2062 ajit.gopalakrishnan at po.state.ct.us ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, February 02, 2006 11:52 AM To: Assessment Discussion List Subject: [Assessment] Legitimacy of alternative tools Hi Bruce and everyone, Bruce, you said: "I think putting forth the strengths and legitimacy of tools such as portfolios, outcome checklists, holistically scored writing samples, etc is a good way to go." This sounds like a very good path to go down to me. I think people would have a lot to say and share about alternative tools, their uses, and their strengths. It would be a great exercise to list them all out and discuss the strengths, uses, and limitations of each one. What questions do folks have about alternative assessments?: using them, seeking them out, developing them, whatever area most intrigues you. What can folks share with the rest of us in terms of "the strengths and legitimacy" of alternative tools such as portfolios, checklists, analytic/holistic scoring, rubric use, writing samples, in-take/placement processes? Are any of the tools you use standardized? Not standardized? Do you think that this is important? Why or why not? Are any of the tools used for both classroom and program purposes? I have other questions for you, but let's leave it at that for right now. Let us hear what your thoughts are. We're looking forward to it. Thanks, marie cora Assessment Discussion List Moderator -- No virus found in this incoming message. Checked by AVG Free Edition. Version: 7.1.375 / Virus Database: 267.15.0/248 - Release Date: 2/1/2006 ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 13289 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060205/db79395a/attachment.bin From marie.cora at hotspurpartners.com Mon Feb 6 10:27:34 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 6 Feb 2006 10:27:34 -0500 Subject: [Assessment] Adolescents in the Adult ESOL Classroom Message-ID: <002301c62b31$dad127d0$0402a8c0@frodo> Hello All, The following post is submited on behalf of Lynda Terrill *********** The Adult English Language Learners is planning an online discussion on adolescent learners in adult ESL/ESOL classes from February 8-14, 2006. To join the English language list, please go to http://www.nifl.gov/lincs/discussions/discussions.html and follow the directions for subscribing. ********* Adolescent English language learners (ELLs) are a growing population in secondary schools and a steady presence in postsecondary (adult) education programs. Many of you have experienced the unique characteristics and needs that adolescent ELLs present in the adult ESL classroom. Like their adult counterparts, some of these adolescents may be undocumented or may not have high literacy or education levels in their native languages. They may be trying to juggle work, education, community, and family responsibilities both here and in their native countries. Some may be struggling with cross-generational reunification issues. Others may have been born and raised in the U.S. but failed to succeed in traditional K-12 schooling. Despite their varied educational, social, and cultural backgrounds, these adolescents have one thing in common - their developmental stage and related needs may set them apart from the adult students in your classes. As high school exit criteria grow more demanding in the United States, students with limited or interrupted schooling are finding it difficult to graduate within the timeframes traditionally allocated for high school study. As a result, these students are turning to adult education to earn high school diplomas, increase their job skills, and improve their English language proficiency. On February 8-14 Sarah Young, author of Adolescent Learners in Adult ESL Classes, http://www.cal.org/caela/esl_resources/briefs/adolescent.html will lead a discussion and respond to questions about this topic. Sarah is an instructor at the Arlington Education and Employment Program (REEP) in Arlington, Virginia. She is also an adult ESL content specialist at the Center for Applied Linguistics where she works on several projects related to adolescent and adult English language learners. On February 8, Sarah will summarize some of the issues related to adolescents studying in adult ESL/ESOL classrooms (e.g., who these learners are and why they are in adult ESL/ESOL classes, what instructional strategies may work well with this population, what types of educational opportunities may be available). To review the topic before the discussion, please read brief (above), which includes an extensive bibliography. We hope you will share your own experiences, advice, and comments, before, during, and after the days that Sarah leads the discussion and fields questions. If questions or comments are raised before next Tuesday, I will forward them to Sarah. You may also send comments or questions to me off the list at lterrill at cal.org Lynda Terrill English Language Discussion List Center for Adult English Language Acquisition Center for Applied Linguistics 4646 40th St, NW Washington, DC lterrill at cal.org tel 202-362-0700 fax 202-363-7204 http://www.cal.org/caela From EJacobson at air.org Mon Feb 6 17:52:16 2006 From: EJacobson at air.org (Jacobson, Erik) Date: Mon, 6 Feb 2006 14:52:16 -0800 Subject: [Assessment] National Reading Conference - J. Michael Parker Award Message-ID: National Reading Conference - J. Michael Parker Award The National Reading Conference (NRC) - 56th Annual Meeting will take place in Los Angeles, CA, from November 29 to December 2, 2006. The conference covers a wide range of literacy related topics, including sessions on adult literacy. Information about the annual meeting is available at http://www.nrconline.org/. I encourage adult literacy researchers to join the dialogue at the meeting and to consider submitting proposals In addition, to encourage research on adult literacy, NRC has established the J. Michael Parker Award. This award is given to graduate students and untenured professors who present research on adult learning or education at the annual meeting. More information and submission guidelines are available in the meeting's Call for Proposals - http://www.nrconline.org/pdf/2006callforproposals.pdf Erik Jacobson Chair, J. Michael Parker Award Committee National Reading Conference -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060206/2d0a6ce2/attachment.html From marie.cora at hotspurpartners.com Tue Feb 7 12:45:06 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 7 Feb 2006 12:45:06 -0500 Subject: [Assessment] Review of Adult Literacy Education Message-ID: <000001c62c0e$3bdad800$0402a8c0@frodo> Dear List Members, The following post is from Tom Sticht. ---------------- Collegues: I have been asked to prepare a chapter for a Handbook on Literacy that will be published by Cambridge University Press. Following is a brief title and outline that I am currently using to think about the work. Some questions I have for you follow below after the outline of topics: "Adult Literacy Education in Industrialized Nations Thomas Sticht In several industrialized nations activities are underway to extend the right to basic literacy education to adults. Having for many decades provided a variety of programs, many arising from charitable work by religious groups and others, activities are today underway to transform these many local, independently acting programs into systems of state-supported, free education for adults across the life span. This paper discusses activities in three industrialized nations under five categories: 1. Scale of Need: determining how many adults are in need of adult literacy education. 2. Access to Provision: determining how many adults are aware of, have access to and enroll in adult literacy education provision. 3. Nature of Provision: determining the nature of the delivery system for meeting the needs of adult literacy provision, including the use of information and communication technology (ICT). 4. Quality of Provision: determining the nature of and need for improved instructional quality, including teacher qualifications and establishing content and outcome standards for programs. 5. Accountability of Provision: improving methods for determining achievements of programs in terms of student learning outcomes and broader impacts for the adult, family, workplace and community. The paper will acquaint readers with issues, challenges, and accomplishments arising from this movement to transform local adult literacy education programs into national systems of adult education in industrialized nations." Questions: I want to review the best work I can to flesh out the chapter so I am asking for any references you think I should read in pursuit of this work. What are two or three of the most important books, papers, research studies, policy papers, etc. that you think have contributed to your thinking and/or practice in adult literacy education in your nation? What are the two or three most important trends to have emerged in adult literacy education in your nation in the last quarter century? What direction do you see adult literacy education taking in your nation in the next ten years or so? What is the most important research in adult literacy education that you have come across that has influenced educational practice in your nation? Thanks for any responses you may have to these questions or any other directions that you think I should consider going in the development of this chapter. You can respond on the list or directly to me at tsticht at aznet.net. Thanks, Tom Sticht marie cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com ---------------------------------------------------- National Institute for Literacy Assessment Discussion List assessment at nifl.gov To unsubscribe or change your subscription settings please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Fri Feb 10 08:41:25 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 10 Feb 2006 08:41:25 -0500 Subject: [Assessment] FW: [AAACE-NLA] Message for Black History Month (longer) Message-ID: <006601c62e47$b0aaff30$0402a8c0@frodo> Celebrating Black History Month February 2006 Three Black Ladies of Adult Literacy Education In the Struggle for Social Justice in the United States Tom Sticht International Consultant in Adult Education In Black History Month we celebrate the history of African-Americans in the United States. In this history, nothing is more important than the struggle of slaves, freedmen, and oppressed African-Americans to learn to read and write and to use these literacy skills to obtain their civil rights. In this history, three great African-American ladies stand out from thousands of others because of the remarkable circumstances under which they labored to help African-Americans gain the dignity and confidence they needed to stand up for their rights. This is a brief summary of some of the contributions of these three African-American ladies of literacy and liberty. Suzie (Baker) King Taylor (1848-unknown) Susie (Baker) King Taylor was born a slave in Savannah, Georgia in 1848. She was raised by her grandmother who sent her and one of her brothers to the home of a free women to learn to read and write, even though it was against the law for slaves to learn to read and write. As she explained in her 1902 book, "We went every day with our books wrapped in paper to prevent the police or white persons from seeing them." (Taylor in Lerner, 1972) During the Civil War the Union Army initiated the practice of enlisting freed African-Americans. But it was soon apparent that there were problems in using these men as soldiers. Among other problems, it was difficult for officers to communicate with illiterate former slaves. So promotion and advancement in the army was difficult for the African-American soldiers. Many of them blamed this situation on their lack of education. In response to these needs, many officers initiated programs of education for the former slaves. One officer, Colonel Thomas W. Higginson of the 33rd U. S. Colored Troops, appointed the chaplain as the regimental teacher. Higginson reportedly saw men at night gathered around a campfire, "spelling slow monosyllables out of a primer, a feat which always commands all ears, " and he observed that, "Their love of the spelling book is perfectly inexhaustible, -they stumbling on by themselves, or the blind leading the blind, with the same pathetic patience which they carry into everything. The chaplain is getting up a schoolhouse, where he will soon teach them as regularly as he can. But the alphabet must always be a very incidental business in a camp." (Cornish, 1952). One of the people whom the chaplain engaged in teaching soldiers of the 33rd to read and write was Susie King Taylor (Blassingame, 1965). She went with the regiment to Florida where she reported that "I learned to handle a musket very well while in the regiment and could shoot straight and often hit the target. I assisted in cleaning the guns and used to fire them off, to see if the cartridges were dry, before cleaning and re-loading , each day. I thought this was great fun." (Taylor in Lerner, 1972, p. 101). Describing something of the conditions under which she worked, Taylor said, "Outside of the Fort were many skulls lying about; I have often moved them one side out of the path.The comrades and I would have wondered a bit as to which side of the war the men fought on, some said they were the skulls of our boys; some said they were the enemies; but as there was no definite way to know, it was never decided which could lay claim to them. They were a gruesome sight, those fleshless heads and grinning jaws, but by this time I had become used to worse things and did not feel as I would have earlier in my camp life. -(Taylor in Lerner, 1972) According to Taylor, "I taught a great many of the comrades in Company E to read and write when they were off duty, nearly all were anxious to learn. My husband taught some also when it was convenient for him. I was very happy to know my efforts were successful in camp also very grateful for the appreciation of my services. I gave my services willingly for four years and three months without receiving a dollar." (Taylor in Lerner, 1972) Throughout the Civil War, thousands of teachers, some modestly paid and many volunteers, worked often under very arduous conditions, such as described above by Suzie King Taylor, to educate the newly freed slaves who came to fight for the preservation of the United States of America. In just the Union Army's Department of the Gulf (Louisiana, Mississippi, Alabama,Texas) by 1864 there were 95 schools with 9,571 children and 2,000 adults being taught by 162 teachers. By the war's end it was estimated some 20,000 African-American troops had been taught to read "intelligently" (Blassingame, 1965). Harriet A. Jacobs (1813-1897) Harriet A. Jacobs was born a slave. But even though it was unlawful to teach slaves to read, Jacob's mistress, the daughter of her owners, taught her to read and write. As she reached puberty, Jacob's master started to make moves on her for sexual favors and subjected her to other abuses. So she ran away and hid at her grandmother's house. She hid in a garret between the ceiling and roof that was about seven feet wide, nine feet in length and only three feet high at the highest point. She hid there for seven years! In 1861, Jacobs wrote a book entitled, "Incidents in the life of a slave girl written by herself." In it she tells the story of her work to help an older black man, a slave like her, learn to read so he could reach for a greater reward for himself at the end of his life. In Jacob's own words of her time: "I knew an old black man, whose piety and childlike trust in God were beautiful to witness. At fifty-three years old he joined the Baptist church. He had a most earnest desire to learn to read. He thought he should know how to serve God better if he could only read the Bible. He came to me, and begged me to teach him. He said he could not pay me, for he had no money; but he would bring me nice fruit when the season for it came. I asked him if he didn't know it was contrary to law; and that slaves were whipped and imprisoned for teaching each other to read. This brought the tears into his eyes. "Don't be troubled, Uncle Fred," said I. "I have no thoughts of refusing to teach you. I only told you of the law, that you might know the danger, and be on your guard." He thought he could plan to come three times a week without its being suspected. I selected a quiet nook, where no intruder was likely to penetrate, and there I taught him his A, B, C. Considering his age, his progress was astonishing. As soon as he could spell in two syllables he wanted to spell out words in the Bible. The happy smile that illuminated his face put joy into my heart. After spelling out a few words he paused, and said, "Honey, it 'pears when I can read dis good book I shall be nearer to God. White man is got all de sense. He can larn easy. It ain't easy for ole black man like me. I only want to read dis book, dat I may know how to live; den I hab no fear 'bout dying." I tried to encourage him by speaking of the rapid progress he had made. "Hab patience, child," he replied. "I larns slow." At the end of six months he had read through the New Testament, and could find any text in it.":End Quote The Freedmen's Schools. Later in her life, after achieving her freedom, Jacobs taught school for former slaves in what were called the Freedmen's Schools. These schools were set up after the Civil War when the U. S. Congress created the Bureau of Refugees, Freedmen, and Abandoned Lands as the primary agency for reconstruction. This agency was placed under the jurisdiction of the War Department and was popularly known as the Freedmen's Bureau. The Freedmen's Bureau provided education for freed former slaves engaging teachers who were primarily from voluntary organizations such as the American Missionary Association. Collectively these organizations became known as Freedmen's Aid Societies. Between 1862 and 1872, fifty-one anti-slavery societies, involving some 2,500 teachers and over 2,000 schools, were conducting education for freedmen. The Freedmen's Bureau was disbanded in 1872 for lack of political support (Morris, 1981). Septima Poinsette Clark (1898-1987) Septima Poinsette Clark has been called the "Queen Mother' of the Civil Rights Movement in the United States. Clark taught black soldiers at Fort Jackson in South Carolina to read and write in the 1930s. Later she conducted workshops at the Highlander Folk school in Tennessee where one of her students was Rosa Parks. Later Clark started citizenship schools with Dr. Martin Luther King at the Southern Christian Leadership Conference. Septima Clark was an innovator in teaching adult reading and writing within the functional context of the civil rights movement to free African-Americans from the oppression of those wanting to deny them full citizenship. Her methods included using "real life" materials for teaching adults to read (Clark, 1986). On January 7, 1957, Clark and her teachers started the first Citizenship School serving adult African-Americans on Johns Island in South Carolina. Clark (1962) recalled that when the teachers asked the students what they wanted to learn, the answer was that, "First, they wanted to learn how to write their names. That was a matter of pride as well as practical need. (p. 147). In teaching students to write their names Clark instructed teachers to carve student's names into cardboard. Then, according to Clark (1962), "What the student does is trace with his pencil over and over his signature until he gets the feel of writing his name. I suppose his fingers memorize it by doing it over and over; he gets into the habit by repeating the tracing time after time." (p.148). She went on to say, "And perhaps the single greatest thing it accomplishes is the enabling of a man to raise his head a little higher; knowing how to sign their names, many of those men and women told me after they had learned, made them FEEL different. Suddenly they had become a part of the community; they were on their way toward first-class citizenship." (p. 149). Speaking of a cleaning woman who asked to be taught to read and write in the Citizenship School on Johns Island, South Carolina, so that she might prepare herself to vote, Septima Poinsette Clark wrote: "This woman is but one of those persons whose stories I could tell. One will never be able, I maintain, to measure or even to approximate the good that this work among the adult illiterates on this one island has accomplished. How can anybody estimate the worth of pride achieved, hope accomplished, faith affirmed, citizenship won? These are intangible things but real nevertheless, solid and of inestimable value." Working with Dr. Martin Luther King at the Southern Christian Leadership Conference Clark took the simple adult literacy educator's method for teaching adults to write their names and eventually trained 10,000 teachers to teach literacy so that African-Americans could gain the vote. Altogether, the Citizenship Schools got nearly 700,000 African-American adults registered to vote in the South, providing political muscle to the Civil Rights Movement of the 1960s! Black History Month owes a lot of its existence to the work of these three great Black ladies, and of course many other African-American educators not noted here, who labored under conditions of duress to help slaves, freedmen, and those African-Americans living under oppression in the middle of the 20th century to acquire literacy. Armed with literacy, African-Americans throughout the United States won the struggle for civil rights. But the struggle goes on. The National Assessment of Adult Literacy (NAAL) of 2003 showed that 67 percent of African-American adults scored at the Basic or Below Basic literacy levels for prose tasks. But in fiscal year 2003, African-Americans made-up only 20 percent of adults enrolled in the Adult Education and Literacy System (AELS) of the United States. Clearly then, at the outset of the 21st century there is a continuing need for political action to support African-American and other adult literacy educators in their efforts to bring literacy and social justice for all. The work goes on; and We SHALL overcome! References Blassingame, J. W. (1965). The Union Army as an educational institution for Negroes, 1862-1865. Journal of Negro Education, 34, 152-159. Clark, Septima P. (1962). Echo in my soul. New York: E. P. Dutton & C0. Cornish, D. T. (1952). The Union Army as a school for Negroes. Journal of Negro history, 37, 368-382. Cornish, D. T. (1952). The Union Army as a school for Negroes. Journal of Negro History, 37, 368-382. Jacobs, H. A. (1987). Incidents in the Life of a Slave Girl: Written by herself. Cambridge, MA: Harvard University Press. (Original work published in 1861). Lerner, G. (Ed.) (1972). Black women in white America: A documentary history. New York: Pantheon Books-Random house. Morris, R.C. (1981). Reading, 'Riting, and Reconstruction: The Education of Freedmen in the South, 1861-1870. Chicago: The University of Chicago Press. Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019-2059 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org From djrosen at comcast.net Sun Feb 12 10:57:15 2006 From: djrosen at comcast.net (David Rosen) Date: Sun, 12 Feb 2006 10:57:15 -0500 Subject: [Assessment] Questions about TABE 9/10 -- and what should Test Publishers Provide on their Web Sites? References: <3.0.6.32.20060207081549.00a0d420@research.haifa.ac.il> Message-ID: <96EBEDDF-5BF9-42AF-BA89-0C4E995E5D96@comcast.net> Assessment Colleagues, The post below (from Iddo Gal, on the Numeracy discussion list) is pertinent to the Assessment list. If you would like to be sure that Iddo gets your reply, in addition to posting it here, also please email it to him. His email address is below. Iddo also raises a larger question which we might explore on the assessment list, and which someone might develop into a summary of points to send to test publishers. Do you agree that test publishers should consider making technical manuals (and independent research or evaluations of their tests) available on their Web sites? What other information should they routinely provide? David J. Rosen djrosen at comcast.net Begin forwarded message: > From: "Iddo Gal" > Date: February 7, 2006 8:15:49 AM EST > To: numeracy at europe.std.com, > Subject: Questions about TABE 9/10 > Reply-To: numeracy at europe.std.com > > > Hi all: > > I am looking for good info regarding the content and background of > the math > subtests of the new TABE 9/10, and what are the differences between > TABE > 7&8 and 9&10. > > I have been frustrated by the lack of detailed info on the website > of the > publishers. Contemporary's website has just commercial info about the > different booklets and products, and McGraw Hill, the parent > company, has > more info but nothing of value if one wants to undestand to what > extent a > different logic was used in TABE 9&10 - it says somewhere that the new > versions were created based on NAEP framework (I guess unlike 7 & > 8), but > there's no way to verify that, and no link to a Technical Manual which > would have the background professional details and supporting > research. > > Further, I found some conflicting info on the websites of different > State > Resource Centers, with a couple saying the 9 & 10 are essentially > the same, > but others claiming there are substantial differences. Hmm.... > > Overall I am quite surprised that so little systematic info has been > published for the community. Seems adult educators and managers > have to > make test-selection and interpretive decisions with very patchy > info to go > on. > > Can anybody direct me to any resource which has deeper info on TABE > 9 & 10, > or do you know of a program that has conducted its own analysis of the > differences between 9 & 10 and 7 & 10? > > Much obliged, > > --Iddo > > > > ---------------------------------------------------------------------- > - > -To unsubscribe from the Numeracy mail list send e-mail to > -majordomo at world.std.com. > -In the body of the message type "unsubscribe numeracy your_address" > > -If you have any questions e-mail edl at world.std.com > From djrosen at comcast.net Sun Feb 12 15:24:02 2006 From: djrosen at comcast.net (David Rosen) Date: Sun, 12 Feb 2006 15:24:02 -0500 Subject: [Assessment] "Alternative" assessment In-Reply-To: <6D87A5CD4E209448BBF7CCFB41D57C0E026013D1@DOIT-EX302.exec.ds.state.ct.us> References: <6D87A5CD4E209448BBF7CCFB41D57C0E026013D1@DOIT-EX302.exec.ds.state.ct.us> Message-ID: <9DBD8679-226A-49C6-B48F-D325692A1779@comcast.net> Marie, Ajit and others, Two terms evaluators often use which could be introduced in this discussion are "direct" and "indirect" measures. A "direct" measure is one which measures an actual performance of a specified task. An "indirect" measure (multiple choice paper and pencil tests are the best-known example) is one which "stands for" the performance. When the GED testing service, several years ago changed the GED writing test from a multiple choice assessment _about_ writing to a performance test of essay writing it moved from an indirect ("mediated") assessment to a direct assessment (of essay writing). Most evaluators would agree that given a world of limitless money and time, direct assessments are better -- that is, they more accurately measure the specified and desired performance of a task -- than indirect ones. "Authentic" and "performance-based" assessments are synonymous (for me) with "direct" assessments. Paper and pencil multiple choice tests (whether standardized or not) are the best-known and most widely used example of "indirect" or "mediated" assessments. Two other terms I find useful when categorizing assessments, measures or observations are "obtrusive' and "unobtrusive". The assessments we talk about on this list are most often or always "obtrusive," that is, the student knows s/he is being tested. This can be good ("positively obtrusive") where the assessment itself causes some additional positive learning, or bad ("negatively obtrusive"), where the assessment interferes with or prevents learning. One of the biggest complaints of the obtrusive assessments mandated by NCLB is that they are negatively obtrusive, that they, and some would argue that preparation for them, takes away a lot of valuable learning time. There are other ways in which assessments can be negatively obtrusive, too, producing false results for some for whom the testing situation creates fear that impedes normal or ordinary performance. I find "unobtrusive" assessments the most interesting, where the learner is assessed but doesn't know it, and just sees it as part of the learning, indistinguishable from the rest. Portfolio assessment can be unobtrusive, as can journal writing, or a weekly set of problems or exercises which the student regards as regular classwork or homework. Teachers ask students questions all the time many of which are used as unobtrusive individual or group assessment. And some teachers do this systematically. In theory, unobtrusive assessments could be standardized, although I know of no example of this.( I wonder if large college lecture classes where students sometimes have assessment consoles that enable them to respond immediately and have their responses immediately tabulated and graphed for the instructor might by now have evolved some standard procedures which make them valid and reliable. Anyone know?) I think "alternative" is a vague word which doesn't help us to think differently about kinds and purposes of assessment, whereas some of these words raise some important and interesting differences in kinds of assessment. David J. Rosen djrosen at comcast.net On Feb 4, 2006, at 12:42 PM, Gopalakrishnan, Ajit wrote: > Marie, et al, > > By "alternative", I presume you mean that these assessment options > are an alternative to multiple-choice assessments. Is that a fair > inference? I sometimes refer to alternative assessments as non- > multiple choice assessments, just to make clear what I am talking > about. > >> From my perspective, referring to them as authentic seems to muddy >> this discussion. Webster provides two of the following definitions >> for authentic which may help to illustrate my thinking: > a) worthy of acceptance or belief as conforming to or based on fact > > b) true to one's own personality, spirit, or character > > > So for example, a student's CASAS scale score in math (say 212) > from a multiple choice test may be worthy of acceptance of a > person's math ability. An analysis of the test item responses may > even provide greater information about a person's strengths and > weaknesses. However, they cannot say much about how the student > perceives the relation of "math" to his/her own personality and > life. Two students at entry might both achieve a score of 207 in > math for very different reasons. One student might have liked math, > viewed herself as being capable of learning math but just not used > it for many years. The other student might have never liked math, > generally seen herself as having other strengths, but been forced > to use math as part of her job. To ascertain this type of > information, the teacher might have to talk to the student and find > out the student's past experiences with math, the student's > perceptions of its importance in his/her life, etc. Then, a custom > assessment/project can > be designed that is meaningful and authentic to that particular > student. > >> From my perspective, all standardization (whether multiple-choice >> or non-multiple choice assessments) will to some extent reduce the >> authenticity for the student. The CASAS system attempts to address >> this by providing assessments that are relevant to adults and >> based in various contexts (life skills, employability skills, >> workforce learning, citizenship, etc.) so that the student can be >> assessed in contexts that are somewhat authentic to their >> experiences and goals. > > Therefore, I prefer the term alternative assessments because then > we can focus our discussion on the differences between multiple > choice assessments and non-multiple choice assessments. > > There is no question that non-multiple choice assessments can be > legitimate and have many strengths. > For example, Connecticut is currently piloting a CASAS workplace > speaking assessment. This is a standardized assessment designed for > ESL learners who are currently working to demonstrate their > listening and speaking abilities in a workplace context. Compared > to the CASAS listening multiple-choice assessments which we have > used over the years, the speaking assessment has the potential for > the instructor to gain a greater understanding of a student's > strengths and weaknesses. Students also seem to enjoy taking the > assessment. However, it needs to be administered one-on-one unlike > the listening which can be group administered. The speaking > assessment also places a greater training and certification burden > on the test administrator and scorer. We have experienced many of > these challenges with our statewide implementation of the CASAS > Functional Writing Assessment over the past few years. Kevin > alluded to some of those challenges such as maintaining scorer > certification and interr > ater reliability. The scoring rubric used in both the writing and > the speaking assessments can be valuable tools for classroom > instruction. > > In my opinion, at least some non-multiple choice assessments should > be standardized so that they can be used to broaden the array of > assessments available for state-level reporting/accountability. > > Thanks. > Ajit > > Ajit Gopalakrishnan > Education Consultant > Connecticut Department of Education > 25 Industrial Park Road > Middletown, CT 06457 > Tel: (860) 807-2125 > Fax: (860) 807-2062 > ajit.gopalakrishnan at po.state.ct.us > > ________________________________ > > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On Behalf Of Marie Cora > Sent: Thursday, February 02, 2006 11:52 AM > To: Assessment Discussion List > Subject: [Assessment] Legitimacy of alternative tools > > Hi Bruce and everyone, > > Bruce, you said: > > "I think putting forth the strengths and legitimacy of tools such > as portfolios, outcome checklists, holistically scored writing > samples, etc is a good way to go." > > This sounds like a very good path to go down to me. I think people > would have a lot to say and share about alternative tools, their > uses, and their strengths. It would be a great exercise to list > them all out and discuss the strengths, uses, and limitations of > each one. > > What questions do folks have about alternative assessments?: using > them, seeking them out, developing them, whatever area most > intrigues you. > > What can folks share with the rest of us in terms of "the strengths > and legitimacy" of alternative tools such as portfolios, > checklists, analytic/holistic scoring, rubric use, writing samples, > in-take/placement processes? > > Are any of the tools you use standardized? Not standardized? Do > you think that this is important? Why or why not? > > Are any of the tools used for both classroom and program purposes? > > I have other questions for you, but let's leave it at that for > right now. Let us hear what your thoughts are. We're looking > forward to it. > > Thanks, > > marie cora > Assessment Discussion List Moderator > > > > > -- > No virus found in this incoming message. > Checked by AVG Free Edition. > Version: 7.1.375 / Virus Database: 267.15.0/248 - Release Date: > 2/1/2006 > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From khinson at future-gate.com Mon Feb 13 10:36:59 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Mon, 13 Feb 2006 16:36:59 +0100 Subject: [Assessment] Questions about TABE 9/10 -- and what should Test Publishers Provide on their Web Sites? Message-ID: <43F0B5AB020000A000001FA1@fghamn01.ham.de.future-gate.com> My program just switched to TABE 9 and 10 in the Fall of 2005 after using 7/8 for years. What we found/encountered when we first started using it was that it was far more time intenstive than 7/8 and we likewise noticed a dramatic decline in student's scores. We know that this can be attributed to many things not the least of which is the fact that we have a new test administrator as of the Fall of 2005. My colleagues and I have also tried to contact TABE/Contemporary's as well by phone - we went beyond the website. We wanted further information on the Grade Level equivalency information. We wanted to know what the difference was between a 12.9 on say the D level and a 12.9 on the A level. We were not happy with the answer we received. Basically, what we were told is there is a +/- 2 Grade Level margin of error and that in that regard the tests overlap. We had the question in the first place because one of our program guidelines is that IF a student scores at the GED High/Low on the TABE test, then he or she can take the GED Practice Test for that area. (In other words if they get a 9.5 on Reading, the student can in theory take the practice test on Reading, SCI and Social Studies. The practice test scores then become a guideline for assessment if he/she doesn't score well enough to take the offical exam.) We wanted to know what the difference was b/w the D and A or even M, D and A to decide if perhaps students should be getting GED Low/High placement on a particular test - like the A level vs D or M. To date, we've still not gotten enough information to really make an adequate decision. I think test makers and suppliers need to ensure that anyone using their products have all the information to ensure that testing is valid and accurate. If a test is updated, changed ormodified, the changes and updates need to be clearly delineated. Was some information removed or added and if so why? What makes test material outdated? I definitely think it's difficult to get answers out of the test makers/ suppliers when questions do arise. Regards Katrina L Hinson >>> djrosen at comcast.net >>> Assessment Colleagues, The post below (from Iddo Gal, on the Numeracy discussion list) is pertinent to the Assessment list. If you would like to be sure that Iddo gets your reply, in addition to posting it here, also please email it to him. His email address is below. Iddo also raises a larger question which we might explore on the assessment list, and which someone might develop into a summary of points to send to test publishers. Do you agree that test publishers should consider making technical manuals (and independent research or evaluations of their tests) available on their Web sites? What other information should they routinely provide? David J. Rosen djrosen at comcast.net Begin forwarded message: > From: "Iddo Gal" > Date: February 7, 2006 8:15:49 AM EST > To: numeracy at europe.std.com, > Subject: Questions about TABE 9/10 > Reply-To: numeracy at europe.std.com > > > Hi all: > > I am looking for good info regarding the content and background of > the math > subtests of the new TABE 9/10, and what are the differences between > TABE > 7&8 and 9&10. > > I have been frustrated by the lack of detailed info on the website > of the > publishers. Contemporary's website has just commercial info about the > different booklets and products, and McGraw Hill, the parent > company, has > more info but nothing of value if one wants to undestand to what > extent a > different logic was used in TABE 9&10 - it says somewhere that the new > versions were created based on NAEP framework (I guess unlike 7 & > 8), but > there's no way to verify that, and no link to a Technical Manual which > would have the background professional details and supporting > research. > > Further, I found some conflicting info on the websites of different > State > Resource Centers, with a couple saying the 9 & 10 are essentially > the same, > but others claiming there are substantial differences. Hmm.... > > Overall I am quite surprised that so little systematic info has been > published for the community. Seems adult educators and managers > have to > make test-selection and interpretive decisions with very patchy > info to go > on. > > Can anybody direct me to any resource which has deeper info on TABE > 9 & 10, > or do you know of a program that has conducted its own analysis of the > differences between 9 & 10 and 7 & 10? > > Much obliged, > > --Iddo > > > > ---------------------------------------------------------------------- > - > -To unsubscribe from the Numeracy mail list send e-mail to > -majordomo at world.std.com. > -In the body of the message type "unsubscribe numeracy your_address" > > -If you have any questions e-mail edl at world.std.com > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From dsenders at lovetoread.org Mon Feb 13 13:31:59 2006 From: dsenders at lovetoread.org (Diane Senders) Date: Mon, 13 Feb 2006 11:31:59 -0700 Subject: [Assessment] Questions about TABE 9/10 -- and what should TestPublishers Provide on their Web Sites? In-Reply-To: <96EBEDDF-5BF9-42AF-BA89-0C4E995E5D96@comcast.net> Message-ID: <005301c630cb$c7bb19e0$6401a8c0@lvt10> We use TABE 7/8 in state funded programs in Arizona. As part of the training to give the TABE, we were provided with a set of explanations from the publisher. One book is called 'Forms 7&8 Technical Report'. This explains all of the information about how the TABE 7&8 were developed. It includes info about equating TABE 7 with TABE 5. (TABE 5&6 were predecessors to TABE 7&8.) It also gives a chart in the 'Validity' section which shows the overlapping grade equivalent range of the different levels of the TABE. We were told by our State Division of Adult Education that when a student scores in either the low or high scoring range of a level, that score was not a valid one and the student scoring in that range needed to take the lower or higher level to get a valid score. For example, if a student scored very high - 12.9 GE on the level D, that is not a valid score and the student should take the level A to get a valid score. I checked the McGraw-Hill/Contemporary site and there is a Technical Report available for the TABE 9&10. It may give you some answers. However I don't know whether you can look at it without purchasing it. Good luck! And please let us know what you find out. Thanks, Diane Diane Senders, Program Administrator Literacy Volunteers of Tucson 1948 E Allen Rd Tucson, AZ 85719 phone: (520) 882-8006 fax: (520) 882-4986 Literacy Volunteers of Tucson - Strengthening Individuals, Families and Communities Through Literacy Tutoring. Literacy Volunteers of Tucson is an accredited affiliate of ProLiteracy America -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of David Rosen Sent: Sunday, February 12, 2006 8:57 AM To: The Discussion List Assessment Cc: iddo at research.haifa.ac.il Subject: [Assessment] Questions about TABE 9/10 -- and what should TestPublishers Provide on their Web Sites? Assessment Colleagues, The post below (from Iddo Gal, on the Numeracy discussion list) is pertinent to the Assessment list. If you would like to be sure that Iddo gets your reply, in addition to posting it here, also please email it to him. His email address is below. Iddo also raises a larger question which we might explore on the assessment list, and which someone might develop into a summary of points to send to test publishers. Do you agree that test publishers should consider making technical manuals (and independent research or evaluations of their tests) available on their Web sites? What other information should they routinely provide? David J. Rosen djrosen at comcast.net Begin forwarded message: > From: "Iddo Gal" > Date: February 7, 2006 8:15:49 AM EST > To: numeracy at europe.std.com, > Subject: Questions about TABE 9/10 > Reply-To: numeracy at europe.std.com > > > Hi all: > > I am looking for good info regarding the content and background of > the math > subtests of the new TABE 9/10, and what are the differences between > TABE > 7&8 and 9&10. > > I have been frustrated by the lack of detailed info on the website > of the > publishers. Contemporary's website has just commercial info about the > different booklets and products, and McGraw Hill, the parent > company, has > more info but nothing of value if one wants to undestand to what > extent a > different logic was used in TABE 9&10 - it says somewhere that the new > versions were created based on NAEP framework (I guess unlike 7 & > 8), but > there's no way to verify that, and no link to a Technical Manual which > would have the background professional details and supporting > research. > > Further, I found some conflicting info on the websites of different > State > Resource Centers, with a couple saying the 9 & 10 are essentially > the same, > but others claiming there are substantial differences. Hmm.... > > Overall I am quite surprised that so little systematic info has been > published for the community. Seems adult educators and managers > have to > make test-selection and interpretive decisions with very patchy > info to go > on. > > Can anybody direct me to any resource which has deeper info on TABE > 9 & 10, > or do you know of a program that has conducted its own analysis of the > differences between 9 & 10 and 7 & 10? > > Much obliged, > > --Iddo > > > > ---------------------------------------------------------------------- > - > -To unsubscribe from the Numeracy mail list send e-mail to > -majordomo at world.std.com. > -In the body of the message type "unsubscribe numeracy your_address" > > -If you have any questions e-mail edl at world.std.com > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From jcrawford at nifl.gov Mon Feb 13 12:20:49 2006 From: jcrawford at nifl.gov (Crawford, June) Date: Mon, 13 Feb 2006 12:20:49 -0500 Subject: [Assessment] Questions about TABE 9/10 -- and what should TestPublishers Provide on their Web Sites? Message-ID: <9B35BF1886881547B5DFF88364AF31A3081E8AA1@wdcrobe2m03.ed.gov> Prior to my employment with the government, I ran a testing program for a university and spent much of my life reviewing tests, test results, long-term follow-ups to test results and instruction, etc. I am very much in favor of having technical manuals made available online by testing companies. I have had difficulty getting answers to questions about tests, even from the statisticians who work for the testing companies, and if there is to be "truth in testing" and students, teachers and administrators are to be judged by the results of standardized tests, then all the information available needs to be available. When you begin to see Standard Error of Measurements reported as multi-grade levels, for instance, there is reason for concern. If we are to really measure what a student learns in a classroom, and make education a more rigorously measured activity, we need to be able to read the technical manuals and know what the tests are really measuring...and how accurately. And we need much more information, in my opinion, about the "N" used for the testing. Not every test is right for every population, and we need to know if the test we are using is appropriate for our population. It isn't just price; it isn't just new products; it certainly shouldn't be who has the glossiest sales pitch. Put the technical manuals online. Just as patients can request all the stats and the information about the good points...and the pitfalls...of taking a new drug, teachers and students should know what they are getting when they buy a new test. June Crawford -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Katrina Hinson Sent: Monday, February 13, 2006 10:37 AM To: djrosen at comcast.net; assessment at dev.nifl.gov; assessment at nifl.gov Cc: iddo at research.haifa.ac.il Subject: Re: [Assessment] Questions about TABE 9/10 -- and what should TestPublishers Provide on their Web Sites? My program just switched to TABE 9 and 10 in the Fall of 2005 after using 7/8 for years. What we found/encountered when we first started using it was that it was far more time intenstive than 7/8 and we likewise noticed a dramatic decline in student's scores. We know that this can be attributed to many things not the least of which is the fact that we have a new test administrator as of the Fall of 2005. My colleagues and I have also tried to contact TABE/Contemporary's as well by phone - we went beyond the website. We wanted further information on the Grade Level equivalency information. We wanted to know what the difference was between a 12.9 on say the D level and a 12.9 on the A level. We were not happy with the answer we received. Basically, what we were told is there is a +/- 2 Grade Level margin of error and that in that regard the tests overlap. We had the question in the first place because one of our program guidelines is that IF a student scores at the GED High/Low on the TABE test, then he or she can take the GED Practice Test for that area. (In other words if they get a 9.5 on Reading, the student can in theory take the practice test on Reading, SCI and Social Studies. The practice test scores then become a guideline for assessment if he/she doesn't score well enough to take the offical exam.) We wanted to know what the difference was b/w the D and A or even M, D and A to decide if perhaps students should be getting GED Low/High placement on a particular test - like the A level vs D or M. To date, we've still not gotten enough information to really make an adequate decision. I think test makers and suppliers need to ensure that anyone using their products have all the information to ensure that testing is valid and accurate. If a test is updated, changed ormodified, the changes and updates need to be clearly delineated. Was some information removed or added and if so why? What makes test material outdated? I definitely think it's difficult to get answers out of the test makers/ suppliers when questions do arise. Regards Katrina L Hinson >>> djrosen at comcast.net >>> Assessment Colleagues, The post below (from Iddo Gal, on the Numeracy discussion list) is pertinent to the Assessment list. If you would like to be sure that Iddo gets your reply, in addition to posting it here, also please email it to him. His email address is below. Iddo also raises a larger question which we might explore on the assessment list, and which someone might develop into a summary of points to send to test publishers. Do you agree that test publishers should consider making technical manuals (and independent research or evaluations of their tests) available on their Web sites? What other information should they routinely provide? David J. Rosen djrosen at comcast.net Begin forwarded message: > From: "Iddo Gal" > Date: February 7, 2006 8:15:49 AM EST > To: numeracy at europe.std.com, > Subject: Questions about TABE 9/10 > Reply-To: numeracy at europe.std.com > > > Hi all: > > I am looking for good info regarding the content and background of > the math > subtests of the new TABE 9/10, and what are the differences between > TABE > 7&8 and 9&10. > > I have been frustrated by the lack of detailed info on the website > of the > publishers. Contemporary's website has just commercial info about the > different booklets and products, and McGraw Hill, the parent > company, has > more info but nothing of value if one wants to undestand to what > extent a > different logic was used in TABE 9&10 - it says somewhere that the new > versions were created based on NAEP framework (I guess unlike 7 & > 8), but > there's no way to verify that, and no link to a Technical Manual which > would have the background professional details and supporting > research. > > Further, I found some conflicting info on the websites of different > State > Resource Centers, with a couple saying the 9 & 10 are essentially > the same, > but others claiming there are substantial differences. Hmm.... > > Overall I am quite surprised that so little systematic info has been > published for the community. Seems adult educators and managers > have to > make test-selection and interpretive decisions with very patchy > info to go > on. > > Can anybody direct me to any resource which has deeper info on TABE > 9 & 10, > or do you know of a program that has conducted its own analysis of the > differences between 9 & 10 and 7 & 10? > > Much obliged, > > --Iddo > > > > ---------------------------------------------------------------------- > - > -To unsubscribe from the Numeracy mail list send e-mail to > -majordomo at world.std.com. > -In the body of the message type "unsubscribe numeracy your_address" > > -If you have any questions e-mail edl at world.std.com > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Tina_Luffman at yc.edu Mon Feb 13 14:29:27 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Mon, 13 Feb 2006 12:29:27 -0700 Subject: [Assessment] Questions about TABE 9/10 -- and what should TestPublishers Provide on their Web Sites? Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060213/3de414a8/attachment.html From rosbrandt at state.pa.us Mon Feb 13 17:49:46 2006 From: rosbrandt at state.pa.us (Brandt, Rose) Date: Mon, 13 Feb 2006 17:49:46 -0500 Subject: [Assessment] "Alternative" assessment References: <6D87A5CD4E209448BBF7CCFB41D57C0E026013D1@DOIT-EX302.exec.ds.state.ct.us> <9DBD8679-226A-49C6-B48F-D325692A1779@comcast.net> Message-ID: <50DFE17FDB132A4182A1D30FDB48CBC303CF8C63@enhbgpri10.backup> A couple of thoughts- First, using terms that are based on comparison is not usually a good idea, since, as the adage says, anything suffers by comparison. So, "alternative" is not the best choice of tems since it suggests different, which leads to the question, "Different than what?" It needs a term that allows it to stand on its own merits. And I want to comment on your statement, David that, in "'unobtrusive' assessments... the learner is assessed but doesn't know it, and just sees it as part of the learning, indistinguishable from the rest." I would make a fine, but I think important distinction here. I believe the student should know that he or she is being assessed. But in unobtrusive assessment, the assessment is part of the learning process, consistent with it, grows out of it, and suggests future direction for it. Rose -----Original Message----- From: assessment-bounces at nifl.gov on behalf of David Rosen Sent: Sun 2/12/2006 3:24 PM To: The Assessment Discussion List Cc: Subject: Re: [Assessment] "Alternative" assessment Marie, Ajit and others, Two terms evaluators often use which could be introduced in this discussion are "direct" and "indirect" measures. A "direct" measure is one which measures an actual performance of a specified task. An "indirect" measure (multiple choice paper and pencil tests are the best-known example) is one which "stands for" the performance. When the GED testing service, several years ago changed the GED writing test from a multiple choice assessment _about_ writing to a performance test of essay writing it moved from an indirect ("mediated") assessment to a direct assessment (of essay writing). Most evaluators would agree that given a world of limitless money and time, direct assessments are better -- that is, they more accurately measure the specified and desired performance of a task -- than indirect ones. "Authentic" and "performance-based" assessments are synonymous (for me) with "direct" assessments. Paper and pencil multiple choice tests (whether standardized or not) are the best-known and most widely used example of "indirect" or "mediated" assessments. Two other terms I find useful when categorizing assessments, measures or observations are "obtrusive' and "unobtrusive". The assessments we talk about on this list are most often or always "obtrusive," that is, the student knows s/he is being tested. This can be good ("positively obtrusive") where the assessment itself causes some additional positive learning, or bad ("negatively obtrusive"), where the assessment interferes with or prevents learning. One of the biggest complaints of the obtrusive assessments mandated by NCLB is that they are negatively obtrusive, that they, and some would argue that preparation for them, takes away a lot of valuable learning time. There are other ways in which assessments can be negatively obtrusive, too, producing false results for some for whom the testing situation creates fear that impedes normal or ordinary performance. I find "unobtrusive" assessments the most interesting, where the learner is assessed but doesn't know it, and just sees it as part of the learning, indistinguishable from the rest. Portfolio assessment can be unobtrusive, as can journal writing, or a weekly set of problems or exercises which the student regards as regular classwork or homework. Teachers ask students questions all the time many of which are used as unobtrusive individual or group assessment. And some teachers do this systematically. In theory, unobtrusive assessments could be standardized, although I know of no example of this.( I wonder if large college lecture classes where students sometimes have assessment consoles that enable them to respond immediately and have their responses immediately tabulated and graphed for the instructor might by now have evolved some standard procedures which make them valid and reliable. Anyone know?) I think "alternative" is a vague word which doesn't help us to think differently about kinds and purposes of assessment, whereas some of these words raise some important and interesting differences in kinds of assessment. David J. Rosen djrosen at comcast.net On Feb 4, 2006, at 12:42 PM, Gopalakrishnan, Ajit wrote: > Marie, et al, > > By "alternative", I presume you mean that these assessment options > are an alternative to multiple-choice assessments. Is that a fair > inference? I sometimes refer to alternative assessments as non- > multiple choice assessments, just to make clear what I am talking > about. > >> From my perspective, referring to them as authentic seems to muddy >> this discussion. Webster provides two of the following definitions >> for authentic which may help to illustrate my thinking: > a) worthy of acceptance or belief as conforming to or based on fact > > b) true to one's own personality, spirit, or character > > > So for example, a student's CASAS scale score in math (say 212) > from a multiple choice test may be worthy of acceptance of a > person's math ability. An analysis of the test item responses may > even provide greater information about a person's strengths and > weaknesses. However, they cannot say much about how the student > perceives the relation of "math" to his/her own personality and > life. Two students at entry might both achieve a score of 207 in > math for very different reasons. One student might have liked math, > viewed herself as being capable of learning math but just not used > it for many years. The other student might have never liked math, > generally seen herself as having other strengths, but been forced > to use math as part of her job. To ascertain this type of > information, the teacher might have to talk to the student and find > out the student's past experiences with math, the student's > perceptions of its importance in his/her life, etc. Then, a custom > assessment/project can > be designed that is meaningful and authentic to that particular > student. > >> From my perspective, all standardization (whether multiple-choice >> or non-multiple choice assessments) will to some extent reduce the >> authenticity for the student. The CASAS system attempts to address >> this by providing assessments that are relevant to adults and >> based in various contexts (life skills, employability skills, >> workforce learning, citizenship, etc.) so that the student can be >> assessed in contexts that are somewhat authentic to their >> experiences and goals. > > Therefore, I prefer the term alternative assessments because then > we can focus our discussion on the differences between multiple > choice assessments and non-multiple choice assessments. > > There is no question that non-multiple choice assessments can be > legitimate and have many strengths. > For example, Connecticut is currently piloting a CASAS workplace > speaking assessment. This is a standardized assessment designed for > ESL learners who are currently working to demonstrate their > listening and speaking abilities in a workplace context. Compared > to the CASAS listening multiple-choice assessments which we have > used over the years, the speaking assessment has the potential for > the instructor to gain a greater understanding of a student's > strengths and weaknesses. Students also seem to enjoy taking the > assessment. However, it needs to be administered one-on-one unlike > the listening which can be group administered. The speaking > assessment also places a greater training and certification burden > on the test administrator and scorer. We have experienced many of > these challenges with our statewide implementation of the CASAS > Functional Writing Assessment over the past few years. Kevin > alluded to some of those challenges such as maintaining scorer > certification and interr > ater reliability. The scoring rubric used in both the writing and > the speaking assessments can be valuable tools for classroom > instruction. > > In my opinion, at least some non-multiple choice assessments should > be standardized so that they can be used to broaden the array of > assessments available for state-level reporting/accountability. > > Thanks. > Ajit > > Ajit Gopalakrishnan > Education Consultant > Connecticut Department of Education > 25 Industrial Park Road > Middletown, CT 06457 > Tel: (860) 807-2125 > Fax: (860) 807-2062 > ajit.gopalakrishnan at po.state.ct.us > > ________________________________ > > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On Behalf Of Marie Cora > Sent: Thursday, February 02, 2006 11:52 AM > To: Assessment Discussion List > Subject: [Assessment] Legitimacy of alternative tools > > Hi Bruce and everyone, > > Bruce, you said: > > "I think putting forth the strengths and legitimacy of tools such > as portfolios, outcome checklists, holistically scored writing > samples, etc is a good way to go." > > This sounds like a very good path to go down to me. I think people > would have a lot to say and share about alternative tools, their > uses, and their strengths. It would be a great exercise to list > them all out and discuss the strengths, uses, and limitations of > each one. > > What questions do folks have about alternative assessments?: using > them, seeking them out, developing them, whatever area most > intrigues you. > > What can folks share with the rest of us in terms of "the strengths > and legitimacy" of alternative tools such as portfolios, > checklists, analytic/holistic scoring, rubric use, writing samples, > in-take/placement processes? > > Are any of the tools you use standardized? Not standardized? Do > you think that this is important? Why or why not? > > Are any of the tools used for both classroom and program purposes? > > I have other questions for you, but let's leave it at that for > right now. Let us hear what your thoughts are. We're looking > forward to it. > > Thanks, > > marie cora > Assessment Discussion List Moderator > > > > > -- > No virus found in this incoming message. > Checked by AVG Free Edition. > Version: 7.1.375 / Virus Database: 267.15.0/248 - Release Date: > 2/1/2006 > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 17882 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060213/eb43582f/attachment.bin From PHCSJean.2163953 at bloglines.com Tue Feb 14 17:49:44 2006 From: PHCSJean.2163953 at bloglines.com (PHCSJean.2163953 at bloglines.com) Date: 14 Feb 2006 22:49:44 -0000 Subject: [Assessment] Assessment Standards Message-ID: <1139957384.909507752.29559.sendItem@bloglines.com> It's been interesting reading this discussion of assessments. In 1994 the International Reading Association (1) published Standards for the Assessment of Reading and Writing which say: 1. A student's interests are paramount in assessment. 2. The purpose of assessment is to improve teaching and learning. 3. Assessment must reflect and allow for critical inquiry into curriculum and instruction. 4. Assessments must recognize and reflect the intellectually and socially complex nature of reading and writing and the important roles of school, home and society in literacy development. 5. Assessment must be fair and equitable. 6. The consequences of an assessment procedure is the first and most important consideration in establishing the validity of an assessment. As the discussion of assessment goes in our world, we generally want to figure out where the student is to create an appropriate placement and program for him/her. Assessment goes beyond that as we enter into the program to measure the gains of the students in the class. We can use a variety of things for that. One thing we haven't talked about (or if we have I missed it in attempting to catch up with all of the recent postings, sorry!) is the role of assessment to determine if we need to change anything in what we're doing in our teaching. We've also spoken about how hard it is to assess the lowest literacy learners using some of the pre-created tests. Some of these standards aren't even considered in the attempts to meet their needs. I just thought they were worth tossing out there, since it's rare that our world intersects with what's going on in the reading training of children, and this comes from that world, and seems to have applicability. Jean Marrapodi (1)International Reading Association & National Council of Teachers of English. (1994). Standards for the assessment of reading and writing. Newark , DE : International Reading Association. Urbana , Illinois : National Council of Teachers of English. From kabeall at comcast.net Wed Feb 15 12:25:13 2006 From: kabeall at comcast.net (Kaye Beall) Date: Wed, 15 Feb 2006 12:25:13 -0500 Subject: [Assessment] New from NCSALL--Program Administrators' Sourcebook Message-ID: <007501c63254$c8353650$0202a8c0@your4105e587b6> If you administer an adult education program, you face a wide variety of challenges: * How can you help students make "level" gains? * How can you help students gain the skills they need to reach their goals? * How can you help students stay in programs long enough to meet their goals? * How can you prepare and retain good teachers? * How can you document the successes of your program? The National Center for the Study of Adult Learning and Literacy (NCSALL) conducted research relevant to these questions. The Program Administrators? Sourcebook (December 2005) is designed to give you, as a program administrator, direct access to research that may help you address the challenges you face in your job. Written by Jackie Taylor, Cristine Smith, and Beth Bingman in collaboration with five local program administrators, this sourcebook presents NCSALL?s research findings in short sections related to key challenges that program administrators face in their work as managers of adult education programs. It also presents the implications of these research findings for program structure and services, as well as some strategies for implementing change based on these implications. To download the Program Administrators' Sourcebook, visit NCSALL's Web site: http://www.ncsall.net/?id=1035 To order the Program Administrators' Sourcebook at $10.00/copy, go to the NCSALL Order Form (http://www.ncsall.net/?id=674); limited quantities available. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060215/b67d4017/attachment.html From marie.cora at hotspurpartners.com Thu Feb 16 10:07:00 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 16 Feb 2006 10:07:00 -0500 Subject: [Assessment] Struggling ESOL Learners Message-ID: <02b001c6330a$a40a0f10$0402a8c0@frodo> Hello Colleagues, Below is an announcemmennt from Julie McKinney, moderator of the NIFL Focus on basics discussion list. Note that this discussion began yesterday (2/15) - I apologize for the delay in getting this announcement to you. marie **************************** For those who are interested: the discussion of Struggling ESOL Learners Starts February 15 on the Focus on Basics Discussion List. Please join Robin Schwarz and the rest of us to discuss her article in Focus on Basics, Vol. 8A. To read the article: "Taking a Closer Look at Struggling ESOL Learners" go to: http://www.ncsall.net/index.php?id=994 If you are not subscribed to the FOB list, you can subscribe at: http://www.nifl.gov/mailman/listinfo/focusonbasics Below are the questions we posted last week to get us thinking about the article and how it relates to the work we do. Discussion Questions 1. Stories: Have you had struggling learners in your program? How common do you think this problem is? Do you want to share a story of a learner you have worked with, and tell us how you were able to find out the issue, and what you did to help? 2. Physical Disabilities: How do we screen for them and what specific accommodations can we make in the class or program for them? 3. Intake/Counseling Procedures: What does your center or program do for a routine intake? What is the procedure to address a learner who is not progressing? How well do you get at factors such as physical and health problems, living situations, amount and nature of literacy skills, nature of the primary language and cultural communication style? 4. Responding: Once there is a reason discovered for a learner's struggles, how well-equipped are you to respond to the problem? How do you learn how to accommodate a hearing or visual problem? What do you do for the learner with anxiety or depression? Do you have access to a consulting teacher, or someone knowledgeable in the complexities of a given culture's communication style (as in the example of the Sudanese men in the article)? 5. Staff Training/Professional Development: What kind of training do we all need in order to ensure that our intake procedures are complete and appropriate? What kind of training will help us to respond an effective way? 6. Did This Article Change Something You Do? Share with us anything that you changed, did, started, or stopped as a result of reading this article. Why? What result did you get? 7. What Connections Did You Make With This Article? Even if you did not change anything, did it ring a bell or hit home to you in some way? We'll see you for the discussion! Julie Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From ktashjian at yahoo.com Tue Feb 21 15:04:47 2006 From: ktashjian at yahoo.com (karisa tashjian) Date: Tue, 21 Feb 2006 12:04:47 -0800 (PST) Subject: [Assessment] Request for rubrics Message-ID: <20060221200447.28790.qmail@web52706.mail.yahoo.com> I have been searching for rubrics that would be appropriate for adult ESL students. I haven't had much luck. Could anyone direct me to a source on the Internet? Thank you in advance for your help. You can email me off-list if you prefer at ktashjian at yahoo.com. Thank you, Karisa Tashjian Rhode Island Family Literacy Initiative Providence, RI --------------------------------- Relax. Yahoo! Mail virus scanning helps detect nasty viruses! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060221/0e83de35/attachment.html From marie.cora at hotspurpartners.com Wed Feb 22 11:11:36 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 22 Feb 2006 11:11:36 -0500 Subject: [Assessment] Request for rubrics In-Reply-To: <20060221200447.28790.qmail@web52706.mail.yahoo.com> Message-ID: <007b01c637ca$a83d29e0$0402a8c0@frodo> Hi Karisa, Can you tell us more specifically what you want to measure? I know for writing, the REEP Writing Rubric and Process is excellent and was developed by ESOL teachers for their adult students. Here's the page from the ALEWiki that discusses the REEP - it's excerpts from a guest discussion that was on the List a year ago. There are links to REEP in the text. http://wiki.literacytent.org/index.php/REEP_Writing_Rubric But as for other content rubrics..not sure what you need...anyone? marie Assessment List Discussion Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of karisa tashjian Sent: Tuesday, February 21, 2006 3:05 PM To: assessment at nifl.gov Subject: [Assessment] Request for rubrics I have been searching for rubrics that would be appropriate for adult ESL students. I haven't had much luck. Could anyone direct me to a source on the Internet? Thank you in advance for your help. You can email me off-list if you prefer at ktashjian at yahoo.com. Thank you, Karisa Tashjian Rhode Island Family Literacy Initiative Providence, RI _____ Relax. Yahoo! Mail virus scanning helps detect nasty viruses! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060222/b5a0124d/attachment.html From marie.cora at hotspurpartners.com Wed Feb 22 11:20:34 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 22 Feb 2006 11:20:34 -0500 Subject: [Assessment] Tech list discussion announcement Message-ID: <008101c637cb$e8ee7bf0$0402a8c0@frodo> Dear colleagues, The following announcement is from Mariann Fedele, Moderator of the Technology and Literacy Discussion List. To subscribe to this discussion, go to: http://www.nifl.gov/mailman/listinfo/Technology Thanks, marie Assessment List Discussion Moderator **************************** Hello NIFL discussion list colleagues, Audio and video resources have been used in adult education instruction for many years, but the introduction of CDs, DVDs, the Internet, and other electronic technologies has greatly expanded their availability and raise many questions about implementation, support annd training. I'm pleased to announce that David Collings, Technology Coordinator for the Adult and Community Education Network in Delaware, and Alex Quinn, Executive Director of the Adult Literacy Media Alliance (ALMA), will join the NIFL Technology list as guests to lead a discussion on current and emerging uses of media in adult ed. instruction in the classroom and at a distance. Their discussion will take place next week from Tuesday, February 28th through Friday March 3rd. Some of the areas they will cover on this topic include: teacher training, technical support for teachers and learners, the challenges of their use, media distribution, and emerging uses and tools foor delivery of media. You are encouraged to participate, ask questions, and share you experience and knowledge. To join the NIFL Technology and Literacy discussion list please subscribe by going to: http://www.nifl.gov/mailman/listinfo/Technology Please don't hesitate to get in touch if you have questions. Regards, Mariann Mariann Fedele Coordinator of Professional Development, Literacy Assistance Center Moderator, NIFL Technology and Literacy Discussion List 32 Broadway 10th Floor New York, New York 10004 212-803-3325 mariannf at lacnyc.org www.lacnyc.org _______________________________________________ National Institute for Literacy Moderators mailing list: Moderators at nifl.gov http://www.nifl.gov/mailman/listinfo/moderators Moderator's Resource Page: http://www.nifl.gov/lincs_dlms/contents.html From ktashjian at yahoo.com Wed Feb 22 13:19:17 2006 From: ktashjian at yahoo.com (karisa tashjian) Date: Wed, 22 Feb 2006 10:19:17 -0800 (PST) Subject: [Assessment] Request for rubrics In-Reply-To: <007b01c637ca$a83d29e0$0402a8c0@frodo> Message-ID: <20060222181917.49288.qmail@web52711.mail.yahoo.com> Hi Marie, Thank you for your reply. Yes, I have looked at those rubrics as well as the WIKI. I am searching for rubrics that are tailored for specific classroom assignments in the adult ESL classroom. For instance, a teacher assigns students to give an oral presentation. I would like an example of a rubric that a teacher has designed to go along with the presentation. Hopefully, it will assess their pronunciation and/or grammar and/or organization of material. My overall intention is to share these in our upcoming teacher newsletter. Instead of just writing about the usefulness of rubrics in the adult ESL classroom, I would like to give examples. I hope I'm not on the wrong track with this........but I have created rubrics for K-12 assignments (or had the students help create them) with great success. My challenge is finding ones that would be adaptable for our classrooms. One of the strengths I find with rubrics is that students truly internalize the criteria for success and can self-evaluate. I welcome any advice. Thank you, Karisa --------------------------------- Yahoo! Mail Use Photomail to share photos without annoying attachments. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060222/f6218cf6/attachment.html From AWilder106 at aol.com Sat Feb 25 12:33:22 2006 From: AWilder106 at aol.com (AWilder106 at aol.com) Date: Sat, 25 Feb 2006 12:33:22 EST Subject: [Assessment] "Alternative" assessment Message-ID: <1c4.3a7d187b.3131eee2@aol.com> Sorry for the lateness of this message. "Unobtrusive measures" can be just that--a teacher stows them away as signs that the learner feels better about being in class, progress, etc.. Unobtrusive measures could be ---a student shows up early to class, looks better dressed, does not sit in the back of the room, jokes with other students... Andrea -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060225/6de8cc01/attachment.html From marie.cora at hotspurpartners.com Mon Feb 27 14:45:24 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 27 Feb 2006 14:45:24 -0500 Subject: [Assessment] "Alternative" terminology In-Reply-To: <1c4.3a7d187b.3131eee2@aol.com> Message-ID: <015401c63bd6$5a6fbbf0$0402a8c0@frodo> Hi Andrea, David, Rose, and everyone, I'd also like to pick up on this great discussion of terminology (sorry to be late - last week was school vacation week!). David, you raised some really good points in your post, and added greatly to our list of terms that can possibly be more useful to us in this work. I have to side with Rose in her point regarding whether or not a student knows they are being assessed. I also feel it's extremely important for them to know how, when, why, etc they will be assessed, but if the test is unobtrusive as David suggests, then this variable should actually be a positive one. It would provide them with information on what the expectations are for their performance. What do others think about this? David you also brought up standardization of unobtrusive tests, and Andrea noted some good examples of things teachers could observe of their students. Although I agree that it would be a tough thing to formally standardize such behaviors/acts/skills/interactions/etc, a teacher could take some measures to be as uniform in the process as possible. For example, they could use a standard form for checking on the things that Andrea notes, and provide themself with parameters for using the form (like use same time every day, always ask same questions in same order, never ask more than 2 students the same question, etc.). There is also what's known as triangulation - which is used in the classroom: it is a procedure that includes 3 ways of corroborating information (http://wiki.literacytent.org/index.php/Triangulation). This is also a possible way of making the collection of some of this info more valid. This is making me think of Exhibitions, which is a Coalition of Essential Schools initiative. It's different because it's high school, but there are many places where this discussion and that type of assessment overlap. Exhibitions, at the very least, would certainly fall under David's "Direct" measure category. (http://www.essentialschools.org/pub/ces_docs/resources/cp/assess/assess .html) So! Here's a tally of some of the terms that have been put out there during this discussion. Let's see if we can add to this list. Advocate for your preferred term as well, and let us know why you feel that way. Add to, or revise the definitions below as well. Constructed response - An exercise for which examinees must create their own responses or products (performance assessment ) rather than choose a response from an enumerated set (multiple choice). Selected response - An exercise for which examinees must choose a response from an enumerated set (multiple choice) rather than create their own responses or products (performance assessment ). A "direct" measure is one which measures an actual performance of a specified task. An "indirect" or "mediated" measure (multiple choice paper and pencil tests are the best-known example) is one which "stands for" the performance. "Obtrusive" assessment - that is, the student knows s/he is being tested. "Unobtrusive" assessment - the learner is assessed but doesn't know it, and just sees it as part of the learning, indistinguishable from the rest. "Positively obtrusive" - the assessment itself causes some additional positive learning. "Negatively obtrusive" - the assessment interferes with or prevents learning. Questions for you all: Performance assessment - What do you think of this term? Too vague as well? Has it taken on too much to be meaningful to us now? Authentic assessment - What do you think of this term? Too vague? Encompasses too much so tells us little? Participatory assessment - What do you think of this term? Is this in the same category as the ones above? Why or why not? Is this term useful for us? Thanks for this rich discussion, marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of AWilder106 at aol.com Sent: Saturday, February 25, 2006 12:33 PM To: assessment at nifl.gov Subject: Re: [Assessment] "Alternative" assessment Sorry for the lateness of this message. "Unobtrusive measures" can be just that--a teacher stows them away as signs that the learner feels better about being in class, progress, etc.. Unobtrusive measures could be ---a student shows up early to class, looks better dressed, does not sit in the back of the room, jokes with other students... Andrea -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060227/ae733974/attachment.html From marie.cora at hotspurpartners.com Mon Feb 27 20:12:06 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 27 Feb 2006 20:12:06 -0500 Subject: [Assessment] FW: More ESOL Discussions Message-ID: <018501c63c03$fe4c4ea0$0402a8c0@frodo> Hello everyone, Please take a look at the following opportunity to discuss issues in ESOL on the FOB Listserv. marie ================================================= Hi All, Another guest discussion on the Focus on Basics List: Tues. Feb 28th - Mon. March 6th. A team of researchers/authors from the Lab School will discuss their articles from the recent FOB issue on ESOL Research (Vol. 8A). This discussion will include information about the Lab School itself, and several research projects that were done there involving student-to-student language and interaction, and a modified reading technique in beginning level adult ESOL classes. Many of the topics overlap, so it will be great to have these researchers lead the discussion as a team. The articles and authors include: The "Lab School" By Steve Reder http://www.ncsall.net/index.php?id=987 Same Activity, Different Focus [Pair Activities] By Kathryn Harris http://www.ncsall.net/index.php?id=988 A Conversation with FOB: Modified Sustained Silent Reading By Sandra Banke and Reuel Kurzet http://www.ncsall.net/index.php?id=990 Rewarding Conversations By Betsy Kraft http://www.ncsall.net/index.php?id=991 Spontaneous Conversations: A Window into Language Learners' Autonomy By Dominique Brillanceau http://www.ncsall.net/index.php?id=992 Turn Taking and Opening Interactions By John Hellermann http://www.ncsall.net/index.php?id=993 Please take a look at the articles and join us for this discussion! (If you are not subscribed to the FOB list, you can subscribe at: www.nifl.gov/mailman/listinfo/focusonbasics ) Sorry for the short notice, but we had a rare window of availability open! Julie Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From kabeall at comcast.net Tue Feb 28 12:53:25 2006 From: kabeall at comcast.net (Kaye Beall) Date: Tue, 28 Feb 2006 12:53:25 -0500 Subject: [Assessment] New from NCSALL--Practitioner Research, Practitioner Knowledge Message-ID: <002401c63c8f$dff086b0$0202a8c0@your4105e587b6> Visit the new Practitioner Research, Practitioner Knowledge section of NCSALL's Web site at http://www.ncsall.net/?id=967. Find out how practitioners learn about new research and then inquire about how this research might be used in their own practice. Teachers in the Northwest Practitioner Knowledge Institute learned about ESL research, made a change in their own practice, documented what happened when they made the change, and shared this knowledge in final reports. They developed and documented "practitioner knowledge" developed from learning about others' research. Teachers in the Minnesota Practitioner Research in Reading Project and the Practitioner Dissemination and Research Network learned about others' research and also conducted research of their own. After learning about new research findings in reading or learner persistence, these teachers developed a research question on one of these topics, planned an intervention or change in their own practice, collected data on what happened as a result, analyzed these data and reported their findings. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060228/6b02529a/attachment.html From marie.cora at hotspurpartners.com Thu Mar 2 12:23:12 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 2 Mar 2006 12:23:12 -0500 Subject: [Assessment] NAS Report now available Message-ID: <03e901c63e1d$fc869550$0402a8c0@frodo> Dear Colleagues, The report Measuring Literacy: Performance Levels for Adults authored by the Committee on Performance Levels for Adult Literacy at the National Research Council is now available at: http://fermat.nap.edu/catalog/11267.html The site includes instructions for purchasing the report as well as for reading it free on-line. marie cora Assessment Discussion Listserv Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060302/4064f856/attachment.html From marie.cora at hotspurpartners.com Mon Mar 6 12:30:53 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 6 Mar 2006 12:30:53 -0500 Subject: [Assessment] Conference opportunity Message-ID: <010101c64143$b8a6c7f0$0402a8c0@frodo> Dear List Members: The following conference announcement/call for presenters is from Jane Swing: Assessment is a concern and interest for members of the Virginia Association of Adult and Continuing Education. We are seeking presenters to provide sessions focused on utilizing assessment, as well as coordinating assessment and curricula. Please see the information provided below for inquiries. The Virginia Association of Adult and Continuing Education invites you to Sail into Spring The Virginia Association of Adult and Continuing Education (VAACE) invites you to attend its annual conference to be held May 3-5 at the Virginia Beach Resort and Conference Center in Virginia Beach, Virginia. The conference will provide a wonderful opportunity for you to network and learn about the latest trends and issues facing adult education. Vendors will be on hand to show off their newest publications along with tried and true ones. Awards will be presented to some of Virginia's best adult education practitioners. The conference planning team is in search of presenters to offer interesting and relevant sessions for our participants. They know there are many people doing great work in adult education. Now is the time to share what you are doing with others. Encourage your peers to share their work and knowledge through a workshop presentation. The workshop proposal form is attached to this message. Don't miss out on this opportunity to learn, renew, and relax in a wonderful setting with colleagues who are facing the same issues, challenges and rewards. More information about VAACE and this year's conference can be found at www.vaace.org . Jane C. Swing, Director Office of Adult Education and Literacy Projects Radford University PO Box 7015 A 136 Peters Hall Radford, VA 24142 540-831-6207 FAX 540-831-5779 jswing at radford.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060306/82a94856/attachment.html From marie.cora at hotspurpartners.com Wed Mar 8 11:51:40 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 8 Mar 2006 11:51:40 -0500 Subject: [Assessment] Discussion summary posted at Wiki Message-ID: <01ec01c642d0$934fa920$0402a8c0@frodo> Hi everyone, I hope this email finds you well. I have archived the discussion we had a couple of weeks ago regarding terminology and alternative assessment at the ALEWiki Assessment page: http://wiki.literacytent.org/index.php/Assessment_Information - click on Discussions. I titled it: " Alternative Assessment as an Appropriate Tool/The Value of Terminology" for lack of a better way to describe the discussion. I hope you find this useful. ***If you use this, or any of the resources at the Assessment Wiki, please let me know how you used them and for what purposes. This is very important information for me to learn in order for the Assessment Wiki to be a useful tool for you all. And if you would like to see other types of info put up at Wiki, let me know - I will help you do it :-) marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060308/5839556f/attachment.html From marie.cora at hotspurpartners.com Wed Mar 8 12:08:31 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 8 Mar 2006 12:08:31 -0500 Subject: [Assessment] To fudge or not to fudge Message-ID: <01f101c642d2$ed9506d0$0402a8c0@frodo> That is the question... Hello all! Not too long ago, I received an email question regarding submitting accurate data to the states and the Feds. It appeared that the person was being pressured to make up data (assessment scores) so that the outcomes of the program looked better. I bet this story is not new to you - either because you have heard about it, or perhaps because it has happened to you. So I have some questions now: If programs fudge their data, won't that come back to haunt us all? Won't that skew standards and either force programs to under-perform or not allow them to reach their performance levels because they are too steep? Why would you want to fudge your data? At some point, most-likely the fudge will be revealed don't you think? We don't have nationwide standards - so if programs within states are reporting data in any which way, we won't be able to compare ourselves across states, will we? Since states have all different standards (and some don't have any), states can report in ways in which it makes them appear to be out-doing other states, when perhaps they are not at all? I'm probably mushing 2 different and important things together here: the accurate data part, and the standards part ("on what do we base our data") - but this is how it's playing out in my mind. Not only do we sometimes struggle with providing accurate data (for a variety of reasons: it's complex, it's messy, we feel pressure, sometimes things are unclear, etc.), but we do not have institutionalized standards across all states for all to be working in parallel fashion. What are your thoughts on this? Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060308/bf2df3b8/attachment.html From ADNEILS at k12.carr.org Wed Mar 8 14:12:26 2006 From: ADNEILS at k12.carr.org (Andrea Neilson) Date: Wed, 08 Mar 2006 14:12:26 -0500 Subject: [Assessment] To fudge or not to fudge Message-ID: Marie, You've hit the mark with me on all the points you've raised regarding data. Having worked as the LWIS coordinator for the past 6 years, I've seen data collection requirements become more detailed and certainly more complex. When we went from scanning to on-line data entry, suddenly we had more restrictions, instant error notifications, and a series of "if this, then that" rules to institute. I don't consider myself a data-geek, but I really appreciate the more clearly defined and precise directives. I value our program's integrity and feel secure in our reporting as it's in line with our state's requirements. However, I still very often feel we're comparing apples to oranges when we look at our performance across fiscal years, against other providers, and against the state averages. I often wonder what practices other providers are using in regard to data management, intake and assessment and data quality (not to mention instruction and professional development). In the bigger picture, it would make sense to me if these practices were also either more clearly directed and defined or at the very least, shared and discussed among all providers so as to learn "what works." I'm a first time responder to this list and perhaps using this platform, we might share legitimate program practices that result in these anticipated outcomes? Thanks, Andrea Neilson Intake Assessment Specialist Carroll Adult Learning Connection Carroll County Public Schools 410-751-3680 ext. 221 >>> marie.cora at hotspurpartners.com 3/8/06 12:08:31 PM >>> That is the question... Hello all! Not too long ago, I received an email question regarding submitting accurate data to the states and the Feds. It appeared that the person was being pressured to make up data (assessment scores) so that the outcomes of the program looked better. I bet this story is not new to you - either because you have heard about it, or perhaps because it has happened to you. So I have some questions now: If programs fudge their data, won't that come back to haunt us all? Won't that skew standards and either force programs to under-perform or not allow them to reach their performance levels because they are too steep? Why would you want to fudge your data? At some point, most-likely the fudge will be revealed don't you think? We don't have nationwide standards - so if programs within states are reporting data in any which way, we won't be able to compare ourselves across states, will we? Since states have all different standards (and some don't have any), states can report in ways in which it makes them appear to be out-doing other states, when perhaps they are not at all? I'm probably mushing 2 different and important things together here: the accurate data part, and the standards part ("on what do we base our data") - but this is how it's playing out in my mind. Not only do we sometimes struggle with providing accurate data (for a variety of reasons: it's complex, it's messy, we feel pressure, sometimes things are unclear, etc.), but we do not have institutionalized standards across all states for all to be working in parallel fashion. What are your thoughts on this? Thanks, marie cora Assessment Discussion List Moderator From janeaddeo at comcast.net Wed Mar 8 14:39:10 2006 From: janeaddeo at comcast.net (janeaddeo at comcast.net) Date: Wed, 08 Mar 2006 19:39:10 +0000 Subject: [Assessment] To fudge or not to fudge Message-ID: <030820061939.2931.440F32DE0006C91F00000B732200750330010A0B0B0E0A020E06@comcast.net> Marie, Re- fudging-"Honesty is the best policy." Standards- the state and pertinent state standards, or lack there -of, should be indicated in the submitted data. Nationwide standards- should be the goal :>) -------------- Original message -------------- From: "Marie Cora" That is the question?.. Hello all! Not too long ago, I received an email question regarding submitting accurate data to the states and the Feds. It appeared that the person was being pressured to make up data (assessment scores) so that the outcomes of the program looked better. I bet this story is not new to you ? either because you have heard about it, or perhaps because it has happened to you. So I have some questions now: If programs fudge their data, won?t that come back to haunt us all? Won?t that skew standards and either force programs to under-perform or not allow them to reach their performance levels because they are too steep? Why would you want to fudge your data? At some point, most-likely the fudge will be revealed don?t you think? We don?t have nationwide standards ? so if programs within states are reporting data in any which way, we won?t be able to compare ourselves across states, will we? Since states have all different standards (and some don?t have any), states can report in ways in which it makes them appear to be out-doing other states, when perhaps they are not at all? I?m probably mushing 2 different and important things together here: the accurate data part, and the standards part (?on what do we base our data?) ? but this is how it?s playing out in my mind. Not only do we sometimes struggle with providing accurate data (for a variety of reasons: it?s complex, it?s messy, we feel pressure, sometimes things are unclear, etc.), but we do not have institutionalized standards across all states for all to be working in parallel fashion. What are your thoughts on this? Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060308/4acc3f32/attachment.html -------------- next part -------------- An embedded message was scrubbed... From: "Marie Cora" Subject: [Assessment] To fudge or not to fudge Date: Wed, 8 Mar 2006 16:57:26 +0000 Size: 731 Url: http://www.nifl.gov/pipermail/assessment/attachments/20060308/4acc3f32/attachment.mht From Susan.Erno at ccs.k12.va.us Wed Mar 8 16:02:47 2006 From: Susan.Erno at ccs.k12.va.us (Susan Erno) Date: Wed, 08 Mar 2006 16:02:47 -0500 Subject: [Assessment] To fudge or not to fudge Message-ID: Marie Such a brave question to post. And a complicated one. and as you say, a messy one. I can give a perspective as a program administrator for a mid-size Adult Education program in Virginia. As in most programs, data collection and accountability are priorities. Our income sources include federal, state, local school, grant, private contracts, and tuition payments. The federal funding is actually just a small % of the total budget. Here are our options: 1. Count all students in the NRS data regardless of funding. our stats would like great but we would be giving a false picture of the actual cost per student. I know of many programs that report all students using the rationale, that the higher numbers make their program look better. The federal government actually rewards what you call "fudging" (others call it "being creative") by giving additional money as incentives to states who meet or exceed their goals. No one asks how the goals were met. I have seen reports showing 100% level gains and 12 hours attendance per student. 2. Count only students in classes which use federal/state funding. Numbers would be lower but more accurate. Goals might not look as good. Even here, this is not a true picture because of extra local funding. A full-time administrator and longer duration of classes give our program an edge. 3 Creative counting of students-including extra students only when it is advantageous to our performance. In this option, we would count everyone in our federal/state funded classes and we would only count successful students from classes with other funding sources. For example, if the jail class (which is fully paid for by the jail) has 11 students who pass the GED test, we count them. The rationale is that some federal/state dollars were spent on this class because the admin, admin assistant, and GED Examiner all spent time. Did they spend a significant amount of time? Maybe-the admin negotiated the contract, hired and trained staff; the ass't did the paperwork and tracking; and the GED examiner administered the tests. We wrestle with these issues and more. We have a teacher-led assessment team whose charge is to continue to improve the assessment process so that it is meaningful to all parties-students, teachers, program, all funders. The NRS is an imperfect system. As you know, each state creates its own benchmarks- so comparisons are impossible. Why should a state with low benchmarks be rewarded for meeting them? In Virginia, the benchmarks for level gains have become increasingly difficult to meet -given the short amount of time we have to teach. I can't condone "fudging" or "creative accounting" but I understand why it is done. We need to look at the NRS and its reward system as a starting point for reform. The system needs to change for the results to change. Susan Erno Adult Learning Center Charlottesville City Schools 1000 Preston Ave, Suite D Charlottesville, VA 22903 434 245-2817 f: 434 245-2601 >>> marie.cora at hotspurpartners.com 03/08/06 12:08 PM >>> That is the question*.. Hello all! Not too long ago, I received an email question regarding submitting accurate data to the states and the Feds. It appeared that the person was being pressured to make up data (assessment scores) so that the outcomes of the program looked better. I bet this story is not new to you * either because you have heard about it, or perhaps because it has happened to you. So I have some questions now: If programs fudge their data, won't that come back to haunt us all? Won't that skew standards and either force programs to under-perform or not allow them to reach their performance levels because they are too steep? Why would you want to fudge your data? At some point, most-likely the fudge will be revealed don't you think? We don't have nationwide standards * so if programs within states are reporting data in any which way, we won't be able to compare ourselves across states, will we? Since states have all different standards (and some don't have any), states can report in ways in which it makes them appear to be out-doing other states, when perhaps they are not at all? I'm probably mushing 2 different and important things together here: the accurate data part, and the standards part ("on what do we base our data") * but this is how it's playing out in my mind. Not only do we sometimes struggle with providing accurate data (for a variety of reasons: it's complex, it's messy, we feel pressure, sometimes things are unclear, etc.), but we do not have institutionalized standards across all states for all to be working in parallel fashion. What are your thoughts on this? Thanks,marie coraAssessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060308/7086c3c1/attachment.html From varshna at grandecom.net Thu Mar 9 07:30:50 2006 From: varshna at grandecom.net (Varshna Narumanchi-Jackson) Date: Thu, 09 Mar 2006 06:30:50 -0600 Subject: [Assessment] To fudge or not to fudge In-Reply-To: <01f101c642d2$ed9506d0$0402a8c0@frodo> Message-ID: Maria: I know the DOL requires states to undergo a data validation process in order to ensure the quality of the data they are receiving?does NRS or US Dept of Education? I?ve read several threads on this, and I?m not sure the problem is with the NRS or states? expectations. The problem will come when programs get to choose which students are included in program performance. The US Dept of Education has been slow to move on the Common Measures adopted across federal agencies funding employment and training programs. If you?re counting everyone who receives a qualifying service, then no one is creaming the population to enhance performance and we?re all using the same definitions for performance and outcomes, regardless of how the states define their benchmarks. Thanks, Varshna. on 3/8/06 11:08 AM, Marie Cora at marie.cora at hotspurpartners.com wrote: > That is the question?.. > > Hello all! Not too long ago, I received an email question regarding > submitting accurate data to the states and the Feds. It appeared that the > person was being pressured to make up data (assessment scores) so that the > outcomes of the program looked better. > > I bet this story is not new to you ? either because you have heard about it, > or perhaps because it has happened to you. > > So I have some questions now: > > If programs fudge their data, won?t that come back to haunt us all? Won?t > that skew standards and either force programs to under-perform or not allow > them to reach their performance levels because they are too steep? Why would > you want to fudge your data? At some point, most-likely the fudge will be > revealed don?t you think? > > We don?t have nationwide standards ? so if programs within states are > reporting data in any which way, we won?t be able to compare ourselves across > states, will we? > > Since states have all different standards (and some don?t have any), states > can report in ways in which it makes them appear to be out-doing other states, > when perhaps they are not at all? > > I?m probably mushing 2 different and important things together here: the > accurate data part, and the standards part (?on what do we base our data?) ? > but this is how it?s playing out in my mind. Not only do we sometimes struggle > with providing accurate data (for a variety of reasons: it?s complex, it?s > messy, we feel pressure, sometimes things are unclear, etc.), but we do not > have institutionalized standards across all states for all to be working in > parallel fashion. > > What are your thoughts on this? > > Thanks, > marie cora > Assessment Discussion List Moderator > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060309/0684babe/attachment.html From khinson at future-gate.com Fri Mar 10 07:50:28 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Fri, 10 Mar 2006 13:50:28 +0100 Subject: [Assessment] To fudge or not to fudge Message-ID: <44118424020000A0000021CD@fghamn01.ham.de.future-gate.com> That's a big question. I had to think about this one some before I could even begin to think about a response. I agree that it's not a new story and that it probably happens more than people realize. My personal opinion is that any time you tie funding primarily to performance, there are bound to be issues regarding data collection and I'm not sure there is a "neat" solution that will adress the problem. Also I think there are gaps in the data itself. I teach in addition to other duties I have at my institution and so often when I do outcomes, I don't feel they're a true indicator of a student's performance - a student may have made progress but not enough to move up a level - not enough to meet a goal or performance indicator - goals like that are accounted for. Another problem we've encountered with the data itself is the fact that if a students marks "find a job" as a goal and ends up joining the military, the goal isn't met either because of the way the data is cross references with ESC agencies, yet I think most people would agree that joining the military definitely qualifies as getting a job. I think that there are issues with the data collection instruments, that while they may have been validated at some point, they perhaps need to be reviewed to see if they are capturing the right data needed to show a program's real performance or if more needs to be taken into account when determining if a program is doing well. I don't think raw data alone can ever truly show a programs strenghts and weaknesses. I'm still digesting this topic. It definitely warrants thought. Regards Katrina Hinson >>> marie.cora at hotspurpartners.com >>> That is the question... Hello all! Not too long ago, I received an email question regarding submitting accurate data to the states and the Feds. It appeared that the person was being pressured to make up data (assessment scores) so that the outcomes of the program looked better. I bet this story is not new to you - either because you have heard about it, or perhaps because it has happened to you. So I have some questions now: If programs fudge their data, won't that come back to haunt us all? Won't that skew standards and either force programs to under-perform or not allow them to reach their performance levels because they are too steep? Why would you want to fudge your data? At some point, most-likely the fudge will be revealed don't you think? We don't have nationwide standards - so if programs within states are reporting data in any which way, we won't be able to compare ourselves across states, will we? Since states have all different standards (and some don't have any), states can report in ways in which it makes them appear to be out-doing other states, when perhaps they are not at all? I'm probably mushing 2 different and important things together here: the accurate data part, and the standards part ("on what do we base our data") - but this is how it's playing out in my mind. Not only do we sometimes struggle with providing accurate data (for a variety of reasons: it's complex, it's messy, we feel pressure, sometimes things are unclear, etc.), but we do not have institutionalized standards across all states for all to be working in parallel fashion. What are your thoughts on this? Thanks, marie cora Assessment Discussion List Moderator From varshna at grandecom.net Fri Mar 10 08:12:53 2006 From: varshna at grandecom.net (Varshna Narumanchi-Jackson) Date: Fri, 10 Mar 2006 07:12:53 -0600 Subject: [Assessment] To fudge or not to fudge In-Reply-To: <44118424020000A0000021CD@fghamn01.ham.de.future-gate.com> Message-ID: Katrina: Just thought I'd address one of the questions below: whether employment with the military is 'entered employment' for the purposes of WIA. It is my understanding that federal military wage records are part of the data set -- the other primary data set being UI wage records -- that states can utilize when evaluating employment outcomes. The question then becomes one of sharing data between Title I and Title II agencies for the purposes of evaluating program outcomes. It is not always clear to me that, when states have separate Title I and II agencies, this happens or that it happens with the kind of coordination expected at the federal level. The issue -- and I'm stating my opinion now -- is the attempt to make a distinction between education and training that (again, my opinion) has no practical relevance to the adults who seek ABE services and, maybe, has no relevance in the discussion of measuring the efficacy of federally-funded programs, especially when education appears to be considered a training service on the basis of its inclusion in federally-defined employment and training programs. Maybe that's a discussion for another list... I don't know if anyone has read the www.expectmore.gov page on Adult Education: http://www.whitehouse.gov/omb/expectmore/summary.10000180.2005.html It's interesting... Program Results and Accountability rates a ZERO. Thanks, Varshna on 3/10/06 6:50 AM, Katrina Hinson at khinson at future-gate.com wrote: > That's a big question. I had to think about this one some before I could even > begin to think about a response. I agree that it's not a new story and that > it probably happens more than people realize. My personal opinion is that any > time you tie funding primarily to performance, there are bound to be issues > regarding data collection and I'm not sure there is a "neat" solution that > will adress the problem. Also I think there are gaps in the data itself. I > teach in addition to other duties I have at my institution and so often when I > do outcomes, I don't feel they're a true indicator of a student's performance > - a student may have made progress but not enough to move up a level - not > enough to meet a goal or performance indicator - goals like that are accounted > for. Another problem we've encountered with the data itself is the fact that > if a students marks "find a job" as a goal and ends up joining the military, > the goal isn't met either because of the way the data is cross re > ferences with ESC agencies, yet I think most people would agree that joining > the military definitely qualifies as getting a job. > > I think that there are issues with the data collection instruments, that while > they may have been validated at some point, they perhaps need to be reviewed > to see if they are capturing the right data needed to show a program's real > performance or if more needs to be taken into account when determining if a > program is doing well. I don't think raw data alone can ever truly show a > programs strenghts and weaknesses. > > I'm still digesting this topic. It definitely warrants thought. > > Regards > Katrina Hinson > >>>> marie.cora at hotspurpartners.com >>> > That is the question... > > Hello all! Not too long ago, I received an email question regarding > submitting accurate data to the states and the Feds. It appeared that > the person was being pressured to make up data (assessment scores) so > that the outcomes of the program looked better. > > I bet this story is not new to you - either because you have heard about > it, or perhaps because it has happened to you. > > So I have some questions now: > > If programs fudge their data, won't that come back to haunt us all? > Won't that skew standards and either force programs to under-perform or > not allow them to reach their performance levels because they are too > steep? Why would you want to fudge your data? At some point, > most-likely the fudge will be revealed don't you think? > > We don't have nationwide standards - so if programs within states are > reporting data in any which way, we won't be able to compare ourselves > across states, will we? > > Since states have all different standards (and some don't have any), > states can report in ways in which it makes them appear to be out-doing > other states, when perhaps they are not at all? > > I'm probably mushing 2 different and important things together here: > the accurate data part, and the standards part ("on what do we base our > data") - but this is how it's playing out in my mind. Not only do we > sometimes struggle with providing accurate data (for a variety of > reasons: it's complex, it's messy, we feel pressure, sometimes things > are unclear, etc.), but we do not have institutionalized standards > across all states for all to be working in parallel fashion. > > What are your thoughts on this? > > Thanks, > marie cora > Assessment Discussion List Moderator > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > From marie.cora at hotspurpartners.com Fri Mar 10 12:52:21 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 10 Mar 2006 12:52:21 -0500 Subject: [Assessment] TABE frustration: a tempting fudge? Message-ID: <03be01c6446b$623f41a0$0402a8c0@frodo> Dear List members: the following post is from Tina Luffman. Hi NIFL listserv members, I use the TABE for state and national reporting purposes. Recently I have noticed a lot of students testing out of range and getting stuck taking three TABES, having to move to the "8" series to get a valid score. So far I have been able to get a valid score through this method. I can see where a program could be tempted to "fudge" a score on a student when the student is frustrated over being asked to take their third Reading TABE, for example. Is there any way that the developers of the TABE can make the gap lesser between their book levels or widen the white area of valid scores so we don't end up in this situation? Perhaps I am asking the wrong audience. Thanks, Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060310/8cbeff8d/attachment.html From varshna.jackson at twc.state.tx.us Fri Mar 10 12:54:10 2006 From: varshna.jackson at twc.state.tx.us (Jackson, Varshna) Date: Fri, 10 Mar 2006 11:54:10 -0600 Subject: [Assessment] TABE frustration: a tempting fudge? Message-ID: do you use the locator first? -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, March 10, 2006 11:52 AM To: assessment at nifl.gov Subject: Re: [Assessment] TABE frustration: a tempting fudge? Dear List members: the following post is from Tina Luffman. Hi NIFL listserv members, I use the TABE for state and national reporting purposes. Recently I have noticed a lot of students testing out of range and getting stuck taking three TABES, having to move to the "8" series to get a valid score. So far I have been able to get a valid score through this method. I can see where a program could be tempted to "fudge" a score on a student when the student is frustrated over being asked to take their third Reading TABE, for example. Is there any way that the developers of the TABE can make the gap lesser between their book levels or widen the white area of valid scores so we don't end up in this situation? Perhaps I am asking the wrong audience. Thanks, Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060310/b9d67700/attachment.html From Tina_Luffman at yc.edu Fri Mar 10 16:00:04 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Fri, 10 Mar 2006 14:00:04 -0700 Subject: [Assessment] TABE frustration: a tempting fudge? Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060310/59f23619/attachment.html From kabeall at comcast.net Mon Mar 13 13:28:55 2006 From: kabeall at comcast.net (Kaye Beall) Date: Mon, 13 Mar 2006 13:28:55 -0500 Subject: [Assessment 233] New from NCSALL--An Evaluation of Focus on Basics Message-ID: <004501c646cb$fce85db0$0302a8c0@your4105e587b6> "It's not an expensive journal, but has high quality articles with current research and techniques. . it helps me stay connected with the profession." says one reader of Focus on Basics. The results of a survey on the impact of Focus on Basics on its readers is available on the NCSALL Web site at http://www.ncsall.net/?id=29#27. To order a printed version ($10), go to http://www.ncsall.net/?id=681. (Printed copies will be available by 3/17/06.) SNEAK PREVIEW: The findings were overwhelmingly upbeat. The 292 readers who completed the survey report that Focus on Basics has had a positive impact in the following ways: . It has influenced their beliefs about adult basic education. . It has helped them feel connected to the larger education community as professionals. . It has contributed to the development of communities of practice. . It has enabled them to make a connection between research and practice. . It has provided them with concrete ideas they have used to change their programs and practice. Four in-depth interviews with professional development providers are included as well. Read the report to find out more about how the publication is and can be used as a professional development tool. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060313/5329be6f/attachment.html From marie.cora at hotspurpartners.com Tue Mar 14 11:13:55 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 14 Mar 2006 11:13:55 -0500 Subject: [Assessment 234] Thoughts on apples and oranges Message-ID: <002d01c64782$4b43c150$0402a8c0@frodo> Hi everyone, I'm going to pick up where the discussion left off last week - we were exploring some of the frustrations with standards, reported data, and goals. Several of you noted that because of the lack of national standards, it's tough or next to impossible to compare performance across programs or states. Yet part of what the federal system does is compare states to one another in terms of identifying recipients of things like incentive grants and so forth. States are required to report on how they are able to show gain via pre and post test scores - but as Andrea and Susan pointed out in their posts, there is no standardized method for showing this gain - each state creates its own benchmarks. What can we do about this? We need a national set of standards. But before that? Jane noted that state standards should be indicated within the submitted data - do any states do this? (probably not because they are not required). Would this help? Let's think this possibility through a little... Susan described several scenarios for us in which one aspect necessarily must suffer in order for another aspect to be recognized (feel familiar to you?). I hear this lament constantly: 'so as a program director, do I make sure my numbers work so I can continue to get funded to run my program, or do I not compromise the integrity of the teaching/learning process but run the risk of not showing good data?' (and then my program loses its funding, so integrity becomes a moot point). However, we must have an accountability system; I really don't believe anyone wants to throw around money without real proof that it's not being wasted. One of you noted that reform then, must happen at the root - at the NRS - what would that look like? Varshna - you asked if the NRS/DOE requirements included data validation as does DOLs requirements - not to my knowledge - but can anyone speak to this question? It's a good one. Varshna - do you believe that such data validation helps with the "apples/oranges" issue? How so? Finally, Katrina - you brought up the 'gaps in data' issue and cited the "unanticipated" goals situation as an example. This is also something we need to address: if a student changes a goal or achieves a goal that was not specifically set at the outset of the learning process - this happens all the time actually and is normal behavior: shifting and changing your goals based on your experience and progress can logically happen during a learning process. But often, these goals can get lost or don't get counted or cannot be counted because our system does not give us a way to show increments for example. What do we need? National standards? Is that the most important thing that will help combat these issues? A different way to capture learning? What would that look like? Remember that the needs of the funder and public are quite different than the needs of the teacher and student - and both are legitimate needs. What are your thoughts on these issues? Thanks, marie cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060314/b1278562/attachment.html From marie.cora at hotspurpartners.com Tue Mar 14 11:33:42 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 14 Mar 2006 11:33:42 -0500 Subject: [Assessment 235] Re: [Assessment] Tying data to performance and performance to data In-Reply-To: Message-ID: <004001c64785$0f46f9d0$0402a8c0@frodo> Hi Varshna and all, So are you saying that you believe the federal system separates employment and training for purposes of data collection only? And that it is not to distinguish between opportunities? Clarify this for us a bit if you can ok? I did go to the ExpectMore url you suggested below and I was intrigued by the Improvement Plan they propose: it focuses on developing uniform or standardized mechanisms for collecting data - they clearly believe that this will make an impact on performance. There is also this item: 'Measuring how program participation impacts an individual's earnings.' - What do folks think of this? Do people feel that we can effectively tie earnings to program participation (and vice versa)? Thanks, marie Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Varshna Narumanchi-Jackson Sent: Friday, March 10, 2006 8:13 AM To: The Assessment Discussion List Subject: Re: [Assessment] To fudge or not to fudge Katrina: Just thought I'd address one of the questions below: whether employment with the military is 'entered employment' for the purposes of WIA. It is my understanding that federal military wage records are part of the data set -- the other primary data set being UI wage records -- that states can utilize when evaluating employment outcomes. The question then becomes one of sharing data between Title I and Title II agencies for the purposes of evaluating program outcomes. It is not always clear to me that, when states have separate Title I and II agencies, this happens or that it happens with the kind of coordination expected at the federal level. The issue -- and I'm stating my opinion now -- is the attempt to make a distinction between education and training that (again, my opinion) has no practical relevance to the adults who seek ABE services and, maybe, has no relevance in the discussion of measuring the efficacy of federally-funded programs, especially when education appears to be considered a training service on the basis of its inclusion in federally-defined employment and training programs. Maybe that's a discussion for another list... I don't know if anyone has read the www.expectmore.gov page on Adult Education: http://www.whitehouse.gov/omb/expectmore/summary.10000180.2005.html It's interesting... Program Results and Accountability rates a ZERO. Thanks, Varshna on 3/10/06 6:50 AM, Katrina Hinson at khinson at future-gate.com wrote: > That's a big question. I had to think about this one some before I could even > begin to think about a response. I agree that it's not a new story and that > it probably happens more than people realize. My personal opinion is that any > time you tie funding primarily to performance, there are bound to be issues > regarding data collection and I'm not sure there is a "neat" solution that > will adress the problem. Also I think there are gaps in the data itself. I > teach in addition to other duties I have at my institution and so often when I > do outcomes, I don't feel they're a true indicator of a student's performance > - a student may have made progress but not enough to move up a level - not > enough to meet a goal or performance indicator - goals like that are accounted > for. Another problem we've encountered with the data itself is the fact that > if a students marks "find a job" as a goal and ends up joining the military, > the goal isn't met either because of the way the data is cross re > ferences with ESC agencies, yet I think most people would agree that joining > the military definitely qualifies as getting a job. > > I think that there are issues with the data collection instruments, that while > they may have been validated at some point, they perhaps need to be reviewed > to see if they are capturing the right data needed to show a program's real > performance or if more needs to be taken into account when determining if a > program is doing well. I don't think raw data alone can ever truly show a > programs strenghts and weaknesses. > > I'm still digesting this topic. It definitely warrants thought. > > Regards > Katrina Hinson > >>>> marie.cora at hotspurpartners.com >>> > That is the question... > > Hello all! Not too long ago, I received an email question regarding > submitting accurate data to the states and the Feds. It appeared that > the person was being pressured to make up data (assessment scores) so > that the outcomes of the program looked better. > > I bet this story is not new to you - either because you have heard about > it, or perhaps because it has happened to you. > > So I have some questions now: > > If programs fudge their data, won't that come back to haunt us all? > Won't that skew standards and either force programs to under-perform or > not allow them to reach their performance levels because they are too > steep? Why would you want to fudge your data? At some point, > most-likely the fudge will be revealed don't you think? > > We don't have nationwide standards - so if programs within states are > reporting data in any which way, we won't be able to compare ourselves > across states, will we? > > Since states have all different standards (and some don't have any), > states can report in ways in which it makes them appear to be out-doing > other states, when perhaps they are not at all? > > I'm probably mushing 2 different and important things together here: > the accurate data part, and the standards part ("on what do we base our > data") - but this is how it's playing out in my mind. Not only do we > sometimes struggle with providing accurate data (for a variety of > reasons: it's complex, it's messy, we feel pressure, sometimes things > are unclear, etc.), but we do not have institutionalized standards > across all states for all to be working in parallel fashion. > > What are your thoughts on this? > > Thanks, > marie cora > Assessment Discussion List Moderator > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From djrosen at comcast.net Tue Mar 14 23:05:24 2006 From: djrosen at comcast.net (David Rosen) Date: Tue, 14 Mar 2006 23:05:24 -0500 Subject: [Assessment 236] Re: : A National System of Adult Education and Literacy In-Reply-To: <002d01c64782$4b43c150$0402a8c0@frodo> References: <002d01c64782$4b43c150$0402a8c0@frodo> Message-ID: Assessment Colleagues, Marie wrote: > What do we need? National standards? Is that the most important > thing that will help combat these issues? > > A different way to capture learning? What would that look like? > Remember that the needs of the funder and public are quite > different than the needs of the teacher and student ? and both are > legitimate needs. > > What are your thoughts on these issues? Ignore for the moment the current political political realities, and consider just the merits and faults, not the practicalities, of what I propose, a national System of Adult Education and Literacy which has three aligned components: National Curriculum Standards, (Free) National Curricula, and Standardized Assessments. Such a system could have other components, but for now, I suggest we look at these three. 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, b) ABE (including adult basic education) c) ASE (adult secondary education/GED/EDP/ADP) and d) Transition to College programs , developed through a process which is widely respected by the field. (Some would argue that we already have that in Equipped for the Future.) 2. National curricula developed based on those standards and available for states to adopt (or adapt) as they choose. The curricula need to be comprehensive, modularized, available in generic as well as work-contextualized units, in English but also bilingual in Spanish and possibly other languages. It needs to be available free online in units that teachers could download and use in their classrooms, that tutors could use with their one-one-one or small group instruction, and in self-instructional formats that adult learners could use directly online. (Yes I know how big a task all this is.) 3. Standardized assessments developed against the national curriculum standards (tests, but also performance-based, direct assessments) which have a high degree of validity for measuring the national standards. Some might think that what I propose is too top-down. I would argue that it could be very bottom-up if the field -- and adult learner leaders -- are/have been/will be well-represented in setting the standards, and if the modules can be be selected to meet specific learner goals and contexts as well as to the standards. A national curriculum could be made up of a database of thousands of units of instruction (modules, learning objects) which could be very easily found and in minutes organized/reorganized to fit learners' goals and contexts. An adult learner or a group who need to improve their reading skills and who are interested in the context of parenting could easily access standards-based modules on parenting issues with reading materials at the right level(s). A teacher whose students worked in health care and who needed to improve their math skills could quickly find and download materials/lessons for using numeracy in health care settings. A student who wanted to learn online and who wanted a job in environmental cleanup work could access standards- based basic skills/occupational education lessons in this area, accompanied by an online career coach and and online tutor. These examples just hint at the complexity and sophistication of what I propose, and will have some shaking their heads at the cost. But, consider that if this is a national curriculum, the costs of developing such modules have the benefits of scale, that those curricula could be widely used -- and freely available. (Sorry publishers, this could eat into your profits.) There is more, but I'll stop with this. Okay, let the questions and brickbats fly. David J. Rosen djrosen at comcast.net From phandy at wcboe.org Wed Mar 15 09:07:50 2006 From: phandy at wcboe.org (PATRICIA HANDY) Date: Wed, 15 Mar 2006 09:07:50 -0500 Subject: [Assessment 237] Re: : A National System of Adult Education and Literacy Message-ID: David and All, As a practitioner for 27 years, now responsible for training new staff, I applaud your suggestions. I would not be appauding if you had proposed a rigid "this curriculum fits all" plan, but as to providing standardized resources from which each teacher or learner could customize a learning plan, YES! YES! Pat Handy 410-749-3217 Coordinator, Wicomico County Adult Learning Center Philmore Commons, Salisbury Confidentiality Note: This message may contain confidential information intended only for the use of the person named above and may contain communication protected by law. If you have received this message in error, you are hereby notified that any dissemination, distribution, copying or other use of this message is prohibited and you are requested to notify the sender immediately at his/her electronic mail. >>> djrosen at comcast.net 03/14/06 11:05 PM >>> Assessment Colleagues, Marie wrote: > What do we need? National standards? Is that the most important > thing that will help combat these issues? > > A different way to capture learning? What would that look like? > Remember that the needs of the funder and public are quite > different than the needs of the teacher and student * and both are > legitimate needs. > > What are your thoughts on these issues? Ignore for the moment the current political political realities, and consider just the merits and faults, not the practicalities, of what I propose, a national System of Adult Education and Literacy which has three aligned components: National Curriculum Standards, (Free) National Curricula, and Standardized Assessments. Such a system could have other components, but for now, I suggest we look at these three. 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, b) ABE (including adult basic education) c) ASE (adult secondary education/GED/EDP/ADP) and d) Transition to College programs , developed through a process which is widely respected by the field. (Some would argue that we already have that in Equipped for the Future.) 2. National curricula developed based on those standards and available for states to adopt (or adapt) as they choose. The curricula need to be comprehensive, modularized, available in generic as well as work-contextualized units, in English but also bilingual in Spanish and possibly other languages. It needs to be available free online in units that teachers could download and use in their classrooms, that tutors could use with their one-one-one or small group instruction, and in self-instructional formats that adult learners could use directly online. (Yes I know how big a task all this is.) 3. Standardized assessments developed against the national curriculum standards (tests, but also performance-based, direct assessments) which have a high degree of validity for measuring the national standards. Some might think that what I propose is too top-down. I would argue that it could be very bottom-up if the field -- and adult learner leaders -- are/have been/will be well-represented in setting the standards, and if the modules can be be selected to meet specific learner goals and contexts as well as to the standards. A national curriculum could be made up of a database of thousands of units of instruction (modules, learning objects) which could be very easily found and in minutes organized/reorganized to fit learners' goals and contexts. An adult learner or a group who need to improve their reading skills and who are interested in the context of parenting could easily access standards-based modules on parenting issues with reading materials at the right level(s). A teacher whose students worked in health care and who needed to improve their math skills could quickly find and download materials/lessons for using numeracy in health care settings. A student who wanted to learn online and who wanted a job in environmental cleanup work could access standards- based basic skills/occupational education lessons in this area, accompanied by an online career coach and and online tutor. These examples just hint at the complexity and sophistication of what I propose, and will have some shaking their heads at the cost. But, consider that if this is a national curriculum, the costs of developing such modules have the benefits of scale, that those curricula could be widely used -- and freely available. (Sorry publishers, this could eat into your profits.) There is more, but I'll stop with this. Okay, let the questions and brickbats fly. David J. Rosen djrosen at comcast.net ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Ajit.Gopalakrishnan at ct.gov Wed Mar 15 09:48:11 2006 From: Ajit.Gopalakrishnan at ct.gov (Gopalakrishnan, Ajit) Date: Wed, 15 Mar 2006 09:48:11 -0500 Subject: [Assessment 238] Re: Thoughts on apples and oranges Message-ID: <281DD0D97E3EC94FB83030B1379CE42601D9EC46@DOIT-EX302.exec.ds.state.ct.us> My comments on some of the issues are below. Thanks. Ajit Ajit Gopalakrishnan Connecticut Department of Education 25 Industrial Park Road Middletown, CT 06457 Phone: (860) 807-2125 Fax: (860) 807-2062 Email: ajit.gopalakrishnan at ct.gov ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, March 14, 2006 11:14 AM To: assessment at nifl.gov Subject: [Assessment 234] Thoughts on apples and oranges Hi everyone, I'm going to pick up where the discussion left off last week - we were exploring some of the frustrations with standards, reported data, and goals. Several of you noted that because of the lack of national standards, it's tough or next to impossible to compare performance across programs or states. Actually, I think that the NRS accomplishes just that by outlining a nationally agreed upon set of skill levels and level descriptors. The NRS does not prescribe what standards a state has to use. It simply tries to compare the performance outcomes resulting from whatever standards the state is using. Yet part of what the federal system does is compare states to one another in terms of identifying recipients of things like incentive grants and so forth. I believe the federal system does not compare states to each other to determine incentive awards. That determination is based on a state's performance relative to its own performance targets. There is some level of state-to-state comparison in the process of establishing performance targets. States are required to report on how they are able to show gain via pre and post test scores - but as Andrea and Susan pointed out in their posts, there is no standardized method for showing this gain - each state creates its own benchmarks. I don't believe this statement is correct. The NRS benchmarks don't vary by state. The standardized method for showing gain is the NRS which requires the use of approved standardized assessments. What can we do about this? We need a national set of standards. But before that? Jane noted that state standards should be indicated within the submitted data - do any states do this? (probably not because they are not required). Would this help? Let's think this possibility through a little..... Susan described several scenarios for us in which one aspect necessarily must suffer in order for another aspect to be recognized (feel familiar to you?). I hear this lament constantly: 'so as a program director, do I make sure my numbers work so I can continue to get funded to run my program, or do I not compromise the integrity of the teaching/learning process but run the risk of not showing good data?' (and then my program loses its funding, so integrity becomes a moot point). However, we must have an accountability system; I really don't believe anyone wants to throw around money without real proof that it's not being wasted. One of you noted that reform then, must happen at the root - at the NRS - what would that look like? I have not read the original scenarios but have heard similar feelings expressed by some practitioners. This tends to be a no-win situation for those practitioners. My feeling is that state and local administrators need to be completely committed to maximizing the value of standardized assessments, and using data for decision making (not just data about test scores but also data relative to recruitment, retention, attendance, goal setting, etc. and data that might sometimes be very unflattering). Varshna - you asked if the NRS/DOE requirements included data validation as does DOLs requirements - not to my knowledge - but can anyone speak to this question? It's a good one. Varshna - do you believe that such data validation helps with the "apples/oranges" issue? How so? The NRS does require an extensive data validation checklist that looks at data structures, data systems, policy requirements, edit checks, and professional development. States submit this checklist along with their end-of-year data submission. Finally, Katrina - you brought up the 'gaps in data' issue and cited the "unanticipated" goals situation as an example. This is also something we need to address: if a student changes a goal or achieves a goal that was not specifically set at the outset of the learning process - this happens all the time actually and is normal behavior: shifting and changing your goals based on your experience and progress can logically happen during a learning process. But often, these goals can get lost or don't get counted or cannot be counted because our system does not give us a way to show increments for example. Yes, currently only goal-based outcomes relative to employment, postsecondary and diploma are of the greatest importance for NRS reporting. States can establish accountability systems that reward (monetary or otherwise) other outcomes if they so choose. What do we need? National standards? Is that the most important thing that will help combat these issues? A different way to capture learning? What would that look like? Remember that the needs of the funder and public are quite different than the needs of the teacher and student - and both are legitimate needs. What are your thoughts on these issues? Thanks, marie cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060315/2eb6725b/attachment.html From hdooley at riral.org Wed Mar 15 10:44:53 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Wed, 15 Mar 2006 10:44:53 -0500 Subject: [Assessment 239] Re: : A National System of Adult Education and Literacy In-Reply-To: References: <002d01c64782$4b43c150$0402a8c0@frodo> Message-ID: <44183675.8040303@riral.org> I think David has a good, basic overall plan here. I wouldn't say the plan is top-down either. I think it recognizes that there is push today to be able to look at success across states and throughout the country, and for that we need a way to connect our local efforts into a national system. Think globally; act locally -- as always the best politics and the best basis for a system of adult ed. But my sense is that, right now, funders are in favor of such a national system, but most practitioners are not. Because, really, the benefits of such a system are largely for the funders, policy makers, and big-picture people; for the instructor and learner in the classroom, what is the impact of it? How does it matter that what I need to learn and am mastering to get a job in RI is also what someone needs to learn and master to enter a community college in AL? It may be interesting, but what does it matter? I also think that many of the standards, curriculum and assessment pieces already exist. If one has the time -- and right now it takes time, believe me -- to peruse and ferret the web, you can find a wealth of excellent curricula that is the start of a "comprehensive, modularized [curriculum], available in generic as well as work-contextualized units, in English". Much of it "available in free online in units that teachers could download and use in their classrooms, that tutors could use with their one-one-one or smallgroup instruction". We use several items for our EL Civics, ESL listening and ABE math curriculua that are from the web. The weakest link for us is "material in self-instructional formats that adult learners can use directly online." There's a lot of print stuff that's been transferred to the web, put it's not exciting or constructivist enough to engage self-directed learners, unless they are high level readers and highly self-motivated. So, I think we could get there more quickly than we might think, but only if most of us really want to get there at all. From a sincere, big-picture kind-of-guy, Howard D. David Rosen wrote: >Assessment Colleagues, > >Marie wrote: > > >>What do we need? National standards? Is that the most important >>thing that will help combat these issues? >> >>A different way to capture learning? What would that look like? >>Remember that the needs of the funder and public are quite >>different than the needs of the teacher and student ? and both are >>legitimate needs. >> >>What are your thoughts on these issues? >> >> > >Ignore for the moment the current political political realities, and >consider just the merits and faults, not the practicalities, of what >I propose, a national System of Adult Education and Literacy which >has three aligned components: National Curriculum Standards, (Free) >National Curricula, and Standardized Assessments. Such a system >could have other components, but for now, I suggest we look at these >three. > >1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >b) ABE (including adult basic education) c) ASE (adult secondary >education/GED/EDP/ADP) and d) Transition to College programs , >developed through a process which is widely respected by the field. >(Some would argue that we already have that in Equipped for the Future.) > >2. National curricula developed based on those standards and >available for states to adopt (or adapt) as they choose. The >curricula need to be comprehensive, modularized, available in generic >as well as work-contextualized units, in English but also bilingual >in Spanish and possibly other languages. It needs to be available >free online in units that teachers could download and use in their >classrooms, that tutors could use with their one-one-one or small >group instruction, and in self-instructional formats that adult >learners could use directly online. (Yes I know how big a task all >this is.) > >3. Standardized assessments developed against the national curriculum >standards (tests, but also performance-based, direct assessments) >which have a high degree of validity for measuring the national >standards. > >Some might think that what I propose is too top-down. I would argue >that it could be very bottom-up if the field -- and adult learner >leaders -- are/have been/will be well-represented in setting the >standards, and if the modules can be be selected to meet specific >learner goals and contexts as well as to the standards. A national >curriculum could be made up of a database of thousands of units of >instruction (modules, learning objects) which could be very easily >found and in minutes organized/reorganized to fit learners' goals and >contexts. An adult learner or a group who need to improve their >reading skills and who are interested in the context of parenting >could easily access standards-based modules on parenting issues with >reading materials at the right level(s). A teacher whose students >worked in health care and who needed to improve their math skills >could quickly find and download materials/lessons for using numeracy >in health care settings. A student who wanted to learn online and who >wanted a job in environmental cleanup work could access standards- >based basic skills/occupational education lessons in this area, >accompanied by an online career coach and and online tutor. These >examples just hint at the complexity and sophistication of what I >propose, and will have some shaking their heads at the cost. But, >consider that if this is a national curriculum, the costs of >developing such modules have the benefits of scale, that those >curricula could be widely used -- and freely available. (Sorry >publishers, this could eat into your profits.) > >There is more, but I'll stop with this. > >Okay, let the questions and brickbats fly. > >David J. Rosen >djrosen at comcast.net > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > From djrosen at comcast.net Wed Mar 15 13:09:06 2006 From: djrosen at comcast.net (David Rosen) Date: Wed, 15 Mar 2006 13:09:06 -0500 Subject: [Assessment 239] Re: : A National System of Adult Education and Literacy In-Reply-To: <44183675.8040303@riral.org> References: <002d01c64782$4b43c150$0402a8c0@frodo> <44183675.8040303@riral.org> Message-ID: <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> Howard, Thanks for your thoughtful comments. See my replies below. I hope others will join in this discussion, too, from the Assessment list and from the Content Standards list. David J. Rosen djrosen at comcast.net On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: > I think David has a good, basic overall plan here. I wouldn't say the > plan is top-down either. I think it recognizes that there is push > today > to be able to look at success across states and throughout the > country, > and for that we need a way to connect our local efforts into a > national > system. Think globally; act locally -- as always the best politics > and > the best basis for a system of adult ed. But my sense is that, right > now, funders are in favor of such a national system, but most > practitioners are not. I wish that funders _were_ in favor of this. The largest adult education funder, the U.S. Department of Education, is reluctant to establish a set of national curriculum standards. I am not sure why, but guess that it is because a long-standing tradition that curriculum standards in American Education are in the control of local school committees and state boards of education. The closest the USDOE has come to this is funding the development of a "warehouse" of state curriculum standards, [ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] and (through the National Institute for Literacy) supporting the development -- but not the endorsement as national curriculum standards - of Equipped for the Future. (I am not sure I have that exactly right so if someone has better information, please let us know.) The problem, as many people have said, is not that we lack standards in the U.S., but that we have too many competing sets of standards. We lack a set of national standards that everyone uses. > Because, really, the benefits of such a system > are largely for the funders, policy makers, and big-picture people; > for > the instructor and learner in the classroom, what is the impact of it? > How does it matter that what I need to learn and am mastering to get a > job in RI is also what someone needs to learn and master to enter a > community college in AL? It may be interesting, but what does it > matter? I agree that a system such as I propose would benefit funders. However, it would also benefit teachers and learners. A lot of curriculum -- often very good curriculum -- is developed in programs and states across the country. But much of it is not published, and if it is, is not easily accessed. It is possible to find some good curriculum through NIFL LINCS, and in other places on the Web, for example, but this takes time, a lot of time. Teachers don't have much time to search for curriculum. It would be of great interest to most teachers if high quality curriculum --ready to download and use -- and adapt to local needs -- in class tomorrow could _easily be found_. Let me give you an example. As I understand it (folks from Arizona correct me if I got this wrong) Arizona has a set of state ESOL standards that are widely used, and respected by ESOL teachers there. A couple of ESOL teachers at Pima County Community College decided that they were useful as far as they went, but they wanted to have good web-based instruction linked to those standards. So they spent hours and hours finding -- and linking -- instruction on a Web page that they call The Splendid ESOL Web [ http://cc.pima.edu/ ~slundquist/index.htm ] When I was doing workshops in Arizona a couple of years ago, ESOL teachers popped up from across the state to tell me about The Splendid ESOL Web and how useful it is to them. This is instructional for us all: a set of standards developed by and respected by teachers, a set of online instructional resources found and organized/linked by ESOL Teachers, and widely used by other ESOL teachers. This sounds like a model to emulate in national curriculum development. Take this a step further. Suppose we had an agreed-upon format for developing instructional resources, nothing fancy, one that most teachers found easy to understand, easy to use, and that was linked to national standards. Suppose further that the format referenced national curriculum standards, that every lesson or module or learning object built by a teacher referenced a national curriculum standard. Then suppose the modules teachers developed were peer- reviewed and those that were approved were stored in an easily- accessible Web-based instructional lesson/module/learning objects database where other teachers could access them by standard, topic, level, etc. Some of the elements of what I have described are in place. For example, the Lesson Plan Builder, developed by OTAN in California [ http://www.lessonplanbuilder.org/lessons/ ], has a practical format for creating lesson plans online, and links them to California (and nationally used) standards. OTAN plans to store these lessons in an accessible database. When that's done, the teacher's chore of finding good lesson plans will be easier. Also, I very much like that these are lesson plans created "bottom up" by teachers (or perhaps even by teachers and their students together.) > I also think that many of the standards, curriculum and assessment > pieces already exist. If one has the time -- and right now it takes > time, believe me -- to peruse and ferret the web, you can find a > wealth > of excellent curricula that is the start of a "comprehensive, > modularized [curriculum], available in generic as well as > work-contextualized units, in English". Yes, much of it is there -- and it's hard to find. Some of it is not there, however. Try to find work-contextualized online lessons which students can access directly (not teacher lesson plans but student lessons online.) I have been searching high and low for these -- in health care work -- but haven't found much. Yet, given the good jobs going begging in health care in New England -- and elsewhere -- wouldn't it be useful if health care workers could do some of their basic skills learning online and if the instruction were contextualized or embedded in health care work? > Much of it "available in free > online in units that teachers could download and use in their > classrooms, that tutors could use with their one-one-one or smallgroup > instruction". We use several items for our EL Civics, ESL > listening and > ABE math curriculua that are from the web. The weakest link for us is > "material in self-instructional formats that adult learners can use > directly online." Yes, that is the weakest link. > There's a lot of print stuff that's been transferred > to the web, put it's not exciting or constructivist enough to engage > self-directed learners, unless they are high level readers and highly > self-motivated. Right you are. > So, I think we could get there more quickly than we might think, but > only if most of us really want to get there at all. Your state, Rhode Island, the first wireless Internet access state, border-to-border, would be a perfect "testbed" for a system such as I am proposing. I think if teachers and tutors understood how useful this could be they would clamor for it. Maybe you could get teachers in Rhode Island to think about this. > From a sincere, big-picture kind-of-guy, > Howard D. > > > > > > > > David Rosen wrote: > >> Assessment Colleagues, >> >> Marie wrote: >> >> >>> What do we need? National standards? Is that the most important >>> thing that will help combat these issues? >>> >>> A different way to capture learning? What would that look like? >>> Remember that the needs of the funder and public are quite >>> different than the needs of the teacher and student ? and both are >>> legitimate needs. >>> >>> What are your thoughts on these issues? >>> >>> >> >> Ignore for the moment the current political political realities, and >> consider just the merits and faults, not the practicalities, of what >> I propose, a national System of Adult Education and Literacy which >> has three aligned components: National Curriculum Standards, (Free) >> National Curricula, and Standardized Assessments. Such a system >> could have other components, but for now, I suggest we look at these >> three. >> >> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >> b) ABE (including adult basic education) c) ASE (adult secondary >> education/GED/EDP/ADP) and d) Transition to College programs , >> developed through a process which is widely respected by the field. >> (Some would argue that we already have that in Equipped for the >> Future.) >> >> 2. National curricula developed based on those standards and >> available for states to adopt (or adapt) as they choose. The >> curricula need to be comprehensive, modularized, available in generic >> as well as work-contextualized units, in English but also bilingual >> in Spanish and possibly other languages. It needs to be available >> free online in units that teachers could download and use in their >> classrooms, that tutors could use with their one-one-one or small >> group instruction, and in self-instructional formats that adult >> learners could use directly online. (Yes I know how big a task all >> this is.) >> >> 3. Standardized assessments developed against the national curriculum >> standards (tests, but also performance-based, direct assessments) >> which have a high degree of validity for measuring the national >> standards. >> >> Some might think that what I propose is too top-down. I would argue >> that it could be very bottom-up if the field -- and adult learner >> leaders -- are/have been/will be well-represented in setting the >> standards, and if the modules can be be selected to meet specific >> learner goals and contexts as well as to the standards. A national >> curriculum could be made up of a database of thousands of units of >> instruction (modules, learning objects) which could be very easily >> found and in minutes organized/reorganized to fit learners' goals and >> contexts. An adult learner or a group who need to improve their >> reading skills and who are interested in the context of parenting >> could easily access standards-based modules on parenting issues with >> reading materials at the right level(s). A teacher whose students >> worked in health care and who needed to improve their math skills >> could quickly find and download materials/lessons for using numeracy >> in health care settings. A student who wanted to learn online and who >> wanted a job in environmental cleanup work could access standards- >> based basic skills/occupational education lessons in this area, >> accompanied by an online career coach and and online tutor. These >> examples just hint at the complexity and sophistication of what I >> propose, and will have some shaking their heads at the cost. But, >> consider that if this is a national curriculum, the costs of >> developing such modules have the benefits of scale, that those >> curricula could be widely used -- and freely available. (Sorry >> publishers, this could eat into your profits.) >> >> There is more, but I'll stop with this. >> >> Okay, let the questions and brickbats fly. >> >> David J. Rosen >> djrosen at comcast.net >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> >> >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From andreawilder at comcast.net Wed Mar 15 13:26:26 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Wed, 15 Mar 2006 13:26:26 -0500 Subject: [Assessment 240] Re: : A National System of Adult Education and Literacy In-Reply-To: References: <002d01c64782$4b43c150$0402a8c0@frodo> Message-ID: <7d676cf83497ae87b4b850a96da2ef9d@comcast.net> David, I use the web all the time for research and information; I like the proposal. EFF has a couple of obvious strengths: a common vocabulary and tasks/performances which go along with different learning topics. Do you have any proposals for teacher training? You need to do a pilot study in a couple of places. Andrea On Mar 14, 2006, at 11:05 PM, David Rosen wrote: > Assessment Colleagues, > > Marie wrote: >> What do we need? National standards? Is that the most important >> thing that will help combat these issues? >> >> A different way to capture learning? What would that look like? >> Remember that the needs of the funder and public are quite >> different than the needs of the teacher and student ? and both are >> legitimate needs. >> >> What are your thoughts on these issues? > > Ignore for the moment the current political political realities, and > consider just the merits and faults, not the practicalities, of what > I propose, a national System of Adult Education and Literacy which > has three aligned components: National Curriculum Standards, (Free) > National Curricula, and Standardized Assessments. Such a system > could have other components, but for now, I suggest we look at these > three. > > 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, > b) ABE (including adult basic education) c) ASE (adult secondary > education/GED/EDP/ADP) and d) Transition to College programs , > developed through a process which is widely respected by the field. > (Some would argue that we already have that in Equipped for the > Future.) > > 2. National curricula developed based on those standards and > available for states to adopt (or adapt) as they choose. The > curricula need to be comprehensive, modularized, available in generic > as well as work-contextualized units, in English but also bilingual > in Spanish and possibly other languages. It needs to be available > free online in units that teachers could download and use in their > classrooms, that tutors could use with their one-one-one or small > group instruction, and in self-instructional formats that adult > learners could use directly online. (Yes I know how big a task all > this is.) > > 3. Standardized assessments developed against the national curriculum > standards (tests, but also performance-based, direct assessments) > which have a high degree of validity for measuring the national > standards. > > Some might think that what I propose is too top-down. I would argue > that it could be very bottom-up if the field -- and adult learner > leaders -- are/have been/will be well-represented in setting the > standards, and if the modules can be be selected to meet specific > learner goals and contexts as well as to the standards. A national > curriculum could be made up of a database of thousands of units of > instruction (modules, learning objects) which could be very easily > found and in minutes organized/reorganized to fit learners' goals and > contexts. An adult learner or a group who need to improve their > reading skills and who are interested in the context of parenting > could easily access standards-based modules on parenting issues with > reading materials at the right level(s). A teacher whose students > worked in health care and who needed to improve their math skills > could quickly find and download materials/lessons for using numeracy > in health care settings. A student who wanted to learn online and who > wanted a job in environmental cleanup work could access standards- > based basic skills/occupational education lessons in this area, > accompanied by an online career coach and and online tutor. These > examples just hint at the complexity and sophistication of what I > propose, and will have some shaking their heads at the cost. But, > consider that if this is a national curriculum, the costs of > developing such modules have the benefits of scale, that those > curricula could be widely used -- and freely available. (Sorry > publishers, this could eat into your profits.) > > There is more, but I'll stop with this. > > Okay, let the questions and brickbats fly. > > David J. Rosen > djrosen at comcast.net > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From marie.cora at hotspurpartners.com Wed Mar 15 14:15:45 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 15 Mar 2006 14:15:45 -0500 Subject: [Assessment 241] Re: Thoughts on apples and oranges In-Reply-To: <281DD0D97E3EC94FB83030B1379CE42601D9EC46@DOIT-EX302.exec.ds.state.ct.us> Message-ID: <00ee01c64864$dd141c50$0402a8c0@frodo> Hi Ajit, Thanks for your comments and the clarifications, this is helpful. You and I discussed this bit: Several of you noted that because of the lack of national standards, it's tough or next to impossible to compare performance across programs or states. Actually, I think that the NRS accomplishes just that by outlining a nationally agreed upon set of skill levels and level descriptors. The NRS does not prescribe what standards a state has to use. It simply tries to compare the performance outcomes resulting from whatever standards the state is using. I guess I still don't feel that states can truly compare their performance. Just because each state has the endpoint of adhering to the NRS guidelines, does not mean that they are all doing the same thing to arrive there. For me, that is really important. One state's standards could be more or less rigorous than another's. And put that together with each state having a different way of collecting and interpreting data. I know that we are all focused on reaching the goals (of the NRS), but I think the goal is always affected by the process leading up to it. What do you think? marie -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gopalakrishnan, Ajit Sent: Wednesday, March 15, 2006 9:48 AM To: The Assessment Discussion List Subject: [Assessment 238] Re: Thoughts on apples and oranges My comments on some of the issues are below. Thanks. Ajit Ajit Gopalakrishnan Connecticut Department of Education 25 Industrial Park Road Middletown, CT 06457 Phone: (860) 807-2125 Fax: (860) 807-2062 Email: ajit.gopalakrishnan at ct.gov _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, March 14, 2006 11:14 AM To: assessment at nifl.gov Subject: [Assessment 234] Thoughts on apples and oranges Hi everyone, I'm going to pick up where the discussion left off last week - we were exploring some of the frustrations with standards, reported data, and goals. Several of you noted that because of the lack of national standards, it's tough or next to impossible to compare performance across programs or states. Actually, I think that the NRS accomplishes just that by outlining a nationally agreed upon set of skill levels and level descriptors. The NRS does not prescribe what standards a state has to use. It simply tries to compare the performance outcomes resulting from whatever standards the state is using. Yet part of what the federal system does is compare states to one another in terms of identifying recipients of things like incentive grants and so forth. I believe the federal system does not compare states to each other to determine incentive awards. That determination is based on a state's performance relative to its own performance targets. There is some level of state-to-state comparison in the process of establishing performance targets. States are required to report on how they are able to show gain via pre and post test scores - but as Andrea and Susan pointed out in their posts, there is no standardized method for showing this gain - each state creates its own benchmarks. I don't believe this statement is correct. The NRS benchmarks don't vary by state. The standardized method for showing gain is the NRS which requires the use of approved standardized assessments. What can we do about this? We need a national set of standards. But before that? Jane noted that state standards should be indicated within the submitted data - do any states do this? (probably not because they are not required). Would this help? Let's think this possibility through a little... Susan described several scenarios for us in which one aspect necessarily must suffer in order for another aspect to be recognized (feel familiar to you?). I hear this lament constantly: 'so as a program director, do I make sure my numbers work so I can continue to get funded to run my program, or do I not compromise the integrity of the teaching/learning process but run the risk of not showing good data?' (and then my program loses its funding, so integrity becomes a moot point). However, we must have an accountability system; I really don't believe anyone wants to throw around money without real proof that it's not being wasted. One of you noted that reform then, must happen at the root - at the NRS - what would that look like? I have not read the original scenarios but have heard similar feelings expressed by some practitioners. This tends to be a no-win situation for those practitioners. My feeling is that state and local administrators need to be completely committed to maximizing the value of standardized assessments, and using data for decision making (not just data about test scores but also data relative to recruitment, retention, attendance, goal setting, etc. and data that might sometimes be very unflattering). Varshna - you asked if the NRS/DOE requirements included data validation as does DOLs requirements - not to my knowledge - but can anyone speak to this question? It's a good one. Varshna - do you believe that such data validation helps with the "apples/oranges" issue? How so? The NRS does require an extensive data validation checklist that looks at data structures, data systems, policy requirements, edit checks, and professional development. States submit this checklist along with their end-of-year data submission. Finally, Katrina - you brought up the 'gaps in data' issue and cited the "unanticipated" goals situation as an example. This is also something we need to address: if a student changes a goal or achieves a goal that was not specifically set at the outset of the learning process - this happens all the time actually and is normal behavior: shifting and changing your goals based on your experience and progress can logically happen during a learning process. But often, these goals can get lost or don't get counted or cannot be counted because our system does not give us a way to show increments for example. Yes, currently only goal-based outcomes relative to employment, postsecondary and diploma are of the greatest importance for NRS reporting. States can establish accountability systems that reward (monetary or otherwise) other outcomes if they so choose. What do we need? National standards? Is that the most important thing that will help combat these issues? A different way to capture learning? What would that look like? Remember that the needs of the funder and public are quite different than the needs of the teacher and student - and both are legitimate needs. What are your thoughts on these issues? Thanks, marie cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060315/34000089/attachment.html From marie.cora at hotspurpartners.com Wed Mar 15 13:26:46 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 15 Mar 2006 13:26:46 -0500 Subject: [Assessment 242] FW: [AAACE-NLA] Question: Does learning to take CASAS tests preparestudents to take the GED? Message-ID: <00d401c6485e$054b6270$0402a8c0@frodo> Dear List Colleagues, I thought perhaps some of you might have thoughts on this post from the NLA. Thanks, marie cora Assessment Discussion List Moderator ----- Hello all, I have been asked by a colleague if exposing students to the CASAS tests can be considered preparation for teaching them to take the GED, because learning to take CASAS tests involves learning to deal with multiple choice questions and how to use standardized answer sheets. I don't really feel that the CASAS provides much preparation for the GED because the kinds of questions that are required for the GED are far more complex and require more academic knowledge than the CASAS, even the advanced CASAS tests. This is an ongoing discussion. So, I would be interested in what others think, including detailed explanations of why you think what you do. Best, Sylvie Kashdan, M.A. Instructor/Curriculum Coordinator KAIZEN PROGRAM for New English Learners with Visual Limitations 810-A Hiawatha Place South Seattle, WA 98144, U.S.A. phone: (206) 784-5619 email: kaizen at literacyworks.org web: http://www.nwlincs.org/kaizen/ _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org From andreawilder at comcast.net Wed Mar 15 16:17:46 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Wed, 15 Mar 2006 16:17:46 -0500 Subject: [Assessment 243] Re: : A National System of Adult Education and Literacy In-Reply-To: <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> References: <002d01c64782$4b43c150$0402a8c0@frodo> <44183675.8040303@riral.org> <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> Message-ID: <7b751a8711a77147bf40fa2eb0647810@comcast.net> David, Another thought: the text and module writing--how about getting some really well-known people to do some lessons? With web help provided? In looking at texts, it is often the person who writes the text who interests me.... Andrea On Mar 15, 2006, at 1:09 PM, David Rosen wrote: > Howard, > > Thanks for your thoughtful comments. See my replies below. I hope > others will join in this discussion, too, from the Assessment list > and from the Content Standards list. > > David J. Rosen > djrosen at comcast.net > > On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: > >> I think David has a good, basic overall plan here. I wouldn't say the >> plan is top-down either. I think it recognizes that there is push >> today >> to be able to look at success across states and throughout the >> country, >> and for that we need a way to connect our local efforts into a >> national >> system. Think globally; act locally -- as always the best politics >> and >> the best basis for a system of adult ed. But my sense is that, right >> now, funders are in favor of such a national system, but most >> practitioners are not. > > I wish that funders _were_ in favor of this. The largest adult > education funder, the U.S. Department of Education, is reluctant to > establish a set of national curriculum standards. I am not sure why, > but guess that it is because a long-standing tradition that > curriculum standards in American Education are in the control of > local school committees and state boards of education. The closest > the USDOE has come to this is funding the development of a > "warehouse" of state curriculum standards, > [ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] and > (through the National Institute for Literacy) supporting the > development -- but not the endorsement as national curriculum > standards - of Equipped for the Future. (I am not sure I have that > exactly right so if someone has better information, please let us > know.) The problem, as many people have said, is not that we lack > standards in the U.S., but that we have too many competing sets of > standards. We lack a set of national standards that everyone uses. > >> Because, really, the benefits of such a system >> are largely for the funders, policy makers, and big-picture people; >> for >> the instructor and learner in the classroom, what is the impact of it? >> How does it matter that what I need to learn and am mastering to get a >> job in RI is also what someone needs to learn and master to enter a >> community college in AL? It may be interesting, but what does it >> matter? > > I agree that a system such as I propose would benefit funders. > However, it would also benefit teachers and learners. A lot of > curriculum -- often very good curriculum -- is developed in programs > and states across the country. But much of it is not published, and > if it is, is not easily accessed. It is possible to find some good > curriculum through NIFL LINCS, and in other places on the Web, for > example, but this takes time, a lot of time. Teachers don't have > much time to search for curriculum. It would be of great interest to > most teachers if high quality curriculum --ready to download and use > -- and adapt to local needs -- in class tomorrow could _easily be > found_. > > Let me give you an example. As I understand it (folks from Arizona > correct me if I got this wrong) Arizona has a set of state ESOL > standards that are widely used, and respected by ESOL teachers > there. A couple of ESOL teachers at Pima County Community College > decided that they were useful as far as they went, but they wanted to > have good web-based instruction linked to those standards. So they > spent hours and hours finding -- and linking -- instruction on a Web > page that they call The Splendid ESOL Web [ http://cc.pima.edu/ > ~slundquist/index.htm ] When I was doing workshops in Arizona a > couple of years ago, ESOL teachers popped up from across the state to > tell me about The Splendid ESOL Web and how useful it is to them. > This is instructional for us all: a set of standards developed by and > respected by teachers, a set of online instructional resources found > and organized/linked by ESOL Teachers, and widely used by other ESOL > teachers. This sounds like a model to emulate in national curriculum > development. > > Take this a step further. Suppose we had an agreed-upon format for > developing instructional resources, nothing fancy, one that most > teachers found easy to understand, easy to use, and that was linked > to national standards. Suppose further that the format referenced > national curriculum standards, that every lesson or module or > learning object built by a teacher referenced a national curriculum > standard. Then suppose the modules teachers developed were peer- > reviewed and those that were approved were stored in an easily- > accessible Web-based instructional lesson/module/learning objects > database where other teachers could access them by standard, topic, > level, etc. Some of the elements of what I have described are in > place. For example, the Lesson Plan Builder, developed by OTAN in > California > [ http://www.lessonplanbuilder.org/lessons/ ], has a practical format > for creating lesson plans online, and links them to California (and > nationally used) standards. OTAN plans to store these lessons in an > accessible database. When that's done, the teacher's chore of finding > good lesson plans will be easier. Also, I very much like that these > are lesson plans created "bottom up" by teachers (or perhaps even by > teachers and their students together.) > >> I also think that many of the standards, curriculum and assessment >> pieces already exist. If one has the time -- and right now it takes >> time, believe me -- to peruse and ferret the web, you can find a >> wealth >> of excellent curricula that is the start of a "comprehensive, >> modularized [curriculum], available in generic as well as >> work-contextualized units, in English". > > Yes, much of it is there -- and it's hard to find. Some of it is not > there, however. Try to find work-contextualized online lessons which > students can access directly (not teacher lesson plans but student > lessons online.) I have been searching high and low for these -- in > health care work -- but haven't found much. Yet, given the good jobs > going begging in health care in New England -- and elsewhere -- > wouldn't it be useful if health care workers could do some of their > basic skills learning online and if the instruction were > contextualized or embedded in health care work? > >> Much of it "available in free >> online in units that teachers could download and use in their >> classrooms, that tutors could use with their one-one-one or smallgroup >> instruction". We use several items for our EL Civics, ESL >> listening and >> ABE math curriculua that are from the web. The weakest link for us is >> "material in self-instructional formats that adult learners can use >> directly online." > > Yes, that is the weakest link. > >> There's a lot of print stuff that's been transferred >> to the web, put it's not exciting or constructivist enough to engage >> self-directed learners, unless they are high level readers and highly >> self-motivated. > > Right you are. > >> So, I think we could get there more quickly than we might think, but >> only if most of us really want to get there at all. > > Your state, Rhode Island, the first wireless Internet access state, > border-to-border, would be a perfect "testbed" for a system such as I > am proposing. I think if teachers and tutors understood how useful > this could be they would clamor for it. Maybe you could get teachers > in Rhode Island to think about this. > >> From a sincere, big-picture kind-of-guy, >> Howard D. > > >> >> >> >> >> >> >> >> David Rosen wrote: >> >>> Assessment Colleagues, >>> >>> Marie wrote: >>> >>> >>>> What do we need? National standards? Is that the most important >>>> thing that will help combat these issues? >>>> >>>> A different way to capture learning? What would that look like? >>>> Remember that the needs of the funder and public are quite >>>> different than the needs of the teacher and student ? and both are >>>> legitimate needs. >>>> >>>> What are your thoughts on these issues? >>>> >>>> >>> >>> Ignore for the moment the current political political realities, and >>> consider just the merits and faults, not the practicalities, of what >>> I propose, a national System of Adult Education and Literacy which >>> has three aligned components: National Curriculum Standards, (Free) >>> National Curricula, and Standardized Assessments. Such a system >>> could have other components, but for now, I suggest we look at these >>> three. >>> >>> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >>> b) ABE (including adult basic education) c) ASE (adult secondary >>> education/GED/EDP/ADP) and d) Transition to College programs , >>> developed through a process which is widely respected by the field. >>> (Some would argue that we already have that in Equipped for the >>> Future.) >>> >>> 2. National curricula developed based on those standards and >>> available for states to adopt (or adapt) as they choose. The >>> curricula need to be comprehensive, modularized, available in generic >>> as well as work-contextualized units, in English but also bilingual >>> in Spanish and possibly other languages. It needs to be available >>> free online in units that teachers could download and use in their >>> classrooms, that tutors could use with their one-one-one or small >>> group instruction, and in self-instructional formats that adult >>> learners could use directly online. (Yes I know how big a task all >>> this is.) >>> >>> 3. Standardized assessments developed against the national curriculum >>> standards (tests, but also performance-based, direct assessments) >>> which have a high degree of validity for measuring the national >>> standards. >>> >>> Some might think that what I propose is too top-down. I would argue >>> that it could be very bottom-up if the field -- and adult learner >>> leaders -- are/have been/will be well-represented in setting the >>> standards, and if the modules can be be selected to meet specific >>> learner goals and contexts as well as to the standards. A national >>> curriculum could be made up of a database of thousands of units of >>> instruction (modules, learning objects) which could be very easily >>> found and in minutes organized/reorganized to fit learners' goals and >>> contexts. An adult learner or a group who need to improve their >>> reading skills and who are interested in the context of parenting >>> could easily access standards-based modules on parenting issues with >>> reading materials at the right level(s). A teacher whose students >>> worked in health care and who needed to improve their math skills >>> could quickly find and download materials/lessons for using numeracy >>> in health care settings. A student who wanted to learn online and who >>> wanted a job in environmental cleanup work could access standards- >>> based basic skills/occupational education lessons in this area, >>> accompanied by an online career coach and and online tutor. These >>> examples just hint at the complexity and sophistication of what I >>> propose, and will have some shaking their heads at the cost. But, >>> consider that if this is a national curriculum, the costs of >>> developing such modules have the benefits of scale, that those >>> curricula could be widely used -- and freely available. (Sorry >>> publishers, this could eat into your profits.) >>> >>> There is more, but I'll stop with this. >>> >>> Okay, let the questions and brickbats fly. >>> >>> David J. Rosen >>> djrosen at comcast.net >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >>> >>> >>> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From bguthrie at tamdistrict.org Wed Mar 15 18:21:10 2006 From: bguthrie at tamdistrict.org (Guthrie, Burr) Date: Wed, 15 Mar 2006 15:21:10 -0800 Subject: [Assessment 243] Re: FW: [AAACE-NLA] Question: Does learning to takeCASAS tests preparestudents to take the GED? Message-ID: <49CB3434FAB899479558F6AF0F8D9318037F96E6@tammail.tuhsd.edu> I think that merely exposing students to the CASAS with the hopes of improving GED results is not worth it. CASAS is designed as a needs assessment (pre-test) and the results are correlated to CASAS competencies and materials to teach those competencies. Gains in understanding the competencies are then revealed on a CASAS post-test. There is a research brief on the CASAS website (www.casas.org) that suggests as students move up from CASAS level C to D, data shows a significant increase in GED pass rates. However, it is likely that those students were exposed to materials based on their test results, and had significant instruction time before they tackled the GED. Burr Guthrie -----Original Message----- From: Marie Cora [mailto:marie.cora at hotspurpartners.com] Sent: Wednesday, March 15, 2006 10:27 AM To: assessment at nifl.gov Cc: kaizen at literacyworks.org Subject: [Assessment 242] FW: [AAACE-NLA] Question: Does learning to takeCASAS tests preparestudents to take the GED? Dear List Colleagues, I thought perhaps some of you might have thoughts on this post from the NLA. Thanks, marie cora Assessment Discussion List Moderator ----- Hello all, I have been asked by a colleague if exposing students to the CASAS tests can be considered preparation for teaching them to take the GED, because learning to take CASAS tests involves learning to deal with multiple choice questions and how to use standardized answer sheets. I don't really feel that the CASAS provides much preparation for the GED because the kinds of questions that are required for the GED are far more complex and require more academic knowledge than the CASAS, even the advanced CASAS tests. This is an ongoing discussion. So, I would be interested in what others think, including detailed explanations of why you think what you do. Best, Sylvie Kashdan, M.A. Instructor/Curriculum Coordinator KAIZEN PROGRAM for New English Learners with Visual Limitations 810-A Hiawatha Place South Seattle, WA 98144, U.S.A. phone: (206) 784-5619 email: kaizen at literacyworks.org web: http://www.nwlincs.org/kaizen/ _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org From khinson at future-gate.com Wed Mar 15 18:21:04 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Thu, 16 Mar 2006 00:21:04 +0100 Subject: [Assessment 244] Re: FW: [AAACE-NLA] Question: Does learning to take CASAS tests preparestudents to take In-Reply-To: <00d401c6485e$054b6270$0402a8c0@frodo> References: <00d401c6485e$054b6270$0402a8c0@frodo> Message-ID: <4418AF70020000A000002248@fghamn01.ham.de.future-gate.com> I'd have to agree with Sylvia, and have to say the concern she expresses is equally valid for both the TABE and CASAS. Neither test in and of itself is an adequate measure of being properly prepared to take the GED. Neither test has the same type of complex questions that a student may find on the GED -and the math area definitely high lights this. I am a GED instructor among many hats that I wear. I see every day, students come in and are given the TABE test, score what would be considered well on the 9/10 A level test but still have and demonstrate weaknesses in terms of being prepared for the GED. Our state policy is that if a student scores anywhere in the highschool range, then he/she can take the official practice test for that area- in other words, if a student scores 9.5 on Reading on the TABE, then he or she is allowed to practice test on Reading, SCI, and Social Studies. However the GED tests on a 12.9+ level and often the students don't get the minimum score on the practice tests that he/she needs to take Test on the GED. For me it comes down to Apples and Oranges. The TABE and CASAS are placement tests but not always adequate or efficient measures of a person's ability to pass the GED test. I think they can be good guidelines for a teacher to use to best help a student achieve their goal of attaining his or her GED. I like to do teacher made assessments to get a better feel for what my students really know...how they think and how they process information. I want to see how they do when it comes to evaluating, analyzing, and synthesizing data...which are higher order test questions that are on the GED and NOT on the TABE or CASAS. Students can sometimes answer a question that is fairly cut and dry/black and white/concrete - but the moment that question becomes abstract in any way, even if the skills being tested are the same as one they might have just done, the student often struggles. I think students need to be exposed to many different kinds of questions and I don't think the TABE or CASAS alone should be what is used to measure a person's readiness. A more complete picture is needed that sometimes involves other assessment measures, whether it's teacher made materials, portfolios etc. Regards Katrina Hinson >>> marie.cora at hotspurpartners.com 03/15/06 1:26 PM >>> Dear List Colleagues, I thought perhaps some of you might have thoughts on this post from the NLA. Thanks, marie cora Assessment Discussion List Moderator ----- Hello all, I have been asked by a colleague if exposing students to the CASAS tests can be considered preparation for teaching them to take the GED, because learning to take CASAS tests involves learning to deal with multiple choice questions and how to use standardized answer sheets. I don't really feel that the CASAS provides much preparation for the GED because the kinds of questions that are required for the GED are far more complex and require more academic knowledge than the CASAS, even the advanced CASAS tests. This is an ongoing discussion. So, I would be interested in what others think, including detailed explanations of why you think what you do. Best, Sylvie Kashdan, M.A. Instructor/Curriculum Coordinator KAIZEN PROGRAM for New English Learners with Visual Limitations 810-A Hiawatha Place South Seattle, WA 98144, U.S.A. phone: (206) 784-5619 email: kaizen at literacyworks.org web: http://www.nwlincs.org/kaizen/ _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From varshna at grandecom.net Thu Mar 16 08:29:55 2006 From: varshna at grandecom.net (Varshna Narumanchi-Jackson) Date: Thu, 16 Mar 2006 07:29:55 -0600 Subject: [Assessment 245] FW: National standards In-Reply-To: <00da01c64862$c58488b0$0402a8c0@frodo> Message-ID: Just continuing the thread that has been going on...we may have evolved beyond this point... Personally, I think there is way too much emphasis on standards development when very good ones exist. It?s interesting that the same battles we tend to see in K12 over local control of education permeate adult education circles. Each state has to reinvent the wheel. It?s a waste of public funds and lacks the kind of policy direction that ED needs to exert. And this is, in my humble opinion, an area which ED will tackle once it has dealt with higher education. So, ABE can put itself ahead of the curve or watch its funding continue to be challenged. This leads me to some questions: what is the value we add to adults? lives as providers of adult basic education? Do people earn more money? Do they get to keep their jobs because they meet a requirement in the workplace? Do they move on to higher education? There seems to be a real disconnect between what adults need to accomplish in their lives and how we approach our jobs. I highly recommend that people look at the attempts of Title I (Department of Labor) programs to ensure uniformity of data, establish common measures (and DOL has adopted the NRS definitions for math/literacy/ESOL), and look at the more sweeping and fundamental changes that have been proposed. The argument goes: if we live in a 21st economy (global, knowledge-based), why haven?t we updated the educational and training models that were developed on the economies of early 20th century? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060316/e6285036/attachment.html From andreawilder at comcast.net Thu Mar 16 08:47:30 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Thu, 16 Mar 2006 08:47:30 -0500 Subject: [Assessment 246] Re: : A National System of Adult Education and Literacy In-Reply-To: <7b751a8711a77147bf40fa2eb0647810@comcast.net> References: <002d01c64782$4b43c150$0402a8c0@frodo> <44183675.8040303@riral.org> <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> <7b751a8711a77147bf40fa2eb0647810@comcast.net> Message-ID: David, Anyone else on this list-- I feel more and more lately as though I am sunning myself on the Titanic. We have 3 big challenges as a country, and as a member of the world community of nations: 1) global warming 2) energy 3) population All of these three issues have the capacity to sink us as a global village. People have talked about these issues as though they would happen in some following generation, but they happen now, are happening now. One of those Greenland glaciers that is breaking off into icebergs could very well sink our ship. This may seem an odd place to bring this up, but "assessment" is what we should be doing in relation to these three crises, and ANY curriculum we propose or support. I suppose there is the "hope" that bird flu and HIV/AIDS will decimate enough populations so everyone else will survive. The Black Death took 1/3 of the population of Europe, after all, but that seems a chancy way of thinking. If enough Greenland ice melts, then the Gulf Stream won't go as far north, will turn around and head south sooner, and northern Europe/Russia, start experiencing very cold weather, this makes northern living conditions difficult. This probably happened this winter, earth science people have noted a slowing down of the GS. Our weather is controlled by ribbon-like ocean currents that link the oceans in a conveyor belt. If you want more info, troll the web. Andrea On Mar 15, 2006, at 4:17 PM, Andrea Wilder wrote: > David, > > Another thought: the text and module writing--how about getting some > really well-known people to do some lessons? With web help provided? > > In looking at texts, it is often the person who writes the text who > interests me.... > > Andrea > > On Mar 15, 2006, at 1:09 PM, David Rosen wrote: > >> Howard, >> >> Thanks for your thoughtful comments. See my replies below. I hope >> others will join in this discussion, too, from the Assessment list >> and from the Content Standards list. >> >> David J. Rosen >> djrosen at comcast.net >> >> On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: >> >>> I think David has a good, basic overall plan here. I wouldn't say >>> the >>> plan is top-down either. I think it recognizes that there is push >>> today >>> to be able to look at success across states and throughout the >>> country, >>> and for that we need a way to connect our local efforts into a >>> national >>> system. Think globally; act locally -- as always the best politics >>> and >>> the best basis for a system of adult ed. But my sense is that, right >>> now, funders are in favor of such a national system, but most >>> practitioners are not. >> >> I wish that funders _were_ in favor of this. The largest adult >> education funder, the U.S. Department of Education, is reluctant to >> establish a set of national curriculum standards. I am not sure why, >> but guess that it is because a long-standing tradition that >> curriculum standards in American Education are in the control of >> local school committees and state boards of education. The closest >> the USDOE has come to this is funding the development of a >> "warehouse" of state curriculum standards, >> [ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] and >> (through the National Institute for Literacy) supporting the >> development -- but not the endorsement as national curriculum >> standards - of Equipped for the Future. (I am not sure I have that >> exactly right so if someone has better information, please let us >> know.) The problem, as many people have said, is not that we lack >> standards in the U.S., but that we have too many competing sets of >> standards. We lack a set of national standards that everyone uses. >> >>> Because, really, the benefits of such a system >>> are largely for the funders, policy makers, and big-picture people; >>> for >>> the instructor and learner in the classroom, what is the impact of >>> it? >>> How does it matter that what I need to learn and am mastering to get >>> a >>> job in RI is also what someone needs to learn and master to enter a >>> community college in AL? It may be interesting, but what does it >>> matter? >> >> I agree that a system such as I propose would benefit funders. >> However, it would also benefit teachers and learners. A lot of >> curriculum -- often very good curriculum -- is developed in programs >> and states across the country. But much of it is not published, and >> if it is, is not easily accessed. It is possible to find some good >> curriculum through NIFL LINCS, and in other places on the Web, for >> example, but this takes time, a lot of time. Teachers don't have >> much time to search for curriculum. It would be of great interest to >> most teachers if high quality curriculum --ready to download and use >> -- and adapt to local needs -- in class tomorrow could _easily be >> found_. >> >> Let me give you an example. As I understand it (folks from Arizona >> correct me if I got this wrong) Arizona has a set of state ESOL >> standards that are widely used, and respected by ESOL teachers >> there. A couple of ESOL teachers at Pima County Community College >> decided that they were useful as far as they went, but they wanted to >> have good web-based instruction linked to those standards. So they >> spent hours and hours finding -- and linking -- instruction on a Web >> page that they call The Splendid ESOL Web [ http://cc.pima.edu/ >> ~slundquist/index.htm ] When I was doing workshops in Arizona a >> couple of years ago, ESOL teachers popped up from across the state to >> tell me about The Splendid ESOL Web and how useful it is to them. >> This is instructional for us all: a set of standards developed by and >> respected by teachers, a set of online instructional resources found >> and organized/linked by ESOL Teachers, and widely used by other ESOL >> teachers. This sounds like a model to emulate in national curriculum >> development. >> >> Take this a step further. Suppose we had an agreed-upon format for >> developing instructional resources, nothing fancy, one that most >> teachers found easy to understand, easy to use, and that was linked >> to national standards. Suppose further that the format referenced >> national curriculum standards, that every lesson or module or >> learning object built by a teacher referenced a national curriculum >> standard. Then suppose the modules teachers developed were peer- >> reviewed and those that were approved were stored in an easily- >> accessible Web-based instructional lesson/module/learning objects >> database where other teachers could access them by standard, topic, >> level, etc. Some of the elements of what I have described are in >> place. For example, the Lesson Plan Builder, developed by OTAN in >> California >> [ http://www.lessonplanbuilder.org/lessons/ ], has a practical format >> for creating lesson plans online, and links them to California (and >> nationally used) standards. OTAN plans to store these lessons in an >> accessible database. When that's done, the teacher's chore of finding >> good lesson plans will be easier. Also, I very much like that these >> are lesson plans created "bottom up" by teachers (or perhaps even by >> teachers and their students together.) >> >>> I also think that many of the standards, curriculum and assessment >>> pieces already exist. If one has the time -- and right now it takes >>> time, believe me -- to peruse and ferret the web, you can find a >>> wealth >>> of excellent curricula that is the start of a "comprehensive, >>> modularized [curriculum], available in generic as well as >>> work-contextualized units, in English". >> >> Yes, much of it is there -- and it's hard to find. Some of it is not >> there, however. Try to find work-contextualized online lessons which >> students can access directly (not teacher lesson plans but student >> lessons online.) I have been searching high and low for these -- in >> health care work -- but haven't found much. Yet, given the good jobs >> going begging in health care in New England -- and elsewhere -- >> wouldn't it be useful if health care workers could do some of their >> basic skills learning online and if the instruction were >> contextualized or embedded in health care work? >> >>> Much of it "available in free >>> online in units that teachers could download and use in their >>> classrooms, that tutors could use with their one-one-one or >>> smallgroup >>> instruction". We use several items for our EL Civics, ESL >>> listening and >>> ABE math curriculua that are from the web. The weakest link for us >>> is >>> "material in self-instructional formats that adult learners can use >>> directly online." >> >> Yes, that is the weakest link. >> >>> There's a lot of print stuff that's been transferred >>> to the web, put it's not exciting or constructivist enough to engage >>> self-directed learners, unless they are high level readers and highly >>> self-motivated. >> >> Right you are. >> >>> So, I think we could get there more quickly than we might think, but >>> only if most of us really want to get there at all. >> >> Your state, Rhode Island, the first wireless Internet access state, >> border-to-border, would be a perfect "testbed" for a system such as I >> am proposing. I think if teachers and tutors understood how useful >> this could be they would clamor for it. Maybe you could get teachers >> in Rhode Island to think about this. >> >>> From a sincere, big-picture kind-of-guy, >>> Howard D. >> >> >>> >>> >>> >>> >>> >>> >>> >>> David Rosen wrote: >>> >>>> Assessment Colleagues, >>>> >>>> Marie wrote: >>>> >>>> >>>>> What do we need? National standards? Is that the most important >>>>> thing that will help combat these issues? >>>>> >>>>> A different way to capture learning? What would that look like? >>>>> Remember that the needs of the funder and public are quite >>>>> different than the needs of the teacher and student ? and both are >>>>> legitimate needs. >>>>> >>>>> What are your thoughts on these issues? >>>>> >>>>> >>>> >>>> Ignore for the moment the current political political realities, and >>>> consider just the merits and faults, not the practicalities, of what >>>> I propose, a national System of Adult Education and Literacy which >>>> has three aligned components: National Curriculum Standards, (Free) >>>> National Curricula, and Standardized Assessments. Such a system >>>> could have other components, but for now, I suggest we look at these >>>> three. >>>> >>>> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >>>> b) ABE (including adult basic education) c) ASE (adult secondary >>>> education/GED/EDP/ADP) and d) Transition to College programs , >>>> developed through a process which is widely respected by the field. >>>> (Some would argue that we already have that in Equipped for the >>>> Future.) >>>> >>>> 2. National curricula developed based on those standards and >>>> available for states to adopt (or adapt) as they choose. The >>>> curricula need to be comprehensive, modularized, available in >>>> generic >>>> as well as work-contextualized units, in English but also bilingual >>>> in Spanish and possibly other languages. It needs to be available >>>> free online in units that teachers could download and use in their >>>> classrooms, that tutors could use with their one-one-one or small >>>> group instruction, and in self-instructional formats that adult >>>> learners could use directly online. (Yes I know how big a task all >>>> this is.) >>>> >>>> 3. Standardized assessments developed against the national >>>> curriculum >>>> standards (tests, but also performance-based, direct assessments) >>>> which have a high degree of validity for measuring the national >>>> standards. >>>> >>>> Some might think that what I propose is too top-down. I would argue >>>> that it could be very bottom-up if the field -- and adult learner >>>> leaders -- are/have been/will be well-represented in setting the >>>> standards, and if the modules can be be selected to meet specific >>>> learner goals and contexts as well as to the standards. A national >>>> curriculum could be made up of a database of thousands of units of >>>> instruction (modules, learning objects) which could be very easily >>>> found and in minutes organized/reorganized to fit learners' goals >>>> and >>>> contexts. An adult learner or a group who need to improve their >>>> reading skills and who are interested in the context of parenting >>>> could easily access standards-based modules on parenting issues with >>>> reading materials at the right level(s). A teacher whose students >>>> worked in health care and who needed to improve their math skills >>>> could quickly find and download materials/lessons for using numeracy >>>> in health care settings. A student who wanted to learn online and >>>> who >>>> wanted a job in environmental cleanup work could access standards- >>>> based basic skills/occupational education lessons in this area, >>>> accompanied by an online career coach and and online tutor. These >>>> examples just hint at the complexity and sophistication of what I >>>> propose, and will have some shaking their heads at the cost. But, >>>> consider that if this is a national curriculum, the costs of >>>> developing such modules have the benefits of scale, that those >>>> curricula could be widely used -- and freely available. (Sorry >>>> publishers, this could eat into your profits.) >>>> >>>> There is more, but I'll stop with this. >>>> >>>> Okay, let the questions and brickbats fly. >>>> >>>> David J. Rosen >>>> djrosen at comcast.net >>>> >>>> ------------------------------- >>>> National Institute for Literacy >>>> Assessment mailing list >>>> Assessment at nifl.gov >>>> To unsubscribe or change your subscription settings, please go to >>>> http://www.nifl.gov/mailman/listinfo/assessment >>>> >>>> >>>> >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From khinson at future-gate.com Thu Mar 16 20:12:42 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Fri, 17 Mar 2006 02:12:42 +0100 Subject: [Assessment 247] Re: : A National System of Adult Education and Literacy In-Reply-To: References: <002d01c64782$4b43c150$0402a8c0@frodo> <44183675.8040303@riral.org> <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> <7b751a8711a77147bf40fa2eb0647810@comcast.net> Message-ID: <441A1B1A020000A00000225E@fghamn01.ham.de.future-gate.com> I can't do very much about global warming, nor any energy crisis that may exist nor can I control the population. What I can do as an educator is teach my students how to work towards stemming global warming in their own environments; conserve energy and weather proof their homes, save gas etc.,- and can counsel my younger and sometimes older students that having more children than you can afford to take care of or want is not the wisest course of action . Teaching our students isn't always about teaching textbook material - it's about helping to create a competent global citizenry. IT goes beyond a book or test and into the day to day survival needs. I think a bigger challenge we face as a country is educating the population. Education is more than tests and there are many, many people around the country that would argue we're no longer teaching students to think independantly for themselves but we are teaching them how to pass various multiple choice tests - whether it's the EOG, CASAS, TABE, GED etc. yet we're not giving students the skills he/she needs to be able to think creatively nor intuitively. I heard a quote recently - "If you do the same thing, the same way, all the time, you'll always get the same results." The challenge is realizing that doing things the same way doesn't always produce positive results. Perhaps if we worked really hard on addressing the challenge of education, the challenges of global warming, energy and population might be met and overcome by the students we've yet to meet. Perhaps many of the issues we face in the world would be, could be addressed by having an educated and intelligent population that's open minded enough to see outside their own small sphere of influence. Just a thought. Katrina ================ David, Anyone else on this list-- I feel more and more lately as though I am sunning myself on the Titanic. We have 3 big challenges as a country, and as a member of the world community of nations: 1) global warming 2) energy 3) population All of these three issues have the capacity to sink us as a global village. People have talked about these issues as though they would happen in some following generation, but they happen now, are happening now. One of those Greenland glaciers that is breaking off into icebergs could very well sink our ship. This may seem an odd place to bring this up, but "assessment" is what we should be doing in relation to these three crises, and ANY curriculum we propose or support. I suppose there is the "hope" that bird flu and HIV/AIDS will decimate enough populations so everyone else will survive. The Black Death took 1/3 of the population of Europe, after all, but that seems a chancy way of thinking. If enough Greenland ice melts, then the Gulf Stream won't go as far north, will turn around and head south sooner, and northern Europe/Russia, start experiencing very cold weather, this makes northern living conditions difficult. This probably happened this winter, earth science people have noted a slowing down of the GS. Our weather is controlled by ribbon-like ocean currents that link the oceans in a conveyor belt. If you want more info, troll the web. Andrea On Mar 15, 2006, at 4:17 PM, Andrea Wilder wrote: > David, > > Another thought: the text and module writing--how about getting some > really well-known people to do some lessons? With web help provided? > > In looking at texts, it is often the person who writes the text who > interests me.... > > Andrea > > On Mar 15, 2006, at 1:09 PM, David Rosen wrote: > >> Howard, >> >> Thanks for your thoughtful comments. See my replies below. I hope >> others will join in this discussion, too, from the Assessment list >> and from the Content Standards list. >> >> David J. Rosen >> djrosen at comcast.net >> >> On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: >> >>> I think David has a good, basic overall plan here. I wouldn't say >>> the >>> plan is top-down either. I think it recognizes that there is push >>> today >>> to be able to look at success across states and throughout the >>> country, >>> and for that we need a way to connect our local efforts into a >>> national >>> system. Think globally; act locally -- as always the best politics >>> and >>> the best basis for a system of adult ed. But my sense is that, right >>> now, funders are in favor of such a national system, but most >>> practitioners are not. >> >> I wish that funders _were_ in favor of this. The largest adult >> education funder, the U.S. Department of Education, is reluctant to >> establish a set of national curriculum standards. I am not sure why, >> but guess that it is because a long-standing tradition that >> curriculum standards in American Education are in the control of >> local school committees and state boards of education. The closest >> the USDOE has come to this is funding the development of a >> "warehouse" of state curriculum standards, >> [ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] and >> (through the National Institute for Literacy) supporting the >> development -- but not the endorsement as national curriculum >> standards - of Equipped for the Future. (I am not sure I have that >> exactly right so if someone has better information, please let us >> know.) The problem, as many people have said, is not that we lack >> standards in the U.S., but that we have too many competing sets of >> standards. We lack a set of national standards that everyone uses. >> >>> Because, really, the benefits of such a system >>> are largely for the funders, policy makers, and big-picture people; >>> for >>> the instructor and learner in the classroom, what is the impact of >>> it? >>> How does it matter that what I need to learn and am mastering to get >>> a >>> job in RI is also what someone needs to learn and master to enter a >>> community college in AL? It may be interesting, but what does it >>> matter? >> >> I agree that a system such as I propose would benefit funders. >> However, it would also benefit teachers and learners. A lot of >> curriculum -- often very good curriculum -- is developed in programs >> and states across the country. But much of it is not published, and >> if it is, is not easily accessed. It is possible to find some good >> curriculum through NIFL LINCS, and in other places on the Web, for >> example, but this takes time, a lot of time. Teachers don't have >> much time to search for curriculum. It would be of great interest to >> most teachers if high quality curriculum --ready to download and use >> -- and adapt to local needs -- in class tomorrow could _easily be >> found_. >> >> Let me give you an example. As I understand it (folks from Arizona >> correct me if I got this wrong) Arizona has a set of state ESOL >> standards that are widely used, and respected by ESOL teachers >> there. A couple of ESOL teachers at Pima County Community College >> decided that they were useful as far as they went, but they wanted to >> have good web-based instruction linked to those standards. So they >> spent hours and hours finding -- and linking -- instruction on a Web >> page that they call The Splendid ESOL Web [ http://cc.pima.edu/ >> ~slundquist/index.htm ] When I was doing workshops in Arizona a >> couple of years ago, ESOL teachers popped up from across the state to >> tell me about The Splendid ESOL Web and how useful it is to them. >> This is instructional for us all: a set of standards developed by and >> respected by teachers, a set of online instructional resources found >> and organized/linked by ESOL Teachers, and widely used by other ESOL >> teachers. This sounds like a model to emulate in national curriculum >> development. >> >> Take this a step further. Suppose we had an agreed-upon format for >> developing instructional resources, nothing fancy, one that most >> teachers found easy to understand, easy to use, and that was linked >> to national standards. Suppose further that the format referenced >> national curriculum standards, that every lesson or module or >> learning object built by a teacher referenced a national curriculum >> standard. Then suppose the modules teachers developed were peer- >> reviewed and those that were approved were stored in an easily- >> accessible Web-based instructional lesson/module/learning objects >> database where other teachers could access them by standard, topic, >> level, etc. Some of the elements of what I have described are in >> place. For example, the Lesson Plan Builder, developed by OTAN in >> California >> [ http://www.lessonplanbuilder.org/lessons/ ], has a practical format >> for creating lesson plans online, and links them to California (and >> nationally used) standards. OTAN plans to store these lessons in an >> accessible database. When that's done, the teacher's chore of finding >> good lesson plans will be easier. Also, I very much like that these >> are lesson plans created "bottom up" by teachers (or perhaps even by >> teachers and their students together.) >> >>> I also think that many of the standards, curriculum and assessment >>> pieces already exist. If one has the time -- and right now it takes >>> time, believe me -- to peruse and ferret the web, you can find a >>> wealth >>> of excellent curricula that is the start of a "comprehensive, >>> modularized [curriculum], available in generic as well as >>> work-contextualized units, in English". >> >> Yes, much of it is there -- and it's hard to find. Some of it is not >> there, however. Try to find work-contextualized online lessons which >> students can access directly (not teacher lesson plans but student >> lessons online.) I have been searching high and low for these -- in >> health care work -- but haven't found much. Yet, given the good jobs >> going begging in health care in New England -- and elsewhere -- >> wouldn't it be useful if health care workers could do some of their >> basic skills learning online and if the instruction were >> contextualized or embedded in health care work? >> >>> Much of it "available in free >>> online in units that teachers could download and use in their >>> classrooms, that tutors could use with their one-one-one or >>> smallgroup >>> instruction". We use several items for our EL Civics, ESL >>> listening and >>> ABE math curriculua that are from the web. The weakest link for us >>> is >>> "material in self-instructional formats that adult learners can use >>> directly online." >> >> Yes, that is the weakest link. >> >>> There's a lot of print stuff that's been transferred >>> to the web, put it's not exciting or constructivist enough to engage >>> self-directed learners, unless they are high level readers and highly >>> self-motivated. >> >> Right you are. >> >>> So, I think we could get there more quickly than we might think, but >>> only if most of us really want to get there at all. >> >> Your state, Rhode Island, the first wireless Internet access state, >> border-to-border, would be a perfect "testbed" for a system such as I >> am proposing. I think if teachers and tutors understood how useful >> this could be they would clamor for it. Maybe you could get teachers >> in Rhode Island to think about this. >> >>> From a sincere, big-picture kind-of-guy, >>> Howard D. >> >> >>> >>> >>> >>> >>> >>> >>> >>> David Rosen wrote: >>> >>>> Assessment Colleagues, >>>> >>>> Marie wrote: >>>> >>>> >>>>> What do we need? National standards? Is that the most important >>>>> thing that will help combat these issues? >>>>> >>>>> A different way to capture learning? What would that look like? >>>>> Remember that the needs of the funder and public are quite >>>>> different than the needs of the teacher and student * and both are >>>>> legitimate needs. >>>>> >>>>> What are your thoughts on these issues? >>>>> >>>>> >>>> >>>> Ignore for the moment the current political political realities, and >>>> consider just the merits and faults, not the practicalities, of what >>>> I propose, a national System of Adult Education and Literacy which >>>> has three aligned components: National Curriculum Standards, (Free) >>>> National Curricula, and Standardized Assessments. Such a system >>>> could have other components, but for now, I suggest we look at these >>>> three. >>>> >>>> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >>>> b) ABE (including adult basic education) c) ASE (adult secondary >>>> education/GED/EDP/ADP) and d) Transition to College programs , >>>> developed through a process which is widely respected by the field. >>>> (Some would argue that we already have that in Equipped for the >>>> Future.) >>>> >>>> 2. National curricula developed based on those standards and >>>> available for states to adopt (or adapt) as they choose. The >>>> curricula need to be comprehensive, modularized, available in >>>> generic >>>> as well as work-contextualized units, in English but also bilingual >>>> in Spanish and possibly other languages. It needs to be available >>>> free online in units that teachers could download and use in their >>>> classrooms, that tutors could use with their one-one-one or small >>>> group instruction, and in self-instructional formats that adult >>>> learners could use directly online. (Yes I know how big a task all >>>> this is.) >>>> >>>> 3. Standardized assessments developed against the national >>>> curriculum >>>> standards (tests, but also performance-based, direct assessments) >>>> which have a high degree of validity for measuring the national >>>> standards. >>>> >>>> Some might think that what I propose is too top-down. I would argue >>>> that it could be very bottom-up if the field -- and adult learner >>>> leaders -- are/have been/will be well-represented in setting the >>>> standards, and if the modules can be be selected to meet specific >>>> learner goals and contexts as well as to the standards. A national >>>> curriculum could be made up of a database of thousands of units of >>>> instruction (modules, learning objects) which could be very easily >>>> found and in minutes organized/reorganized to fit learners' goals >>>> and >>>> contexts. An adult learner or a group who need to improve their >>>> reading skills and who are interested in the context of parenting >>>> could easily access standards-based modules on parenting issues with >>>> reading materials at the right level(s). A teacher whose students >>>> worked in health care and who needed to improve their math skills >>>> could quickly find and download materials/lessons for using numeracy >>>> in health care settings. A student who wanted to learn online and >>>> who >>>> wanted a job in environmental cleanup work could access standards- >>>> based basic skills/occupational education lessons in this area, >>>> accompanied by an online career coach and and online tutor. These >>>> examples just hint at the complexity and sophistication of what I >>>> propose, and will have some shaking their heads at the cost. But, >>>> consider that if this is a national curriculum, the costs of >>>> developing such modules have the benefits of scale, that those >>>> curricula could be widely used -- and freely available. (Sorry >>>> publishers, this could eat into your profits.) >>>> >>>> There is more, but I'll stop with this. >>>> >>>> Okay, let the questions and brickbats fly. >>>> >>>> David J. Rosen >>>> djrosen at comcast.net >>>> >>>> ------------------------------- >>>> National Institute for Literacy >>>> Assessment mailing list >>>> Assessment at nifl.gov >>>> To unsubscribe or change your subscription settings, please go to >>>> http://www.nifl.gov/mailman/listinfo/assessment >>>> >>>> >>>> >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From varshna at grandecom.net Thu Mar 16 20:24:52 2006 From: varshna at grandecom.net (Varshna Narumanchi-Jackson) Date: Thu, 16 Mar 2006 19:24:52 -0600 Subject: [Assessment 248] Re: : A National System of Adult Education and Literacy In-Reply-To: <441A1B1A020000A00000225E@fghamn01.ham.de.future-gate.com> Message-ID: In both of these posts, I have to note a certain amount of ethnocentrism which is an interesting tangent to follow as an educator, especially as it concerns assessment. Are adult education materials and testing instruments subject to the same rigorous examination for bias against subcultures as k12 materials? Furthermore, in the context of teaching 'culture' as well as language to adult ESOL students, when does one move from merely informing to teaching a 'eurocentric' or 'western liberal' or 'religious conservative' agenda in the classroom? on 3/16/06 7:12 PM, Katrina Hinson at khinson at future-gate.com wrote: > I can't do very much about global warming, nor any energy crisis that > may exist nor can I control the population. What I can do as an educator > is teach my students how to work towards stemming global warming in > their own environments; conserve energy and weather proof their homes, > save gas etc.,- and can counsel my younger and sometimes older students > that having more children than you can afford to take care of or want is > not the wisest course of action . > > Teaching our students isn't always about teaching textbook material - > it's about helping to create a competent global citizenry. IT goes > beyond a book or test and into the day to day survival needs. > > I think a bigger challenge we face as a country is educating the > population. Education is more than tests and there are many, many people > around the country that would argue we're no longer teaching students to > think independantly for themselves but we are teaching them how to pass > various multiple choice tests - whether it's the EOG, CASAS, TABE, GED > etc. yet we're not giving students the skills he/she needs to be able to > think creatively nor intuitively. I heard a quote recently - "If you do > the same thing, the same way, all the time, you'll always get the same > results." The challenge is realizing that doing things the same way > doesn't always produce positive results. > > Perhaps if we worked really hard on addressing the challenge of > education, the challenges of global warming, energy and population might > be met and overcome by the students we've yet to meet. Perhaps many of > the issues we face in the world would be, could be addressed by having > an educated and intelligent population that's open minded enough to see > outside their own small sphere of influence. > > Just a thought. > > Katrina > > > ================ > David, > > Anyone else on this list-- > > I feel more and more lately as though I am sunning myself on the > Titanic. We have 3 big challenges as a country, and as a member of the > > world community of nations: > > 1) global warming > 2) energy > 3) population > > All of these three issues have the capacity to sink us as a global > village. People have talked about these issues as though they would > happen in some following generation, but they happen now, are happening > > now. One of those Greenland glaciers that is breaking off into > icebergs could very well sink our ship. > > This may seem an odd place to bring this up, but "assessment" is what > > we should be doing in relation to these three crises, and ANY > curriculum we propose or support. > > I suppose there is the "hope" that bird flu and HIV/AIDS will decimate > > enough populations so everyone else will survive. The Black Death > took 1/3 of the population of Europe, after all, but that seems a > chancy way of thinking. > > If enough Greenland ice melts, then the Gulf Stream won't go as far > north, will turn around and head south sooner, and northern > Europe/Russia, start experiencing very cold weather, this makes > northern living conditions difficult. This probably happened this > winter, earth science people have noted a slowing down of the GS. Our > > weather is controlled by ribbon-like ocean currents that link the > oceans in a conveyor belt. > > If you want more info, troll the web. > > Andrea > > > > On Mar 15, 2006, at 4:17 PM, Andrea Wilder wrote: > >> David, >> >> Another thought: the text and module writing--how about getting > some >> really well-known people to do some lessons? With web help > provided? >> >> In looking at texts, it is often the person who writes the text who >> interests me.... >> >> Andrea >> >> On Mar 15, 2006, at 1:09 PM, David Rosen wrote: >> >>> Howard, >>> >>> Thanks for your thoughtful comments. See my replies below. I hope >>> others will join in this discussion, too, from the Assessment list >>> and from the Content Standards list. >>> >>> David J. Rosen >>> djrosen at comcast.net >>> >>> On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: >>> >>>> I think David has a good, basic overall plan here. I wouldn't say > >>>> the >>>> plan is top-down either. I think it recognizes that there is push >>>> today >>>> to be able to look at success across states and throughout the >>>> country, >>>> and for that we need a way to connect our local efforts into a >>>> national >>>> system. Think globally; act locally -- as always the best > politics >>>> and >>>> the best basis for a system of adult ed. But my sense is that, > right >>>> now, funders are in favor of such a national system, but most >>>> practitioners are not. >>> >>> I wish that funders _were_ in favor of this. The largest adult >>> education funder, the U.S. Department of Education, is reluctant to >>> establish a set of national curriculum standards. I am not sure > why, >>> but guess that it is because a long-standing tradition that >>> curriculum standards in American Education are in the control of >>> local school committees and state boards of education. The closest >>> the USDOE has come to this is funding the development of a >>> "warehouse" of state curriculum standards, >>> [ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] > and >>> (through the National Institute for Literacy) supporting the >>> development -- but not the endorsement as national curriculum >>> standards - of Equipped for the Future. (I am not sure I have that >>> exactly right so if someone has better information, please let us >>> know.) The problem, as many people have said, is not that we lack >>> standards in the U.S., but that we have too many competing sets of >>> standards. We lack a set of national standards that everyone uses. >>> >>>> Because, really, the benefits of such a system >>>> are largely for the funders, policy makers, and big-picture > people; >>>> for >>>> the instructor and learner in the classroom, what is the impact of > >>>> it? >>>> How does it matter that what I need to learn and am mastering to > get >>>> a >>>> job in RI is also what someone needs to learn and master to enter > a >>>> community college in AL? It may be interesting, but what does it >>>> matter? >>> >>> I agree that a system such as I propose would benefit funders. >>> However, it would also benefit teachers and learners. A lot of >>> curriculum -- often very good curriculum -- is developed in > programs >>> and states across the country. But much of it is not published, > and >>> if it is, is not easily accessed. It is possible to find some good >>> curriculum through NIFL LINCS, and in other places on the Web, for >>> example, but this takes time, a lot of time. Teachers don't have >>> much time to search for curriculum. It would be of great interest > to >>> most teachers if high quality curriculum --ready to download and > use >>> -- and adapt to local needs -- in class tomorrow could _easily be >>> found_. >>> >>> Let me give you an example. As I understand it (folks from Arizona >>> correct me if I got this wrong) Arizona has a set of state ESOL >>> standards that are widely used, and respected by ESOL teachers >>> there. A couple of ESOL teachers at Pima County Community College >>> decided that they were useful as far as they went, but they wanted > to >>> have good web-based instruction linked to those standards. So they >>> spent hours and hours finding -- and linking -- instruction on a > Web >>> page that they call The Splendid ESOL Web [ http://cc.pima.edu/ >>> ~slundquist/index.htm ] When I was doing workshops in Arizona a >>> couple of years ago, ESOL teachers popped up from across the state > to >>> tell me about The Splendid ESOL Web and how useful it is to them. >>> This is instructional for us all: a set of standards developed by > and >>> respected by teachers, a set of online instructional resources > found >>> and organized/linked by ESOL Teachers, and widely used by other > ESOL >>> teachers. This sounds like a model to emulate in national > curriculum >>> development. >>> >>> Take this a step further. Suppose we had an agreed-upon format for >>> developing instructional resources, nothing fancy, one that most >>> teachers found easy to understand, easy to use, and that was linked >>> to national standards. Suppose further that the format referenced >>> national curriculum standards, that every lesson or module or >>> learning object built by a teacher referenced a national curriculum >>> standard. Then suppose the modules teachers developed were peer- >>> reviewed and those that were approved were stored in an easily- >>> accessible Web-based instructional lesson/module/learning objects >>> database where other teachers could access them by standard, topic, >>> level, etc. Some of the elements of what I have described are in >>> place. For example, the Lesson Plan Builder, developed by OTAN in >>> California >>> [ http://www.lessonplanbuilder.org/lessons/ ], has a practical > format >>> for creating lesson plans online, and links them to California (and >>> nationally used) standards. OTAN plans to store these lessons in > an >>> accessible database. When that's done, the teacher's chore of > finding >>> good lesson plans will be easier. Also, I very much like that these >>> are lesson plans created "bottom up" by teachers (or perhaps even > by >>> teachers and their students together.) >>> >>>> I also think that many of the standards, curriculum and assessment >>>> pieces already exist. If one has the time -- and right now it > takes >>>> time, believe me -- to peruse and ferret the web, you can find a >>>> wealth >>>> of excellent curricula that is the start of a "comprehensive, >>>> modularized [curriculum], available in generic as well as >>>> work-contextualized units, in English". >>> >>> Yes, much of it is there -- and it's hard to find. Some of it is > not >>> there, however. Try to find work-contextualized online lessons > which >>> students can access directly (not teacher lesson plans but student >>> lessons online.) I have been searching high and low for these -- in >>> health care work -- but haven't found much. Yet, given the good > jobs >>> going begging in health care in New England -- and elsewhere -- >>> wouldn't it be useful if health care workers could do some of their >>> basic skills learning online and if the instruction were >>> contextualized or embedded in health care work? >>> >>>> Much of it "available in free >>>> online in units that teachers could download and use in their >>>> classrooms, that tutors could use with their one-one-one or >>>> smallgroup >>>> instruction". We use several items for our EL Civics, ESL >>>> listening and >>>> ABE math curriculua that are from the web. The weakest link for us > >>>> is >>>> "material in self-instructional formats that adult learners can > use >>>> directly online." >>> >>> Yes, that is the weakest link. >>> >>>> There's a lot of print stuff that's been transferred >>>> to the web, put it's not exciting or constructivist enough to > engage >>>> self-directed learners, unless they are high level readers and > highly >>>> self-motivated. >>> >>> Right you are. >>> >>>> So, I think we could get there more quickly than we might think, > but >>>> only if most of us really want to get there at all. >>> >>> Your state, Rhode Island, the first wireless Internet access state, >>> border-to-border, would be a perfect "testbed" for a system such as > I >>> am proposing. I think if teachers and tutors understood how useful >>> this could be they would clamor for it. Maybe you could get > teachers >>> in Rhode Island to think about this. >>> >>>> From a sincere, big-picture kind-of-guy, >>>> Howard D. >>> >>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> David Rosen wrote: >>>> >>>>> Assessment Colleagues, >>>>> >>>>> Marie wrote: >>>>> >>>>> >>>>>> What do we need? National standards? Is that the most > important >>>>>> thing that will help combat these issues? >>>>>> >>>>>> A different way to capture learning? What would that look like? >>>>>> Remember that the needs of the funder and public are quite >>>>>> different than the needs of the teacher and student * and both > are >>>>>> legitimate needs. >>>>>> >>>>>> What are your thoughts on these issues? >>>>>> >>>>>> >>>>> >>>>> Ignore for the moment the current political political realities, > and >>>>> consider just the merits and faults, not the practicalities, of > what >>>>> I propose, a national System of Adult Education and Literacy > which >>>>> has three aligned components: National Curriculum Standards, > (Free) >>>>> National Curricula, and Standardized Assessments. Such a system >>>>> could have other components, but for now, I suggest we look at > these >>>>> three. >>>>> >>>>> 1. Sets of national curriculum standards for: a) adult > ESL/ESOL/ELL, >>>>> b) ABE (including adult basic education) c) ASE (adult secondary >>>>> education/GED/EDP/ADP) and d) Transition to College programs , >>>>> developed through a process which is widely respected by the > field. >>>>> (Some would argue that we already have that in Equipped for the >>>>> Future.) >>>>> >>>>> 2. National curricula developed based on those standards and >>>>> available for states to adopt (or adapt) as they choose. The >>>>> curricula need to be comprehensive, modularized, available in >>>>> generic >>>>> as well as work-contextualized units, in English but also > bilingual >>>>> in Spanish and possibly other languages. It needs to be > available >>>>> free online in units that teachers could download and use in > their >>>>> classrooms, that tutors could use with their one-one-one or small >>>>> group instruction, and in self-instructional formats that adult >>>>> learners could use directly online. (Yes I know how big a task > all >>>>> this is.) >>>>> >>>>> 3. Standardized assessments developed against the national >>>>> curriculum >>>>> standards (tests, but also performance-based, direct assessments) >>>>> which have a high degree of validity for measuring the national >>>>> standards. >>>>> >>>>> Some might think that what I propose is too top-down. I would > argue >>>>> that it could be very bottom-up if the field -- and adult learner >>>>> leaders -- are/have been/will be well-represented in setting the >>>>> standards, and if the modules can be be selected to meet specific >>>>> learner goals and contexts as well as to the standards. A > national >>>>> curriculum could be made up of a database of thousands of units > of >>>>> instruction (modules, learning objects) which could be very > easily >>>>> found and in minutes organized/reorganized to fit learners' goals > >>>>> and >>>>> contexts. An adult learner or a group who need to improve their >>>>> reading skills and who are interested in the context of parenting >>>>> could easily access standards-based modules on parenting issues > with >>>>> reading materials at the right level(s). A teacher whose > students >>>>> worked in health care and who needed to improve their math skills >>>>> could quickly find and download materials/lessons for using > numeracy >>>>> in health care settings. A student who wanted to learn online and > >>>>> who >>>>> wanted a job in environmental cleanup work could access > standards- >>>>> based basic skills/occupational education lessons in this area, >>>>> accompanied by an online career coach and and online tutor. > These >>>>> examples just hint at the complexity and sophistication of what I >>>>> propose, and will have some shaking their heads at the cost. > But, >>>>> consider that if this is a national curriculum, the costs of >>>>> developing such modules have the benefits of scale, that those >>>>> curricula could be widely used -- and freely available. (Sorry >>>>> publishers, this could eat into your profits.) >>>>> >>>>> There is more, but I'll stop with this. >>>>> >>>>> Okay, let the questions and brickbats fly. >>>>> >>>>> David J. Rosen >>>>> djrosen at comcast.net >>>>> >>>>> ------------------------------- >>>>> National Institute for Literacy >>>>> Assessment mailing list >>>>> Assessment at nifl.gov >>>>> To unsubscribe or change your subscription settings, please go to >>>>> http://www.nifl.gov/mailman/listinfo/assessment >>>>> >>>>> >>>>> >>>> >>>> ------------------------------- >>>> National Institute for Literacy >>>> Assessment mailing list >>>> Assessment at nifl.gov >>>> To unsubscribe or change your subscription settings, please go to >>>> http://www.nifl.gov/mailman/listinfo/assessment >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >>> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > From marie.cora at hotspurpartners.com Fri Mar 17 12:43:25 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 17 Mar 2006 12:43:25 -0500 Subject: [Assessment 249] Q&A on New Assessments by ETS Message-ID: <005e01c649ea$4bcb7e00$0402a8c0@frodo> Dear Colleagues, The Assessment Discussion List will be hosting a Q&A during the week of March 27 on 3 new assessments for adult learning being developed by ETS (Educational Testing Service). ETS is seeking states to collaborate on the development of these new standards-based assessments. Presently, 7 Charter states have been working with ETS on this project. ETS is recruiting several more states for the next phase of the project, which includes: * Developing, reviewing and selecting tasks to be included in the new measures; * Contributing to the development of diagnostic score reports; * Participating in a standard-setting process that will map the tests to the NRS levels; * Piloting the tests with your adult learners; * Creating a test designed by you with your state's learners', teachers', and administrators' needs in mind. Julie Eastland, of ETS, will be joining us during the week of March 27 to answer your questions and comments regarding the project. You can send your questions to the List before the week of March 27, and I will hold them for that week, or you can post your questions and comments during that week. Julie will be available to respond periodically throughout that week. For more information, please see the attachment. Thanks and looking forward to your questions and comments. marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060317/e4b1901e/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: new_assessments.pdf Type: application/pdf Size: 54895 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060317/e4b1901e/attachment.pdf From andreawilder at comcast.net Sat Mar 18 21:38:20 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Sat, 18 Mar 2006 21:38:20 -0500 Subject: [Assessment 250] Re: : A National System of Adult Education and Literacy In-Reply-To: References: Message-ID: David, As a believer in performance based assessment, I am wondering how this might work with your computer modules. Andrea On Mar 15, 2006, at 9:07 AM, PATRICIA HANDY wrote: > David and All, > As a practitioner for 27 years, now responsible for training new > staff, I applaud your suggestions. I would not be appauding if you had > proposed a rigid "this curriculum fits all" plan, but as to providing > standardized resources from which each teacher or learner could > customize a learning plan, YES! YES! > > > Pat Handy > 410-749-3217 > Coordinator, Wicomico County Adult Learning Center > Philmore Commons, Salisbury > > Confidentiality Note: > This message may contain confidential information intended only for > the use of the person named above and may contain communication > protected by law. If you have received this message in error, you are > hereby notified that any dissemination, distribution, copying or other > use of this message is prohibited and you are requested to notify the > sender immediately at his/her electronic mail. >>>> djrosen at comcast.net 03/14/06 11:05 PM >>> > Assessment Colleagues, > > Marie wrote: >> What do we need? National standards? Is that the most important >> thing that will help combat these issues? >> >> A different way to capture learning? What would that look like? >> Remember that the needs of the funder and public are quite >> different than the needs of the teacher and student * and both are >> legitimate needs. >> >> What are your thoughts on these issues? > > Ignore for the moment the current political political realities, and > consider just the merits and faults, not the practicalities, of what > I propose, a national System of Adult Education and Literacy which > has three aligned components: National Curriculum Standards, (Free) > National Curricula, and Standardized Assessments. Such a system > could have other components, but for now, I suggest we look at these > three. > > 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, > b) ABE (including adult basic education) c) ASE (adult secondary > education/GED/EDP/ADP) and d) Transition to College programs , > developed through a process which is widely respected by the field. > (Some would argue that we already have that in Equipped for the > Future.) > > 2. National curricula developed based on those standards and > available for states to adopt (or adapt) as they choose. The > curricula need to be comprehensive, modularized, available in generic > as well as work-contextualized units, in English but also bilingual > in Spanish and possibly other languages. It needs to be available > free online in units that teachers could download and use in their > classrooms, that tutors could use with their one-one-one or small > group instruction, and in self-instructional formats that adult > learners could use directly online. (Yes I know how big a task all > this is.) > > 3. Standardized assessments developed against the national curriculum > standards (tests, but also performance-based, direct assessments) > which have a high degree of validity for measuring the national > standards. > > Some might think that what I propose is too top-down. I would argue > that it could be very bottom-up if the field -- and adult learner > leaders -- are/have been/will be well-represented in setting the > standards, and if the modules can be be selected to meet specific > learner goals and contexts as well as to the standards. A national > curriculum could be made up of a database of thousands of units of > instruction (modules, learning objects) which could be very easily > found and in minutes organized/reorganized to fit learners' goals and > contexts. An adult learner or a group who need to improve their > reading skills and who are interested in the context of parenting > could easily access standards-based modules on parenting issues with > reading materials at the right level(s). A teacher whose students > worked in health care and who needed to improve their math skills > could quickly find and download materials/lessons for using numeracy > in health care settings. A student who wanted to learn online and who > wanted a job in environmental cleanup work could access standards- > based basic skills/occupational education lessons in this area, > accompanied by an online career coach and and online tutor. These > examples just hint at the complexity and sophistication of what I > propose, and will have some shaking their heads at the cost. But, > consider that if this is a national curriculum, the costs of > developing such modules have the benefits of scale, that those > curricula could be widely used -- and freely available. (Sorry > publishers, this could eat into your profits.) > > There is more, but I'll stop with this. > > Okay, let the questions and brickbats fly. > > David J. Rosen > djrosen at comcast.net > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From djrosen at comcast.net Sun Mar 19 10:10:52 2006 From: djrosen at comcast.net (David Rosen) Date: Sun, 19 Mar 2006 10:10:52 -0500 Subject: [Assessment 251] Online Performance-based Assessment In-Reply-To: References: Message-ID: <29D24278-AEF0-4535-96B9-5D8836C22FA0@comcast.net> Andrea, and others, On Mar 18, 2006, at 9:38 PM, Andrea Wilder wrote to the Assessment discussion list: > As a believer in performance based assessment, I am wondering how > this might work with your computer modules. Modularized, online competency-based instruction, which I talked about as modules or learning objects in an earlier posting, opens up a world of interesting new possibilities for adult learning and assessment, including, for example: 1) A video library of short demonstrations. For example, in the numeracy and science context, instead of a written story problem, imagine a streamed video of two students discussing a real problem which requires mathematics to solve it. Let's imagine, for example, room mates or partners who live in a northern clime whose landlord has agreed to pay for a storm door if they install it and if they can get a good price on the door. They need to understand how a storm door works -- how, if it is properly installed, it creates an insulating cushion of air. They need to know how to do linear measurement, and how precise the measurements need to be. They may need to know how to comparison shop, possibly using online pricing information. Such a video could present the problem in steps and invite the learner (s) to offer possible solutions before viewing each next step. Some of these examples could be low-cost video logs (Vlogs) which could also easily be downloaded to a portable video player such as a video i-pod. (For examples of Vlogs, go to http://www.seriousmagic.com/products/vlogit/sampleVideos.cfm Anyone have better examples of Vlogs used for educational purposes?) 2) Simulations. I recently looked at some online instruction for occupational health and safety (OSHA) compliance. [ http:// www.osha.gov/dts/osta/oshasoft/index.html ] The Web site uses virtual reality simulations of rooms in hospitals which have health and safety hazards to identify, and procedures to choose for addressing the hazard. From your keyboard you can move around the virtual room, sometimes 360 degrees, to find the hazards. Video game technology may offer a lot which we could use well in online adult learning. 3) Natural language databases. It is possible to carry on a kind of discussion with a computer, particularly in a topic area where most of the responses can be anticipated. Perhaps this could be used for instruction, possibly even assessment. Some of these technologies, along with online adaptive assessments (where depending on the response to a question, the computer adapts and gives one an easier or harder next question) could be used in performance based assessments. [ http://en.wikipedia.org/wiki/Computer-adaptive_test ] Some of my favorite examples of assessments have been developed by adult reading specialist and interactive technology designer Mike Hillinger. [ http://www.workingsimulations.com/ ] For example, several years ago Mike developed an online instruction and assessment application for manufacturing which involves the user choosing a tool (a micrometer, for example) and moving it to measure parts of machines. It taught and assessed precise linear measurement -- actual measuring, not just answering questions about measuring. Mike is also the author of the (free) online basic skills simulation called The Office. [ http:// www.workingsimulations.com/main_site/basicskillHome.html ] Some of these applications would not be expensive (teacher-made Vlogs, for example) and some (like Mike Hillinger's high quality work) might only be affordable in a business or military context or with investments by the U.S. Department of Education (which paid for the development of The Office). This is an exciting arena for creative development for adult educators who are interested in technology. None of these examples, I would caution, is intended as a replacement for face-to-face learning or assessment. However, online learning opens up new possibilities for students who are limited in the amount of time they can attend class but who have more time available to study online from work or home, or for students who cannot attend face-to-face classes. Online learning is usually best when facilitated by an online teacher. And, a final caution, there is no evidence that online learning is less expensive than face-to-face learning. I wonder if anyone else on the Assessment or Technology lists has been looking at applications like these for instruction or assessment, and could add other examples. David J. Rosen www.newsomeassociates.com djrosen at comcast.net > > > On Mar 15, 2006, at 9:07 AM, PATRICIA HANDY wrote: > >> David and All, >> As a practitioner for 27 years, now responsible for training new >> staff, I applaud your suggestions. I would not be appauding if you >> had >> proposed a rigid "this curriculum fits all" plan, but as to providing >> standardized resources from which each teacher or learner could >> customize a learning plan, YES! YES! >> >> >> Pat Handy >> 410-749-3217 >> Coordinator, Wicomico County Adult Learning Center >> Philmore Commons, Salisbury >> >> Confidentiality Note: >> This message may contain confidential information intended only for >> the use of the person named above and may contain communication >> protected by law. If you have received this message in error, you >> are >> hereby notified that any dissemination, distribution, copying or >> other >> use of this message is prohibited and you are requested to notify the >> sender immediately at his/her electronic mail. >>>>> djrosen at comcast.net 03/14/06 11:05 PM >>> >> Assessment Colleagues, >> >> Marie wrote: >>> What do we need? National standards? Is that the most important >>> thing that will help combat these issues? >>> >>> A different way to capture learning? What would that look like? >>> Remember that the needs of the funder and public are quite >>> different than the needs of the teacher and student * and both are >>> legitimate needs. >>> >>> What are your thoughts on these issues? >> >> Ignore for the moment the current political political realities, and >> consider just the merits and faults, not the practicalities, of what >> I propose, a national System of Adult Education and Literacy which >> has three aligned components: National Curriculum Standards, (Free) >> National Curricula, and Standardized Assessments. Such a system >> could have other components, but for now, I suggest we look at these >> three. >> >> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >> b) ABE (including adult basic education) c) ASE (adult secondary >> education/GED/EDP/ADP) and d) Transition to College programs , >> developed through a process which is widely respected by the field. >> (Some would argue that we already have that in Equipped for the >> Future.) >> >> 2. National curricula developed based on those standards and >> available for states to adopt (or adapt) as they choose. The >> curricula need to be comprehensive, modularized, available in generic >> as well as work-contextualized units, in English but also bilingual >> in Spanish and possibly other languages. It needs to be available >> free online in units that teachers could download and use in their >> classrooms, that tutors could use with their one-one-one or small >> group instruction, and in self-instructional formats that adult >> learners could use directly online. (Yes I know how big a task all >> this is.) >> >> 3. Standardized assessments developed against the national curriculum >> standards (tests, but also performance-based, direct assessments) >> which have a high degree of validity for measuring the national >> standards. >> >> Some might think that what I propose is too top-down. I would argue >> that it could be very bottom-up if the field -- and adult learner >> leaders -- are/have been/will be well-represented in setting the >> standards, and if the modules can be be selected to meet specific >> learner goals and contexts as well as to the standards. A national >> curriculum could be made up of a database of thousands of units of >> instruction (modules, learning objects) which could be very easily >> found and in minutes organized/reorganized to fit learners' goals and >> contexts. An adult learner or a group who need to improve their >> reading skills and who are interested in the context of parenting >> could easily access standards-based modules on parenting issues with >> reading materials at the right level(s). A teacher whose students >> worked in health care and who needed to improve their math skills >> could quickly find and download materials/lessons for using numeracy >> in health care settings. A student who wanted to learn online and who >> wanted a job in environmental cleanup work could access standards- >> based basic skills/occupational education lessons in this area, >> accompanied by an online career coach and and online tutor. These >> examples just hint at the complexity and sophistication of what I >> propose, and will have some shaking their heads at the cost. But, >> consider that if this is a national curriculum, the costs of >> developing such modules have the benefits of scale, that those >> curricula could be widely used -- and freely available. (Sorry >> publishers, this could eat into your profits.) >> >> There is more, but I'll stop with this. >> >> Okay, let the questions and brickbats fly. >> >> David J. Rosen >> djrosen at comcast.net >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From andreawilder at comcast.net Sun Mar 19 14:09:48 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Sun, 19 Mar 2006 14:09:48 -0500 Subject: [Assessment 252] Re: : A National System of Adult Education and Literacy In-Reply-To: References: Message-ID: <5a76fa3bf4cac90977638dba9a74dbb2@comcast.net> Marie, others, I first taught at a school run by teachers--there was a head, but there were curriculum outlines, deeply humanistic, which we followed.As teachers we were evaluated by how we demonstrated mastery of this curriculum, which we had made up. Meaning, really, that we were evaluated through our students and the topics we had covered as they reflected broad curriculum outlines. Everyone was happy, students, teachers, parents, he school. (true story) Each fall we gave the kids a spelling test to see where we should start working. Each spring the children took a standardized test to see how our school ranked against others. (At one point I worked (didn't teach) at an immense city school. i managed book buying. One teacher got for a second year in a row the first volume of a volume of two books. She didn't protest that the kids would be taught the same material for two years in a row....) So at the first school we had a dual system, assessment for us, assessment for the school. Now it seems to me, but I can be wrong about this, that when we talk about assessment we don't talk about the value of what we are teaching--is it good or not? Take the TABE. It functions kind of like an index, like say taking one's temperature. 98.6 is just an index. No one pretends that the TABE materials are earth shaking, having read them, for a point of interest they are dreadful. If I were coming into class I would want to read materials that were important to me. (This may be another topic.) At what point to we pay attention to TEXT? What the words on the page are telling us? Who wrote the words, anyway? And all that. is this where we talk CRITICAL LITERACY? Is this where standards come in? Adult literacy is really different from kid lit.. Adults want to master what is important for their lives. I may have missed, mislaid, some large piece of knowledge which Marie and others have gone over already, or what i am asking may not be pertinent at all, so please bear with me. Thanks. Andrea On Mar 18, 2006, at 9:38 PM, Andrea Wilder wrote: > David, > > As a believer in performance based assessment, I am wondering how > this might work with your computer modules. > > Andrea > > > On Mar 15, 2006, at 9:07 AM, PATRICIA HANDY wrote: > >> David and All, >> As a practitioner for 27 years, now responsible for training new >> staff, I applaud your suggestions. I would not be appauding if you had >> proposed a rigid "this curriculum fits all" plan, but as to providing >> standardized resources from which each teacher or learner could >> customize a learning plan, YES! YES! >> >> >> Pat Handy >> 410-749-3217 >> Coordinator, Wicomico County Adult Learning Center >> Philmore Commons, Salisbury >> >> Confidentiality Note: >> This message may contain confidential information intended only for >> the use of the person named above and may contain communication >> protected by law. If you have received this message in error, you are >> hereby notified that any dissemination, distribution, copying or other >> use of this message is prohibited and you are requested to notify the >> sender immediately at his/her electronic mail. >>>>> djrosen at comcast.net 03/14/06 11:05 PM >>> >> Assessment Colleagues, >> >> Marie wrote: >>> What do we need? National standards? Is that the most important >>> thing that will help combat these issues? >>> >>> A different way to capture learning? What would that look like? >>> Remember that the needs of the funder and public are quite >>> different than the needs of the teacher and student * and both are >>> legitimate needs. >>> >>> What are your thoughts on these issues? >> >> Ignore for the moment the current political political realities, and >> consider just the merits and faults, not the practicalities, of what >> I propose, a national System of Adult Education and Literacy which >> has three aligned components: National Curriculum Standards, (Free) >> National Curricula, and Standardized Assessments. Such a system >> could have other components, but for now, I suggest we look at these >> three. >> >> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >> b) ABE (including adult basic education) c) ASE (adult secondary >> education/GED/EDP/ADP) and d) Transition to College programs , >> developed through a process which is widely respected by the field. >> (Some would argue that we already have that in Equipped for the >> Future.) >> >> 2. National curricula developed based on those standards and >> available for states to adopt (or adapt) as they choose. The >> curricula need to be comprehensive, modularized, available in generic >> as well as work-contextualized units, in English but also bilingual >> in Spanish and possibly other languages. It needs to be available >> free online in units that teachers could download and use in their >> classrooms, that tutors could use with their one-one-one or small >> group instruction, and in self-instructional formats that adult >> learners could use directly online. (Yes I know how big a task all >> this is.) >> >> 3. Standardized assessments developed against the national curriculum >> standards (tests, but also performance-based, direct assessments) >> which have a high degree of validity for measuring the national >> standards. >> >> Some might think that what I propose is too top-down. I would argue >> that it could be very bottom-up if the field -- and adult learner >> leaders -- are/have been/will be well-represented in setting the >> standards, and if the modules can be be selected to meet specific >> learner goals and contexts as well as to the standards. A national >> curriculum could be made up of a database of thousands of units of >> instruction (modules, learning objects) which could be very easily >> found and in minutes organized/reorganized to fit learners' goals and >> contexts. An adult learner or a group who need to improve their >> reading skills and who are interested in the context of parenting >> could easily access standards-based modules on parenting issues with >> reading materials at the right level(s). A teacher whose students >> worked in health care and who needed to improve their math skills >> could quickly find and download materials/lessons for using numeracy >> in health care settings. A student who wanted to learn online and who >> wanted a job in environmental cleanup work could access standards- >> based basic skills/occupational education lessons in this area, >> accompanied by an online career coach and and online tutor. These >> examples just hint at the complexity and sophistication of what I >> propose, and will have some shaking their heads at the cost. But, >> consider that if this is a national curriculum, the costs of >> developing such modules have the benefits of scale, that those >> curricula could be widely used -- and freely available. (Sorry >> publishers, this could eat into your profits.) >> >> There is more, but I'll stop with this. >> >> Okay, let the questions and brickbats fly. >> >> David J. Rosen >> djrosen at comcast.net >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From andreawilder at comcast.net Sun Mar 19 16:32:15 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Sun, 19 Mar 2006 16:32:15 -0500 Subject: [Assessment 253] Re: : A National System of Adult Education and Literacy In-Reply-To: <5a76fa3bf4cac90977638dba9a74dbb2@comcast.net> References: <5a76fa3bf4cac90977638dba9a74dbb2@comcast.net> Message-ID: <8938fe688a5fd226d9c24d23da017221@comcast.net> I need to follow up on one point--The Teacher Who Didn't Protest. She worked in a bureaucratic system, she had always worked there. She did a really good job with the kids she taught. I can't explain her action anymore than this. it was what it was. AW On Mar 19, 2006, at 2:09 PM, Andrea Wilder wrote: > Marie, others, > > I first taught at a school run by teachers--there was a head, but > there were curriculum outlines, deeply humanistic, which we followed.As > teachers we were evaluated by how we demonstrated mastery of this > curriculum, which we had made up. Meaning, really, that we were > evaluated through our students and the topics we had covered as they > reflected broad curriculum outlines. Everyone was happy, students, > teachers, parents, he school. (true story) Each fall we gave the kids > a spelling test to see where we should start working. Each spring the > children took a standardized test to see how our school ranked against > others. > > (At one point I worked (didn't teach) at an immense city school. i > managed book buying. One teacher got for a second year in a row the > first volume of a volume of two books. She didn't protest that the > kids would be taught the same material for two years in a row....) > > So at the first school we had a dual system, assessment for us, > assessment for the school. > > Now it seems to me, but I can be wrong about this, that when we talk > about assessment we don't talk about the value of what we are > teaching--is it good or not? > > Take the TABE. It functions kind of like an index, like say taking > one's temperature. 98.6 is just an index. No one pretends that the > TABE materials are earth shaking, having read them, for a point of > interest they are dreadful. > > If I were coming into class I would want to read materials that were > important to me. (This may be another topic.) > > At what point to we pay attention to TEXT? What the words on the page > are telling us? Who wrote the words, anyway? And all that. is this > where we talk CRITICAL LITERACY? Is this where standards come in? > > Adult literacy is really different from kid lit.. Adults want to > master what is important for their lives. > > I may have missed, mislaid, some large piece of knowledge which Marie > and others have gone over already, or what i am asking may not be > pertinent at all, so please bear with me. > > Thanks. > > Andrea > > > On Mar 18, 2006, at 9:38 PM, Andrea Wilder wrote: > >> David, >> >> As a believer in performance based assessment, I am wondering how >> this might work with your computer modules. >> >> Andrea >> >> >> On Mar 15, 2006, at 9:07 AM, PATRICIA HANDY wrote: >> >>> David and All, >>> As a practitioner for 27 years, now responsible for training new >>> staff, I applaud your suggestions. I would not be appauding if you >>> had >>> proposed a rigid "this curriculum fits all" plan, but as to providing >>> standardized resources from which each teacher or learner could >>> customize a learning plan, YES! YES! >>> >>> >>> Pat Handy >>> 410-749-3217 >>> Coordinator, Wicomico County Adult Learning Center >>> Philmore Commons, Salisbury >>> >>> Confidentiality Note: >>> This message may contain confidential information intended only for >>> the use of the person named above and may contain communication >>> protected by law. If you have received this message in error, you >>> are >>> hereby notified that any dissemination, distribution, copying or >>> other >>> use of this message is prohibited and you are requested to notify the >>> sender immediately at his/her electronic mail. >>>>>> djrosen at comcast.net 03/14/06 11:05 PM >>> >>> Assessment Colleagues, >>> >>> Marie wrote: >>>> What do we need? National standards? Is that the most important >>>> thing that will help combat these issues? >>>> >>>> A different way to capture learning? What would that look like? >>>> Remember that the needs of the funder and public are quite >>>> different than the needs of the teacher and student * and both are >>>> legitimate needs. >>>> >>>> What are your thoughts on these issues? >>> >>> Ignore for the moment the current political political realities, and >>> consider just the merits and faults, not the practicalities, of what >>> I propose, a national System of Adult Education and Literacy which >>> has three aligned components: National Curriculum Standards, (Free) >>> National Curricula, and Standardized Assessments. Such a system >>> could have other components, but for now, I suggest we look at these >>> three. >>> >>> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >>> b) ABE (including adult basic education) c) ASE (adult secondary >>> education/GED/EDP/ADP) and d) Transition to College programs , >>> developed through a process which is widely respected by the field. >>> (Some would argue that we already have that in Equipped for the >>> Future.) >>> >>> 2. National curricula developed based on those standards and >>> available for states to adopt (or adapt) as they choose. The >>> curricula need to be comprehensive, modularized, available in generic >>> as well as work-contextualized units, in English but also bilingual >>> in Spanish and possibly other languages. It needs to be available >>> free online in units that teachers could download and use in their >>> classrooms, that tutors could use with their one-one-one or small >>> group instruction, and in self-instructional formats that adult >>> learners could use directly online. (Yes I know how big a task all >>> this is.) >>> >>> 3. Standardized assessments developed against the national curriculum >>> standards (tests, but also performance-based, direct assessments) >>> which have a high degree of validity for measuring the national >>> standards. >>> >>> Some might think that what I propose is too top-down. I would argue >>> that it could be very bottom-up if the field -- and adult learner >>> leaders -- are/have been/will be well-represented in setting the >>> standards, and if the modules can be be selected to meet specific >>> learner goals and contexts as well as to the standards. A national >>> curriculum could be made up of a database of thousands of units of >>> instruction (modules, learning objects) which could be very easily >>> found and in minutes organized/reorganized to fit learners' goals and >>> contexts. An adult learner or a group who need to improve their >>> reading skills and who are interested in the context of parenting >>> could easily access standards-based modules on parenting issues with >>> reading materials at the right level(s). A teacher whose students >>> worked in health care and who needed to improve their math skills >>> could quickly find and download materials/lessons for using numeracy >>> in health care settings. A student who wanted to learn online and who >>> wanted a job in environmental cleanup work could access standards- >>> based basic skills/occupational education lessons in this area, >>> accompanied by an online career coach and and online tutor. These >>> examples just hint at the complexity and sophistication of what I >>> propose, and will have some shaking their heads at the cost. But, >>> consider that if this is a national curriculum, the costs of >>> developing such modules have the benefits of scale, that those >>> curricula could be widely used -- and freely available. (Sorry >>> publishers, this could eat into your profits.) >>> >>> There is more, but I'll stop with this. >>> >>> Okay, let the questions and brickbats fly. >>> >>> David J. Rosen >>> djrosen at comcast.net >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >>> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From marie.cora at hotspurpartners.com Mon Mar 20 09:57:10 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 20 Mar 2006 09:57:10 -0500 Subject: [Assessment 254] FW from Content Standards Message-ID: <00ec01c64c2e$9142f990$0402a8c0@frodo> Dear Colleagues, The following post is in response to comments made by David Rosen on the Assessment Discussion List on Sunday, March 19 regarding standards and on-line performance assessment. Thanks, marie ---- all A bottom-up, field (practitioner and learner) - represented set of standards? Sounds like EFF, no? What am I missing here? We have this rich, useful resource in EFF. Are we talking about starting over? Something different? Don't we have this already? Aren't some of us building on it? Janet Isserlis >> >> Some might think that what I propose is too top-down. I would argue >> that it could be very bottom-up if the field -- and adult learner >> leaders -- are/have been/will be well-represented in setting the >> standards, and if the modules can be be selected to meet specific >> learner goals and contexts as well as to the standards. David Rosen ---------------------------------------------------- From andreawilder at comcast.net Mon Mar 20 10:02:55 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Mon, 20 Mar 2006 10:02:55 -0500 Subject: [Assessment 255] Re: FW from Content Standards In-Reply-To: <00ec01c64c2e$9142f990$0402a8c0@frodo> References: <00ec01c64c2e$9142f990$0402a8c0@frodo> Message-ID: <7275563f5a931dbfb23359050d201e7b@comcast.net> Janet, Very good questions-- What i am understanding about David's curriculum is that it would consist of units that could be plucked from the air..What are the strength of EFF? What are the strengths of David's curriculum? Upsides and downsides to each? How is success measured? What values are enhanced? Who puts up the money for each? How about grassroots support? Andrea On Mar 20, 2006, at 9:57 AM, Marie Cora wrote: > Dear Colleagues, > > The following post is in response to comments made by David Rosen on > the > Assessment Discussion List on Sunday, March 19 regarding standards and > on-line performance assessment. Thanks, marie > ---- > > > all > > A bottom-up, field (practitioner and learner) - represented set of > standards? Sounds like EFF, no? What am I missing here? We have this > rich, useful resource in EFF. Are we talking about starting over? > Something different? Don't we have this already? Aren't some of us > building on it? > > Janet Isserlis > > >>> >>> Some might think that what I propose is too top-down. I would argue >>> that it could be very bottom-up if the field -- and adult learner >>> leaders -- are/have been/will be well-represented in setting the >>> standards, and if the modules can be be selected to meet specific >>> learner goals and contexts as well as to the standards. > > David Rosen > ---------------------------------------------------- > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From atrawick at charter.net Mon Mar 20 10:28:54 2006 From: atrawick at charter.net (Amy R. Trawick) Date: Mon, 20 Mar 2006 10:28:54 -0500 Subject: [Assessment 256] Re: : A National System of Adult Education andLiteracy References: <002d01c64782$4b43c150$0402a8c0@frodo><44183675.8040303@riral.org> <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> Message-ID: <00ee01c64c33$0045b810$3002a8c0@ben2ut66kkx7o3> David, I think this is an intriguing idea. Lots of issues are whirling around in my head, but let's deal with this one first: Are you proposing the development of national standards or federal standards? It is possible to have a set of national standards, supporting curriculum, and standardized assessments--developed and referenced by the field as a whole because of the collaborative nature of their development or adoption--without having these codified within the federal bureaucracy. In this scenario, the federal government could even provide support through funding. Is this what your getting at, or are you seeing a more hands-on role being played by the federal government? Amy Amy R. Trawick, M.S. Ed. North Wilkesboro, North Carolina atrawick at charter.net ----- Original Message ----- From: "David Rosen" To: ; "The Assessment Discussion List" Sent: Wednesday, March 15, 2006 1:09 PM Subject: [Assessment 239] Re: : A National System of Adult Education andLiteracy Howard, Thanks for your thoughtful comments. See my replies below. I hope others will join in this discussion, too, from the Assessment list and from the Content Standards list. David J. Rosen djrosen at comcast.net On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: > I think David has a good, basic overall plan here. I wouldn't say the > plan is top-down either. I think it recognizes that there is push > today > to be able to look at success across states and throughout the > country, > and for that we need a way to connect our local efforts into a > national > system. Think globally; act locally -- as always the best politics > and > the best basis for a system of adult ed. But my sense is that, right > now, funders are in favor of such a national system, but most > practitioners are not. I wish that funders _were_ in favor of this. The largest adult education funder, the U.S. Department of Education, is reluctant to establish a set of national curriculum standards. I am not sure why, but guess that it is because a long-standing tradition that curriculum standards in American Education are in the control of local school committees and state boards of education. The closest the USDOE has come to this is funding the development of a "warehouse" of state curriculum standards, [ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] and (through the National Institute for Literacy) supporting the development -- but not the endorsement as national curriculum standards - of Equipped for the Future. (I am not sure I have that exactly right so if someone has better information, please let us know.) The problem, as many people have said, is not that we lack standards in the U.S., but that we have too many competing sets of standards. We lack a set of national standards that everyone uses. > Because, really, the benefits of such a system > are largely for the funders, policy makers, and big-picture people; > for > the instructor and learner in the classroom, what is the impact of it? > How does it matter that what I need to learn and am mastering to get a > job in RI is also what someone needs to learn and master to enter a > community college in AL? It may be interesting, but what does it > matter? I agree that a system such as I propose would benefit funders. However, it would also benefit teachers and learners. A lot of curriculum -- often very good curriculum -- is developed in programs and states across the country. But much of it is not published, and if it is, is not easily accessed. It is possible to find some good curriculum through NIFL LINCS, and in other places on the Web, for example, but this takes time, a lot of time. Teachers don't have much time to search for curriculum. It would be of great interest to most teachers if high quality curriculum --ready to download and use -- and adapt to local needs -- in class tomorrow could _easily be found_. Let me give you an example. As I understand it (folks from Arizona correct me if I got this wrong) Arizona has a set of state ESOL standards that are widely used, and respected by ESOL teachers there. A couple of ESOL teachers at Pima County Community College decided that they were useful as far as they went, but they wanted to have good web-based instruction linked to those standards. So they spent hours and hours finding -- and linking -- instruction on a Web page that they call The Splendid ESOL Web [ http://cc.pima.edu/ ~slundquist/index.htm ] When I was doing workshops in Arizona a couple of years ago, ESOL teachers popped up from across the state to tell me about The Splendid ESOL Web and how useful it is to them. This is instructional for us all: a set of standards developed by and respected by teachers, a set of online instructional resources found and organized/linked by ESOL Teachers, and widely used by other ESOL teachers. This sounds like a model to emulate in national curriculum development. Take this a step further. Suppose we had an agreed-upon format for developing instructional resources, nothing fancy, one that most teachers found easy to understand, easy to use, and that was linked to national standards. Suppose further that the format referenced national curriculum standards, that every lesson or module or learning object built by a teacher referenced a national curriculum standard. Then suppose the modules teachers developed were peer- reviewed and those that were approved were stored in an easily- accessible Web-based instructional lesson/module/learning objects database where other teachers could access them by standard, topic, level, etc. Some of the elements of what I have described are in place. For example, the Lesson Plan Builder, developed by OTAN in California [ http://www.lessonplanbuilder.org/lessons/ ], has a practical format for creating lesson plans online, and links them to California (and nationally used) standards. OTAN plans to store these lessons in an accessible database. When that's done, the teacher's chore of finding good lesson plans will be easier. Also, I very much like that these are lesson plans created "bottom up" by teachers (or perhaps even by teachers and their students together.) > I also think that many of the standards, curriculum and assessment > pieces already exist. If one has the time -- and right now it takes > time, believe me -- to peruse and ferret the web, you can find a > wealth > of excellent curricula that is the start of a "comprehensive, > modularized [curriculum], available in generic as well as > work-contextualized units, in English". Yes, much of it is there -- and it's hard to find. Some of it is not there, however. Try to find work-contextualized online lessons which students can access directly (not teacher lesson plans but student lessons online.) I have been searching high and low for these -- in health care work -- but haven't found much. Yet, given the good jobs going begging in health care in New England -- and elsewhere -- wouldn't it be useful if health care workers could do some of their basic skills learning online and if the instruction were contextualized or embedded in health care work? > Much of it "available in free > online in units that teachers could download and use in their > classrooms, that tutors could use with their one-one-one or smallgroup > instruction". We use several items for our EL Civics, ESL > listening and > ABE math curriculua that are from the web. The weakest link for us is > "material in self-instructional formats that adult learners can use > directly online." Yes, that is the weakest link. > There's a lot of print stuff that's been transferred > to the web, put it's not exciting or constructivist enough to engage > self-directed learners, unless they are high level readers and highly > self-motivated. Right you are. > So, I think we could get there more quickly than we might think, but > only if most of us really want to get there at all. Your state, Rhode Island, the first wireless Internet access state, border-to-border, would be a perfect "testbed" for a system such as I am proposing. I think if teachers and tutors understood how useful this could be they would clamor for it. Maybe you could get teachers in Rhode Island to think about this. > From a sincere, big-picture kind-of-guy, > Howard D. > > > > > > > > David Rosen wrote: > >> Assessment Colleagues, >> >> Marie wrote: >> >> >>> What do we need? National standards? Is that the most important >>> thing that will help combat these issues? >>> >>> A different way to capture learning? What would that look like? >>> Remember that the needs of the funder and public are quite >>> different than the needs of the teacher and student ? and both are >>> legitimate needs. >>> >>> What are your thoughts on these issues? >>> >>> >> >> Ignore for the moment the current political political realities, and >> consider just the merits and faults, not the practicalities, of what >> I propose, a national System of Adult Education and Literacy which >> has three aligned components: National Curriculum Standards, (Free) >> National Curricula, and Standardized Assessments. Such a system >> could have other components, but for now, I suggest we look at these >> three. >> >> 1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, >> b) ABE (including adult basic education) c) ASE (adult secondary >> education/GED/EDP/ADP) and d) Transition to College programs , >> developed through a process which is widely respected by the field. >> (Some would argue that we already have that in Equipped for the >> Future.) >> >> 2. National curricula developed based on those standards and >> available for states to adopt (or adapt) as they choose. The >> curricula need to be comprehensive, modularized, available in generic >> as well as work-contextualized units, in English but also bilingual >> in Spanish and possibly other languages. It needs to be available >> free online in units that teachers could download and use in their >> classrooms, that tutors could use with their one-one-one or small >> group instruction, and in self-instructional formats that adult >> learners could use directly online. (Yes I know how big a task all >> this is.) >> >> 3. Standardized assessments developed against the national curriculum >> standards (tests, but also performance-based, direct assessments) >> which have a high degree of validity for measuring the national >> standards. >> >> Some might think that what I propose is too top-down. I would argue >> that it could be very bottom-up if the field -- and adult learner >> leaders -- are/have been/will be well-represented in setting the >> standards, and if the modules can be be selected to meet specific >> learner goals and contexts as well as to the standards. A national >> curriculum could be made up of a database of thousands of units of >> instruction (modules, learning objects) which could be very easily >> found and in minutes organized/reorganized to fit learners' goals and >> contexts. An adult learner or a group who need to improve their >> reading skills and who are interested in the context of parenting >> could easily access standards-based modules on parenting issues with >> reading materials at the right level(s). A teacher whose students >> worked in health care and who needed to improve their math skills >> could quickly find and download materials/lessons for using numeracy >> in health care settings. A student who wanted to learn online and who >> wanted a job in environmental cleanup work could access standards- >> based basic skills/occupational education lessons in this area, >> accompanied by an online career coach and and online tutor. These >> examples just hint at the complexity and sophistication of what I >> propose, and will have some shaking their heads at the cost. But, >> consider that if this is a national curriculum, the costs of >> developing such modules have the benefits of scale, that those >> curricula could be widely used -- and freely available. (Sorry >> publishers, this could eat into your profits.) >> >> There is more, but I'll stop with this. >> >> Okay, let the questions and brickbats fly. >> >> David J. Rosen >> djrosen at comcast.net >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> >> >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From hdooley at riral.org Mon Mar 20 11:13:29 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Mon, 20 Mar 2006 11:13:29 -0500 Subject: [Assessment 257] Re: : A National System of Adult Education andLiteracy In-Reply-To: <00ee01c64c33$0045b810$3002a8c0@ben2ut66kkx7o3> References: <002d01c64782$4b43c150$0402a8c0@frodo><44183675.8040303@riral.org> <09D7944F-651C-4AC2-B1EA-7F622BA1ABEA@comcast.net> <00ee01c64c33$0045b810$3002a8c0@ben2ut66kkx7o3> Message-ID: <441ED4A9.8060100@riral.org> Hi, Amy. Would you say the current federal situation fits your description of "without having these codified within the federal bureaucracy"? Ronna's work on the Standards Warehouse disseminates information about various standards developed, and of course the EFF website/s do that also; but the development of standards remains a local issue, with states developing standards or guidelines and then programs are able to align their curricula with those standards, based on learners' goals, needed certifications, next steps and so on. If all the state and local standards are based on national models -- say, for example, basing all math standards on NCTM's work -- then we'd have a system like what you describe, and which right now seems to "work" for OVAE. I could certainly get behind that. We need to maintain local control over the specific standards taught to, so that instruction continues to be based on the learner's goals and needs and based within the learner / teacher interaction. (I'm preaching to the choir now, right?) Howard D. Amy R. Trawick wrote: >David, I think this is an intriguing idea. Lots of issues are whirling >around in my head, but let's deal with this one first: > >Are you proposing the development of national standards or federal >standards? It is possible to have a set of national standards, supporting >curriculum, and standardized assessments--developed and referenced by the >field as a whole because of the collaborative nature of their development or >adoption--without having these codified within the federal bureaucracy. In >this scenario, the federal government could even provide support through >funding. Is this what your getting at, or are you seeing a more hands-on >role being played by the federal government? > >Amy > >Amy R. Trawick, M.S. Ed. >North Wilkesboro, North Carolina >atrawick at charter.net > > > > From khinson at future-gate.com Mon Mar 20 11:28:29 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Mon, 20 Mar 2006 17:28:29 +0100 Subject: [Assessment 258] Re: FW from Content Standards Message-ID: <441EE63E020000A0000022A3@fghamn01.ham.de.future-gate.com> Not everyone uses EFF. What about them? How do you go about addressing the questions/concerns such people might have as to why they don't use a system if it's already in place? Regards Katrina >>> marie.cora at hotspurpartners.com >>> Dear Colleagues, The following post is in response to comments made by David Rosen on the Assessment Discussion List on Sunday, March 19 regarding standards and on-line performance assessment. Thanks, marie ---- all A bottom-up, field (practitioner and learner) - represented set of standards? Sounds like EFF, no? What am I missing here? We have this rich, useful resource in EFF. Are we talking about starting over? Something different? Don't we have this already? Aren't some of us building on it? Janet Isserlis >> >> Some might think that what I propose is too top-down. I would argue >> that it could be very bottom-up if the field -- and adult learner >> leaders -- are/have been/will be well-represented in setting the >> standards, and if the modules can be be selected to meet specific >> learner goals and contexts as well as to the standards. David Rosen ---------------------------------------------------- ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From djrosen at comcast.net Mon Mar 20 12:06:59 2006 From: djrosen at comcast.net (David Rosen) Date: Mon, 20 Mar 2006 12:06:59 -0500 Subject: [Assessment 259] Re: FW from Content Standards In-Reply-To: <7275563f5a931dbfb23359050d201e7b@comcast.net> References: <00ec01c64c2e$9142f990$0402a8c0@frodo> <7275563f5a931dbfb23359050d201e7b@comcast.net> Message-ID: <82343987-9C40-4D0E-A49C-38504E13DF0A@comcast.net> Janet, Andrea, and others: To clarify, and for those who may not have seen my original, March 15th, post, I am proposing a national system of Adult education and Literacy which has three aligned components: National Curriculum Standards, (Free) National Curricula, and Standardized Assessments. I wrote, regarding national curriculum standards: "1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, b) ABE (including adult basic education) c) ASE (adult secondary education/GED/EDP/ADP) and d) Transition to College programs , developed through a process which is widely respected by the field. (Some would argue that we already have that in Equipped for the Future.)" I am among those who would argue that we already have that in EFF and agree with Janet that we do not need to start over on this. I am not sure what Andrea's "units plucked from the air" means but I proposed that curriculum units (and assessments) be based on these standards. Success would be measured based on adult learner goals and contexts and national standards. For this to succeed we would need national leadership and federal funding. Those who are interested in advocating for this could bring it up with Congressional and Presidential candidates in the upcoming elections as well as with their current elected representatives. David J. Rosen www.newsomeassociates.com djrosen at comcast.net On Mar 20, 2006, at 10:02 AM, Andrea Wilder wrote: > Janet, > > Very good questions-- > > What i am understanding about David's curriculum is that it would > consist of units that could be plucked from the air..What are the > strength of EFF? What are the strengths of David's curriculum? > Upsides and downsides to each? > > How is success measured? What values are enhanced? > > Who puts up the money for each? > > How about grassroots support? > > Andrea > > On Mar 20, 2006, at 9:57 AM, Marie Cora wrote: > >> Dear Colleagues, >> >> The following post is in response to comments made by David Rosen on >> the >> Assessment Discussion List on Sunday, March 19 regarding standards >> and >> on-line performance assessment. Thanks, marie >> ---- >> >> >> all >> >> A bottom-up, field (practitioner and learner) - represented set of >> standards? Sounds like EFF, no? What am I missing here? We have >> this >> rich, useful resource in EFF. Are we talking about starting over? >> Something different? Don't we have this already? Aren't some of us >> building on it? >> >> Janet Isserlis >> >> >>>> >>>> Some might think that what I propose is too top-down. I would >>>> argue >>>> that it could be very bottom-up if the field -- and adult learner >>>> leaders -- are/have been/will be well-represented in setting the >>>> standards, and if the modules can be be selected to meet specific >>>> learner goals and contexts as well as to the standards. >> >> David Rosen >> ---------------------------------------------------- >> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment >> > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From andreawilder at comcast.net Mon Mar 20 14:30:51 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Mon, 20 Mar 2006 14:30:51 -0500 Subject: [Assessment 260] Re: FW from Content Standards In-Reply-To: <82343987-9C40-4D0E-A49C-38504E13DF0A@comcast.net> References: <00ec01c64c2e$9142f990$0402a8c0@frodo> <7275563f5a931dbfb23359050d201e7b@comcast.net> <82343987-9C40-4D0E-A49C-38504E13DF0A@comcast.net> Message-ID: <67e9a0ef7f2d2c8525a7cc7ab136d425@comcast.net> "Units plucked from the air" is my way of saying that the curriculum is downloadable, eg, plucked from the air. Andrea On Mar 20, 2006, at 12:06 PM, David Rosen wrote: > Janet, Andrea, and others: > > To clarify, and for those who may not have seen my original, March > 15th, post, I am proposing a national system of Adult education and > Literacy which has three aligned components: National Curriculum > Standards, (Free) National Curricula, and Standardized Assessments. > > I wrote, regarding national curriculum standards: > > "1. Sets of national curriculum standards for: a) adult ESL/ESOL/ELL, > b) ABE (including adult basic education) c) ASE (adult secondary > education/GED/EDP/ADP) and d) Transition to College programs , > developed through a process which is widely respected by the field. > (Some would argue that we already have that in Equipped for the > Future.)" > > I am among those who would argue that we already have that in EFF and > agree with Janet that we do not need to start over on this. > > I am not sure what Andrea's "units plucked from the air" means but I > proposed that curriculum units (and assessments) be based on these > standards. > > Success would be measured based on adult learner goals and contexts > and national standards. > > For this to succeed we would need national leadership and federal > funding. Those who are interested in advocating for this could bring > it up with Congressional and Presidential candidates in the upcoming > elections as well as with their current elected representatives. > > David J. Rosen > www.newsomeassociates.com > djrosen at comcast.net > > > On Mar 20, 2006, at 10:02 AM, Andrea Wilder wrote: > >> Janet, >> >> Very good questions-- >> >> What i am understanding about David's curriculum is that it would >> consist of units that could be plucked from the air..What are the >> strength of EFF? What are the strengths of David's curriculum? >> Upsides and downsides to each? >> >> How is success measured? What values are enhanced? >> >> Who puts up the money for each? >> >> How about grassroots support? >> >> Andrea >> >> On Mar 20, 2006, at 9:57 AM, Marie Cora wrote: >> >>> Dear Colleagues, >>> >>> The following post is in response to comments made by David Rosen on >>> the >>> Assessment Discussion List on Sunday, March 19 regarding standards >>> and >>> on-line performance assessment. Thanks, marie >>> ---- >>> >>> >>> all >>> >>> A bottom-up, field (practitioner and learner) - represented set of >>> standards? Sounds like EFF, no? What am I missing here? We have >>> this >>> rich, useful resource in EFF. Are we talking about starting over? >>> Something different? Don't we have this already? Aren't some of us >>> building on it? >>> >>> Janet Isserlis >>> >>> >>>>> >>>>> Some might think that what I propose is too top-down. I would >>>>> argue >>>>> that it could be very bottom-up if the field -- and adult learner >>>>> leaders -- are/have been/will be well-represented in setting the >>>>> standards, and if the modules can be be selected to meet specific >>>>> learner goals and contexts as well as to the standards. >>> >>> David Rosen >>> ---------------------------------------------------- >>> >>> >>> ------------------------------- >>> National Institute for Literacy >>> Assessment mailing list >>> Assessment at nifl.gov >>> To unsubscribe or change your subscription settings, please go to >>> http://www.nifl.gov/mailman/listinfo/assessment >>> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From andreawilder at comcast.net Mon Mar 20 17:05:09 2006 From: andreawilder at comcast.net (Andrea Wilder) Date: Mon, 20 Mar 2006 17:05:09 -0500 Subject: [Assessment 261] Re: FW from Content Standards In-Reply-To: <441EE63E020000A0000022A3@fghamn01.ham.de.future-gate.com> References: <441EE63E020000A0000022A3@fghamn01.ham.de.future-gate.com> Message-ID: <16c93f0f35aabea8cf0101ed656e5154@comcast.net> Katrina-- Do adult literacy teachers have a kind of internalized list of what adults need to know? I am not thinking of EFF, i am thinking of a teacherly set of internalized content (and performance) standards. Example: one of the adults living in my house this year can neither swim nor drive a car. This stood out for me. My set of internalized standards says she should have these two skills. Thanks for any light you can give on this subject. Andrea Andrea On Mar 20, 2006, at 11:28 AM, Katrina Hinson wrote: > Not everyone uses EFF. What about them? How do you go about > addressing the questions/concerns such people might have as to why > they don't use a system if it's already in place? > > Regards > Katrina > >>>> marie.cora at hotspurpartners.com >>> > Dear Colleagues, > > The following post is in response to comments made by David Rosen on > the > Assessment Discussion List on Sunday, March 19 regarding standards and > on-line performance assessment. Thanks, marie > ---- > > > all > > A bottom-up, field (practitioner and learner) - represented set of > standards? Sounds like EFF, no? What am I missing here? We have this > rich, useful resource in EFF. Are we talking about starting over? > Something different? Don't we have this already? Aren't some of us > building on it? > > Janet Isserlis > > >>> >>> Some might think that what I propose is too top-down. I would argue >>> that it could be very bottom-up if the field -- and adult learner >>> leaders -- are/have been/will be well-represented in setting the >>> standards, and if the modules can be be selected to meet specific >>> learner goals and contexts as well as to the standards. > > David Rosen > ---------------------------------------------------- > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From marie.cora at hotspurpartners.com Wed Mar 22 07:31:09 2006 From: marie.cora at hotspurpartners.com (marie.cora at hotspurpartners.com) Date: Wed, 22 Mar 2006 05:31:09 -0700 Subject: [Assessment 262] Re: : A National System of Adult Education and Literacy Message-ID: <20060322053109.25f5a32e926a0964fcd463de8558d100.0ee0bd1d1d.wbe@email.email.secureserver.net> Dear Colleague, The following post if from Linda Perry and was posted on March 21 to the Content Standards Discussion List. marie cora _____ > > Interesting part of a discussion. There are Scans competencies that are > > followed in many areas in California and the testing for our program > and > other ESL programs is from CAL, Center for Applied Linguistics, > implying > some applied linguistic guidelines. ORU recommends that doctoral > students > who have studied for their undergraduate and/or master's degrees in > their > education programs, study in other institutions for their doctorates so > they > don't become ingrown and overly focused on one way of doing things. > The > idea is that it is broadening and keeps a person open to new and > alternative > ways of teaching. Is David proposing national guidelines? This is a > good > idea, if it is an open ended recommendation that is regularly reviewed > and > allows for input, research and inclusion of varied approaches. > > > >From: "Amy R. Trawick" > >Reply-To: The Adult Education Content Standards Discussion > >List > >To: "The Assessment Discussion List" ,"The Adult > > >Education Content Standards Discussion > List" > >Subject: [ContentStandards 75] Re: [Assessment 239] Re: : A National > >Systemof Adult Education andLiteracy > >Date: Mon, 20 Mar 2006 10:28:54 -0500 > > > >David, I think this is an intriguing idea. Lots of issues are > whirling > >around in my head, but let's deal with this one first: > > > >Are you proposing the development of national standards or federal > >standards? It is possible to have a set of national standards, > supporting > >curriculum, and standardized assessments--developed and referenced by > the > >field as a whole because of the collaborative nature of their > development > >or > >adoption--without having these codified within the federal > bureaucracy. In > >this scenario, the federal government could even provide support > through > >funding. Is this what your getting at, or are you seeing a more > hands-on > >role being played by the federal government? > > > >Amy > > > >Amy R. Trawick, M.S. Ed. > >North Wilkesboro, North Carolina > >atrawick at charter.net > > > > > > > >----- Original Message ----- > >From: "David Rosen" > >To: ; "The Assessment Discussion List" > > > >Sent: Wednesday, March 15, 2006 1:09 PM > >Subject: [Assessment 239] Re: : A National System of Adult Education > >andLiteracy > > > > > >Howard, > > > >Thanks for your thoughtful comments. See my replies below. I hope > >others will join in this discussion, too, from the Assessment list > >and from the Content Standards list. > > > >David J. Rosen > >djrosen at comcast.net > > > >On Mar 15, 2006, at 10:44 AM, Howard L. Dooley, Jr. wrote: > > > > > I think David has a good, basic overall plan here. I wouldn't say > the > > > plan is top-down either. I think it recognizes that there is push > > > today > > > to be able to look at success across states and throughout the > > > country, > > > and for that we need a way to connect our local efforts into a > > > national > > > system. Think globally; act locally -- as always the best > politics > > > and > > > the best basis for a system of adult ed. But my sense is that, > right > > > now, funders are in favor of such a national system, but most > > > practitioners are not. > > > >I wish that funders _were_ in favor of this. The largest adult > >education funder, the U.S. Department of Education, is reluctant to > >establish a set of national curriculum standards. I am not sure why, > >but guess that it is because a long-standing tradition that > >curriculum standards in American Education are in the control of > >local school committees and state boards of education. The closest > >the USDOE has come to this is funding the development of a > >"warehouse" of state curriculum standards, > >[ http://www.adultedcontentstandards.org/Source/GetStandard.asp ] and > >(through the National Institute for Literacy) supporting the > >development -- but not the endorsement as national curriculum > >standards - of Equipped for the Future. (I am not sure I have that > >exactly right so if someone has better information, please let us > >know.) The problem, as many people have said, is not that we lack > >standards in the U.S., but that we have too many competing sets of > >standards. We lack a set of national standards that everyone uses. > > > > > Because, really, the benefits of such a system > > > are largely for the funders, policy makers, and big-picture > people; > > > for > > > the instructor and learner in the classroom, what is the impact of > it? > > > How does it matter that what I need to learn and am mastering to > get a > > > job in RI is also what someone needs to learn and master to enter > a > > > community college in AL? It may be interesting, but what does it > > > matter? > > > >I agree that a system such as I propose would benefit funders. > >However, it would also benefit teachers and learners. A lot of > >curriculum -- often very good curriculum -- is developed in programs > >and states across the country. But much of it is not published, and > >if it is, is not easily accessed. It is possible to find some good > >curriculum through NIFL LINCS, and in other places on the Web, for > >example, but this takes time, a lot of time. Teachers don't have > >much time to search for curriculum. It would be of great interest to > >most teachers if high quality curriculum --ready to download and use > >-- and adapt to local needs -- in class tomorrow could _easily be > >found_. > > > >Let me give you an example. As I understand it (folks from Arizona > >correct me if I got this wrong) Arizona has a set of state ESOL > >standards that are widely used, and respected by ESOL teachers > >there. A couple of ESOL teachers at Pima County Community College > >decided that they were useful as far as they went, but they wanted to > >have good web-based instruction linked to those standards. So they > >spent hours and hours finding -- and linking -- instruction on a Web > >page that they call The Splendid ESOL Web [ http://cc.pima.edu/ > >~slundquist/index.htm ] When I was doing workshops in Arizona a > >couple of years ago, ESOL teachers popped up from across the state to > >tell me about The Splendid ESOL Web and how useful it is to them. > >This is instructional for us all: a set of standards developed by and > >respected by teachers, a set of online instructional resources found > >and organized/linked by ESOL Teachers, and widely used by other ESOL > >teachers. This sounds like a model to emulate in national curriculum > >development. > > > >Take this a step further. Suppose we had an agreed-upon format for > >developing instructional resources, nothing fancy, one that most > >teachers found easy to understand, easy to use, and that was linked > >to national standards. Suppose further that the format referenced > >national curriculum standards, that every lesson or module or > >learning object built by a teacher referenced a national curriculum > >standard. Then suppose the modules teachers developed were peer- > >reviewed and those that were approved were stored in an easily- > >accessible Web-based instructional lesson/module/learning objects > >database where other teachers could access them by standard, topic, > >level, etc. Some of the elements of what I have described are in > >place. For example, the Lesson Plan Builder, developed by OTAN in > >California > >[ http://www.lessonplanbuilder.org/lessons/ ], has a practical format > >for creating lesson plans online, and links them to California (and > >nationally used) standards. OTAN plans to store these lessons in an > >accessible database. When that's done, the teacher's chore of finding > >good lesson plans will be easier. Also, I very much like that these > >are lesson plans created "bottom up" by teachers (or perhaps even by > >teachers and their students together.) > > > > > I also think that many of the standards, curriculum and assessment > > > pieces already exist. If one has the time -- and right now it > takes > > > time, believe me -- to peruse and ferret the web, you can find a > > > wealth > > > of excellent curricula that is the start of a "comprehensive, > > > modularized [curriculum], available in generic as well as > > > work-contextualized units, in English". > > > >Yes, much of it is there -- and it's hard to find. Some of it is not > >there, however. Try to find work-contextualized online lessons which > >students can access directly (not teacher lesson plans but student > >lessons online.) I have been searching high and low for these -- in > >health care work -- but haven't found much. Yet, given the good jobs > >going begging in health care in New England -- and elsewhere -- > >wouldn't it be useful if health care workers could do some of their > >basic skills learning online and if the instruction were > >contextualized or embedded in health care work? > > > > > Much of it "available in free > > > online in units that teachers could download and use in their > > > classrooms, that tutors could use with their one-one-one or > smallgroup > > > instruction". We use several items for our EL Civics, ESL > > > listening and > > > ABE math curriculua that are from the web. The weakest link for us > is > > > "material in self-instructional formats that adult learners can > use > > > directly online." > > > >Yes, that is the weakest link. > > > > > There's a lot of print stuff that's been transferred > > > to the web, put it's not exciting or constructivist enough to > engage > > > self-directed learners, unless they are high level readers and > highly > > > self-motivated. > > > >Right you are. > > > > > So, I think we could get there more quickly than we might think, > but > > > only if most of us really want to get there at all. > > > >Your state, Rhode Island, the first wireless Internet access state, > >border-to-border, would be a perfect "testbed" for a system such as I > >am proposing. I think if teachers and tutors understood how useful > >this could be they would clamor for it. Maybe you could get teachers > >in Rhode Island to think about this. > > > > > From a sincere, big-picture kind-of-guy, > > > Howard D. > > > > > > > > > > > > > > > > > > > > > > > > > > > > David Rosen wrote: > > > > > >> Assessment Colleagues, > > >> > > >> Marie wrote: > > >> > > >> > > >>> What do we need? National standards? Is that the most > important > > >>> thing that will help combat these issues? > > >>> > > >>> A different way to capture learning? What would that look like? > > >>> Remember that the needs of the funder and public are quite > > >>> different than the needs of the teacher and student ? and both > are > > >>> legitimate needs. > > >>> > > >>> What are your thoughts on these issues? > > >>> > > >>> > > >> > > >> Ignore for the moment the current political political realities, > and > > >> consider just the merits and faults, not the practicalities, of > what > > >> I propose, a national System of Adult Education and Literacy > which > > >> has three aligned components: National Curriculum Standards, > (Free) > > >> National Curricula, and Standardized Assessments. Such a system > > >> could have other components, but for now, I suggest we look at > these > > >> three. > > >> > > >> 1. Sets of national curriculum standards for: a) adult > ESL/ESOL/ELL, > > >> b) ABE (including adult basic education) c) ASE (adult secondary > > >> education/GED/EDP/ADP) and d) Transition to College programs , > > >> developed through a process which is widely respected by the > field. > > >> (Some would argue that we already have that in Equipped for the > > >> Future.) > > >> > > >> 2. National curricula developed based on those standards and > > >> available for states to adopt (or adapt) as they choose. The > > >> curricula need to be comprehensive, modularized, available in > generic > > >> as well as work-contextualized units, in English but also > bilingual > > >> in Spanish and possibly other languages. It needs to be > available > > >> free online in units that teachers could download and use in > their > > >> classrooms, that tutors could use with their one-one-one or small > > >> group instruction, and in self-instructional formats that adult > > >> learners could use directly online. (Yes I know how big a task > all > > >> this is.) > > >> > > >> 3. Standardized assessments developed against the national > curriculum > > >> standards (tests, but also performance-based, direct assessments) > > >> which have a high degree of validity for measuring the national > > >> standards. > > >> > > >> Some might think that what I propose is too top-down. I would > argue > > >> that it could be very bottom-up if the field -- and adult learner > > >> leaders -- are/have been/will be well-represented in setting the > > >> standards, and if the modules can be be selected to meet specific > > >> learner goals and contexts as well as to the standards. A > national > > >> curriculum could be made up of a database of thousands of units > of > > >> instruction (modules, learning objects) which could be very > easily > > >> found and in minutes organized/reorganized to fit learners' goals > and > > >> contexts. An adult learner or a group who need to improve their > > >> reading skills and who are interested in the context of parenting > > >> could easily access standards-based modules on parenting issues > with > > >> reading materials at the right level(s). A teacher whose > students > > >> worked in health care and who needed to improve their math skills > > >> could quickly find and download materials/lessons for using > numeracy > > >> in health care settings. A student who wanted to learn online and > who > > >> wanted a job in environmental cleanup work could access > standards- > > >> based basic skills/occupational education lessons in this area, > > >> accompanied by an online career coach and and online tutor. > These > > >> examples just hint at the complexity and sophistication of what I > > >> propose, and will have some shaking their heads at the cost. > But, > > >> consider that if this is a national curriculum, the costs of > > >> developing such modules have the benefits of scale, that those > > >> curricula could be widely used -- and freely available. (Sorry > > >> publishers, this could eat into your profits.) > > >> > > >> There is more, but I'll stop with this. > > >> > > >> Okay, let the questions and brickbats fly. > > >> > > >> David J. Rosen > > >> djrosen at comcast.net > > >> > > >> ------------------------------- > > >> National Institute for Literacy > > >> Assessment mailing list > > >> Assessment at nifl.gov > > >> To unsubscribe or change your subscription settings, please go to > > >> http://www.nifl.gov/mailman/listinfo/assessment > > >> > > >> > > >> > > > > > > ------------------------------- > > > National Institute for Literacy > > > Assessment mailing list > > > Assessment at nifl.gov > > > To unsubscribe or change your subscription settings, please go to > > > http://www.nifl.gov/mailman/listinfo/assessment > > > >------------------------------- > >National Institute for Literacy > >Assessment mailing list > >Assessment at nifl.gov > >To unsubscribe or change your subscription settings, please go to > >http://www.nifl.gov/mailman/listinfo/assessment > > > >---------------------------------------------------- > >National Institute for Literacy > >Adult Education Content Standards mailing list > >ContentStandards at nifl.gov > >To unsubscribe or change your subscription settings, please go to > >http://www.nifl.gov/mailman/listinfo/contentstandards > > > > --------------------------------------------------------------------- > ---------------------------------------------------- > National Institute for Literacy > Adult Education Content Standards mailing list > ContentStandards at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/contentstandards From marie.cora at hotspurpartners.com Thu Mar 23 07:29:21 2006 From: marie.cora at hotspurpartners.com (marie.cora at hotspurpartners.com) Date: Thu, 23 Mar 2006 05:29:21 -0700 Subject: [Assessment 263] ETS Assessments Q&A: Reminder Message-ID: <20060323052921.25f5a32e926a0964fcd463de8558d100.5634815c6e.wbe@email.email.secureserver.net> Good morning, afternoon, and evening to you all. I would like to remind folks that next week there will be a Guest Q&A on the 3 new literacy assessments being developed by ETS (Educational Testing Service). They are seeking involvement from the field in the next stages of the test development. Please post your questions or comments - you can post to me and I will forward them next week, or you can wait til next week and post your question yourself. Please note that this discussion is not synchronous as they usually are - Julie Eastland will respond to questions periodically throughout the week. Thanks and looking forward to your questions and comments. marie cora Assessment Discussion List Moderator ----- This original post was sent to the Assessment Discussion List on Friday, March 17. Dear Colleagues, The Assessment Discussion List will be hosting a Q&A during the week of March 27 on 3 new assessments for adult learning being developed by ETS (Educational Testing Service). ETS is seeking states to collaborate on the development of these new standards-based assessments. Presently, 7 Charter states have been working with ETS on this project. ETS is recruiting several more states for the next phase of the project, which includes: * Developing, reviewing and selecting tasks to be included in the new measures; * Contributing to the development of diagnostic score reports; * Participating in a standard-setting process that will map the tests to the NRS levels; * Piloting the tests with your adult learners; * Creating a test designed by you with your state's learners', teachers', and administrators' needs in mind. Julie Eastland, of ETS, will be joining us during the week of March 27 to answer your questions and comments regarding the project. You can send your questions to the List before the week of March 27, and I will hold them for that week, or you can post your questions and comments during that week. Julie will be available to respond periodically throughout that week. For more information, please see the attachment. Thanks and looking forward to your questions and comments. marie cora Assessment Discussion List Moderator From marie.cora at hotspurpartners.com Mon Mar 27 11:00:34 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 27 Mar 2006 11:00:34 -0500 Subject: [Assessment 264] Questions about the new ETS assessments Message-ID: <003501c651b7$9575ace0$0402a8c0@frodo> Good morning, afternoon, and evening to you all. I hope this email finds you well. I would like to welcome Julie Eastland from Educational Testing Services, here during this week to respond to your questions about the development of three new assessments in ABE. I understand that there may also be some folks from the Charter states currently involved in the project who can respond as well. Thanks to you all for sharing information and your thoughts with us. I would like to start by asking if any of the Charter state members can share with us how they feel they have benefited from participating in this project, and what types of challenges they may have encountered. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060327/470cec57/attachment.html From marie.cora at hotspurpartners.com Mon Mar 27 11:08:22 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 27 Mar 2006 11:08:22 -0500 Subject: [Assessment 265] More questions for ETS Message-ID: <003f01c651b8$ac694410$0402a8c0@frodo> Hello everyone, I have received a few questions to be posted for the discussion: -Are these assessments that students will take on-line? -Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? -What does a state need to do in order to join the project? -If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If you have questions, please feel free to send your question to me for posting - or feel free to post it yourself. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060327/db4bb704/attachment.html From Marcia.Cook at maine.gov Mon Mar 27 11:09:26 2006 From: Marcia.Cook at maine.gov (Cook, Marcia) Date: Mon, 27 Mar 2006 11:09:26 -0500 Subject: [Assessment 266] Re: Questions about the new ETS assessments Message-ID: <32E5C6B0B949584D9B6168C5F727916BB6E0F2@SOM-TEAQASMAIL1.som.w2k.state.me.us> Maine began to participate as a Charter State. Our biggest challenge is money. Although we have for years been actively involved in EFF and in many areas still are, we do not have the money at this time to continue to participate in the project. Our biggest concern about the EFF assessments is the cost of the assessments to each program. We are a small state population wise and receive a relatively small amount of money. I will be anxious to hear this addressed during the week. Thanks, Marcia _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, March 27, 2006 11:01 AM To: assessment at nifl.gov Subject: [Assessment 264] Questions about the new ETS assessments Good morning, afternoon, and evening to you all. I hope this email finds you well. I would like to welcome Julie Eastland from Educational Testing Services, here during this week to respond to your questions about the development of three new assessments in ABE. I understand that there may also be some folks from the Charter states currently involved in the project who can respond as well. Thanks to you all for sharing information and your thoughts with us. I would like to start by asking if any of the Charter state members can share with us how they feel they have benefited from participating in this project, and what types of challenges they may have encountered. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060327/d90cf193/attachment.html From S.Oconnor at BrooklynPublicLibrary.org Mon Mar 27 11:19:48 2006 From: S.Oconnor at BrooklynPublicLibrary.org (O'Connor, Susan) Date: Mon, 27 Mar 2006 11:19:48 -0500 Subject: [Assessment 267] Re: Questions about the new ETS assessments Message-ID: <7C952CC727B6A94BA342634037F1F93505138DC2@bplwired2.BPL-CENTRAL.local> Brooklyn Public LIbrary considered participation in this project but the cost was so high that we were unable to even consider participation. This leads me to wonder about the efficacy of such an expensive venture for adult literacy programs that have no fat at all to expend on anything other than instruction. -----Original Message----- From: Cook, Marcia To: 'The Assessment Discussion List' Cc: Dyer, Becky Sent: 3/27/06 11:09 AM Subject: [Assessment 266] Re: Questions about the new ETS assessments Maine began to participate as a Charter State. Our biggest challenge is money. Although we have for years been actively involved in EFF and in many areas still are, we do not have the money at this time to continue to participate in the project. Our biggest concern about the EFF assessments is the cost of the assessments to each program. We are a small state population wise and receive a relatively small amount of money. I will be anxious to hear this addressed during the week. Thanks, Marcia _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, March 27, 2006 11:01 AM To: assessment at nifl.gov Subject: [Assessment 264] Questions about the new ETS assessments Good morning, afternoon, and evening to you all. I hope this email finds you well. I would like to welcome Julie Eastland from Educational Testing Services, here during this week to respond to your questions about the development of three new assessments in ABE. I understand that there may also be some folks from the Charter states currently involved in the project who can respond as well. Thanks to you all for sharing information and your thoughts with us. I would like to start by asking if any of the Charter state members can share with us how they feel they have benefited from participating in this project, and what types of challenges they may have encountered. Thanks, marie cora Assessment Discussion List Moderator <> From jeastland at ETS.ORG Mon Mar 27 11:22:29 2006 From: jeastland at ETS.ORG (Eastland, Julie) Date: Mon, 27 Mar 2006 11:22:29 -0500 Subject: [Assessment 268] Re: More questions for ETS Message-ID: <95782C26106D904E814E554F47EACA9C01A813FF@rosnt115.etslan.org> Hi Everyone, I will be available periodically this week to answer the questions that are posted. 1. Are these assessments that students will take on-line? Yes. They will be developed for on-line use. The students will take the assessments on-line, and the assessments will be scored and scaled in real-time on -line and immediately produce score reports, which institutions can release directly to the student and/or teacher. There will be a tutorial developed for students to access if they need assistance with basic computer skills such as using a mouse and highlighting, and the tutorial will also allow students to become familiar with the item types that will be seen on the test. 2. Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? Yes, they are based on the EFF Reading and Math frameworks, and will use real-life materials to assess student knowledge and proficiency in those areas. We will work with participants to align the tests to meet the NRS requirements, which should make the tests useful even in states where the EFF frameworks are not utilized. 3. What does a state need to do in order to join the project? A state would need to commit to the next phase of the project (a 2 year period), which would mean signing a Memorandum of Understanding with ETS, agreeing to attend development meetings, and piloting all of the tests (reading with Understanding, Reading Components, and Using Math to Solve Problems) with a minimum of 330 adult test candidates. The cost of piloting is expected to be $10,000 per state. If your state is interested, I would encourage you to contact me directly for more information about this effort - I would be happy to share a copy of the MOU with you.. 4. If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If your state decides to join this project, you will not be required to use the assessments beyond the pilot stage. If you are involved in the pilot stage, you will have input into the development of the tests, including score reports. If you were not in the development project, the tests will be made available for purchase after the development phase. At this time, we anticipate the final costs of the test to be approximately $10 per test. I look forward to further discussion this week. Best regards, Julie ________________________ Julie K. Eastland Program Administrator Center for Global Assessment Educational Testing Service Rosedale Rd. Princeton, NJ 08541 jeastland at ets.org ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, March 27, 2006 11:08 AM To: assessment at nifl.gov Subject: [Assessment 265] More questions for ETS Hello everyone, I have received a few questions to be posted for the discussion: -Are these assessments that students will take on-line? -Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? -What does a state need to do in order to join the project? -If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If you have questions, please feel free to send your question to me for posting - or feel free to post it yourself. Thanks, marie cora Assessment Discussion List Moderator -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060327/eb40b63c/attachment.html From djrosen at comcast.net Mon Mar 27 11:38:17 2006 From: djrosen at comcast.net (David Rosen) Date: Mon, 27 Mar 2006 11:38:17 -0500 Subject: [Assessment 269] Re: Questions about the new ETS assessments In-Reply-To: <32E5C6B0B949584D9B6168C5F727916BB6E0F2@SOM-TEAQASMAIL1.som.w2k.state.me.us> References: <32E5C6B0B949584D9B6168C5F727916BB6E0F2@SOM-TEAQASMAIL1.som.w2k.state.me.us> Message-ID: <465E6E65-4CB2-4360-9D25-E2B8E167A765@comcast.net> Assessment Colleagues, Thanks for bringing this up, Marcia. In K-12 and higher education, the costs of assessment relative to instruction are relatively small. In vastly underfunded adult literacy education the additional costs may be relatively significant, and may severely drain resources from already thinly-provided instruction. Does this mean we should abandon assessment? Of course not. But perhaps it is time to cost it out, and to ask Congress and state legislatures to pay for the increased costs. It is possible that some state legislatures -- and possibly Congress -- would understand a line item increase needed to pay for assessment, even in times of fiscal restraint. So, my question for Julie Eastland: What are the costs involved? ? For being a participating state? ? For teachers' time to learn how to conduct the assessments? ? For the assessment instruments? And what might this be in terms of additional cost-per-student? In other words, if a State Director of Adult Education were asked by a legislator how much it would cost to fully implement EFF assessments, what would be the answer? What would the additional cost per student be, recognizing that that would vary for states with larger or fewer numbers of students? David J. Rosen newsomeassociates.com djrosen at comcast.net On Mar 27, 2006, at 11:09 AM, Cook, Marcia wrote: > Maine began to participate as a Charter State. Our biggest > challenge is money. Although we have for years been actively > involved in EFF and in many areas still are, we do not have the > money at this time to continue to participate in the project. Our > biggest concern about the EFF assessments is the cost of the > assessments to each program. We are a small state population wise > and receive a relatively small amount of money. I will be anxious > to hear this addressed during the week. > > Thanks, > > Marcia > > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On Behalf Of Marie Cora > Sent: Monday, March 27, 2006 11:01 AM > To: assessment at nifl.gov > Subject: [Assessment 264] Questions about the new ETS assessments > > Good morning, afternoon, and evening to you all. I hope this email > finds you well. > > > > I would like to welcome Julie Eastland from Educational Testing > Services, here during this week to respond to your questions about > the development of three new assessments in ABE. I understand that > there may also be some folks from the Charter states currently > involved in the project who can respond as well. Thanks to you all > for sharing information and your thoughts with us. > > > > I would like to start by asking if any of the Charter state members > can share with us how they feel they have benefited from > participating in this project, and what types of challenges they > may have encountered. > > > > Thanks, > > > > marie cora > > Assessment Discussion List Moderator > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From hdooley at riral.org Mon Mar 27 15:55:46 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Mon, 27 Mar 2006 15:55:46 -0500 Subject: [Assessment 270] Re: Questions about the new ETS assessments In-Reply-To: <465E6E65-4CB2-4360-9D25-E2B8E167A765@comcast.net> References: <32E5C6B0B949584D9B6168C5F727916BB6E0F2@SOM-TEAQASMAIL1.som.w2k.state.me.us> <465E6E65-4CB2-4360-9D25-E2B8E167A765@comcast.net> Message-ID: <44285152.6090501@riral.org> And my question about the costs is, what is the value added? What will learners and instructors receive that will inform future learning, that other assessments don't already provide? What will program managers receive that will inform program design and funding allocations, beyond what current assessments and evaluations already provide? What will legislators and other funders learn that will benefit future funding, that they are not already learning from adult education advocates? And, what will be reportable to the NRS that is not already reported? My understanding of these assessments is that they are developing standardized assessments aligned to the EFF content standards. Do we need this? Aren't other assessments already aligned with EFF enough to provide the information each stakeholder needs to make their decisions? Isn't the current mix of formal and informal, standardized and local, performance and objective assessments essentially doing the necessary tasks (leaving room for continuous improvements, of course)? I can see why some programs may want to have a highly aligned, standardized assessment, and perhaps they should have that option and be wiling to pay for it. But will most of us need it? I believe that in considering the most efficient use of scarce resources, we should look to value added by these investments. Howard D. David Rosen wrote: >Assessment Colleagues, > >Thanks for bringing this up, Marcia. In K-12 and higher education, >the costs of assessment relative to instruction are relatively >small. In vastly underfunded adult literacy education the additional >costs may be relatively significant, and may severely drain resources >from already thinly-provided instruction. Does this mean we should >abandon assessment? Of course not. But perhaps it is time to cost >it out, and to ask Congress and state legislatures to pay for the >increased costs. It is possible that some state legislatures -- and >possibly Congress -- would understand a line item increase needed to >pay for assessment, even in times of fiscal restraint. > >So, my question for Julie Eastland: What are the costs involved? > >? For being a participating state? >? For teachers' time to learn how to conduct the assessments? >? For the assessment instruments? > >And what might this be in terms of additional cost-per-student? In >other words, if a State Director of Adult Education were asked by a >legislator how much it would cost to fully implement EFF assessments, >what would be the answer? What would the additional cost per student >be, recognizing that that would vary for states with larger or fewer >numbers of students? > >David J. Rosen >newsomeassociates.com >djrosen at comcast.net > > From jgordon at fortunesociety.org Mon Mar 27 17:07:30 2006 From: jgordon at fortunesociety.org (John Gordon) Date: Mon, 27 Mar 2006 17:07:30 -0500 Subject: [Assessment 271] Re: More questions for ETS Message-ID: Did you say $10 per test? Does that mean that each time we test someone, it would cost $10? John Gordon _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Eastland, Julie Sent: Monday, March 27, 2006 11:22 AM To: The Assessment Discussion List Subject: [Assessment 268] Re: More questions for ETS Hi Everyone, I will be available periodically this week to answer the questions that are posted. 1. Are these assessments that students will take on-line? Yes. They will be developed for on-line use. The students will take the assessments on-line, and the assessments will be scored and scaled in real-time on -line and immediately produce score reports, which institutions can release directly to the student and/or teacher. There will be a tutorial developed for students to access if they need assistance with basic computer skills such as using a mouse and highlighting, and the tutorial will also allow students to become familiar with the item types that will be seen on the test. 2. Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? Yes, they are based on the EFF Reading and Math frameworks, and will use real-life materials to assess student knowledge and proficiency in those areas. We will work with participants to align the tests to meet the NRS requirements, which should make the tests useful even in states where the EFF frameworks are not utilized. 3. What does a state need to do in order to join the project? A state would need to commit to the next phase of the project (a 2 year period), which would mean signing a Memorandum of Understanding with ETS, agreeing to attend development meetings, and piloting all of the tests (reading with Understanding, Reading Components, and Using Math to Solve Problems) with a minimum of 330 adult test candidates. The cost of piloting is expected to be $10,000 per state. If your state is interested, I would encourage you to contact me directly for more information about this effort - I would be happy to share a copy of the MOU with you.. 4. If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If your state decides to join this project, you will not be required to use the assessments beyond the pilot stage. If you are involved in the pilot stage, you will have input into the development of the tests, including score reports. If you were not in the development project, the tests will be made available for purchase after the development phase. At this time, we anticipate the final costs of the test to be approximately $10 per test. I look forward to further discussion this week. Best regards, Julie ________________________ Julie K. Eastland Program Administrator Center for Global Assessment Educational Testing Service Rosedale Rd. Princeton, NJ 08541 jeastland at ets.org ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov ] On Behalf Of Marie Cora Sent: Monday, March 27, 2006 11:08 AM To: assessment at nifl.gov Subject: [Assessment 265] More questions for ETS Hello everyone, I have received a few questions to be posted for the discussion: -Are these assessments that students will take on-line? -Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? -What does a state need to do in order to join the project? -If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If you have questions, please feel free to send your question to me for posting - or feel free to post it yourself. Thanks, marie cora Assessment Discussion List Moderator -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060327/803487d9/attachment.html From jeastland at ETS.ORG Mon Mar 27 18:40:47 2006 From: jeastland at ETS.ORG (Eastland, Julie) Date: Mon, 27 Mar 2006 18:40:47 -0500 Subject: [Assessment 272] Re: More questions for ETS Message-ID: <95782C26106D904E814E554F47EACA9C01AEA7A7@rosnt115.etslan.org> Yes. The costs at this time are expected to be $10 per test. The tests are designed to be web-delivered and administered, scored on-line, and give score reports, and a downloadable database of results in a file format that is compatible with most spreadsheets (excel, access, etc). There is almost no administrative burden, in that the instructors do not have to score the tests or record the results. Administration can be done in a computer lab and instructors do not need to administer the test -once the student is set up at a computer, the test is administered on the computer. Julie ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of John Gordon Sent: Monday, March 27, 2006 5:08 PM To: The Assessment Discussion List Subject: [Assessment 271] Re: More questions for ETS Did you say $10 per test? Does that mean that each time we test someone, it would cost $10? John Gordon ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Eastland, Julie Sent: Monday, March 27, 2006 11:22 AM To: The Assessment Discussion List Subject: [Assessment 268] Re: More questions for ETS Hi Everyone, I will be available periodically this week to answer the questions that are posted. 1. Are these assessments that students will take on-line? Yes. They will be developed for on-line use. The students will take the assessments on-line, and the assessments will be scored and scaled in real-time on -line and immediately produce score reports, which institutions can release directly to the student and/or teacher. There will be a tutorial developed for students to access if they need assistance with basic computer skills such as using a mouse and highlighting, and the tutorial will also allow students to become familiar with the item types that will be seen on the test. 2. Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? Yes, they are based on the EFF Reading and Math frameworks, and will use real-life materials to assess student knowledge and proficiency in those areas. We will work with participants to align the tests to meet the NRS requirements, which should make the tests useful even in states where the EFF frameworks are not utilized. 3. What does a state need to do in order to join the project? A state would need to commit to the next phase of the project (a 2 year period), which would mean signing a Memorandum of Understanding with ETS, agreeing to attend development meetings, and piloting all of the tests (reading with Understanding, Reading Components, and Using Math to Solve Problems) with a minimum of 330 adult test candidates. The cost of piloting is expected to be $10,000 per state. If your state is interested, I would encourage you to contact me directly for more information about this effort - I would be happy to share a copy of the MOU with you.. 4. If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If your state decides to join this project, you will not be required to use the assessments beyond the pilot stage. If you are involved in the pilot stage, you will have input into the development of the tests, including score reports. If you were not in the development project, the tests will be made available for purchase after the development phase. At this time, we anticipate the final costs of the test to be approximately $10 per test. I look forward to further discussion this week. Best regards, Julie ________________________ Julie K. Eastland Program Administrator Center for Global Assessment Educational Testing Service Rosedale Rd. Princeton, NJ 08541 jeastland at ets.org ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov ] On Behalf Of Marie Cora Sent: Monday, March 27, 2006 11:08 AM To: assessment at nifl.gov Subject: [Assessment 265] More questions for ETS Hello everyone, I have received a few questions to be posted for the discussion: -Are these assessments that students will take on-line? -Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? -What does a state need to do in order to join the project? -If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If you have questions, please feel free to send your question to me for posting - or feel free to post it yourself. Thanks, marie cora Assessment Discussion List Moderator -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060327/538cd8c2/attachment.html From MKroege at ade.az.gov Tue Mar 28 09:56:25 2006 From: MKroege at ade.az.gov (Kroeger, Miriam) Date: Tue, 28 Mar 2006 07:56:25 -0700 Subject: [Assessment 273] Cost factors for assessments Message-ID: <1DE339C47662EC4E992656C5E72AABC401BF237F@prodmail2.prod.root> $10 - yikes! Particularly if we are conducting these types of assessments more than twice. However, before we give thumbs down because of cost, I think we have to remember to factor in the "hidden" costs of what we do now. We have teachers' or test administrators' time "proctoring" the tests; we have time correcting the tests; sometimes there is time taken from instruction. And if we are testing when we consider that it's time for the student to test (even if it's as little as 4 weeks or 40 hours), as opposed to a set 8 weeks, or 80 - 100 hours, then the teacher needs to keep on teaching those who are not ready to test, so we need another person to administer the test. So, again, what are our true costs for what we do now as compared to this option? Not that I'm gung-ho. I'm trying to figure out how to assure that everyone has access to the Internet!! -Miriam Kroeger Arizona -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of assessment-request at nifl.gov Sent: Tuesday, March 28, 2006 5:22 AM To: assessment at nifl.gov Subject: Assessment Digest, Vol 6, Issue 20 Send Assessment mailing list submissions to assessment at nifl.gov To subscribe or unsubscribe via the World Wide Web, visit http://www.nifl.gov/mailman/listinfo/assessment or, via email, send a message with subject or body 'help' to assessment-request at nifl.gov You can reach the person managing the list at assessment-owner at nifl.gov When replying, please edit your Subject line so it is more specific than "Re: Contents of Assessment digest..." From jtaustin1 at core.com Tue Mar 28 10:17:47 2006 From: jtaustin1 at core.com (james austin) Date: Tue, 28 Mar 2006 10:17:47 -0500 Subject: [Assessment 274] Questions for ETS In-Reply-To: Message-ID: <000001c6527a$c614af90$2efdfea9@JimUpTwo> Folks, this is a reply to the postings of yesterday on the EFF_ETS testing system. Q&A on the EFF_ETS Assessments for NIFL Listserv Assessment My name is Jim Austin. I wanted to provide an aggregate response to the postings that addresses their diversity, but first let me provide some background on my involvement in the EFF_ETS project. MY STAKE: I am an advocate and I've been working on this testing project for Ohio over four meetings since early 2005. The process has been fruitful and completely voluntary. I believe that it carries much potential for advancing the field of adult education. Recently, with the initial NALS results and Performance levels for adult literacy (NRC, 2005) there is a focus on testing. A major advantage that I see is the evidence-based nature of this testing system, when this standard is a consistent chorus from policymakers. I also realize the practical and conceptual objections. Technology requirements & cost ($10, which means $20 if we honor our assurances of WIA Title II and OVAE requirements to pre- and post-test learners) are two practical objections. It is clear from the NRC (2002) report on performance assessment for adults that much less than $1,000 is spent on average per student per year in the ABLE system and that the proposed cost would be an increase over the cheaper paper-pencil and computer-delivered competitors. Cost was an overriding concern to most posting on this question, and for the participant states as well. I believe that $10 per test is a rock-bottom price for ETS. Conceptual objections are raised by participants in the request for value-added logic, in understanding the development and evaluation of the test, and in dealing with linkage to specific state content standards. To me, value-added crosses over into practicality because of the potential benefits of the EFF_ETS tests. HISTORY: Prior discussions included a presentation at NIFL for testing firms after which ETS was the only firm expressing interest in moving forward. Discussions at COABE in 2004 consisted of a dinner meeting at which Dr. Irwin Kirsch of ETS presented about a multi-phase project that would work through constructing test blueprints, developing pilot forms, and completing field testing. Interest in Ohio stemmed from a failure to obtain permission from OVAE to use and continue to evaluate a portfolio system which my work unit had developed to support assessment of a standards-based system developed within EFF. Ohio developed benchmarks with field practitioners, national expert participation or review, and standards validation via committees or surveys to create benchmarks for Reading, Writing, and Mathematics as well as Reading-Writing and Speaking-Listening domains. Those standards and benchmarks have been increasingly accepted across the 130+ local ABLE programs. At this juncture Ohio was a state interested in maintaining the momentum of standards-based education while recognizing that existing test publishers were reluctant to push the envelope (NOTE: this viewpoint is mine and based on my experiences and testing background. It may not be the interpretation of all). Publishers were willing to perform or to commission crosswalks or alignments of their test products against the Ohio benchmarks. These alignments vary in the level of detail, and must often be supplemented. Publishers did not appear as willing to extend test designs to honor advances in understanding literacy. Likewise on test delivery and on innovative methods of mapping items and response tasks together. PROCESS: Working as partners with ETS and EFF personnel, states have contributed time and thought to map out a test specification using two general definitions provided below. We were aided by skilled (and patient!) facilitators from ETS who guided the group across the meetings. The basic plan is to merge concepts of Prose, Document, & Quantitative literacy with on-line delivery and advances in reading comprehension and adult numeracy. PDQ is an area in which ETS has been working in its large-scale assessments of adult literacy. On-line (and only on-line) delivery permits multiple advantages in scoring (item response theory) and reporting (comparison to national and international surveys: ALL, NAAL). Lastly, the fruits of the Adult Reading Components Study (ARCS) and Adult Numeracy frameworks are visible in the proposed tests. Using IRT and a construct approach to test creation, the value-added will come from moving away from strictly choice responses, more meaningful reporting via item maps that relate scores to task performances, and diagnostics for teachers. ETS Standards of Fairness and Quality will be applied to ensure that the testing product conforms to recognized standards. Using Math to Solve Problems (UMSP) means acting on, interpreting, and communicating mathematical information in order to fulfill responsibilities as parent/family member, citizen/community member, and worker. Read with Understanding (RWU) means retrieving, analyzing, integrating, and reflecting on information from text (continuous and non-continuous) in order to fulfill responsibilities as parent/family member, citizen/community member, and worker. POST #2 (Marie Cora of NIFL) -Are these assessments that students will take on-line? -Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? -What does a state need to do in order to join the project? -If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? POST #3 (Marcia Cook of Maine) Maine began to participate as a Charter State. Our biggest challenge is money. Although we have for years been actively involved in EFF and in many areas still are, we do not have the money at this time to continue to participate in the project. Our biggest concern about the EFF assessments is the cost of the assessments to each program. We are a small state population wise and receive a relatively small amount of money. I will be anxious to hear this addressed during the week. POST #4 (Susan O'Connor of Brooklyn Public Library) Brooklyn Public Library considered participation in this project but the cost was so high that we were unable to even consider participation. This leads me to wonder about the efficacy of such an expensive venture for adult literacy programs that have no fat at all to expend on anything other than instruction. POST #5 (Julie Eastland of ETS) 1. Are these assessments that students will take on-line? Yes. They will be developed for on-line use. The students will take the assessments on-line, and the assessments will be scored and scaled in real-time on -line and immediately produce score reports, which institutions can release directly to the student and/or teacher. There will be a tutorial developed for students to access if they need assistance with basic computer skills such as using a mouse and highlighting, and the tutorial will also allow students to become familiar with the item types that will be seen on the test. 2. Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? Yes, they are based on the EFF Reading and Math frameworks, and will use real-life materials to assess student knowledge and proficiency in those areas. We will work with participants to align the tests to meet the NRS requirements, which should make the tests useful even in states where the EFF frameworks are not utilized. 3. What does a state need to do in order to join the project? A state would need to commit to the next phase of the project (a 2 year period), which would mean signing a Memorandum of Understanding with ETS, agreeing to attend development meetings, and piloting all of the tests (reading with Understanding, Reading Components, and Using Math to Solve Problems) with a minimum of 330 adult test candidates. The cost of piloting is expected to be $10,000 per state. If your state is interested, I would encourage you to contact me directly for more information about this effort - I would be happy to share a copy of the MOU with you.. 4. If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If your state decides to join this project, you will not be required to use the assessments beyond the pilot stage. If you are involved in the pilot stage, you will have input into the development of the tests, including score reports. If you were not in the development project, the tests will be made available for purchase after the development phase. At this time, we anticipate the final costs of the test to be approximately $10 per test. POST #6 (David Rosen of Newsome Associates) Thanks for bringing this up, Marcia. In K-12 and higher education, the costs of assessment relative to instruction are relatively small. In vastly underfunded adult literacy education the additional costs may be relatively significant, and may severely drain resources from already thinly-provided instruction. Does this mean we should abandon assessment? Of course not. But perhaps it is time to cost it out, and to ask Congress and state legislatures to pay for the increased costs. It is possible that some state legislatures -- and possibly Congress -- would understand a line item increase needed to pay for assessment, even in times of fiscal restraint. So, my question for Julie Eastland: What are the costs involved? . For being a participating state? . For teachers' time to learn how to conduct the assessments? . For the assessment instruments? And what might this be in terms of additional cost-per-student? In other words, if a State Director of Adult Education were asked by a legislator how much it would cost to fully implement EFF assessments, what would be the answer? What would the additional cost per student be, recognizing that that would vary for states with larger or fewer numbers of students? POST #7 (Howard Dooley of Rhode Island Regional Adult Learning) And my question about the costs is, what is the value added? What will learners and instructors receive that will inform future learning, that other assessments don't already provide? What will program managers receive that will inform program design and funding allocations, beyond what current assessments and evaluations already provide? What will legislators and other funders learn that will benefit future funding, that they are not already learning from adult education advocates? And, what will be reportable to the NRS that is not already reported? My understanding of these assessments is that they are developing standardized assessments aligned to the EFF content standards. Do we need this? Aren't other assessments already aligned with EFF enough to provide the information each stakeholder needs to make their decisions? Isn't the current mix of formal and informal, standardized and local, performance and objective assessments essentially doing the necessary tasks (leaving room for continuous improvements, of course)? I can see why some programs may want to have a highly aligned, standardized assessment, and perhaps they should have that option and be wiling to pay for it. But will most of us need it? I believe that in considering the most efficient use of scarce resources, we should look to value added by these investments. POST #7 (John Gordon of ) Did you say $10 per test? Does that mean that each time we test someone, it would cost $10? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060328/1abf2f5f/attachment.html From marie.cora at hotspurpartners.com Tue Mar 28 11:22:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 28 Mar 2006 11:22:08 -0500 Subject: [Assessment 275] Qs for ETS Message-ID: <005f01c65283$c35a3460$0402a8c0@frodo> The following questions are posted by request. Thanks, marie cora My own personal concern is that the students have to take the tests on the computer, yes? And if so, what happens if they are not computer literate? And if the program does not have access to computers so they can teach computer literacy? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060328/775e7af8/attachment.html From bryan at cal.org Tue Mar 28 11:30:26 2006 From: bryan at cal.org (Bryan Woerner) Date: Tue, 28 Mar 2006 11:30:26 -0500 Subject: [Assessment 276] Re: Qs for ETS Message-ID: <7E0B624DDF68104F92C38648A4D93D8FF69555@MAIL.cal.local> I don't want distract from the topic at hand, but there is a free website available that that teaches students how to use a computer in a easy to use and friendly manner. It was developed by the Arlington Education and Employment Program in Virginia. www.reepworld.org Anyway, sorry for the aside. Bryan Bryan Woerner BEST Plus Operations Assistant bryan at cal.org BEST Plus Contact Information Toll free: 1-866-845-BEST (2378) Fax: 1-888-700-3629 Email: best-plus at cal.org User Support Hours: M-F 9am-7pm EST Mail: BEST Plus c/o Center for Applied Linguistics 4646 40th Street, NW Washington, DC 20016-1859 Web site: www.best-plus.net Center for Applied Linguistics www.cal.org ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, March 28, 2006 11:22 AM To: assessment at nifl.gov Subject: [Assessment 275] Qs for ETS The following questions are posted by request. Thanks, marie cora My own personal concern is that the students have to take the tests on the computer, yes? And if so, what happens if they are not computer literate? And if the program does not have access to computers so they can teach computer literacy? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060328/29ab2cad/attachment.html From jeastland at ETS.ORG Tue Mar 28 11:34:10 2006 From: jeastland at ETS.ORG (Eastland, Julie) Date: Tue, 28 Mar 2006 11:34:10 -0500 Subject: [Assessment 277] Re: Qs for ETS Message-ID: <95782C26106D904E814E554F47EACA9C01AEAA4D@rosnt115.etslan.org> Hi Everyone, Yes, the tests will only be available on computer. We have designed the interface for these tests to be as simple as possible. There will be a tutorial that students can access which will familiarize them not only with specific item types, but with other computer skills such as using a mouse and highlighting. The tutorial will be free and students can access it and practice their skills as often as they want prior to taking the tests. An example of this kind of tutorial can be found by going to www.ets.org/etsliteracy, then clicking on the 'Sample Questions Available' link. Once you click on that link, you will see a page which has the Sample Questions and a "Tutorial" tab. Click on the "Tutorial" tab. Follow the instructions on that page to run the tutorial. Best regards, Julie ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, March 28, 2006 11:22 AM To: assessment at nifl.gov Subject: [Assessment 275] Qs for ETS The following questions are posted by request. Thanks, marie cora My own personal concern is that the students have to take the tests on the computer, yes? And if so, what happens if they are not computer literate? And if the program does not have access to computers so they can teach computer literacy? -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060328/ac115d35/attachment.html From elohimsgirl at adelphia.net Wed Mar 29 10:13:02 2006 From: elohimsgirl at adelphia.net (elohimsgirl at adelphia.net) Date: Wed, 29 Mar 2006 7:13:02 -0800 Subject: [Assessment 277] Re: More questions for ETS Message-ID: <22884717.1143645182145.JavaMail.root@web25> -- What do you all feel about the cost? That sounds expensive to me. I am interested to hear others' thoughts. Marlo Thomas Watson Director of Adult and Workforce Education Northeast Kingdom Learning Services (NEKLS) 364 Railroad Street, Suite 2 St. Johnsbury, VT 05819 (802) 748-5624 phone (802) 751-8071 fax elohimsgirl at adelphia.net You are only as great as your committment to make it happen! ---- John Gordon wrote: ============= Did you say $10 per test? Does that mean that each time we test someone, it would cost $10? John Gordon _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Eastland, Julie Sent: Monday, March 27, 2006 11:22 AM To: The Assessment Discussion List Subject: [Assessment 268] Re: More questions for ETS Hi Everyone, I will be available periodically this week to answer the questions that are posted. 1. Are these assessments that students will take on-line? Yes. They will be developed for on-line use. The students will take the assessments on-line, and the assessments will be scored and scaled in real-time on -line and immediately produce score reports, which institutions can release directly to the student and/or teacher. There will be a tutorial developed for students to access if they need assistance with basic computer skills such as using a mouse and highlighting, and the tutorial will also allow students to become familiar with the item types that will be seen on the test. 2. Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? Yes, they are based on the EFF Reading and Math frameworks, and will use real-life materials to assess student knowledge and proficiency in those areas. We will work with participants to align the tests to meet the NRS requirements, which should make the tests useful even in states where the EFF frameworks are not utilized. 3. What does a state need to do in order to join the project? A state would need to commit to the next phase of the project (a 2 year period), which would mean signing a Memorandum of Understanding with ETS, agreeing to attend development meetings, and piloting all of the tests (reading with Understanding, Reading Components, and Using Math to Solve Problems) with a minimum of 330 adult test candidates. The cost of piloting is expected to be $10,000 per state. If your state is interested, I would encourage you to contact me directly for more information about this effort - I would be happy to share a copy of the MOU with you.. 4. If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If your state decides to join this project, you will not be required to use the assessments beyond the pilot stage. If you are involved in the pilot stage, you will have input into the development of the tests, including score reports. If you were not in the development project, the tests will be made available for purchase after the development phase. At this time, we anticipate the final costs of the test to be approximately $10 per test. I look forward to further discussion this week. Best regards, Julie ________________________ Julie K. Eastland Program Administrator Center for Global Assessment Educational Testing Service Rosedale Rd. Princeton, NJ 08541 jeastland at ets.org ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov ] On Behalf Of Marie Cora Sent: Monday, March 27, 2006 11:08 AM To: assessment at nifl.gov Subject: [Assessment 265] More questions for ETS Hello everyone, I have received a few questions to be posted for the discussion: -Are these assessments that students will take on-line? -Aren't these assessments based on EFF (Equipped for the Future)? What if my state does not use EFF? -What does a state need to do in order to join the project? -If my state joins this project, does that mean we must then use these assessments? If we don't join this project, can my state access these assessments anyway, even if we were not in the development project? If you have questions, please feel free to send your question to me for posting - or feel free to post it yourself. Thanks, marie cora Assessment Discussion List Moderator -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- From hdooley at riral.org Wed Mar 29 12:28:36 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Wed, 29 Mar 2006 12:28:36 -0500 Subject: [Assessment 278] Re: More questions about the price In-Reply-To: <22884717.1143645182145.JavaMail.root@web25> References: <22884717.1143645182145.JavaMail.root@web25> Message-ID: <442AC3C4.2050404@riral.org> I think it's a little early to be overly-concerned about the price. I am concerned that most of the postings this week are about this one point. I do think that the field needs to develop these assessments, to see what these assessments are and what they can give our learners, our instructors, our managers, our directors, and our fellow stakeholders, particularly given the investment we have made in EFF over the past ten years. To be effective for us, it needs to be a complete system, which means (among so many other things) having an assessment component. The tests are in development, and it's not certain what the cost/administration will be once they are up and running, and large numbers of programs are using them. Also, my organization uses performance tests in some classes and programs, and objective, pen-and-pencil tests in others, depending on the learners, their learning difficulties, the instructional content, and what we need from the standardized assessment piece of our learner assessments. So, if we used these tests, we wouldn't necessarily be using them for every learner in every program. Third, if you pre-test each learner, and then post-test each learner once or twice a year, that is a $20 or $30 cost/learner at the current rate. Is that too much, to meet federal and state requirements? And thus keep those grants? I think you'd have to look at how much you are spending now, and how much of an increase this represents. And, as I said previously, you would need to consider the value added by these assessments over what you are currently using. I don't know that funders would balk at these costs (by which I mean I personally don't think they would), if you have a cogent explanation as to why these assessments are the best -- or perhaps necessary -- choice to demonstrate your learners success in achieving their educational and life goals. Howard D. elohimsgirl at adelphia.net wrote: >-- >What do you all feel about the cost? That sounds expensive to me. I am interested to hear others' thoughts. > >Marlo Thomas Watson >Director of Adult and Workforce Education >Northeast Kingdom Learning Services (NEKLS) >364 Railroad Street, Suite 2 >St. Johnsbury, VT 05819 > >(802) 748-5624 phone (802) 751-8071 fax > >elohimsgirl at adelphia.net > > > From marie.cora at hotspurpartners.com Thu Mar 30 10:32:47 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 30 Mar 2006 10:32:47 -0500 Subject: [Assessment 279] Re: ETS Tutorial In-Reply-To: <95782C26106D904E814E554F47EACA9C01AEAA4D@rosnt115.etslan.org> Message-ID: <01d501c6540f$3312f1c0$0402a8c0@frodo> Hi everyone, Has anyone tried out the Tutorial? What do you think? marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Eastland, Julie Sent: Tuesday, March 28, 2006 11:34 AM To: The Assessment Discussion List Subject: [Assessment 277] Re: Qs for ETS Hi Everyone, Yes, the tests will only be available on computer. We have designed the interface for these tests to be as simple as possible. There will be a tutorial that students can access which will familiarize them not only with specific item types, but with other computer skills such as using a mouse and highlighting. The tutorial will be free and students can access it and practice their skills as often as they want prior to taking the tests. An example of this kind of tutorial can be found by going to www.ets.org/etsliteracy, then clicking on the 'Sample Questions Available' link. Once you click on that link, you will see a page which has the Sample Questions and a "Tutorial" tab. Click on the "Tutorial" tab. Follow the instructions on that page to run the tutorial. Best regards, Julie _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, March 28, 2006 11:22 AM To: assessment at nifl.gov Subject: [Assessment 275] Qs for ETS The following questions are posted by request. Thanks, marie cora My own personal concern is that the students have to take the tests on the computer, yes? And if so, what happens if they are not computer literate? And if the program does not have access to computers so they can teach computer literacy? -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060330/88b0177b/attachment.html From marie.cora at hotspurpartners.com Thu Mar 30 11:25:09 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 30 Mar 2006 11:25:09 -0500 Subject: [Assessment 280] Pending questions and discussion Message-ID: <01da01c65416$83c46de0$0402a8c0@frodo> Hi everyone, There are clearly a series of questions here that we need to address. Some of them Julie and ETS folks can respond to; others are probably for us to pursue discussing and need the attention of policy-makers (local and otherwise). Julie - I wonder if you might respond to these questions posed by David Rosen: What would be the cost: >. For teachers' time to learn how to conduct the assessments? . For the assessment instruments? Also, the description of a state's involvement includes attending development meetings and piloting all the tests - how much time is involved in completing these tasks? I'm unsure who might respond to this series of David's questions - anyone?: >And what might this be in terms of additional cost-per-student? In >other words, if a State Director of Adult Education were asked by a >legislator how much it would cost to fully implement EFF assessments, >what would be the answer? What would the additional cost per student >be, recognizing that that would vary for states with larger or fewer >numbers of students? As for value added and present costs - Howard you asked some questions related to how this might be better than what we have now. I think these are questions for Julie, ETS folks, and Jim Austin and other pilot participants to respond to, if they can: Howard Dooley wrote: And my question about the costs is, what is the value added? What will learners and instructors receive that will inform future learning, that other assessments don't already provide? What will program managers receive that will inform program design and funding allocations, beyond what current assessments and evaluations already provide? What will legislators and other funders learn that will benefit future funding, that they are not already learning from adult education advocates? And, what will be reportable to the NRS that is not already reported? But Howard, I will respond back to you on these couple of questions from my personal point of view: Howard wrote: Aren't other assessments already aligned with EFF enough to provide the information each stakeholder needs to make their decisions? Marie: I would say no, there aren't. I also question "aligned enough" - isn't that exactly what we want to get away from? This Discussion List had quite a conversation on standards a couple of weeks ago, and that discussion clearly expressed people's frustrations with the lack of national standards and aligned curriculum and assessment. Howard: Isn't the current mix of formal and informal, standardized and local, performance and objective assessments essentially doing the necessary tasks (leaving room for continuous improvements, of course)? Marie: Yes, I would say that it is - but at what *cost*? The intense juggling that programs and states must tackle in order to meet all the demands they face is taxing to put it mildly. Wouldn't it be better if we had a system in which the pieces fit together so seamlessly that there wouldn't be any juggling to do? I invite everyone to join this discussion further. Let us know your thoughts, the answers to any of the above questions, or ask us your own question. Thanks, marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060330/f596b1ab/attachment.html From samuel.mcgraw at seattlegoodwill.org Thu Mar 30 12:06:43 2006 From: samuel.mcgraw at seattlegoodwill.org (Samuel McGraw III) Date: Thu, 30 Mar 2006 09:06:43 -0800 Subject: [Assessment 281] Re: ETS Tutorial Message-ID: <802F2B4590320142A57872DC43A2BFD20218AF5C@seamail.seagoodwill.org> I just tried it. It is not bad. It has all the usual problems (or the literacy student) will have using a computer to take an assessment. In other words there are inherent problems when students use a computer to test/assess, however, I believe it is a great start. Samuel McGraw III M. Ed. Program Coordinator Goodwill Tel: 206.860.5789 Fax: 206.325.9845 http://www.seattlegoodwill.org Because jobs change lives -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, March 30, 2006 7:33 AM To: 'The Assessment Discussion List' Subject: [Assessment 279] Re: ETS Tutorial Hi everyone, Has anyone tried out the Tutorial? What do you think? marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Eastland, Julie Sent: Tuesday, March 28, 2006 11:34 AM To: The Assessment Discussion List Subject: [Assessment 277] Re: Qs for ETS Hi Everyone, Yes, the tests will only be available on computer. We have designed the interface for these tests to be as simple as possible. There will be a tutorial that students can access which will familiarize them not only with specific item types, but with other computer skills such as using a mouse and highlighting. The tutorial will be free and students can access it and practice their skills as often as they want prior to taking the tests. An example of this kind of tutorial can be found by going to www.ets.org/etsliteracy, then clicking on the 'Sample Questions Available' link. Once you click on that link, you will see a page which has the Sample Questions and a "Tutorial" tab. Click on the "Tutorial" tab. Follow the instructions on that page to run the tutorial. Best regards, Julie _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, March 28, 2006 11:22 AM To: assessment at nifl.gov Subject: [Assessment 275] Qs for ETS The following questions are posted by request. Thanks, marie cora My own personal concern is that the students have to take the tests on the computer, yes? And if so, what happens if they are not computer literate? And if the program does not have access to computers so they can teach computer literacy? -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060330/1883628b/attachment.html From Tina_Luffman at yc.edu Thu Mar 30 12:17:05 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Thu, 30 Mar 2006 10:17:05 -0700 Subject: [Assessment 282] Re: Pending questions and discussion Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060330/3d0c3ac4/attachment.html From marie.cora at hotspurpartners.com Fri Mar 31 11:23:00 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 31 Mar 2006 11:23:00 -0500 Subject: [Assessment 283] Re: ETS Tutorial Message-ID: <032d01c654df$61649620$0402a8c0@frodo> Dear everyone, I have received the following email to post. Thanks, marie cora Assessment Discussion List Moderator I went online to try the tutorial as you suggested and I wasn't clear if those were sample questions in general or if they are specific to the new EFF assessments. If they are, I saw a great connection to authentic texts, but not a clear tie to the components of the standards. Basically, they looked a lot like GED (or TABE) questions, but in a computerized format. Also, I'm pretty sure there's an error on one of the sample questions in the health section. The question asks how many minutes you'd have to run to burn off calories from eating a tuna sandwich, but then you're supposed to highlight the answer in the chart, but all the chart says is how many calories you burn per minute running. I would think there'd need to be a box to fill in the response. I was hoping that you might be able to clarify for me. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060331/8b3aed34/attachment.html From jeastland at ETS.ORG Fri Mar 31 11:30:31 2006 From: jeastland at ETS.ORG (Eastland, Julie) Date: Fri, 31 Mar 2006 11:30:31 -0500 Subject: [Assessment 284] Re: ETS Tutorial Message-ID: <95782C26106D904E814E554F47EACA9C01B3BA59@rosnt115.etslan.org> Hi Everyone, In answer to your questions about the sample items - these items are not specific to the EFF frameworks. The items that we plan to develop for the new assessments will be more closely tied to the EFF standards. The items you saw are examples of the kinds of items items that are tied to the IALS and ALL Prose, Document, and Quantitative scales. We worked very hard to develop items using real-life content that adults are likely to encounter in their daily lives. I am glad that the connection to authentic texts was apparent! The sample question you refer to about the tuna sandwich does not have an error - the chart contains information about how many calories there are in a tuna sandwich and the various minutes per activity it would take to burn off those calories. Please feel free to take a second look. Regards, Julie Eastland ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, March 31, 2006 11:23 AM To: assessment at nifl.gov Subject: [Assessment 283] Re: ETS Tutorial Dear everyone, I have received the following email to post. Thanks, marie cora Assessment Discussion List Moderator I went online to try the tutorial as you suggested and I wasn't clear if those were sample questions in general or if they are specific to the new EFF assessments. If they are, I saw a great connection to authentic texts, but not a clear tie to the components of the standards. Basically, they looked a lot like GED (or TABE) questions, but in a computerized format. Also, I'm pretty sure there's an error on one of the sample questions in the health section. The question asks how many minutes you'd have to run to burn off calories from eating a tuna sandwich, but then you're supposed to highlight the answer in the chart, but all the chart says is how many calories you burn per minute running. I would think there'd need to be a box to fill in the response. I was hoping that you might be able to clarify for me. Thank you. -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060331/43fdbc4a/attachment.html From jeastland at ETS.ORG Mon Apr 3 09:54:48 2006 From: jeastland at ETS.ORG (Eastland, Julie) Date: Mon, 3 Apr 2006 09:54:48 -0400 Subject: [Assessment 285] Re: Pending questions and discussion Message-ID: <95782C26106D904E814E554F47EACA9C01B3BE60@rosnt115.etslan.org> Hi Everyone, In response to the questions below: >* What would be the cost: For teachers' time to learn how to conduct the assessments? * For the assessment instruments? The tests are designed to be administered on-line. directly with the students. The teachers would only need to be available to get the students started with the tests, so we believe that the time and cost for these activities would be minimal. I believe that we would be able to better determine these costs during the pilot stage. >*Also, the description of a state's involvement includes attending development meetings and piloting all the tests - how much time is involved in completing these tasks? The states would need to be involved in attending 3-4 meetings during the pilot stage (over a two year period), one that is mainly about item development and development of score reports, one for pilot implementation, one to participate in a standard setting process to map the tests to NRS levels, and one at the end of the pilot to discuss the results. The meetings would be 2-3 days each. >And what might this be in terms of additional cost-per-student? In other words, if a State Director of Adult Education were asked by a legislator how much it would cost to fully implement EFF assessments, what would be the answer? What would the additional cost per student be, recognizing that that would vary for states with larger or fewer numbers of students? The tests are planned to cost $10 per test, and there will be 2 forms of each test for pre- and post-testing purposes. I think that the costs per student would vary by state, depending on the number of students tested, the tests they take (Read with Understanding, Reading Components, Using Math to Solve Problems) and the number of times they are tested. And my question about the costs is, what is the value added? What will learners and instructors receive that will inform future learning, that other assessments don't already provide? What will program managers receive that will inform program design and funding allocations, beyond what current assessments and evaluations already provide? What will legislators and other funders learn that will benefit future funding, that they are not already learning from adult education advocates? And, what will be reportable to the NRS that is not already reported? We believe the value added will take many forms. These assessments will be based on current research in the areas of reading and numeracy skills; provide diagnostic information, especially with the reading components measures, which can be used to profile adult learners, inform them of their progress, and guide instructional planning; allow measurement of gains at the lowest skill levels; map to existing standards-based professional development and curricula; use open-ended tasks based on authentic materials selected from adult roles and contexts; offer an integrated system which administers, scores, and analyzes responses in real time and which can be used in conjunction with existing state information management systems; be linked to the National Adult Literacy Survey, the International Adult Literacy Survey, and the Adult Literacy and Life Skills Survey, making it easy for policy makers and program managers to make connections to social and economic benchmarks and to track changes over time; and predict student success in educational and workforce environments. Score reports will be designed with input from literacy practitioners, to better reflect the kinds of information that will be useful to test takers, teachers, and policy makers. The discussion about this project has been quite informative and I hope that you will contact me directly if you are interested in participating. Best regards, Julie K. Eastland Program Administrator Center for Global Assessment Educational Testing Service Rosedale Rd. Princeton, NJ 08541 jeastland at ets.org ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, March 30, 2006 11:25 AM To: assessment at nifl.gov Subject: [Assessment 280] Pending questions and discussion Hi everyone, There are clearly a series of questions here that we need to address. Some of them Julie and ETS folks can respond to; others are probably for us to pursue discussing and need the attention of policy-makers (local and otherwise). Julie - I wonder if you might respond to these questions posed by David Rosen: What would be the cost: >* For teachers' time to learn how to conduct the assessments? * For the assessment instruments? Also, the description of a state's involvement includes attending development meetings and piloting all the tests - how much time is involved in completing these tasks? I'm unsure who might respond to this series of David's questions - anyone?: >And what might this be in terms of additional cost-per-student? In >other words, if a State Director of Adult Education were asked by a >legislator how much it would cost to fully implement EFF assessments, >what would be the answer? What would the additional cost per student >be, recognizing that that would vary for states with larger or fewer >numbers of students? As for value added and present costs - Howard you asked some questions related to how this might be better than what we have now. I think these are questions for Julie, ETS folks, and Jim Austin and other pilot participants to respond to, if they can: Howard Dooley wrote: And my question about the costs is, what is the value added? What will learners and instructors receive that will inform future learning, that other assessments don't already provide? What will program managers receive that will inform program design and funding allocations, beyond what current assessments and evaluations already provide? What will legislators and other funders learn that will benefit future funding, that they are not already learning from adult education advocates? And, what will be reportable to the NRS that is not already reported? But Howard, I will respond back to you on these couple of questions from my personal point of view: Howard wrote: Aren't other assessments already aligned with EFF enough to provide the information each stakeholder needs to make their decisions? Marie: I would say no, there aren't. I also question "aligned enough" - isn't that exactly what we want to get away from? This Discussion List had quite a conversation on standards a couple of weeks ago, and that discussion clearly expressed people's frustrations with the lack of national standards and aligned curriculum and assessment. Howard: Isn't the current mix of formal and informal, standardized and local, performance and objective assessments essentially doing the necessary tasks (leaving room for continuous improvements, of course)? Marie: Yes, I would say that it is - but at what *cost*? The intense juggling that programs and states must tackle in order to meet all the demands they face is taxing to put it mildly. Wouldn't it be better if we had a system in which the pieces fit together so seamlessly that there wouldn't be any juggling to do? I invite everyone to join this discussion further. Let us know your thoughts, the answers to any of the above questions, or ask us your own question. Thanks, marie cora Assessment Discussion List Moderator -------------------------------------------------- This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited. Thank you for your compliance. -------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060403/bda8a110/attachment.html From marie.cora at hotspurpartners.com Mon Apr 3 11:11:20 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 3 Apr 2006 11:11:20 -0400 Subject: [Assessment 286] Discussion thanks Message-ID: <005b01c65730$dd641bd0$0402a8c0@frodo> Dear everyone, I would like to thank Julie Eastland and her colleagues at ETS for answering our questions regarding the new assessments last week. If you have further questions or are interested in participating in this project, I encourage you to email Julie at jeastland at ets.org. If you have further thoughts or ideas that you would like to continue to discuss regarding this or a related topic, please let us carry on. Thanks! marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060403/8b9f045c/attachment.html From marie.cora at hotspurpartners.com Mon Apr 3 14:12:46 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 3 Apr 2006 14:12:46 -0400 Subject: [Assessment 287] FW: [ProfessionalDevelopment 301] FW: Oregon Instructor at Sea Message-ID: <00b801c6574a$3629e6f0$0402a8c0@frodo> Hi everyone, I usually don't post messages that are not assessment-related in some way, but the following is too interesting and exciting not to share. So just to be 'aligned' with this List mission, if you decide to take advantage of the following opportunity, tell us about it, and let us know how you ended up assessing your experience. :) marie cora Assessment Discussion List Moderator -----Original Message----- From: professionaldevelopment-bounces at nifl.gov [mailto:professionaldevelopment-bounces at nifl.gov] On Behalf Of jataylor Sent: Monday, April 03, 2006 1:39 PM To: professionaldevelopment at nifl.gov Cc: kristen.kulongoski at state.or.us Subject: [ProfessionalDevelopment 301] FW: Oregon Instructor at Sea PD Colleagues: I am pleased to share the announcement below from Kristen Kulongoski, Oregon State Director of Adult Basic Education. Two Oregon adult education instructors have been chosen this year to go on Teacher at Sea Expeditions to the Mariana Arc and the Antarctic with the National Oceanic and Atmospheric Administration. Instructors and students from all over the country can follow these adult education teachers during their trips via a website and email, ask them questions, participate in "real time" adult education math and science lessons, and use "real time" scientific research. Information about the first expedition (April 18 - May 13, 2006) can be found below. Please note that the information via the web will not be live until closer to the expedition departure date. Please share this teaching and learning opportunity with teachers in your area. Best, Jackie Jackie Taylor, Adult Literacy Professional Development List Moderator, jataylor at utk.edu =================================================== Oregon Adult Education Instructor Chosen for Professional Development Opportunity in the Ring of Fire NOAA Ocean Explorer Research Mission: Submarine Ring of Fire 2006-Mariana Arc The upcoming NOAA-Explorer expedition "Submarine Ring of Fire 2006 - Mariana Arc" sails from Guam on April 18, 2006. Joining the scientists and crew aboard the R/V Melville is Lori Savage, an instructor at Rogue Community College, Grants Pass, Oregon. Lori teaches math, science and life skills classes to teenagers and adults who are working to obtain their General Educational Development (GED) certificates. During this research mission, scientists will explore the active undersea volcanoes of the Mariana Arc. The expedition ends May 13, 2006, in Yokohama, Japan. Students, instructors, and the general public can follow this mission and learn about the scientific research on the Ocean Explorer interactive website (http://oceanexplorer.noaa.gov/explorations/06fire/welcome.html). The website includes background information, live mission logs updated during the expedition, slideshows and videos, and an education module featuring lesson plans and links to other resources. You can write to Lori Savage or other scientists while they are at sea by using the "Ask an Explorer" button on the Ocean Explorer website. Using the website, Lori will have a unique opportunity to connect "real time" ocean exploration with adult learners and the general public. Lori Savage is a participant in the Oregon Ocean Sciences and Math Collaborative Project. This project, now in its second year, delivers professional development through a series of institutes for instructors who teach in federally-funded Adult Basic Education (ABE) and Adult Secondary Education (ASE) programs administered through the Oregon Department of Community Colleges and Workforce Development. The website for educational support of our program and our participating teachers at sea is "Ocean Science Station" http://literacyworks.org/ocean. Partners in this project are the College of Oceanic and Atmospheric Sciences, Oregon State University; Oregon Sea Grant; the Hatfield Marine Science Center; and the Oregon Department of Community Colleges and Workforce Development. In this year-long program, instructors learn about ocean sciences and integrate these themes into their classroom instruction. Themes include ocean and earth processes, human impacts on the marine environment, and the application of technology in research. This project provides the instructors and their adult learners with a compelling and relevant focus for strengthening general literacy and numeracy skills and gaining the knowledge necessary for work, further education, family self-sufficiency, and community involvement. Fourteen adult basic instructors are participating in this year's professional development project. They represent such diverse instructional programs as workforce training (pre-employment), workplace education, adult basic education, adult secondary education, English to Speakers of Other Languages, Family Literacy, and corrections education. ------------------------------------------------------ Kristen Kulongoski State Director of Adult Basic Education Oregon Department of Community Colleges and Workforce Development 255 Capitol Street NE/PSB Salem, OR 97310 503-378-8648 ext. 375 503-378-3365 fax kristen.kulongoski at state.or.us -------------- next part -------------- A non-text attachment was scrubbed... Name: OR Instructor at Sea.doc Type: application/octet-stream Size: 33282 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060403/62e306d7/attachment.obj -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ATT00182.txt Url: http://www.nifl.gov/pipermail/assessment/attachments/20060403/62e306d7/attachment.txt From hdooley at riral.org Wed Apr 5 12:18:52 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Wed, 05 Apr 2006 12:18:52 -0400 Subject: [Assessment 288] (no subject) In-Reply-To: <00b801c6574a$3629e6f0$0402a8c0@frodo> References: <00b801c6574a$3629e6f0$0402a8c0@frodo> Message-ID: <4433EDEC.2030203@riral.org> Hi, everyone. I hope everyone is well, and enjoying spring -- though here, today in Woonsocket RI it's snowing if you can believe it... I am looking for information on adult basic skills certifications, particularly those issued by a state-level agency or a consortium. A certification that is recognized as valid and useful by programs, colleges, providers, businesses across your state. Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessements are being used? If your state doesn't have such certifications, do you think it would be valuable to have them? If so, I have the same questions as above. RI is considering if and what certifications would benefit our learners as they move from our adult ed programs into their next-steps--work, college, training, citizenship, and so on. We'd appreciate your guidance, ( and the opportunity to beg, borrow, steal or buy the best from the best....) Thanks for the assist! Howard D. Project RIRAL From marie.cora at hotspurpartners.com Wed Apr 5 13:16:54 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 5 Apr 2006 13:16:54 -0400 Subject: [Assessment 289] Focus on Basics Discussion Message-ID: <00b601c658d4$bd2e7f90$0402a8c0@frodo> Dear Colleagues, The following Guest Discussion will be help April 10-14 on the Focus on Basics Discussion List. marie ______ Focus on Basics (FOB) is a quarterly publication from the National Center for Adult Learning and Literacy (NCSALL), which connects research to practice in adult literacy education. The report from the recent evaluation of FOB is complete and highlights the variety of ways that it impacts its readers. Barb Garner, FOB's editor, will lead a discussion on the list the week of April 10-14 to discuss the survey results and how these results can guide our use of FOB as professionals. Please take a look at the evaluation and think about comments or questions you may have. If you are not subscribed to the FOB list, you can easily subscribe for this discussion, and unsubscribe afterwards. Just go to: http://www.nifl.gov/mailman/listinfo/focusonbasics Feel free to contact me with any question. All the best, Julie McKinney, moderator, Focus on Basics Discussion List julie_mckinney at worlded.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060405/13e6910d/attachment.html From elohimsgirl at adelphia.net Thu Apr 6 05:59:17 2006 From: elohimsgirl at adelphia.net (elohimsgirl at adelphia.net) Date: Thu, 6 Apr 2006 2:59:17 -0700 Subject: [Assessment 290] Re: (no subject) Message-ID: <26147375.1144317557259.JavaMail.root@web19> -- We do not have such in Vermont, but are having discussions around the 'certification' issue. I will be happy to share as we move forward from discussion. ---- "Howard L. Dooley wrote: ============= Hi, everyone. I hope everyone is well, and enjoying spring -- though here, today in Woonsocket RI it's snowing if you can believe it... I am looking for information on adult basic skills certifications, particularly those issued by a state-level agency or a consortium. A certification that is recognized as valid and useful by programs, colleges, providers, businesses across your state. Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessements are being used? If your state doesn't have such certifications, do you think it would be valuable to have them? If so, I have the same questions as above. RI is considering if and what certifications would benefit our learners as they move from our adult ed programs into their next-steps--work, college, training, citizenship, and so on. We'd appreciate your guidance, ( and the opportunity to beg, borrow, steal or buy the best from the best....) Thanks for the assist! Howard D. Project RIRAL ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From khinson at future-gate.com Thu Apr 6 12:48:08 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Thu, 06 Apr 2006 18:48:08 +0200 Subject: [Assessment 291] Re: (no subject) Message-ID: <44355A25020000A000002441@fghamn01.ham.de.future-gate.com> We don't have any such certification in NC either, as far as I know. I've heard whispers that the state administrators are looking into certification for Adult Educators though. I don't know where that discussion stands though. >>> elohimsgirl at adelphia.net >>> -- We do not have such in Vermont, but are having discussions around the 'certification' issue. I will be happy to share as we move forward from discussion. ---- "Howard L. Dooley wrote: ============= Hi, everyone. I hope everyone is well, and enjoying spring -- though here, today in Woonsocket RI it's snowing if you can believe it... I am looking for information on adult basic skills certifications, particularly those issued by a state-level agency or a consortium. A certification that is recognized as valid and useful by programs, colleges, providers, businesses across your state. Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessements are being used? If your state doesn't have such certifications, do you think it would be valuable to have them? If so, I have the same questions as above. RI is considering if and what certifications would benefit our learners as they move from our adult ed programs into their next-steps--work, college, training, citizenship, and so on. We'd appreciate your guidance, ( and the opportunity to beg, borrow, steal or buy the best from the best....) Thanks for the assist! Howard D. Project RIRAL ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From elohimsgirl at adelphia.net Thu Apr 6 19:08:58 2006 From: elohimsgirl at adelphia.net (elohimsgirl at adelphia.net) Date: Thu, 6 Apr 2006 16:08:58 -0700 Subject: [Assessment 292] Re: (no subject) Message-ID: <8496176.1144364938425.JavaMail.root@web12.mail.adelphia.net> -- It leads to many implications...on one hand it is nice to provide instructors with background information on adult education, assessments, best practices, etc....BUT it will require more funding to pay teachers higher wages if higher credentials are required. I believe in that, but the money is not there...not in Vermont anyway. It is a difficult situation. WHat do you think? Marlo Thomas Watson Director of Adult and Workforce Education Northeast Kingdom Learning Services (NEKLS) 364 Railroad Street, Suite 2 St. Johnsbury, VT 05819 (802) 748-5624 phone (802) 751-8071 fax elohimsgirl at adelphia.net You are only as great as your committment to make it happen! ---- Katrina Hinson wrote: ============= We don't have any such certification in NC either, as far as I know. I've heard whispers that the state administrators are looking into certification for Adult Educators though. I don't know where that discussion stands though. >>> elohimsgirl at adelphia.net >>> -- We do not have such in Vermont, but are having discussions around the 'certification' issue. I will be happy to share as we move forward from discussion. ---- "Howard L. Dooley wrote: ============= Hi, everyone. I hope everyone is well, and enjoying spring -- though here, today in Woonsocket RI it's snowing if you can believe it... I am looking for information on adult basic skills certifications, particularly those issued by a state-level agency or a consortium. A certification that is recognized as valid and useful by programs, colleges, providers, businesses across your state. Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessements are being used? If your state doesn't have such certifications, do you think it would be valuable to have them? If so, I have the same questions as above. RI is considering if and what certifications would benefit our learners as they move from our adult ed programs into their next-steps--work, college, training, citizenship, and so on. We'd appreciate your guidance, ( and the opportunity to beg, borrow, steal or buy the best from the best....) Thanks for the assist! Howard D. Project RIRAL ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Fri Apr 7 08:16:26 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 7 Apr 2006 08:16:26 -0400 Subject: [Assessment 293] Subject line reminders Message-ID: <01e701c65a3d$18806390$0402a8c0@frodo> Hi everyone, I hope this email finds you well. Just a quick reminder: please remember to put the topic at hand in your subject line so that we know what we are opening and/or what thread of the discussion the post belongs to. Also: if you reply to a post and your message shifts the conversation or changes it completely - please *change* the subject line to a title that better describes the email you are sending. Before the Lists went fully-moderated, the Moderators were able to make these helpful changes themselves, but with the new system we cannot. So we are relying on you to help us - it really makes a difference for the reader as well as for the Moderators. Thank you so much! marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060407/938216db/attachment.html From marie.cora at hotspurpartners.com Fri Apr 7 12:21:20 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 7 Apr 2006 12:21:20 -0400 Subject: [Assessment 294] Linking Research and Practice: NCSALL'sResearch Strand at COABE Message-ID: <021f01c65a5f$4e8e7630$0402a8c0@frodo> Dear Colleagues: The following announcement describes sessions being offered by NCSALL at the COABE conference. A number of sessions are focused on topics that might be of interest to you including evaluating the impact of strategies used to increase learner persistence, how learners' engagement affects their learning outcomes, and a session on the ARCS Reading Diagnostic. Please see below. Thanks, marie cora For those of you attending the 2006 COABE Conference in Houston April 26-29, the National Center for the Study of Adult Learning and Literacy (NCSALL) invites you to attend its sessions during the conference. Presenters will highlight NCSALL research findings and share professional development activities and instructional strategies for strengthening the quality of adult literacy programs. Attached is a list of the NCSALL sessions. See you at COABE! Stop by and visit our booth (#119) in the exhibit area. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060407/b272db33/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: NCSALL Ad. for COABE 06_Final3.20.06.pdf Type: application/pdf Size: 131223 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060407/b272db33/attachment.pdf From pammenjk at haslett.k12.mi.us Mon Apr 10 13:29:23 2006 From: pammenjk at haslett.k12.mi.us (JO PAMMENT) Date: Mon, 10 Apr 2006 13:29:23 -0400 Subject: [Assessment 295] Re: (no subject) Message-ID: In Michigan, all adult educators are part of the K-12 system and must be certified in a related subject area. However teachers may get an ESL endorsement attached to their certification if they take an additional 18 hours of ceredit in the ESL field, and pass a test. Currently, an endorsement is not required in order to be hired as an ESL teacher. Jo Pamment Jo Pamment ESL Coordinator Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> khinson at future-gate.com 4/6/06 12:48 PM >>> We don't have any such certification in NC either, as far as I know. I've heard whispers that the state administrators are looking into certification for Adult Educators though. I don't know where that discussion stands though. >>> elohimsgirl at adelphia.net >>> -- We do not have such in Vermont, but are having discussions around the 'certification' issue. I will be happy to share as we move forward from discussion. ---- "Howard L. Dooley wrote: ============= Hi, everyone. I hope everyone is well, and enjoying spring -- though here, today in Woonsocket RI it's snowing if you can believe it... I am looking for information on adult basic skills certifications, particularly those issued by a state-level agency or a consortium. A certification that is recognized as valid and useful by programs, colleges, providers, businesses across your state. Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessements are being used? If your state doesn't have such certifications, do you think it would be valuable to have them? If so, I have the same questions as above. RI is considering if and what certifications would benefit our learners as they move from our adult ed programs into their next-steps--work, college, training, citizenship, and so on. We'd appreciate your guidance, ( and the opportunity to beg, borrow, steal or buy the best from the best....) Thanks for the assist! Howard D. Project RIRAL ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Tina_Luffman at yc.edu Mon Apr 10 15:17:49 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Mon, 10 Apr 2006 12:17:49 -0700 Subject: [Assessment 296] Re: (no subject) Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060410/ff13d62d/attachment.html From elohimsgirl at adelphia.net Wed Apr 12 21:23:28 2006 From: elohimsgirl at adelphia.net (elohimsgirl at adelphia.net) Date: Wed, 12 Apr 2006 18:23:28 -0700 Subject: [Assessment 297] Re: certification for adult ed teachers Message-ID: <3527709.1144891408100.JavaMail.root@web24> -- Tina, is your pay comprable to the K-12 teachers? Also, can you give me a website that I could look into the certification? Thanks Marlo Thomas Watson Director of Adult and Workforce Education Northeast Kingdom Learning Services (NEKLS) 364 Railroad Street, Suite 2 St. Johnsbury, VT 05819 (802) 748-5624 phone (802) 751-8071 fax elohimsgirl at adelphia.net You are only as great as your committment to make it happen! ---- Tina_Luffman at yc.edu wrote: ============= ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From ludlowr at maine.com Thu Apr 13 06:06:33 2006 From: ludlowr at maine.com (Ramsey Ludlow) Date: Thu, 13 Apr 2006 06:06:33 -0400 Subject: [Assessment 298] Re: (no subject) References: Message-ID: <001a01c65ee1$f1bf0c80$6401a8c0@Ramsey> Hello- I believe Howard Dooley was asking whether any program awards certificates to adults for skills attainment- not whether teachers are certified: "Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessments are being used?" I'd be very interested to hear answers to this; there are many learners who attain useful and marketable skills, but who are many semesters away from a diploma. Do programs award certifications that a learner show a potential employer what the learner knows and is able to do- before a learner has attained a diploma or GED? thank you, Ramsey Ludlow Oxford Hills Adult Education 256 Main Street South Paris, Maine 04281 ludlowr at sad17.k12.me.us ----- Original Message ----- From: "JO PAMMENT" To: Sent: Monday, April 10, 2006 1:29 PM Subject: [Assessment 295] Re: (no subject) > In Michigan, all adult educators are part of the K-12 system and must be > certified in a related subject area. However teachers may get an ESL > endorsement attached to their certification if they take an additional 18 > hours of ceredit in the ESL field, and pass a test. > > Currently, an endorsement is not required in order to be hired as an ESL > teacher. > > Jo Pamment > > Jo Pamment > ESL Coordinator > Haslett Public Schools > 1118 S. Harrison > East Lansing, Michigan 48823 > > TEL: 517 337-8353 > FAX: 517 337-3195 > E-Mail: pammenjk at haslett.k12.mi.us > >>>> khinson at future-gate.com 4/6/06 12:48 PM >>> > We don't have any such certification in NC either, as far as I know. I've > heard whispers that the state administrators are looking into > certification for Adult Educators though. I don't know where that > discussion stands though. > > > >>>> elohimsgirl at adelphia.net >>> > -- > We do not have such in Vermont, but are having discussions around the > 'certification' issue. I will be happy to share as we move forward from > discussion. > > > > > > > > ---- "Howard L. Dooley wrote: > > ============= > Hi, everyone. I hope everyone is well, and enjoying spring -- though > here, today in Woonsocket RI it's snowing if you can believe it... > > I am looking for information on adult basic skills certifications, > particularly those issued by a state-level agency or a consortium. A > certification that is recognized as valid and useful by programs, > colleges, providers, businesses across your state. Does your state have > a process for recognizing skill attainment before a secondary credential > is awarded? Is it considering one? If so, what standards are > included? What levels of performance are recognized? What assessements > are being used? > > If your state doesn't have such certifications, do you think it would be > valuable to have them? If so, I have the same questions as above. RI > is considering if and what certifications would benefit our learners as > they move from our adult ed programs into their next-steps--work, > college, training, citizenship, and so on. We'd appreciate your > guidance, ( and the opportunity to beg, borrow, steal or buy the best > from the best....) > > Thanks for the assist! > > Howard D. > Project RIRAL > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From marie.cora at hotspurpartners.com Thu Apr 13 12:26:42 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 13 Apr 2006 12:26:42 -0400 Subject: [Assessment 299] Seeking math placements Message-ID: <01c201c65f17$0d1bcc90$0402a8c0@frodo> Dear Colleagues, I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. marie cora Assessment Discussion List Moderator _______________ We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060413/3ea1960e/attachment.html From ltaylor at casas.org Thu Apr 13 12:22:45 2006 From: ltaylor at casas.org (Linda Taylor) Date: Thu, 13 Apr 2006 12:22:45 -0400 Subject: [Assessment 300] Re: Seeking math placements In-Reply-To: <01c201c65f17$0d1bcc90$0402a8c0@frodo> Message-ID: <0IXO00HE85FAQVB5@vms044.mailsrvcs.net> Marie, CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. Linda Taylor, CASAS (800) 255-1036 _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, April 13, 2006 12:27 PM To: assessment at nifl.gov Subject: [Assessment 299] Seeking math placements Dear Colleagues, I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. marie cora Assessment Discussion List Moderator _______________ We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060413/b32190d4/attachment.html From prwhite at MadisonCounty.NET Thu Apr 13 12:29:56 2006 From: prwhite at MadisonCounty.NET (Patti White) Date: Thu, 13 Apr 2006 11:29:56 -0500 Subject: [Assessment 301] Re: learner certification References: <001a01c65ee1$f1bf0c80$6401a8c0@Ramsey> Message-ID: <05e501c65f17$80808270$6501a8c0@PattiAALRC> Arkansas has a "Workforce Alliance for Growth in the Economy" (WAGE) program that awards certificates to learners in 3 areas: industrial, clerical, and employability. Further information about the WAGE program is located at http://aalrc.org/resources/wage/index.aspx including a PowerPoint that outlines the process for adult education programs, employers, and learners. Patti White Disabilities Project Manager Arkansas Adult Learning Resource Center prwhite at madisoncounty.net ----- Original Message ----- From: Ramsey Ludlow To: The Assessment Discussion List Sent: Thursday, April 13, 2006 5:06 AM Subject: [Assessment 298] Re: (no subject) Hello- I believe Howard Dooley was asking whether any program awards certificates to adults for skills attainment- not whether teachers are certified: "Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessments are being used?" I'd be very interested to hear answers to this; there are many learners who attain useful and marketable skills, but who are many semesters away from a diploma. Do programs award certifications that a learner show a potential employer what the learner knows and is able to do- before a learner has attained a diploma or GED? thank you, Ramsey Ludlow Oxford Hills Adult Education 256 Main Street South Paris, Maine 04281 ludlowr at sad17.k12.me.us From tarv at chemeketa.edu Thu Apr 13 14:15:52 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Thu, 13 Apr 2006 11:15:52 -0700 Subject: [Assessment 302] Re: Seeking math placements Message-ID: <89DA2100D59D7341BEA4D938F9FB2A9303B471E7@cccmail2.chemeketa.network> AND GED ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Linda Taylor Sent: Thursday, April 13, 2006 9:23 AM To: 'The Assessment Discussion List' Subject: [Assessment 300] Re: Seeking math placements Marie, CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. Linda Taylor, CASAS (800) 255-1036 ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, April 13, 2006 12:27 PM To: assessment at nifl.gov Subject: [Assessment 299] Seeking math placements Dear Colleagues, I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. marie cora Assessment Discussion List Moderator _______________ We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060413/5eeb0c6e/attachment.html From lwilkins at mills.edu Thu Apr 13 14:27:50 2006 From: lwilkins at mills.edu (Lynne Wilkins) Date: Thu, 13 Apr 2006 11:27:50 -0700 Subject: [Assessment 303] Re: Seeking math placements In-Reply-To: <01c201c65f17$0d1bcc90$0402a8c0@frodo> References: <01c201c65f17$0d1bcc90$0402a8c0@frodo> Message-ID: <443E9826.1050902@mills.edu> Hello CASAS tests math in their employability series. Best of luck! Lynne Wilkins Lynne Wilkins, Associate Director for Programs English Center for International Women at Mills College P.O. Box 9968, Oakland, CA 94613 (510)430-2285 lwilkins at mills.edu Marie Cora wrote: >Dear Colleagues, > >I have received the following email from a colleague who is trying to >gather some resources on developing or finding placement assessments for >math. Please read on. >marie cora >Assessment Discussion List Moderator >_______________ > > > >We are having departmental meetings to address the multi-level offerings >of math across all departments (ESOL, ABE, and ASE). We are meeting at >the end of April to either develop or use parts of standardized >assessments for placement. Can you suggest some resources (either >electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------------------------------------------------ > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > -- Lynne Wilkins, Associate Director for Programs English Center for International Women at Mills College P.O. Box 9968, Oakland, CA 94613 (510)430-2285 lwilkins at mills.edu From stange at gcnet.com Thu Apr 13 14:38:36 2006 From: stange at gcnet.com (Karen Stange) Date: Thu, 13 Apr 2006 13:38:36 -0500 Subject: [Assessment 304] Re: Seeking math placements In-Reply-To: <0IXO00HE85FAQVB5@vms044.mailsrvcs.net> References: <01c201c65f17$0d1bcc90$0402a8c0@frodo> Message-ID: <20060413183810.351F3EF5EE@skat.hubris.net> An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060413/dad040cf/attachment.html From Tina_Luffman at yc.edu Thu Apr 13 15:19:02 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Thu, 13 Apr 2006 12:19:02 -0700 Subject: [Assessment 305] Re: certification for adult ed teachers Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060413/56596441/attachment.html From khinson at future-gate.com Thu Apr 13 19:18:45 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Fri, 14 Apr 2006 01:18:45 +0200 Subject: [Assessment 306] Re: Seeking math placements In-Reply-To: <20060413183810.351F3EF5EE@skat.hubris.net> References: <01c201c65f17$0d1bcc90$0402a8c0@frodo><0IXO00HE85FAQVB5@vms044.mailsrvcs.net> <20060413183810.351F3EF5EE@skat.hubris.net> Message-ID: <443EF875020000A0000024EB@fghamn01.ham.de.future-gate.com> Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? Regards Katrina Hinson >>> stange at gcnet.com 04/13/06 2:38 PM >>> At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". At 12:22 PM 4/13/06 -0400, you wrote: Marie, CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. Linda Taylor, CASAS (800) 255-1036 From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Thursday, April 13, 2006 12:27 PM To: assessment at nifl.gov Subject: [Assessment 299] Seeking math placements Dear Colleagues, I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. marie cora Assessment Discussion List Moderator _______________ We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment Karen Stange stange at gcnet.com Curriculum Specialist Finney County Community Learning Center 1401 West Buffalo Jones Avenue Garden City, KS 67846 From elohimsgirl at adelphia.net Thu Apr 13 20:30:48 2006 From: elohimsgirl at adelphia.net (elohimsgirl at adelphia.net) Date: Thu, 13 Apr 2006 17:30:48 -0700 Subject: [Assessment 307] Re: certification for adult ed teachers Message-ID: <2559894.1144974648379.JavaMail.root@web25> -- Thanks Tina! This was amazingly helpful! elohimsgirl at adelphia.net You are only as great as your committment to make it happen! ---- Tina_Luffman at yc.edu wrote: ============= ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From hdooley at riral.org Fri Apr 14 09:33:38 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Fri, 14 Apr 2006 09:33:38 -0400 Subject: [Assessment 308] Re: Seeking math placements In-Reply-To: <443EF875020000A0000024EB@fghamn01.ham.de.future-gate.com> References: <01c201c65f17$0d1bcc90$0402a8c0@frodo><0IXO00HE85FAQVB5@vms044.mailsrvcs.net> <20060413183810.351F3EF5EE@skat.hubris.net> <443EF875020000A0000024EB@fghamn01.ham.de.future-gate.com> Message-ID: <443FA4B2.8020303@riral.org> Katrina, When our program reviewed standardized tests, we decided to use the CASAS because the competencies assessed were more in line with our instruction than the more academic skills approach of the TABE, and provides diagnostice information that helps guide instruction. We still use the TABE in those instances where a collaborating partner "needs" a very specific grade level score for their purposes, such as a 9.0 reading score to enter a job training program, or for learners who have a specific academic skills goal. We integrated the CASAS assessments into our program, because we did not want it to stand alone as a high-stake test for accountability. We discovered that the CASAS placement tests placed individuals into class levels as accurately as our previous in-house assessments, and does so in less time than previously. So, we moved to using the CASAS for this purpose. Using it in this way integrates it into our curriculum and provides one benchmark for learner success. Howard D. Katrina Hinson wrote: >Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? > >Regards >Katrina Hinson > > > >>>>stange at gcnet.com 04/13/06 2:38 PM >>> >>>> >>>> >At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. >In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". >At 12:22 PM 4/13/06 -0400, you wrote: > >Marie, > > > >CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. > > > >Linda Taylor, CASAS > >(800) 255-1036 > > > >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora >Sent: Thursday, April 13, 2006 12:27 PM >To: assessment at nifl.gov >Subject: [Assessment 299] Seeking math placements > > > >Dear Colleagues, > > > >I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. > >marie cora > >Assessment Discussion List Moderator > >_______________ > > > >We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > >Karen Stange >stange at gcnet.com >Curriculum Specialist >Finney County Community Learning Center >1401 West Buffalo Jones Avenue >Garden City, KS 67846 > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > From stange at gcnet.com Fri Apr 14 11:17:41 2006 From: stange at gcnet.com (Karen Stange) Date: Fri, 14 Apr 2006 10:17:41 -0500 Subject: [Assessment 309] PBS Locator Tests Message-ID: <20060414151704.EC424EF5E5@skat.hubris.net> An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060414/a5113e73/attachment.html From tarv at chemeketa.edu Mon Apr 17 18:41:16 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Mon, 17 Apr 2006 15:41:16 -0700 Subject: [Assessment 310] Re: Seeking math placements Message-ID: <89DA2100D59D7341BEA4D938F9FB2A9303BA4977@cccmail2.chemeketa.network> CASAS has created grade level charts for those partners who need that non-adult type score Va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Friday, April 14, 2006 6:34 AM To: The Assessment Discussion List Subject: [Assessment 308] Re: Seeking math placements Katrina, When our program reviewed standardized tests, we decided to use the CASAS because the competencies assessed were more in line with our instruction than the more academic skills approach of the TABE, and provides diagnostice information that helps guide instruction. We still use the TABE in those instances where a collaborating partner "needs" a very specific grade level score for their purposes, such as a 9.0 reading score to enter a job training program, or for learners who have a specific academic skills goal. We integrated the CASAS assessments into our program, because we did not want it to stand alone as a high-stake test for accountability. We discovered that the CASAS placement tests placed individuals into class levels as accurately as our previous in-house assessments, and does so in less time than previously. So, we moved to using the CASAS for this purpose. Using it in this way integrates it into our curriculum and provides one benchmark for learner success. Howard D. Katrina Hinson wrote: >Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? > >Regards >Katrina Hinson > > > >>>>stange at gcnet.com 04/13/06 2:38 PM >>> >>>> >>>> >At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. >In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". >At 12:22 PM 4/13/06 -0400, you wrote: > >Marie, > > > >CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. > > > >Linda Taylor, CASAS > >(800) 255-1036 > > > >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora >Sent: Thursday, April 13, 2006 12:27 PM >To: assessment at nifl.gov >Subject: [Assessment 299] Seeking math placements > > > >Dear Colleagues, > > > >I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. > >marie cora > >Assessment Discussion List Moderator > >_______________ > > > >We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > >Karen Stange >stange at gcnet.com >Curriculum Specialist >Finney County Community Learning Center >1401 West Buffalo Jones Avenue >Garden City, KS 67846 > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Tue Apr 18 10:58:10 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 18 Apr 2006 10:58:10 -0400 Subject: [Assessment 311] National Practitioner-Researcher Symposium Message-ID: <002e01c662f8$85f9a620$0b02a8c0@LITNOW> The following post is from Mary Ann Corely. ____________ SAVE the DATES: November 30-December 2, 2006! ANNOUNCING: A MEETING OF THE MINDS II SYMPOSIUM! The National Center for the Study of Adult Learning and Literacy (NCSALL), the California Department of Education (CDE) Adult Education Office, and the California Adult Literacy Professional Development Project (CALPRO) of the American Institutes for Research are pleased to announce a Meeting of the Minds II: A National Adult Education Practitioner-Researcher Symposium. Scheduled for November 30-December 2, 2006, at the Sheraton Grand hotel in Sacramento, California, the symposium is designed to provide opportunities for adult education practitioners and researchers to share and discuss the most current research findings and practitioner wisdom. It will engage practitioners and researchers with questions related to goals, accountability, and efficacy and efficiency in policy, practice, and research. The ultimate goals of the symposium are to highlight systemic changes that can enhance literacy practice and increase student learning gains. The theme of this year's symposium is Systemic Change and Student Success: What Does Research Tell Us? As in the first Meeting of the Minds Symposium that was held in 2004, each session of the 2006 Symposium will be structured so that the research presentation is followed by a panel of practitioners who will discuss implications for practice or policy. In addition, conference attendees will have opportunities for small group interaction and networking with researcher-presenters to discuss not only how research can inform practice and policy, but also how practice and policy can inform and suggest a research agenda. More information about the Meeting of the Minds II symposium will be available soon on the symposium Web site, www.researchtopractice.org. (This Web site currently lists presenters' PowerPoints and abstracts of sessions held at the 2004 Meeting of the Minds symposium as well as thoughts generated by attendees regarding implications of the research findings.) We are in the process of updating this Web site to house information about online registration for the 2006 symposium as well as information about hotel registration. We will send out another notice after the Web site has been updated. In the meantime, please save the dates and plan to join us in November in Sacramento! Thank you. -Mary Ann Corley, Ph.D. Symposium Coordinator and CALPRO Director, American Institutes for Research From samuel.mcgraw at seattlegoodwill.org Tue Apr 18 12:51:50 2006 From: samuel.mcgraw at seattlegoodwill.org (Samuel McGraw III) Date: Tue, 18 Apr 2006 09:51:50 -0700 Subject: [Assessment 312] Re: Seeking math placements Message-ID: <802F2B4590320142A57872DC43A2BFD20218AFF7@seamail.seagoodwill.org> Is the chart posted? Samuel McGraw III M. Ed. Goodwill Tel: 206.860.5789 Fax: 206.325.9845 http://www.seattlegoodwill.org Because jobs change lives -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Virginia Tardaewether Sent: Monday, April 17, 2006 3:41 PM To: hdooley at riral.org; The Assessment Discussion List Subject: [Assessment 310] Re: Seeking math placements CASAS has created grade level charts for those partners who need that non-adult type score Va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Friday, April 14, 2006 6:34 AM To: The Assessment Discussion List Subject: [Assessment 308] Re: Seeking math placements Katrina, When our program reviewed standardized tests, we decided to use the CASAS because the competencies assessed were more in line with our instruction than the more academic skills approach of the TABE, and provides diagnostice information that helps guide instruction. We still use the TABE in those instances where a collaborating partner "needs" a very specific grade level score for their purposes, such as a 9.0 reading score to enter a job training program, or for learners who have a specific academic skills goal. We integrated the CASAS assessments into our program, because we did not want it to stand alone as a high-stake test for accountability. We discovered that the CASAS placement tests placed individuals into class levels as accurately as our previous in-house assessments, and does so in less time than previously. So, we moved to using the CASAS for this purpose. Using it in this way integrates it into our curriculum and provides one benchmark for learner success. Howard D. Katrina Hinson wrote: >Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? > >Regards >Katrina Hinson > > > >>>>stange at gcnet.com 04/13/06 2:38 PM >>> >>>> >>>> >At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. >In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". >At 12:22 PM 4/13/06 -0400, you wrote: > >Marie, > > > >CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. > > > >Linda Taylor, CASAS > >(800) 255-1036 > > > >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora >Sent: Thursday, April 13, 2006 12:27 PM >To: assessment at nifl.gov >Subject: [Assessment 299] Seeking math placements > > > >Dear Colleagues, > > > >I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. > >marie cora > >Assessment Discussion List Moderator > >_______________ > > > >We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > >Karen Stange >stange at gcnet.com >Curriculum Specialist >Finney County Community Learning Center >1401 West Buffalo Jones Avenue >Garden City, KS 67846 > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From jeguez at casas.org Tue Apr 18 13:13:47 2006 From: jeguez at casas.org (Jane Eguez) Date: Tue, 18 Apr 2006 10:13:47 -0700 Subject: [Assessment 313] Re: Seeking math placements Message-ID: <2716C9F0CFC0A54C8D0E1F429681AE78934E80@XNG.casas.org> Yes, the chart "CASAS SCALED SCORE REFERENCES FOR GRADE LEVELS" can be accessed at: http://www.casas.org/Downloads/more.cfm?mfile_id=866&bhcp=1 You may be interested in another paper that discusses "Why Scaled Scores are Better than Grade Level Equivalents" located at: http://www.casas.org/DirctDwnlds.cfm?mfile_id=1231&selected_id=715&wtarget=body Jane Eg?ez CASAS Director Program Planning jeguez at casas.org www.casas.org 800-255-1036 x125 -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Samuel McGraw III Sent: Tuesday, April 18, 2006 9:52 AM To: The Assessment Discussion List; hdooley at riral.org Subject: [Assessment 312] Re: Seeking math placements Is the chart posted? Samuel McGraw III M. Ed. Goodwill Tel: 206.860.5789 Fax: 206.325.9845 http://www.seattlegoodwill.org Because jobs change lives -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Virginia Tardaewether Sent: Monday, April 17, 2006 3:41 PM To: hdooley at riral.org; The Assessment Discussion List Subject: [Assessment 310] Re: Seeking math placements CASAS has created grade level charts for those partners who need that non-adult type score Va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Friday, April 14, 2006 6:34 AM To: The Assessment Discussion List Subject: [Assessment 308] Re: Seeking math placements Katrina, When our program reviewed standardized tests, we decided to use the CASAS because the competencies assessed were more in line with our instruction than the more academic skills approach of the TABE, and provides diagnostice information that helps guide instruction. We still use the TABE in those instances where a collaborating partner "needs" a very specific grade level score for their purposes, such as a 9.0 reading score to enter a job training program, or for learners who have a specific academic skills goal. We integrated the CASAS assessments into our program, because we did not want it to stand alone as a high-stake test for accountability. We discovered that the CASAS placement tests placed individuals into class levels as accurately as our previous in-house assessments, and does so in less time than previously. So, we moved to using the CASAS for this purpose. Using it in this way integrates it into our curriculum and provides one benchmark for learner success. Howard D. Katrina Hinson wrote: >Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? > >Regards >Katrina Hinson > > > >>>>stange at gcnet.com 04/13/06 2:38 PM >>> >>>> >>>> >At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. >In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". >At 12:22 PM 4/13/06 -0400, you wrote: > >Marie, > > > >CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. > > > >Linda Taylor, CASAS > >(800) 255-1036 > > > >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora >Sent: Thursday, April 13, 2006 12:27 PM >To: assessment at nifl.gov >Subject: [Assessment 299] Seeking math placements > > > >Dear Colleagues, > > > >I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. > >marie cora > >Assessment Discussion List Moderator > >_______________ > > > >We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > >Karen Stange >stange at gcnet.com >Curriculum Specialist >Finney County Community Learning Center >1401 West Buffalo Jones Avenue >Garden City, KS 67846 > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From tarv at chemeketa.edu Tue Apr 18 13:59:30 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Tue, 18 Apr 2006 10:59:30 -0700 Subject: [Assessment 314] Re: Seeking math placements References: <802F2B4590320142A57872DC43A2BFD20218AFF7@seamail.seagoodwill.org> Message-ID: <89DA2100D59D7341BEA4D938F9FB2A935D551E@cccmail2.chemeketa.network> Samuel I haven't looked recently but I'd guess this chart would be on the CASAS web page I know it used to be in their manual. Va ________________________________ From: assessment-bounces at nifl.gov on behalf of Samuel McGraw III Sent: Tue 04/18/2006 09:51 To: The Assessment Discussion List; hdooley at riral.org Subject: [Assessment 312] Re: Seeking math placements Is the chart posted? Samuel McGraw III M. Ed. Goodwill Tel: 206.860.5789 Fax: 206.325.9845 http://www..org Because jobs change lives -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Virginia Tardaewether Sent: Monday, April 17, 2006 3:41 PM To: hdooley at riral.org; The Assessment Discussion List Subject: [Assessment 310] Re: Seeking math placements CASAS has created grade level charts for those partners who need that non-adult type score Va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Friday, April 14, 2006 6:34 AM To: The Assessment Discussion List Subject: [Assessment 308] Re: Seeking math placements Katrina, When our program reviewed standardized tests, we decided to use the CASAS because the competencies assessed were more in line with our instruction than the more academic skills approach of the TABE, and provides diagnostice information that helps guide instruction. We still use the TABE in those instances where a collaborating partner "needs" a very specific grade level score for their purposes, such as a 9.0 reading score to enter a job training program, or for learners who have a specific academic skills goal. We integrated the CASAS assessments into our program, because we did not want it to stand alone as a high-stake test for accountability. We discovered that the CASAS placement tests placed individuals into class levels as accurately as our previous in-house assessments, and does so in less time than previously. So, we moved to using the CASAS for this purpose. Using it in this way integrates it into our curriculum and provides one benchmark for learner success. Howard D. Katrina Hinson wrote: >Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? > >Regards >Katrina Hinson > > > >>>>stange at gcnet.com 04/13/06 2:38 PM >>> >>>> >>>> >At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. >In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". >At 12:22 PM 4/13/06 -0400, you wrote: > >Marie, > > > >CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. > > > >Linda Taylor, CASAS > >(800) 255-1036 > > > >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora >Sent: Thursday, April 13, 2006 12:27 PM >To: assessment at nifl.gov >Subject: [Assessment 299] Seeking math placements > > > >Dear Colleagues, > > > >I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. > >marie cora > >Assessment Discussion List Moderator > >_______________ > > > >We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > >Karen Stange >stange at gcnet.com >Curriculum Specialist >Finney County Community Learning Center >1401 West Buffalo Jones Avenue >Garden City, KS 67846 > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 8573 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060418/93dc2969/attachment.bin From jgore at readingconnections.org Tue Apr 18 14:52:30 2006 From: jgore at readingconnections.org (Jenny Gore) Date: Tue, 18 Apr 2006 14:52:30 -0400 Subject: [Assessment 315] Re: Seeking math placements References: <2716C9F0CFC0A54C8D0E1F429681AE78934E80@XNG.casas.org> Message-ID: <030301c66319$3f47a530$0201a8c0@RCLAB8> see below. Jennifer B. Gore Executive Director Reading Connections, Inc. 122 N. Elm Street, Suite 520 Greensboro, NC 27401 336-230-2223 www.readingconnections.org Treat people as if they were what they ought to be, and help them become what they are capable of being. Goethe ----- Original Message ----- From: "Jane Eguez" To: "The Assessment Discussion List" ; Sent: Tuesday, April 18, 2006 1:13 PM Subject: [Assessment 313] Re: Seeking math placements Yes, the chart "CASAS SCALED SCORE REFERENCES FOR GRADE LEVELS" can be accessed at: http://www.casas.org/Downloads/more.cfm?mfile_id=866&bhcp=1 You may be interested in another paper that discusses "Why Scaled Scores are Better than Grade Level Equivalents" located at: http://www.casas.org/DirctDwnlds.cfm?mfile_id=1231&selected_id=715&wtarget=body Jane Eg?ez CASAS Director Program Planning jeguez at casas.org www.casas.org 800-255-1036 x125 -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Samuel McGraw III Sent: Tuesday, April 18, 2006 9:52 AM To: The Assessment Discussion List; hdooley at riral.org Subject: [Assessment 312] Re: Seeking math placements Is the chart posted? Samuel McGraw III M. Ed. Goodwill Tel: 206.860.5789 Fax: 206.325.9845 http://www.seattlegoodwill.org Because jobs change lives -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Virginia Tardaewether Sent: Monday, April 17, 2006 3:41 PM To: hdooley at riral.org; The Assessment Discussion List Subject: [Assessment 310] Re: Seeking math placements CASAS has created grade level charts for those partners who need that non-adult type score Va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Friday, April 14, 2006 6:34 AM To: The Assessment Discussion List Subject: [Assessment 308] Re: Seeking math placements Katrina, When our program reviewed standardized tests, we decided to use the CASAS because the competencies assessed were more in line with our instruction than the more academic skills approach of the TABE, and provides diagnostice information that helps guide instruction. We still use the TABE in those instances where a collaborating partner "needs" a very specific grade level score for their purposes, such as a 9.0 reading score to enter a job training program, or for learners who have a specific academic skills goal. We integrated the CASAS assessments into our program, because we did not want it to stand alone as a high-stake test for accountability. We discovered that the CASAS placement tests placed individuals into class levels as accurately as our previous in-house assessments, and does so in less time than previously. So, we moved to using the CASAS for this purpose. Using it in this way integrates it into our curriculum and provides one benchmark for learner success. Howard D. Katrina Hinson wrote: >Can I get some more information on the PBS Locator tests mentioned in the email below? Also, I see on the list that a lot of programs use CASAS or appear to use CASAS more than TABE. Is there a particular reason for that? > >Regards >Katrina Hinson > > > >>>>stange at gcnet.com 04/13/06 2:38 PM >>> >>>> >>>> >At the FCCLC, we base our class placements on the CASAS diagnostic score. Since we only offer two levels of class, the Level one class is for CASAS ABE levels 1and 2 (Up to a 212 score), Level two class is for CASAS ABE level 3 and 4 (up to a 235 score) and the fast track class is for above 235). It would be better to have four levels, but our level placement is quite flexible. >In individual classes I like to use the PBS Locator tests, as they are easier to make specific placement of needs of the class members if the goal is most commonly, "get my GED". >At 12:22 PM 4/13/06 -0400, you wrote: > >Marie, > > > >CASAS Math appraisals and pre/post tests can be used for ABE or ESOL. > > > >Linda Taylor, CASAS > >(800) 255-1036 > > > >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora >Sent: Thursday, April 13, 2006 12:27 PM >To: assessment at nifl.gov >Subject: [Assessment 299] Seeking math placements > > > >Dear Colleagues, > > > >I have received the following email from a colleague who is trying to gather some resources on developing or finding placement assessments for math. Please read on. > >marie cora > >Assessment Discussion List Moderator > >_______________ > > > >We are having departmental meetings to address the multi-level offerings of math across all departments (ESOL, ABE, and ASE). We are meeting at the end of April to either develop or use parts of standardized assessments for placement. Can you suggest some resources (either electronic or places you could guide me to) that could assist me/us? > > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > >Karen Stange >stange at gcnet.com >Curriculum Specialist >Finney County Community Learning Center >1401 West Buffalo Jones Avenue >Garden City, KS 67846 > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From kabeall at comcast.net Tue Apr 18 14:31:41 2006 From: kabeall at comcast.net (Kaye Beall) Date: Tue, 18 Apr 2006 14:31:41 -0400 Subject: [Assessment 316] New from NCSALL Message-ID: <008401c66316$56893220$0202a8c0@your4105e587b6> Two new publications are now available from NCSALL. For more information, please visit the NCSALL Web site at: http://www.ncsall.net An Evidence-based Adult Education Program Model Appropriate for Research by John Comings, Lisa Soricone, and Maricel Santos The document reviews the available empirical evidence and professional wisdom in order to define a program model that meets the requirements for good practice. This program model describes what teachers, adult students, counselors, administrators, volunteers, and program partners should do to provide both effective instruction and the support services adults need to persist in their learning long enough to be successful. This paper describes a program model as having a program quality support component and three chronological program components, which are entrance into a program, participation in a program, and reengagement in learning. Though this model could also be used as a description of good programs for other purposes, here it describes the context in which research on approaches to instruction and support services could be productive. To download the NCSALL Occasional Paper, visit NCSALL's Web site: http://www.ncsall.net/index.php?id=26#ebae Learner's Engagement in Adult Literacy Education by Hal Beder, Jessica Tomkins, Patsy Medina, Regina Riccioni, and Weiling Deng Engagement is mental effort focused on learning and is a precondition to learning progress. It is important to understand how and why adult learners engage in literacy instruction because engagement is a precondition to learning progress. This study focused on how learning context shapes engagement. The practical reason for doing so is that to a great extent adult educators control the educational context. Thus if they understand how the educational context shapes engagement, they can influence engagement in positive ways. To download the NCSALL Report, visit NCSALL's Web site: http://www.ncsall.net/?id=29#28 To order the NCSALL Report at $10.00/copy, go to the Order Form: http://www.ncsall.net/?id=681 **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060418/07fbdbf3/attachment.html From marie.cora at hotspurpartners.com Mon Apr 24 10:51:27 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 24 Apr 2006 10:51:27 -0400 Subject: [Assessment 317] COABE session Message-ID: <009101c667ae$96364fb0$0202a8c0@LITNOW> Dear Colleagues: Please join us in the following session at this year's COABE National Conference in Houston, Texas: Professional Development From Your "Inbox": Making the Most of National Discussion Lists Saturday, 9:45 - 11:00 Presented by NIFL Discussion List Moderators: Jackie Taylor, Adult Literacy Professional Development List Marie Cora, Assessment List Daphne Greenberg, Women & Literacy List National online discussion lists provide an opportunity for ongoing professional development with colleagues, researchers, nationally-recognized experts and leaders in the field. Presenters will provide information regarding the National Institute for Literacy's discussion lists, emerging and key issues for each topic, upcoming discussion activities, and how to get the most from your discussion list subscription. To this end, we encourage you to attend the session to discuss your own experience being a subscriber (writer or lurker!) on any of the Institute's Lists. Please come and share your thoughts on how newcomers can get the most out of their subscription, as well as provide us with feedback so that we can better serve your needs. We look forward to meeting you in person! Jackie Taylor, Moderator Adult Literacy Professional Development List Marie Cora, Moderator Assessment List Daphne Greenberg, Moderator Women & Literacy List The National Institute for Literacy's Discussion Lists are: Adult Literacy Professional Development, Assessment, Adult Education Content Standards, English Language Learners, Family Literacy, Focus on Basics, Health and Literacy, Learning Disabilities, Poverty, Race, and Literacy, Program Leadership and Improvement, Technology and Literacy, Women and Literacy, and Workplace Literacy. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060424/0b6fdd31/attachment.html From djgbrian at utk.edu Mon Apr 24 11:08:42 2006 From: djgbrian at utk.edu (Donna Brian) Date: Mon, 24 Apr 2006 11:08:42 -0400 Subject: [Assessment 318] Invitation to a COABE Session Message-ID: <5.1.0.14.2.20060424110820.03ba9190@pop.utk.edu> Dear Colleagues: Please join us in the following session at this year's COABE National Conference in Houston, Texas: Learning Disabilities & Work Issues: How Can We Help? Thursday, April 27 at 11:00 AM in the computer lab Presented by Aaron Kohring, Content Coordinator of the Literacy and Learning Disabilities Special Collection and Donna Brian, Content Coordinator of the Workforce Education Special Collection Learning disabilities among workers are common, though often unrecognized, and they frequently contribute to job loss. As an educator, what can you do to prepare adults with LD for greater success in the world of work? How do you assist an adult to self-assess for their strengths, their struggles, and helpful accommodations? What proactive steps can make the difference between success and frustration for LD workers? What is the role of self-advocacy? Are there learning strategies that are especially effective with LD workers? Where can you go for resources? This session begins with a participatory activity to heighten awareness of the nature of LD in the workplace. Participants will then consider ways to discern strengths and weaknesses of an adult with learning disabilities and provide appropriate accommodations to foster success. Learning and work strategies particularly effective for those with LD will be introduced. The recounted experiences of successful LD workers will bring to life the critical role that self-advocacy plays. Throughout the presentation, the presenters will highlight pertinent resources from the Literacy and Learning Disabilities and Workforce Education Special Collections. We will demonstrate on-line how the collections can be used to find free resources. The organization and content of the special collections will be explored through using the collections to find information about issues. Attendees will be able to extend their learning beyond the presentation through access to a self-paced tutorial (authored by the presenters) plus a list of applicable online resources. Participation of attendees in providing examples of their experiences with adults with LD will be encouraged throughout the presentation. Discussion will invite feedback from participants to guide the presenters' continued future efforts in collecting and organizing on-line resources responsive to the needs of workplace and workforce educators in mentoring adults with learning disabilities. Donna JG Brian Moderator, NIFL Workplace Literacy Discussion List, and Coordinator/Developer LINCS Workforce Education Special Collection at http://worklink.coe.utk.edu/ Center for Literacy Studies at The University of Tennessee 600 Henley Street, Suite 312 Knoxville, TN 37996-4135 865-974-3420 (desk phone) FAX 865-974-3857 djgbrian at utk.edu From DonMcCabe at aol.com Mon Apr 24 13:43:56 2006 From: DonMcCabe at aol.com (DonMcCabe at aol.com) Date: Mon, 24 Apr 2006 13:43:56 EDT Subject: [Assessment 319] Re: Invitation to a COABE Session Message-ID: <384.1b1fbf7.317e685c@aol.com> Dear Colleagues: Please join us in the following session at this year's COABE National Conference in Houston, Texas: Practical Applications of Research in the Teaching of Spelling/Reading to Adults. Saturday, April 29 8:00 - 11:00 am Presented by Don McCabe, Research Director AVKO Educational Research Foundation Don McCabe spelling.org (http://www.spelling.org) From marie.cora at hotspurpartners.com Thu Apr 27 10:29:36 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 27 Apr 2006 10:29:36 -0400 Subject: [Assessment 320] Family Health and Literacy - A New Resource Guide! Message-ID: <008d01c66a07$08fbe8c0$7b82fea9@LITNOW> Dear colleagues: The following announcement is from Julie McKinney. Marie ************************************** I would like to announce a new health literacy resource guide that has just been published in print and on the Web! Family Health and Literacy This guide to easy-to-read health materials and websites is for adult literacy practitioners and health educators alike. It lists resources to teach health to families with lower literacy skills, but also discusses how to integrate health and literacy education, how to get started and engage adult learners, and how to build connections between literacy programs and local health services. You can find Family Health and Literacy online at: www.worlded.org/us/health/docs/family This is a PDF, and with Adobe Reader 7 or higher you can click on the live links! Hard copies are also available free of charge for a limited time: please contact Leah_Peterson at worlded.org I hope you find it helpful. All the best, Julie Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From marie.cora at hotspurpartners.com Thu Apr 27 10:37:19 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 27 Apr 2006 10:37:19 -0400 Subject: [Assessment 321] Literacy President 2008 Message-ID: <008e01c66a08$1eeda050$7b82fea9@LITNOW> Good morning, afternoon, and evening to you all. The following post is from David Rosen. Your voice needs to be heard in terms of letting our country's leadership know how important ABE and ESOL services are for our economy, our families, and our ability to compete in the global market. We need to send a strong message to our next President - please vote and let your voice be heard! marie cora Assessment Discussion List Moderator ----------------------- Dear Colleague, We need your opinion and your vote. Literacy President 2008 is a non- partisan effort to increase national awareness of adult literacy regardless of whom is elected. Literacy President provides members of the adult education community with ways to be active participants in the 2008 Presidential election. The first activity was generating possible questions to ask the candidates. We now have 20 possible questions, and they need to be narrowed to the top five. This is your chance to vote on these, to help us narrow them to the top five best questions. In the 2004 election, Literacy President had over 1000 people -- practitioners, adult learners and others -- who voted for the top priority questions. This time our goal is 1500 people participating: students, practitioners and other advocates for adult literacy. To vote, please go to http://www.surveymonkey.com/s.asp?u=85102489618 If the address breaks into two lines, you can try this one instead: http://tinyurl.com/s553p For the Literacy President Group, David Rosen djrosen at comcast.net From hdooley at riral.org Mon May 1 13:29:27 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Mon, 01 May 2006 13:29:27 -0400 Subject: [Assessment 322] Literacy President 2008 Message-ID: <44564577.4090703@riral.org> Hello, everyone hello! I received the following email, and believe it is an excellent Civics opportunity for you and your learners. You can simply complete the survey and have your voice heard, or you can use the "Literacy President 2008" materials to develop lessons that delve deeper in the issues with research, discussion, writing, and so on. Even if you decide not to share this with your learners, please consider voting yourself, so that practitioners' voices will be heard in the upcoming elections. Thank you. The following post is from David Rosen. Your voice needs to be heard in terms of letting our country's leadership know how important ABE and ESOL services are for our economy, our families, and our ability to compete in the global market. We need to send a strong message to our next President - please vote and let your voice be heard! marie cora Assessment Discussion List Moderator ----------------------- Dear Colleague, We need your opinion and your vote. Literacy President 2008 is a non- partisan effort to increase national awareness of adult literacy regardless of whom is elected. Literacy President provides members of the adult education community with ways to be active participants in the 2008 Presidential election. The first activity was generating possible questions to ask the candidates. We now have 20 possible questions, and they need to be narrowed to the top five. This is your chance to vote on these, to help us narrow them to the top five best questions. In the 2004 election, Literacy President had over 1000 people -- practitioners, adult learners and others -- who voted for the top priority questions. This time our goal is 1500 people participating: students, practitioners and other advocates for adult literacy. To vote, please go to http://www.surveymonkey.com/s.asp?u=85102489618 If the address breaks into two lines, you can try this one instead: http://tinyurl.com/s553p For the Literacy President Group, David Rosen djrosen at comcast.net ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov From marie.cora at hotspurpartners.com Wed May 3 08:25:59 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 3 May 2006 08:25:59 -0400 Subject: [Assessment 323] FW: [ProgramLeadership 48] Guest next week Message-ID: <011601c66eac$bfbef170$0302a8c0@LITNOW> Good morning, afternoon, and evening to you all. The following post is from Kim Chaney, Moderator of the Program Leadership and Improvement. To subscribe to this Discussion List, go to: http://www.nifl.gov/mailman/listinfo/Programleadership marie ________________________ Dear "Program Leadership and Improvement" List Subscribers: I am pleased to announce that Esmerelda Doreste, Program Director with the Union City (NJ) Adult Learning Center, will be a guest on the list from Monday, May 8 through Friday, May 12. As a participant in the UPS Foundation-funded "Leadership for Community Literacy" Initiative that was administered by the Equipped for the Future (EFF) Center in the late winter, Ms. Doreste worked with her program to implement a program improvement process based on the EFF program quality model. Along with the four other participants, Ms. Doreste wrote about that experience...her story is now accessible on the "Program Leadership and Improvement" web site. Go to [http://pli.cls.utk.edu] and click the "Stories of Program Improvement" button, then click/open "Union City Adult Learning Center: A Program Improvement Process." Please read the story in preparation for Ms. Doreste's visit next week. She will be ready to answer your questions about the nuts and bolts of implementing the process, as well as any other related issues. Looking forward to next week, Kim From marie.cora at hotspurpartners.com Wed May 3 08:30:05 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 3 May 2006 08:30:05 -0400 Subject: [Assessment 324] FW: Literacy President Update Message-ID: <012a01c66ead$52479ba0$0302a8c0@LITNOW> Colleagues, The following post is from David Rosen. marie cora -------------- Dear Colleague, The nonpartisan Literacy President survey is off to a good start. We now have over 325 people who have voted on the questions to ask the candidates for President. The top state, way in the lead, is Pennsylvania, with over 68 votes on the questions. Next is California with 28, Illinois with 15, then Florida, Massachusetts, North Carolina, and New Jersey. Other states also have a few participants who have voted. We have a way to go to reach our goal of 1500 survey participants. And currently only one student has voted, so we need to do a lot more to get students involved. If you haven't voted yet, please do. To vote, please go to: http://www.surveymonkey.com/s.asp?u=85102489618 If the address breaks into two lines, you can try this one instead: http://tinyurl.com/s553p Even if you have voted, please help to get the message out to others. Post the message below to adult literacy education electronic lists of your organization or state. Voting will take place over the next several weeks, but it's important to get the word out as soon as possible. Thanks for your help. David J. Rosen Adult Literacy Advocate DJRosen at theworld.com On Apr 25, 2006, at 8:20 AM, David Rosen wrote: > Dear Colleague, > > We need your opinion and your vote. Literacy President 2008 is a non- > partisan effort to increase national awareness of adult literacy > regardless of whom is elected. Literacy President provides members > of the adult education community with ways to be active participants > in the 2008 Presidential election. The first activity was generating > possible questions to ask the candidates. We now have 20 possible > questions, and they need to be narrowed to the top five. > > This is your chance to vote on these, to help us narrow them to the > top five best questions. In the 2004 election, Literacy President > had over 1000 people -- practitioners, adult learners and others -- > who voted for the top priority questions. This time our goal is 1500 > people participating: students, practitioners and other advocates for > adult literacy. > > To vote, please go to > > http://www.surveymonkey.com/s.asp?u=85102489618 > > If the address breaks into two lines, you can try this one instead: > > http://tinyurl.com/s553p > > > For the Literacy President Group, > > David Rosen > djrosen at comcast.net > From info at nifl.gov Wed May 3 09:02:27 2006 From: info at nifl.gov (Sandra Baxter) Date: Wed, 3 May 2006 09:02:27 -0400 (EDT) Subject: [Assessment 325] News from the National Institute for Literacy Message-ID: <20060503130227.7634546555@dev.nifl.gov> Dear Colleagues, We are happy to announce that the National Institute for Literacy has launched a new web page design to help provide easily accessible, high quality information about literacy. New features clearly highlight the Institute's work in all areas of literacy, including early childhood, childhood, adolescence, and adulthood. You will continue to find the links to all of the Institute's projects, such as Bridges to Practice, LINCS, Assessment Strategies and Reading Profiles under Programs and Services. The Institute's publications, including the recently released Teaching Reading to Adults can be found under the Publications link. Please visit http://www.nifl.gov for more information. This is phase one of the redesign. We will soon be incorporating all the Institute's projects into this new design. As many of you know, websites are always a work in progress and we intend to continue improving the Institute's site in order to provide you with the best available resources. We would love to hear your thoughts regarding the new look. Please send your comments to Jo Maralit at mmaralit at nifl.gov. Thanks, Sandra Baxter, Ed.D. Director National Institute for Literacy http://www.nifl.gov From marie.cora at hotspurpartners.com Wed May 3 12:40:17 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 3 May 2006 12:40:17 -0400 Subject: [Assessment 326] First Discussion on Special Topics List Message-ID: <019401c66ed0$45c68990$0302a8c0@LITNOW> Dear Colleagues, Below please fine the announcement of the new Special Topics List sponsored by the National Institute for Literacy. The moderator of the List is David Rosen. Please read further for a full description of this new List. David's first topic will be focused on the Adult Reading Components Study, or ARCS, as it is known. A discussion on ARCS was held on the Assessment Discussion List in April of 2005 - just over a year ago - and you can find information on ARCS as well as a summary of that Discussion at the ALE Wiki by going to the Discussions in the Assessment section: http://wiki.literacytent.org/index.php/Assessment_Information#Discussion s and clicking on: ARCS Reading Diagnostic For those of you who are not familiar with ARCS, I highly recommend that you sign on to this Discussion to learn more about the resource tool. For those of you who were here last year for that Discussion, I encourage you to sign on to find out what has been learned about ARCS over the past year, or to ask questions and make comments if you have been using ARCS. Thanks and hope to see you there! marie cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com _____________ New Special Topics Discussion List Dear Colleague, On May 23rd we will begin a week-long discussion on the new National Institute for Literacy Special Topics electronic list. The topic is the Adult Reading Components Study (ARCS). Dr. Rosalind Davidson and Dr. John Strucker, the co-researchers, will join us to answer your questions. Special Topics will be an intermittent discussion list. The topics will open and close throughout the year, so there will be periods where there will be no discussion or postings. You can subscribe to the e-list for a particular topic of interest, and then unsubscribe, or you can stay subscribed throughout the year. To participate in this first topic, the Adult Reading Components Study, and to learn more about the ARCS interactive Web site -- which has lots of reading help for teachers -- please subscribe to the Special Topics list now by going to: http://www.nifl.gov/mailman/listinfo/specialtopics Before the discussion begins on May 23rd please look at a 30-minute streaming video introduction to the discussion with researcher panelists Rosalind Davidson and John Strucker, and practitioners Kay Vaccaro and Jane Meyer. http://www.nifl.gov/nifl/webcasts/20040204/webcast02-04a.html (Note: Macintosh users will need to have Real Player installed, and for them performance may not be optimal.) After you subscribe, you can send your questions to the discussion list. Note, however, that messages will not be posted until May 22nd. I look forward to having you join us in this discussion. David J. Rosen Special Topics Discussion List Moderator djrosen at comcast.net From marie.cora at hotspurpartners.com Mon May 8 09:08:43 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 8 May 2006 09:08:43 -0400 Subject: [Assessment 327] New FOB on the Web Message-ID: <034501c672a0$8c238630$0302a8c0@LITNOW> Dear Colleagues, The newest Focus On Basics, on learners' Experiences, is now available on NCSALL's web site, www.ncsall.net. Quick, tell me about your students' self esteem. Low, because of their academic struggles? That's not what a recent NCSALL-Rutgers study showed. And how about reading? Do your learners know that to increase their reading fluency, they need to...read? What kind of and how much reading do they do outside of class? Another NCSALL-Rutgers study follows three learners as they go about their days and finds quite a variety in the amount of reading the learners do on their own. Teachers, have you ever seen yourself teach? Or noticed just what that clump of students was doing while you were engaged with one person on another side of the classroom? Teachers working with NCSALL-Rutgers found that videos taken of their classroom for research purposes provided them with rich information useful to their own professional development. Learn how useful video can be in helping pinpoint issues and suggest new ways of doing things in the classroom. There's lots more, particularly around learner engagement. Go to www.ncsall.net and click on "Newest Issue of Focus on Basics" Printed copies and a text-only web version will not be out for another two weeks. Regards, Barb Garner, Editor _______________________________________________ From djrosen at comcast.net Wed May 10 04:43:28 2006 From: djrosen at comcast.net (djrosen at comcast.net) Date: Wed, 10 May 2006 08:43:28 +0000 Subject: [Assessment 328] Formative Assessment Message-ID: <051020060843.17767.4461A7AF000B8580000045672200761394020A9C019D060B@comcast.net> Assessment colleagues, The term ?formative assessment" may not be familiar to many of us working in adult literacy education in the United States. It is more widely used in Europe, and possibly in Canada, Australia and New Zealand, and more common in elementary and secondary education. I am quite interested in this topic and would be interested to learn about examples in adult literacy education in the U.S., especially of systematic formative assessment. So, what is formative assessment? It's the opposite of summative assessment. Its focus is assessment _for_ learning; whereas summative assessment's focus is assessment _of_ learning. Formative assessments give teachers and learners information about learners' goals and objectives, about how they are progressing toward them, about what methods do -- and do not -- work for them. Formative assessments are carried out by a learner on her/his own (self-assessment) , by a group of learners (peer assessment) by a teacher and learner together, and possibly by a learner and a counselor or intake worker. Formative assessments are not usually standardized tests. They do not usually result in a grade, official score or certificate. They provide information for decision-making by learners and teachers, often together, about the learning itself. Some people in the U.S. use the term " alternative assessment," or are familiar with one formative assessment process referred to as "portfolio assessment." Some teachers, when they un derstand what "formative assessment" means say "Yes, of course, I do this all the time. It's part of the learning process. I just don't have a name for it." I am interested in the details of how this is done, in examples of where it is done especially well, and where it is systematically used. I am especially interested because I am doing a case study right now on how formative assessment is being used in Belgium, part of a larger OECD study being carried out in several countries where formative assessment is used in adult literacy education. I am also interested because there is evidence from elementary and secondary education research that formative assessment works, that is, that its use results in higher student achievement. If you use a formative assessment process in your classroom, or if you have studied formative assessment and/or if you know of good studies of formative assessment, please let us all know -- here -- or email me a djrosen at comcast.net. David J. Rosen newsomeassociates.com djrosen at comcast.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060510/0cbfc303/attachment.html From marie.cora at hotspurpartners.com Thu May 11 08:38:09 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 11 May 2006 08:38:09 -0400 Subject: [Assessment 328] Re: Formative Assessment In-Reply-To: <051020060843.17767.4461A7AF000B8580000045672200761394020A9C019D060B@comcast.net> Message-ID: <001001c674f7$c6207d80$0302a8c0@LITNOW> Hi everyone, Thanks for your post David ? subscribers: I?m surprised that no one has responded to David?s post! Isn?t one of our biggest conundrums/topics of discussion the ?formative assessment versus summative assessment? situation (although we may not call it this)? In the on-line assessment course that I facilitate, participants talked about this fairly in-depth, and pointed out that formative assessment is what happens all the time in teachers? classrooms ? where the ?real? substance of teaching and learning are gauged. This has been touched on here before as well ? if you go to the Assessment Archives at: http://www.nifl.gov/cgi-bin/texis/webinator/search_discussions?cq=2 and type ?formative assessment? in the search box you get 21 entries. But David is looking for studies and further info ? there are nearly 500 subscribers here so I?m pretty sure someone has some resources to share. And if not, you have your experiences to share. David said: ?I am interested in the details of how this is done, in examples of where it is done especially well, and where it is systematically used.? Perhaps this is a good place to start. Please let?s hear from you, Thanks! marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of djrosen at comcast.net Sent: Wednesday, May 10, 2006 4:43 AM To: assessment at nifl.gov Subject: [Assessment 328] Formative Assessment Assessment colleagues, The term ?formative assessment" may not be familiar to many of us working in adult literacy education in the United States. It is more widely used in Europe, and possibly in Canada, Australia and New Zealand, and more common in elementary and secondary education. I am quite interested in this topic and would be interested to learn about examples in adult literacy education in the U.S., especially of systematic formative assessment. So, what is formative assessment? It's the opposite of summative assessment. Its focus is assessment _for_ learning; whereas summative assessment's focus is assessment _of_ learning. Formative assessments give teachers and learners information about learners' goals and objectives, about how they are progressing toward them, about what methods do -- and do not -- work for them. Formative assessments are carried out by a learner on her/his own (self-assessment) , by a group of learners (peer assessment) by a teacher and learner together, and possibly by a learner and a counselor or intake worker. Formative assessments are not usually standardized tests. They do not usually result in a grade, official score or certificate. They provide information for decision-making by learners and teachers, often together, about the learning itself. Some people in the U.S. use the term " alternative assessment," or are familiar with one formative assessment process referred to as "portfolio assessment." S ome te achers, when they understand what "formative assessment" means say "Yes, of course, I do this all the time. It's part of the learning process. I just don't have a name for it." I am interested in the details of how this is done, in examples of where it is done especially well, and where it is systematically used. I am especially interested because I am doing a case study right now on how formative assessment is being used in Belgium, part of a larger OECD study being carried out in several countries where formative assessment is used in adult literacy education. I am also interested because there is evidence from elementary and secondary education research that formative assessment works, that is, that its use results in higher student achievement. If you use a formative assessment process in your classroom, or if you have studied formative assessment and/or if you know of good studies of formative assessment, please let us all know -- here -- or email me a djrosen at comcast.net. David J. Rosen newsomeassociates.com djrosen at comcast.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060511/809993d5/attachment.html From akohring at utk.edu Thu May 11 09:58:07 2006 From: akohring at utk.edu (Aaron Kohring) Date: Thu, 11 May 2006 09:58:07 -0400 Subject: [Assessment 329] Re: Formative Assessment In-Reply-To: <001001c674f7$c6207d80$0302a8c0@LITNOW> References: < <051020060843.17767.4461A7AF000B8580000045672200761394020A9C019D060B@comcast.net> Message-ID: <5.1.0.14.2.20060511095549.04881a88@pop.utk.edu> Marie, I hope any teachers or programs using EFF might share what they are doing. The EFF teaching/learning cycle embeds assessment (both formative and summative) within the teaching process. Aaron At 08:38 AM 5/11/2006 -0400, you wrote: >Hi everyone, > > > >Thanks for your post David subscribers: I m surprised that no one has >responded to David s post! Isn t one of our biggest conundrums/topics of >discussion the formative assessment versus summative assessment situation >(although we may not call it this)? In the on-line assessment course that >I facilitate, participants talked about this fairly in-depth, and pointed >out that formative assessment is what happens all the time in teachers >classrooms where the real substance of teaching and learning are >gauged. This has been touched on here before as well if you go to the >Assessment Archives at: > >http://www.nifl.gov/cgi-bin/texis/webinator/search_discussions?cq=2 > > >and type formative assessment in the search box you get 21 entries. > > > >But David is looking for studies and further info there are nearly 500 >subscribers here so I m pretty sure someone has some resources to >share. And if not, you have your experiences to share. David said: > >I am interested in the details of how this is done, in examples of where >it is done especially well, and where it is systematically used. > > > >Perhaps this is a good place to start. Please let s hear from you, > >Thanks! > >marie cora > >Assessment Discussion List Moderator > > > > > >-----Original Message----- >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On >Behalf Of djrosen at comcast.net >Sent: Wednesday, May 10, 2006 4:43 AM >To: assessment at nifl.gov >Subject: [Assessment 328] Formative Assessment > > > >Assessment colleagues, > > > >The term ?formative assessment" may not be familiar to many of us working >in adult literacy education in the United States. It is more widely used >in Europe, and possibly in Canada, Australia and New Zealand, and more >common in elementary and secondary education. I am quite interested in >this topic and would be interested to learn about examples in adult >literacy education in the U.S., especially of systematic formative assessment. > > > >So, what is formative assessment? > > > >It's the opposite of summative assessment. Its focus is assessment _for_ >learning; whereas summative assessment's focus is assessment _of_ >learning. Formative assessments give teachers and learners information >about learners' goals and objectives, about how they are progressing >toward them, about what methods do -- and do not -- work for them. >Formative assessments are carried out by a learner on her/his own >(self-assessment) , by a group of learners (peer assessment) by a teacher >and learner together, and possibly by a learner and a counselor or intake >worker. Formative assessments are not usually standardized tests. They do >not usually result in a grade, official score or certificate. They provide >information for decision-making by learners and teachers, often together, >about the learning itself. Some people in the U.S. use the term " >alternative assessment," or are familiar with one formative assessment >process referred to as "portfolio assessment." S ome te achers, when they >understand what "formative assessment" means say "Yes, of course, I do >this all the time. It's part of the learning process. I just don't have a >name for it." > > > >I am interested in the details of how this is done, in examples of where >it is done especially well, and where it is systematically used. I am >especially interested because I am doing a case study right now on how >formative assessment is being used in Belgium, part of a larger OECD study >being carried out in several countries where formative assessment is used >in adult literacy education. I am also interested because there is >evidence from elementary and secondary education research that formative >assessment works, that is, that its use results in higher student achievement. > > > >If you use a formative assessment process in your classroom, or if you >have studied formative assessment and/or if you know of good studies of >formative assessment, please let us all know -- here -- or email me a >djrosen at comcast.net. > > > >David J. Rosen > >newsomeassociates.com > >djrosen at comcast.net > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to >http://www.nifl.gov/mailman/listinfo/assessment Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu From MWPotts2001 at aol.com Thu May 11 10:09:58 2006 From: MWPotts2001 at aol.com (MWPotts2001 at aol.com) Date: Thu, 11 May 2006 10:09:58 EDT Subject: [Assessment 330] Re: Formative Assessment Message-ID: <41c.af40de.31949fb6@aol.com> David and All, David says, The term ?formative assessment" may not be familiar to many of us working in adult literacy education in the United States. Well, if many of us do not know the term, "formative assessment," many of us have missed the boat! I am not in the classroom now, but I am an evaluator, and I find that formative assessment is more important to the programs that I work with than summative assessment. Formative assessment is, after all, technical assistance, if done correctly, i.e., data collection with feedback, the purpose of which is program improvement. In the past year, I made 4 trips of two days each to my programs, during which I spent a lot of time observing, some time interviewing, less time writing, and no time judging. What seems to help most is the time we spend reflecting, reviewing and discussing what is working and what is not. Each time I return to the program, I expect to see changes in accordance with our discussion and conclusions from the previous visit. Generally, it happens. What a shame it would be if I waited for summative evaluation to note the need for change and growth. And what a waste. David, you always throw out the best questions. This may not be the best answer, but I had to put in my 2 cents worth. Meta Potts FOCUS on Literacy Glen Allen, VA -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060511/e882cf84/attachment.html From sreid at workbase.org.nz Thu May 11 11:45:56 2006 From: sreid at workbase.org.nz (Susan Reid) Date: Fri, 12 May 2006 03:45:56 +1200 Subject: [Assessment 331] Re: Formative Assessment Message-ID: <14794889A1E3AF419042F64CC5425A1E23BEAE@server1.wbeductrust.local> Hi David I am replying in a rush and so don't have a chance to add a great deal of detail - more to come later Perhaps the most influential text on formative assessment in the UK system which has percolated down to the NZ system and into the adult education sector is Inside the Black Box by Black and Wiliam which advocates exactly what you are saying - that formative assessment, designed to improve teaching and thus learning can improve learner outcomes. Here are some references below http://www.pdkintl.org/kappan/kbla9810.htm http://www.kcl.ac.uk/phpnews/wmprint.php?ArtID=225 Formative assessment forms a key part of Reading and Writing Professional Development that Workbase offers in New Zealand, Two colleagues from New Zealand John Benseman and Alison Sutton have also worked on an OECD project around formative assessment and I will put them in touch with you will provide more information later Kind regards Susan Reid Manager Professional Development Workbase the NZ Centre for Workforce Literacy www.workbase.org.nz www.nzliteracyportal.org.nz ________________________________ From: assessment-bounces at nifl.gov on behalf of djrosen at comcast.net Sent: Wed 10/05/2006 8:43 p.m. To: assessment at nifl.gov Subject: [Assessment 328] Formative Assessment Assessment colleagues, The term ?formative assessment" may not be familiar to many of us working in adult literacy education in the United States. It is more widely used in Europe, and possibly in Canada, Australia and New Zealand, and more common in elementary and secondary education. I am quite interested in this topic and would be interested to learn about examples in adult literacy education in the U.S., especially of systematic formative assessment. So, what is formative assessment? It's the opposite of summative assessment. Its focus is assessment _for_ learning; whereas summative assessment's focus is assessment _of_ learning. Formative assessments give teachers and learners information about learners' goals and objectives, about how they are progressing toward them, about what methods do -- and do not -- work for them. Formative assessments are carried out by a learner on her/his own (self-assessment) , by a group of learners (peer assessment) by a teacher and learner together, and possibly by a learner and a counselor or intake worker. Formative assessments are not usually standardized tests. They do not usually result in a grade, official score or certificate. They provide information for decision-making by learners and teachers, often together, about the learning itself. Some people in the U.S. use the term " alternative assessment," or are familiar with one formative assessment process referred to as "portfolio assessment." S ome te achers, when they understand what "formative assessment" means say "Yes, of course, I do this all the time. It's part of the learning process. I just don't have a name for it." I am interested in the details of how this is done, in examples of where it is done especially well, and where it is systematically used. I am especially interested because I am doing a case study right now on how formative assessment is being used in Belgium, part of a larger OECD study being carried out in several countries where formative assessment is used in adult literacy education. I am also interested because there is evidence from elementary and secondary education research that formative assessment works, that is, that its use results in higher student achievement. If you use a formative assessment process in your classroom, or if you have studied formative assessment and/or if you know of good studies of formative assessment, please let us all know -- here -- or email me a djrosen at comcast.net. David J. Rosen newsomeassociates.com djrosen at comcast.net -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 6431 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060512/88052cf9/attachment.bin From Tina_Luffman at yc.edu Thu May 11 11:59:57 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Thu, 11 May 2006 08:59:57 -0700 Subject: [Assessment 332] Re: Formative Assessment Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060511/e72fedb4/attachment.html From djrosen at comcast.net Fri May 12 15:16:21 2006 From: djrosen at comcast.net (djrosen at comcast.net) Date: Fri, 12 May 2006 19:16:21 +0000 Subject: [Assessment 332] Re: Formative Assessment Message-ID: <051220061916.22075.4464DF04000E3B270000563B2205886442020A9C019D060B@comcast.net> Hello Tina, I would agree that the use of tests, including standardized multiple choice tests for diagnostic or placement purposes, is an example of formative assessment. It is good to hear that you can use the TABE for this purpose and that you and your students find this useful. Are the TABE diagnostics you refer to something put out by a publisher or something you have made yourself? Do other teachers have examples of using formal assessments for formative assessment, to help the students in the learing process, not just as a measure of outcomes or for measurement of pre-post gains? How about informal ongoing assessment strategies or tools? Do you have any of these you could share with us? David J. Rosen djrosen at comcast.net -------------- Original message -------------- From: Tina_Luffman at yc.edu Hi Formative Assessment teammates, Here at Yavapai College, when I give a TABE test to students for follow-up testing, I have a set of TABE diagnostics which tell us which kind of questions the student is still struggling with. Then we have TABE study booklets put out by Steck-Vaughn that are excellent for helping us teach directly to the area that student still needs to study. We also have them go back and work on those areas in the Contemporary satellite books. I would consider this process formative assessment. Do you agree? Thanks, Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060512/4b3b3ae2/attachment.html -------------- next part -------------- An embedded message was scrubbed... From: Tina_Luffman at yc.edu Subject: [Assessment 332] Re: Formative Assessment Date: Thu, 11 May 2006 16:32:01 +0000 Size: 744 Url: http://www.nifl.gov/pipermail/assessment/attachments/20060512/4b3b3ae2/attachment.mht From djrosen at comcast.net Fri May 12 15:32:50 2006 From: djrosen at comcast.net (djrosen at comcast.net) Date: Fri, 12 May 2006 19:32:50 +0000 Subject: [Assessment 333] Re: Formative Assessment Message-ID: <051220061932.27730.4464E2E200051B1A00006C522200750330020A9C019D060B@comcast.net> Hello Susan and others, You are right on target, Susan, and have perfectly understood my question. It was the Black and William study I had in mind, and I am currently in Belgium conducting three case studies, with a Belgian colleague, under the auspices of OECD -- the same "What Works" project to which you have referred. It is this experience that has prompted me to learn more about what might be happening with formative assessment in North America now. For several years SABES/World Education published an alternative adult literacy education assessment journal called "Adventures in Assessment." You will find these journals online at: http://sabes.org/resources/adventures/index.htm Volume after volume is filled with rich exanples from practitioners' experiences of formative assessments, although not usually called that. Since the journal stopped publishing, there is a huge gap in knowledge in the U.S. I think we no longer know if teachers are using these assessments, and if so, how. I was hoping for a rich discussion on this list by teachers, for a picture of what is happining now with the use of formative assessments. I hope that formative assessment has not disappeared or gone underground, or been dispaced by summative, outcomes-based assessment. We need both. North American teacher colleagues: are the?"adventures" in assessment over? Or do they continue .... for example in your classroom? I would really like to know. David J. Rosen djrosen at comcast.net -------------- Original message -------------- From: "Susan Reid" > Hi David > I am replying in a rush and so don't have a chance to add a great deal of detail > - more to come later > Perhaps the most influential text on formative assessment in the UK system which > has percolated down to the NZ system and into the adult education sector is > Inside the Black Box by Black and Wiliam which advocates exactly what you are > saying - that formative assessment, designed to improve teaching and thus > learning can improve learner outcomes. Here are some references below > > http://www.pdkintl.org/kappan/kbla9810.htm > http://www.kcl.ac.uk/phpnews/wmprint.php?ArtID=225 > > Formative assessment forms a key part of Reading and Writing Professional > Development that Workbase offers in New Zealand, Two colleagues from New Zealand > John Benseman and Alison Sutton have also worked on an OECD project around > formative assessment and I will put them in touch with you > will provide more information later > > Kind regards Susan Reid > Manager Professional Development > Workbase the NZ Centre for Workforce Literacy > www.workbase.org.nz > www.nzliteracyportal.org.nz > > ________________________________ > > From: assessment-bounces at nifl.gov on behalf of djrosen at comcast.net > Sent: Wed 10/05/2006 8:43 p.m. > To: assessment at nifl.gov > Subject: [Assessment 328] Formative Assessment > > > Assessment colleagues, > > The term ?formative assessment" may not be familiar to many of us working in > adult literacy education in the United States. It is more widely used in Europe, > and possibly in Canada, Australia and New Zealand, and more common in elementary > and secondary education. I am quite interested in this topic and would be > interested to learn about examples in adult literacy education in the U.S., > especially of systematic formative assessment. > > So, what is formative assessment? > > It's the opposite of summative assessment. Its focus is assessment _for_ > learning; whereas summative assessment's focus is assessment _of_ learning. > Formative assessments give teachers and learners information about learners' > goals and objectives, about how they are progressing toward them, about what > methods do -- and do not -- work for them. Formative assessments are carried out > by a learner on her/his own (self-assessment) , by a group of learners (peer > assessment) by a teacher and learner together, and possibly by a learner and a > counselor or intake worker. Formative assessments are not usually standardized > tests. They do not usually result in a grade, official score or certificate. > They provide information for decision-making by learners and teachers, often > together, about the learning itself. Some people in the U.S. use the term " > alternative assessment," or are familiar with one formative assessment process > referred to as "portfolio assessment." S ome te achers, when they understand > what "formative assessment" means say "Yes, of course, I do this all the time. > It's part of the learning process. I just don't have a name for it." > > I am interested in the details of how this is done, in examples of where it is > done especially well, and where it is systematically used. I am especially > interested because I am doing a case study right now on how formative assessment > is being used in Belgium, part of a larger OECD study being carried out in > several countries where formative assessment is used in adult literacy > education. I am also interested because there is evidence from elementary and > secondary education research that formative assessment works, that is, that its > use results in higher student achievement. > > If you use a formative assessment process in your classroom, or if you have > studied formative assessment and/or if you know of good studies of formative > assessment, please let us all know -- here -- or email me a djrosen at comcast.net. > > David J. Rosen > newsomeassociates.com > djrosen at comcast.net > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060512/01767f14/attachment.html -------------- next part -------------- An embedded message was scrubbed... From: "Susan Reid" Subject: [Assessment 331] Re: Formative Assessment Date: Thu, 11 May 2006 16:31:44 +0000 Size: 9591 Url: http://www.nifl.gov/pipermail/assessment/attachments/20060512/01767f14/attachment.mht From marie.cora at hotspurpartners.com Sat May 13 08:45:45 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sat, 13 May 2006 08:45:45 -0400 Subject: [Assessment 333] FW: [AAACE-NLA] Literacy President Update May 12, 2006 Message-ID: <012501c6768b$2a4b56b0$0302a8c0@LITNOW> Dear Colleagues: below please read an update on the Literacy President campaign. Have YOU responded? marie cora __________________________ The Literacy President survey to determine the top five questions to ask the Presidential Candidates is going well. Nearly 800 people have responded, and about 6% of the participants are adult learners (not including college students). The two states which have the most number of participants are Pennsylvania and Washington State. Many states have particpated however. Our goal is 1500 people, and we have a little under two weeks left to reach it. To meet the goal we need to be sure the word is getting out. So far I am aware of only one national organization that has asked its members to particpate in this survey, VALUE. I know that VALUE Executive Director, Marty Finsterbusch has made getting out the word to VALUE members a priority and that this does seem to have made a difference in the number of adult learners participating. But it is disappointing if other national adult literacy organizations are not making this a priority. I know that the Literacy Presideent message has been posted on some state lists -- and clearly advocates in several states are working on this. My congratulations especially to Pennsylvania and Washington. If you are a member of a national adult literacy organization, please call or email your leadership and ask them to get the word out to their members now. Thanks. David J. Rosen Adult Literacy Advocate Member, the Literacy President Group DJRosen at the world.org _______________________________________________ From Tina_Luffman at yc.edu Sat May 13 13:35:31 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Sat, 13 May 2006 10:35:31 -0700 Subject: [Assessment 334] Re: Formative Assessment Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060513/13e3662f/attachment.html From marie.cora at hotspurpartners.com Mon May 15 12:10:49 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 15 May 2006 12:10:49 -0400 Subject: [Assessment 335] job announcement Message-ID: <009501c6783a$266be7c0$0702a8c0@LITNOW> Dear Colleagues: The following is an announcement for a new position at the Literacy Assistance Center in New York City: Project Leader The Literacy Assistant Center (LAC), a not-for-profit organization that supports and promotes the expansion of quality literacy services in New York, is looking for a Project Leader to join its dynamic work team. This individual's primary responsibility is to lead the three-year Statewide Staff Development Project beginning June 1, 2006. The goal of the project is to improve instruction in adult basic education, ESOL, and GED classes. Specific responsibilities include developing a specialized training curriculum; providing related professional development sessions to adult educators, program managers and administrators; and creating four policy/resource manuals. This work will incorporate New York State Adult Education Learning Standards, reflect essential concepts for teaching adults, and integrate core principles of teaching reading, writing, mathematics, and English language attainment. Statewide travel required. The Project Leader will be an expert in adult and literacy education, knowledgeable about current trends in the field, and have a proven ability to work with key external constituents, including state education department personnel, and with diverse local organizations and individuals. This highly qualified educator will have an advanced degree (masters required, doctorate preferred) with commensurate experience and demonstrated skill in new project development. S/he will be expected to describe the impact of the project in published articles and/or professional conference presentations. LAC offers a competitive compensation package commensurate with experience. Send resume and cover letter to hr at lacnyc.org or fax to 212-952-1359 by May 25, 2006. No phone calls, please. Mariann Fedele Coordinator of Professional Development, Literacy Assistance Center Moderator, NIFL Technology and Literacy Discussion List 32 Broadway 10th Floor New York, New York 10004 212-803-3325 mariannf at lacnyc.org www.lacnyc.org _______________________________________________ National Institute for Literacy Moderators mailing list: Moderators at nifl.gov http://www.nifl.gov/mailman/listinfo/moderators Moderator's Resource Page: http://www.nifl.gov/lincs_dlms/contents.html Moderator's List Archive page: http://www.nifl.gov/mailman/private/moderators From marie.cora at hotspurpartners.com Tue May 16 11:42:52 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 16 May 2006 11:42:52 -0400 Subject: [Assessment 336] Special Topics Discussion next week: ARCS Message-ID: <012601c678ff$67bcd520$0702a8c0@LITNOW> To subscribe to the Special Topics Discussion List, go to: http://www.nifl.gov/mailman/listinfo/specialtopics Dear subscribers, We invite you to join us on the National Institute for Literacy's new SPECIAL TOPICS List. This list was established to provide opportunities throughout the year for focused discussion topics with invited researchers and other experts in the field of adult education and literacy (including English language learning and numeracy). The Special Topics Discussion List will be moderated by David Rosen, Ed.D., Senior Associate, Newsome Associates. This list is an intermittent discussion list. The topics will open and close throughout the year, so there are periods when there will be no discussion or postings. You can subscribe to the discussion list for a particular topic of interest, and then unsubscribe, or you can stay subscribed throughout the year. We look forward to the upcoming discussion, beginning May 23, with Dr. Rosalind Davidson and Dr. John Strucker, co-researchers on the Adult Reading Components Study (ARCS). You may have seen the previous announcement for this on the list, if not, please visit the list archives at: http://www.nifl.gov/pipermail/assessment/2006/000310.html. The DVD of the video is also available free of charge, please send your mailing address to info at nifl.gov to request a copy. For more information, or suggestions of topics, contact David J. Rosen at djrosen1 at comcast.net Regards, Jo Maralit National Institute for Literacy http://www.nifl.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060516/a0559e7b/attachment.html From djrosen at comcast.net Fri May 19 19:24:57 2006 From: djrosen at comcast.net (David Rosen) Date: Fri, 19 May 2006 19:24:57 -0400 Subject: [Assessment 337] ARCS Discussion Begins May 23rd on New SpecialTopics List Message-ID: Assessment Colleagues, A last reminder that on May 23rd we will begin an important discussion on the Adult Reading Components Study (ARCS) with the researchers. To sign up for this discussion go to http://www.nifl.gov/mailman/listinfo/specialtopics You will find a 30-minute video panel discussion with ARCS researchers, Rosalind Davidson and John Strucker, and practitioners Kay Vaccaro and Jane Meyer at http://www.nifl.gov/nifl/webcasts/20040204/webcast02-04.html The video panel introduction is also available on DVD from the National Center for the Study of Adult Learning and Literacy [ http:// www.ncsall.net/?id=24 ] or from the National Institute for Literacy. (Send a request for the Adult Readiing Components Study (ARCS) Panel (free) DVD to: info at nifl.gov Be sure to include your mailing address.) Other ARCS introductory materials include: 1. Adult Reading Components Study (ARCS) [PDF document] by John Strucker and Rosalind Davidson http://www.ncsall.net/?id=27 (ninth item down) 2. How the ARCS Was Done http://www.ncsall.net/fileadmin/resources/research/op_arcs.pdf 3. Adult Reading Components Study (ARCS) http://www.ncsall.net/?id=27#arcs We do hope you will be able to join us, from May 23rd through May 30th to learn about and explore the uses of the ARCS. David J. Rosen djrosen at comcast.net From sreid at workbase.org.nz Sat May 27 19:49:38 2006 From: sreid at workbase.org.nz (Susan Reid) Date: Sun, 28 May 2006 11:49:38 +1200 Subject: [Assessment 338] Re: Formative Assessment Message-ID: <14794889A1E3AF419042F64CC5425A1E23BF40@secure.workbase.org.nz> Just a follow up to my original post about the work of Black and Wiliam following David's question about formative assessment here are a couple of other urls people might be interested in the last one is an online provfessional development workshop that our Minsistry of Education developed for K-12 teachers but it is equally applicable to adult literacy in terms of its explanations etc Black and Wiliam http://ngfl.northumberland.gov.uk/keystage3ictstrategy/Assessment/blackbox.pdf MOE online workshop http://www.tki.org.nz/r/assessment/atol_online/ppt/online_workshop_1.ppt Regards Susan Reid Manager, Professional Development Workbase the NZ Centre for Workforce Literacy Development www.workbase.org.nz Check out the New Zealand literacy portal www.nzliteracyportal.org.nz ________________________________ From: assessment-bounces at nifl.gov on behalf of Marie Cora Sent: Fri 12/05/2006 12:38 a.m. To: 'The Assessment Discussion List' Subject: [Assessment 328] Re: Formative Assessment Hi everyone, Thanks for your post David - subscribers: I'm surprised that no one has responded to David's post! Isn't one of our biggest conundrums/topics of discussion the 'formative assessment versus summative assessment' situation (although we may not call it this)? In the on-line assessment course that I facilitate, participants talked about this fairly in-depth, and pointed out that formative assessment is what happens all the time in teachers' classrooms - where the 'real' substance of teaching and learning are gauged. This has been touched on here before as well - if you go to the Assessment Archives at: http://www.nifl.gov/cgi-bin/texis/webinator/search_discussions?cq=2 and type 'formative assessment' in the search box you get 21 entries. But David is looking for studies and further info - there are nearly 500 subscribers here so I'm pretty sure someone has some resources to share. And if not, you have your experiences to share. David said: "I am interested in the details of how this is done, in examples of where it is done especially well, and where it is systematically used." Perhaps this is a good place to start. Please let's hear from you, Thanks! marie cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of djrosen at comcast.net Sent: Wednesday, May 10, 2006 4:43 AM To: assessment at nifl.gov Subject: [Assessment 328] Formative Assessment Assessment colleagues, The term ?formative assessment" may not be familiar to many of us working in adult literacy education in the United States. It is more widely used in Europe, and possibly in Canada, Australia and New Zealand, and more common in elementary and secondary education. I am quite interested in this topic and would be interested to learn about examples in adult literacy education in the U.S., especially of systematic formative assessment. So, what is formative assessment? It's the opposite of summative assessment. Its focus is assessment _for_ learning; whereas summative assessment's focus is assessment _of_ learning. Formative assessments give teachers and learners information about learners' goals and objectives, about how they are progressing toward them, about what methods do -- and do not -- work for them. Formative assessments are carried out by a learner on her/his own (self-assessment) , by a group of learners (peer assessment) by a teacher and learner together, and possibly by a learner and a counselor or intake worker. Formative assessments are not usually standardized tests. They do not usually result in a grade, official score or certificate. They provide information for decision-making by learners and teachers, often together, about the learning itself. Some people in the U.S. use the term " alternative assessment," or are familiar with one formative assessment process referred to as "portfolio assessment." S ome te achers, when they understand what "formative assessment" means say "Yes, of course, I do this all the time. It's part of the learning process. I just don't have a name for it." I am interested in the details of how this is done, in examples of where it is done especially well, and where it is systematically used. I am especially interested because I am doing a case study right now on how formative assessment is being used in Belgium, part of a larger OECD study being carried out in several countries where formative assessment is used in adult literacy education. I am also interested because there is evidence from elementary and secondary education research that formative assessment works, that is, that its use results in higher student achievement. If you use a formative assessment process in your classroom, or if you have studied formative assessment and/or if you know of good studies of formative assessment, please let us all know -- here -- or email me a djrosen at comcast.net. David J. Rosen newsomeassociates.com djrosen at comcast.net -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 10966 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060528/21b43c1f/attachment.bin From marie.cora at hotspurpartners.com Sun May 28 10:19:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 28 May 2006 10:19:08 -0400 Subject: [Assessment 339] Upcoming Discussions on FOB List: Learners' Experiences Message-ID: <000501c68261$b03d16f0$0202a8c0@LITNOW> Dear Colleagues, The following post is from Julie McKinney, Moderator of the Focus on Basics (FOB) Discussion List. marie cora Assessment Discussion List Moderator ****************************************************** I want to give you a heads-up on next month's rich schedule of weekly discussions with Focus On Basics authors. The following authors will discuss their articles from the recent issue of Focus On Basics, Vol. 8B, which is about Learners' Experiences. Find the whole issue at: http://www.ncsall.net/?id=1103 June 5-9: Jessica Tomkins Video as a Professional Development Tool http://www.ncsall.net/index.php?id=1107 June 12-16: Alisa Belzer Influences on the Reading Practices of Adults in ABE http://www.ncsall.net/index.php?id=1108 and Learners on Learning to Read http://www.ncsall.net/index.php?id=1110 June 25-30: Hal Beder Shaping and Sustaining Learner Engagement in Individualized Group Instruction Classrooms http://www.ncsall.net/index.php?id=1106 Please pass on the word to colleagues who may be interested in these discussions! Anyone can subscribe to the list at: http://www.nifl.gov/mailman/listinfo/focusonbasics ***************************************** Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From marie.cora at hotspurpartners.com Tue May 30 12:25:30 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 30 May 2006 12:25:30 -0400 Subject: [Assessment 340] FW: [AAACE-NLA] Literacy President Update, May 28, 2006 Message-ID: <001d01c68405$abd4c9f0$0202a8c0@LITNOW> Dear Colleagues, Please see the Literacy President update below. We are SO CLOSE to reaching our goal of 1500 responses! If you have not voted, please do so. And remember: friends encourage friends to vote! Thank you! marie cora Assessment Discussion List Moderator -----Original Message----- From: aaace-nla-bounces at lists.literacytent.org [mailto:aaace-nla-bounces at lists.literacytent.org] On Behalf Of David Rosen Sent: Sunday, May 28, 2006 8:08 PM To: National Literacy Advocacy List sponsored by AAACE Subject: [AAACE-NLA] Literacy President Update, May 28, 2006 AAACE-NLA Colleagues, As of 8:00 P.M. Sunday, May 28th we have 1472 responses to the Literacy President questions online survey. We are close to reaching our goal of 1500 responding by May 31st. Please help us reach that goal. To vote on the questions, go to: http://www.litpresident.org and select "Survey." So far, the top twelve voting states, in order of the most votes, are: 1. PA: 178 2. TN 137 3. WA: 96 4. KS: 94 5. IL:93 6. CA: 90 7. MA: 81 8. OH: 71 9. GA: 64 10. VA: 51 11. NJ: 43 12. MN: 41 Pennsylvania had an early lead, which it has maintained, and Tennessee is now strong in second place, David J. Rosen Adult Literacy Advocate DJRosen at theworld.com _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org From djrosen at comcast.net Thu Jun 1 06:39:25 2006 From: djrosen at comcast.net (David Rosen) Date: Thu, 1 Jun 2006 06:39:25 -0400 Subject: [Assessment 341] New Assessment study Message-ID: <24087FF0-0BB0-4A8C-9110-B88D530FB1FC@comcast.net> Assessment colleagues, You may find of interest a new study on assessment practices across Canada by Pat Campbell: Student Assessment in Adult Basic Education: (2006), A Canadian Snapshot From the description on the NALD web page: The purpose of this document is to report on the findings from a national survey on student assessment in adult basic education. This report is one of several outcomes from a national project entitled Assessment Practices in Adult Basic Education. This project will also produce an edited book on assessment practices and three videos. http://library.nald.ca/research/item/5995 David J. Rosen djrosen at comcast.net From djrosen at comcast.net Thu Jun 1 06:47:52 2006 From: djrosen at comcast.net (David Rosen) Date: Thu, 1 Jun 2006 06:47:52 -0400 Subject: [Assessment 342] Another Canadian study -- assessment from learners' perspectives Message-ID: Assessment colleagues, Another new Canadian assessment study which may be of interest is "I've opened up" (2006) By: Susan Lefebvre, Patricia Belding, Mary Brehaut, Sarah Dermer, Anne-Marie Kaskens, Emily Lord, Wayne McKay, Nadine Sookermany. Here's a description from the NALD web page: This project explored what constitutes progress in community-based literacy programs from the perspective of learners. The research took place between December 2004 and January 2006. This project explored learners? experiences and understanding of progress and sought to define and articulate this knowledge. The research showed the importance of understanding and valuing the perspectives learners have of their progress. We discovered numerous nonacademic outcomes critical to learners? progress that they associated with adult literacy programs. Learners realized very well what literacy can do for them and value the many nonacademic outcomes they experience in various facets of their lives. The learners? comments also provided insight into the richness and complexity of the learning outcomes and of the interactions between these outcomes, their program environment, the learning process and non-academic learning outcomes they achieved. http://library.nald.ca/research/item/6008 David J. Rosen djrosen at comcast.net From mdick at lagcc.cuny.edu Fri Jun 2 12:21:11 2006 From: mdick at lagcc.cuny.edu (Mae Dick) Date: Fri, 02 Jun 2006 12:21:11 -0400 Subject: [Assessment 343] TABE testing for non-native English speakers Message-ID: <44802D370200000A0019D111@mailgate.lagcc.cuny.edu> Hi. I am posting to this list for the first time. Currently I direct an adult literacy program at LaGuardia Community College in the City University of New York. We serve primarily an immigrant population and offer classes in basic education, GED and ESL. We have large numbers of non-native English speakers in our basic education and GED classes. We have WIA funding and, as you know, are required to administer the TABE test for BE and GED students in reading comprehension in order to measure gain and movement from one NRS level to another. While our students do quite well on the GED Practice test and subsequently on the actual GED exam, their TABE reading scores are often erratic, and we can have students scoring at 6th and 7th grade on the TABE M and D who take and pass the GED exam after scoring well on the GED Predictor. I realize that the TABE reading test has its difficulties and it is criticized in NYC (and probably elsewhere) as a measure of progress. Whatever its limitations may be, my understanding is that the TABE was normed on native English speakers. I was wondering if anyone has had experience and/or knowledge of the appropriateness of the TABE for non-native English speakers? From hdooley at riral.org Sat Jun 3 10:58:53 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Sat, 03 Jun 2006 10:58:53 -0400 Subject: [Assessment 344] Re: TABE testing for non-native English speakers In-Reply-To: <44802D370200000A0019D111@mailgate.lagcc.cuny.edu> References: <44802D370200000A0019D111@mailgate.lagcc.cuny.edu> Message-ID: <4481A3AD.3080708@riral.org> Most ESOL practitioners I know believe the current TABE is not appropriate for non-native English speakers. However, the TABE folks are currently piloting a version for non-native speakers. They have an anticipated roll-out date of this fall. You might be able to contact a pilot site to get some initial thoughts and responses. I know one program in RI participated. My program also finds discrepancies between TABE scores, GED OPT's and CASAS scores, in the small number of cases where learners have taken multiple tests. I agree that there seems to be more going on than just the facts that each test tests different skills and knowledge, but I don't know of any research or case studies that look at or try to explain these discrepancies. Let me know if you hear anything. Howard D. Project RIRAL Mae Dick wrote: > Hi. I am posting to this list for the first time. Currently I direct an adult literacy program at LaGuardia Community College in the City University of New York. We serve primarily an immigrant population and offer classes in basic education, GED and ESL. > > We have large numbers of non-native English speakers in our basic education and GED classes. We have WIA funding and, as you know, are required to administer the TABE test for BE and GED students in reading comprehension in order to measure gain and movement from one NRS level to another. While our students do quite well on the GED Practice test and subsequently on the actual GED exam, their TABE reading scores are often erratic, and we can have students scoring at 6th and 7th grade on the TABE M and D who take and pass the GED exam after scoring well on the GED Predictor. I realize that the TABE reading test has its difficulties and it is criticized in NYC (and probably elsewhere) as a measure of progress. Whatever its limitations may be, my understanding is that the TABE was normed on native English speakers. I was wondering if anyone has had experience and/or knowledge of the appropriateness of the TABE for non-native English speakers? > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > From prwhite at MadisonCounty.NET Mon Jun 5 10:45:51 2006 From: prwhite at MadisonCounty.NET (Patti White) Date: Mon, 5 Jun 2006 09:45:51 -0500 Subject: [Assessment 345] Fw: TABE testing for non-native English speakers Message-ID: <00a001c688ae$bdd9ea60$6501a8c0@PattiAALRC> I shot this question to our state ESL expert and here's his reply: ----- Original Message ----- From: Philip Less To: Patti White Sent: Monday, June 05, 2006 9:26 AM Subject: RE: [Assessment 343] TABE testing for non-native English speakers Hi Patti, Interesting question. I wonder what kind of response she will get from other NY folks or people in the field. I know that TABE is developing and pilot testing a TABE-ESL exam this year. I don't know if it will be accepted by the feds to use in NRS instead of the BEST or TABE or help in the situation mentioned below. Of course, the problem in a nutshell (in the example below) is that the GED tests are not the same as the TABE. In other words, it is not really that surprising (if you think about it) that a student with a 6th or 7th grade reading level can pass the GED. Some people's reading skills in 7th grade are probably as good as they will ever get. The only improvements we tend to make are in the realms of critical thinking (gained with wisdom and experience perhaps) and vocabulary development (gained over time). Philip Dr. Philip Less, ESL Program Advisor Arkansas Department of Workforce Education Adult Education Section Three Capitol Mall Little Rock, Arkansas 72201 Tel: 501-682-1970 Fax: 501-682-1706 Email: philip.less at arkansas.gov Web: http://dwe.arkansas.gov -------------------------------------------------------------------------------- From: Patti White [mailto:prwhite at MadisonCounty.NET] Sent: Friday, June 02, 2006 8:13 PM To: Philip Less Subject: Fw: [Assessment 343] TABE testing for non-native English speakers Another one with your name all over it.......Thanks~ Patti Patti White, M.Ed. Arkansas Adult Learning Resource Center Disabilities Project Manager 479 232 5760 / 800 569 3539 v/tty/fax prwhite at madisoncounty.net www.aalrc.org/html/ld/disab2.htm ----- Original Message ----- From: Mae Dick To: assessment at nifl.gov Sent: Friday, June 02, 2006 11:21 AM Subject: [Assessment 343] TABE testing for non-native English speakers Hi. I am posting to this list for the first time. Currently I direct an adult literacy program at LaGuardia Community College in the City University of New York. We serve primarily an immigrant population and offer classes in basic education, GED and ESL. We have large numbers of non-native English speakers in our basic education and GED classes. We have WIA funding and, as you know, are required to administer the TABE test for BE and GED students in reading comprehension in order to measure gain and movement from one NRS level to another. While our students do quite well on the GED Practice test and subsequently on the actual GED exam, their TABE reading scores are often erratic, and we can have students scoring at 6th and 7th grade on the TABE M and D who take and pass the GED exam after scoring well on the GED Predictor. I realize that the TABE reading test has its difficulties and it is criticized in NYC (and probably elsewhere) as a measure of progress. Whatever its limitations may be, my understanding is that the TABE was normed on native English speakers. I was wondering if anyone has had experience and/or knowledge of the appropriateness of the TABE for non-native English speakers? ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060605/ee5dca0e/attachment.html From marie.cora at hotspurpartners.com Tue Jun 6 06:12:12 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 6 Jun 2006 06:12:12 -0400 Subject: [Assessment 346] FW: [AAACE-NLA] Results of Literacy President Questions Survey Message-ID: <005701c68951$ae42fda0$0302a8c0@LITNOW> Dear Colleagues, The following post/update is from David Rosen. marie cora Assessment Discussion List Moderator _____________ AAACE-NLA Colleagues, The Literacy President survey was conducted in April and May, 2006. Participants selected their top eight questions from an online survey of twenty possible questions to ask the candidates for President in 2008. The questions from the survey came primarily from practitioners, in response to a series of requests on the AAACE- National literacy Advocacy electronic list, and also from questions on the 2004 Literacy President survey. 1666 people responded. . Practitioners: 82.7% . Learners (e.g. adult new reader, adult education program graduate) 6.3% . College or university students in adult education: 3.8% . Other: 10.8% . Female: 82.2% The top 12 participating states in order of the most participants, were: 1. Pennsylvania: 193 2. Tennessee: 142 3. Washington: 140 4. Massachusetts: 104 5. Kansas: 99 6. Illinois: 97 7. California: 95 8. Ohio: 80 9. Georgia: 65 10. Virgnia: 61 11. New Jersey: 52 12. Minnesota: 49 Taking into account both the number of people who selected a given question and its assigned priority it was difficult to determine the top five questions. Instead, I have listed the top seven questions. Note that questions 6 and 7 were close in number of votes and that they had a higher percentage of people choosing them in their top three priorities. Top Seven Questions The Working Poor. 1. In 2000, 6.4 million American adults were classified as the "working poor." The majority -three-fifths- worked full-time but remained in poverty. To make a "living wage" many of these adults require further education and training. For many, the lack of a high school diploma and strong literacy skills is a barrier to the training needed to obtain a job with a living wage. What will you do to address this issue? Total respondents who chose this: 790 %age which chose this as one of top 3 priorities: 43% 2. Priority of Adult Education As President, will you make adult education and literacy one of your top three educational priorities? Total respondents who chose this: 710 %age which chose this as one of top 3 priorities: 74% 3. Professional Development and Support for Teachers Given the importance of adult literacy education and its impact on the workforce, what will you do to ensure sufficient training, salaries, and benefits for adult basic education teachers? Total respondents who chose this: 683 %age which chose this as one of top 3 priorities: 32% 4. Competitiveness in a World Economy A recent government survey indicates that 93 million individuals are at risk at home, at work and in the community because of low levels of literacy. What new investment in adult education will you make to increase access for the unemployed, new immigrants and other at risk populations in order to keep our nation competitive? Total respondents who chose this: 673 %age which chose this as one of top 3 priorities: 51% 5. Adults Left Behind What role should the federal government take in providing services for adults and out-of-school youth who have been "left behind" by the educational system in their states? Total respondents who chose this: 652 %age which chose this as one of top 3 priorities: 41% 6. Intergenerational Literacy Children who do not get an education now will become adults who need literacy skills. Then, as parents, they are unable help their children with schoolwork. Without strong parental support for education, children of these individuals may also be left behind. What do you see as the role of adult education programs in addressing this intergenerational literacy issue? Total respondents who chose this: 645 %age which chose this as one of top 3 priorities: 44% 7. Funding Do you believe that Adult Education and Literacy services (including English language learning and family literacy) should be available to all residents who need and seek those services? If so, are you willing to support an increase in funding that would eliminate long waiting lists for these services? Total respondents who chose this: 641 %age which chose this as one of top 3 priorities: 46% Comments from participants: There were 135 participants who chose to comment on the questionnaire or its process: . Twenty two said they thought the questions were good, that it was hard to choose, that all the questions are important, that all should be asked. . Nine said the survey was hard to complete, cumbersome, complicated, time-consuming, difficult mechanically. . Nine said we shouldn't ask yes/no questions. One said we shouldn't ask open-ended questions. . Three said the questions should be shorter. . Three objected to the wording of some of the questions, which appeared to them to include illegal immigrants. . Two said the questionnaire was too hard for some students. David J. Rosen Adult Literacy Advocate DJRosen at theworld.com From mdick at lagcc.cuny.edu Tue Jun 6 11:49:38 2006 From: mdick at lagcc.cuny.edu (Mae Dick) Date: Tue, 06 Jun 2006 11:49:38 -0400 Subject: [Assessment 347] Re: Fw: TABE testing for non-native English speakers In-Reply-To: <00a001c688ae$bdd9ea60$6501a8c0@PattiAALRC> References: <00a001c688ae$bdd9ea60$6501a8c0@PattiAALRC> Message-ID: <44856BD20200000A0019D694@mailgate.lagcc.cuny.edu> Thank you. Much appreciated. >>> "Patti White" 06/05/06 10:45 AM >>> I shot this question to our state ESL expert and here's his reply: ----- Original Message ----- From: Philip Less To: Patti White Sent: Monday, June 05, 2006 9:26 AM Subject: RE: [Assessment 343] TABE testing for non-native English speakers Hi Patti, Interesting question. I wonder what kind of response she will get from other NY folks or people in the field. I know that TABE is developing and pilot testing a TABE-ESL exam this year. I don't know if it will be accepted by the feds to use in NRS instead of the BEST or TABE or help in the situation mentioned below. Of course, the problem in a nutshell (in the example below) is that the GED tests are not the same as the TABE. In other words, it is not really that surprising (if you think about it) that a student with a 6th or 7th grade reading level can pass the GED. Some people's reading skills in 7th grade are probably as good as they will ever get. The only improvements we tend to make are in the realms of critical thinking (gained with wisdom and experience perhaps) and vocabulary development (gained over time). Philip Dr. Philip Less, ESL Program Advisor Arkansas Department of Workforce Education Adult Education Section Three Capitol Mall Little Rock, Arkansas 72201 Tel: 501-682-1970 Fax: 501-682-1706 Email: philip.less at arkansas.gov Web: http://dwe.arkansas.gov -------------------------------------------------------------------------------- From: Patti White [mailto:prwhite at MadisonCounty.NET] Sent: Friday, June 02, 2006 8:13 PM To: Philip Less Subject: Fw: [Assessment 343] TABE testing for non-native English speakers Another one with your name all over it.......Thanks~ Patti Patti White, M.Ed. Arkansas Adult Learning Resource Center Disabilities Project Manager 479 232 5760 / 800 569 3539 v/tty/fax prwhite at madisoncounty.net www.aalrc.org/html/ld/disab2.htm ----- Original Message ----- From: Mae Dick To: assessment at nifl.gov Sent: Friday, June 02, 2006 11:21 AM Subject: [Assessment 343] TABE testing for non-native English speakers Hi. I am posting to this list for the first time. Currently I direct an adult literacy program at LaGuardia Community College in the City University of New York. We serve primarily an immigrant population and offer classes in basic education, GED and ESL. We have large numbers of non-native English speakers in our basic education and GED classes. We have WIA funding and, as you know, are required to administer the TABE test for BE and GED students in reading comprehension in order to measure gain and movement from one NRS level to another. While our students do quite well on the GED Practice test and subsequently on the actual GED exam, their TABE reading scores are often erratic, and we can have students scoring at 6th and 7th grade on the TABE M and D who take and pass the GED exam after scoring well on the GED Predictor. I realize that the TABE reading test has its difficulties and it is criticized in NYC (and probably elsewhere) as a measure of progress. Whatever its limitations may be, my understanding is that the TABE was normed on native English speakers. I was wondering if anyone has had experience and/or knowledge of the appropriateness of the TABE for non-native English speakers? ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From kabeall at comcast.net Wed Jun 7 12:55:14 2006 From: kabeall at comcast.net (Kaye Beall) Date: Wed, 7 Jun 2006 12:55:14 -0400 Subject: [Assessment 348] New from NIFL & NCSALL-ARCS Video Message-ID: <001301c68a53$262893f0$0202a8c0@your4105e587b6> The National Institute for Literacy (NIFL) and the National Center for the Study of Adult Learning and Literacy (NCSALL) announce the "Adult Reading Components Study (ARCS) Panel," a 30-minute video on NCSALL's ARCS research produced by the Institute. This video is available in streaming format and can be viewed by going to: http://www.nifl.gov/nifl/webcasts/20040204/webcast02-04.html ARCS was the first large-scale attempt to use a battery of individually administered reading and language tests to describe the reading of students enrolled in adult basic education (ABE) and English for speakers of other languages (ESOL) programs. Nearly 1,000 adult learners from 30 learning centers in seven states were assessed in order to develop instructionally relevant cluster profiles of adult readers. The video offers a panel discussion about NCSALL's ARCS research and ways in which programs can use the Assessment Strategies and Reading Profiles, an on-line assessment tool based on the ARCS research, to assess students and plan instruction tailored to their specific profiles. Panel participants are: Dr. John Strucker - Researcher and ARCS Director, NCSALL Dr. Rosalind Davidson - Researcher and ARCS Assistant Director, NCSALL Kay Vaccaro - Program Assistant, Harris County, TX Department of Education, Adult Education Division Jane Meyer - Coordinator, ABLE-funded adult literacy project, Canton, OH David J. Rosen (moderator) - Senior Associate, Newsome Associates, Boston, MA To visit the Assessment Strategies and Reading Profiles ARCS Web site, please go to: http://www.nifl.gov/readingprofiles/ To learn more about the ARCS, please see NCSALL's "Seminar Guide - Reading Profiles" http://www.ncsall.net/?id=597 available from the CPPR section of the NCSALL Web site. A NCSALL study circle guide on reading research and teaching materials on reading are available from the Publications section of the NCSALL Web site: http://www.ncsall.net/index.php?id=25 The ARCS video panel introduction is also available free on DVD: Order from NCSALL at http://www.ncsall.net/?id=24 for $5.00/copy (shipping and handling), or send your request to NIFL at info at nifl.gov, and be sure to include your mailing address. The ARCS video is the first in a series of videos based on NCSALL research that are being produced by the National Institute for Literacy. As each video is completed, streaming versions will be posted to the Web, with accompanying announcements on the Institute's listservs and web sites and NCSALL's Web site. Once the entire series is completed, all of the videos will be packaged in a single DVD, which the Institute and NCSALL will make available to the field. The National Institute for Literacy and NCSALL present these videos as introductions to key research topics in adult learning and literacy. We hope the field finds them useful as professional and program development tools. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060607/e2423dce/attachment.html From marie.cora at hotspurpartners.com Mon Jun 12 19:59:23 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 12 Jun 2006 19:59:23 -0400 Subject: [Assessment 349] Guest Discussion next week on Mathematics Message-ID: <008d01c68e7c$3af15060$0302a8c0@LITNOW> Dear Colleagues, I hope this email finds you well. I'm pleased to announce the following Guest Discussion, which will begin on Monday of next week: June 19 - 23, 2006 Topic: Assessment in Mathematics Guest: Myrna Manly - please see Myrna's bio below. Myrna will respond to your email posts once per day - feel free to send your post to the Assessment Discussion List, or to me so that I can post it for you (marie.cora at hotspurpartners.com). Bio Myrna Manly, a mathematics teacher with experience at many academic levels, retired in 2001 from a position as professor of mathematics at El Camino College. In addition to instruction, she has been involved with the assessment of the mathematics proficiency of adults in various roles: as the Mathematics Specialist for the 1988 version of the GED test; as a member of the numeracy team for the Adult Literacy and Lifeskills Survey (ALL); and as the numeracy consultant for a similar international survey to be used in developing countries, the Literacy Assessment and Monitoring Programme (LAMP). She is the Past President of the Adult Numeracy Network (ANN), is the author of The GED Math Problem Solver, and also works with states and programs facilitating staff-development workshops aimed at improving mathematics instruction to adults. Myrna is presently writing a paper with Mary Jane Schmidt and Lynda Ginsburg on the components of numeracy for NCSALL (National Center for the Study of Adult Learning and Literacy). The paper reviews the literature, describes the fundamental elements of adult numeracy, and makes recommendations for further research, particularly with respect to curriculum and assessment. Look for this resource soon from NCSALL. Recommended preparations for this discussion Myrna has provided several questions below to get you thinking about math assessment: * It is known that students and teachers come to value what is assessed. What is your opinion of the influence that the standardized mathematics assessments (GED, TABE, CASAS) have in your classrooms? Are they assessing the mathematics that is important for the 21st century? Do you think that they all assess the same mathematics? What do you think is missing from each? * Computation skills are easy to assess. How can we assess other important aspects of mathematics like strategic problem solving, conceptual understanding, and reasoning? * Describe instances where you have seen a student's "math anxiety" interfere with an accurate assessment of his/her abilities. Do you assess math anxiety in any way? What strategies have you used to reduce it? Any luck with them? * Which classroom techniques do you recommend for informal, ongoing assessment of a student's progress in learning mathematics? In addition to the above questions to stimulate discussion, Myrna has provided these sites for math assessment. Please take a look at these sites and post your questions and comments to the Discussion: http://www.literacy.org/products/ncal/pdf/TR9805.pdf Assessing Mathematical Knowledge of Adult Learners: Are We Looking at What Counts? This technical report from NCAL was written by Joy Cumming, Iddo Gal, and Lynda Ginsburg in 1998. It discusses assessment principles and evaluates their implementation in common numeracy assessment tools. http://www.ncsall.net/?id=573 The Inclusion of Numeracy in Adult Basic Education, Dave Tout and Mary Jane Schmitt, 2002. This chapter from NCSALL's annual review contains a section on assessment that includes a description of assessments in adult education from Australia and The Netherlands. http://www.nctm.org/news/assessment/2005_12nb.htm Will This Be on the Test? This article discusses the importance of including significant mathematics on tests. It includes a good example of a test item that goes beyond procedural skills. http://standards.nctm.org/document/chapter2/assess.htm This document in an overview of NCTM's assessment principle for K-12 mathematics. Large-scale surveys of adult skills: Adult Literacy and Lifeskills Survey (ALL) Numeracy Framework (begins on p.137): http://www.statcan.ca/cgi-bin/downpub/listpub.cgi?catno=89-552-MIE200501 3 First results: http://www.statcan.ca/english/freepub/89-603-XIE/2005001/pdf.htm Data Tool: http://litdata.ets.org/ialdata/search.asp National Assessment of Adult Literacy (NAAL) First results: http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2006470 Hard Copy Resource: Adult Numeracy Development: Theory, Policy and Practice, Iddo Gal, ed., 2000. Hampton Press, Inc. This book has a section on numeracy assessment with one article discussing assessment issues and principles using examples from the US and Australia and another article describing the use of "Supermarket Strategy" materials for diagnosing the skills of individual learners in The Netherlands. Thanks everyone, and I'm looking forward to seeing you all next week to chat about math assessment! Marie Cora Moderator NIFL Assessment Discussion List marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060612/78a8b7ff/attachment.html From marie.cora at hotspurpartners.com Wed Jun 14 07:12:42 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 14 Jun 2006 07:12:42 -0400 Subject: [Assessment 350] FW: [ContentStandards 160] Using standards for instruction and assessment Message-ID: <00de01c68fa3$752a8580$0302a8c0@LITNOW> Dear all, There is an interesting discussion happening right now on the Content Standards Discussion List, starting with this thread by Moderator Aaron Kohring from Monday. I won't forward messages, so I encourage you to go join that discussion at: http://www.nifl.gov/mailman/listinfo/contentstandards You can also view the archives for any posts you have missed by going to: http://www.nifl.gov/pipermail/contentstandards/2006/date.html marie cora ______________________________________ Greetings all, I've received a request for feedback from several instructors who would like to hear from other teachers using standards for instruction and assessment. What have been your challenges in using standards in the classroom? What types of support do you need from your program or state? Did you participate in any specific professional development activities? Aaron Aaron Kohring Coordinator, LINCS Literacy & Learning Disabilities Special Collection (http://ldlink.coe.utk.edu/) Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) Coordinator, Equipped for the Future Websites (http://eff.cls.utk.edu/) Center for Literacy Studies, University of Tennessee EFF Center for Training and Technical Assistance Phone:(865) 974-4109 main (865) 974-4258 direct Fax: (865) 974-3857 e-mail: akohring at utk.edu From marie.cora at hotspurpartners.com Wed Jun 14 07:17:40 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 14 Jun 2006 07:17:40 -0400 Subject: [Assessment 351] FW: [Moderators 761] Discussion on Health Lit: June 19, Health Literacy Study Circles+ Message-ID: <00e401c68fa4$26b1e8c0$0302a8c0@LITNOW> Hi again! Lots of interesting things going on on the Lists these days! The Health Literacy List is hosting the following discussion. Moderator Julie McKinney notes: "The Health Literacy Study Circles+ is a method of training teachers which may draw some interesting conversation. It would also relate to ESOL and Family Literacy, given the population of literacy students who participated in this pilot. It is about teaching basic skills with health literacy as the context." This should certainly be of interest to some of you, please take advantage! To subscribe to this Discussion, go to: www.nifl.gov/mailman/listinfo/healthliteracy marie cora ********************************************************** I am happy to announce a discussion next week on the Health Literacy list. We will be talking about the Health Literacy Study Circles+, a series of facilitator guides published by NCSALL, which introduces teachers to a skills-based approach to Health Literacy. Our guest speakers will be Winston Lawrence, senior professional development associate with the Literacy Assistance Center in New York City, and Lisa Soricone, a research associate and former fellow at the National Center for the Study of Adult Learning and Literacy (NCSALL). Together they piloted these study circles with adult learners in New York City, and wrote about their experience through an interview in "Focus on Basics" (FOB). Please see this article for more information about the study circles, and to prepare for this discussion: A Conversation with FOB: Learning How to Teach Health Literacy http://www.ncsall.net/index.php?id=995 We look forward to our first formal discussion on the Health Literacy List! Please pass on this information to any colleagues who may be interested, they can join the list (and then unsubscribe afterwards if they wish) at the following link: www.nifl.gov/mailman/listinfo/healthliteracy Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From cathayreta at sbcglobal.net Thu Jun 15 00:16:08 2006 From: cathayreta at sbcglobal.net (Cathay Reta) Date: Wed, 14 Jun 2006 21:16:08 -0700 (PDT) Subject: [Assessment 352] California Literacy Conference Message-ID: <20060615041608.55616.qmail@web82708.mail.mud.yahoo.com> Hello All, California Literacy will be holding its annual statewide conference in Pasadena, California October 19 ? 21, 2006, and we are looking for some outstanding presenters. Please check out the Call for Presenters form on the website: www.caliteracy.org We are particularly interested in proposals for workshops of benefit to basic literacy tutors and teachers, ESL tutors and teachers, and program administrators. And, we are happy to waive the registration fee for the day you present. If you have any questions, please feel free to contact me offline. Cathay Reta Interim Executive Director California Literacy 1000 E. Walnut St., Suite 201 Pasadena CA 91106 (626) 405-9272 cathayreta at caliteracy.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060614/c5bd0619/attachment.html From marie.cora at hotspurpartners.com Fri Jun 16 09:40:09 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 16 Jun 2006 09:40:09 -0400 Subject: [Assessment 353] FW: Adults Can't Learn to Read Message-ID: <003701c6914a$636c30c0$0302a8c0@LITNOW> Dear Colleagues, I thought the following post from Tom Sticht might be of interest to some of you. marie cora _________________________ June 12, 2006 Theoretically You Can't Teach Adults to Read and Write: But Just Keep On Doing It Tom Sticht International Consultant in Adult Education Why is it so hard to get funding for adult literacy education? Innumerable studies, reports, TV shows, and statistical surveys in most of the industrialized nations of the world declare that their nation is being brought to its economic knees because of widespread low basic skills (literacy, numeracy) amongst the adult population. But repeated calls for funding commensurate with the size of the problem go unanswered. Why? Beneath the popular pronouncements of educators, industry leaders, and government officials about the importance of adult basic skills development there flows an undercurrent of disbelief about the abilities of illiterates or the poorly literate to ever improve much above their present learning. This was encountered close to a hundred years ago when Cora Wilson Stewart started the Moonlight Schools of Kentucky in 1911. Her claim that adults could learn to read and write met with skepticism. As she reported, Quote: "Some educators, however, declared preposterous the claims we made that grown people were learning to read and write. It was contrary to the principles of psychology, they said." End Quote Today that undercurrent of disbelief still flows, but today it carries with it the flotsam and jetsam of "scientific facts" from genetics science, brain science, and psychological science. Look here at objects snatched from the undercurrent of disbelief stretching back for just a decade and a half. 2006. Ann Coulter is a major voice in the conservative political arena. In her new book, Godless: The Church of Liberalism (Chapter 7 The Left's War on Science: Burning Books to Advance "Science" pages 172-174) she clearly defends the ideas given in Murray & Hernstein's book The Bell Curve regarding the genetic basis of intelligence. By extension, since The Bell Curve uses reading and math tests in the Armed Forces Qualification Test (AFQT), Coulter is discussing the genetic basis of literacy and numeracy. In her book she says about The Bell Curve book: Quote: "Contrary to the party line denying that such a thing as IQ existed, the book methodically demonstrated that IQ exists, it is easily measured, it is heritable, and it is extremely important. .Among many other things, IQ is a better predictor than socioeconomic status of poverty, unemployment, criminality, divorce, single motherhood, workplace injuries, and high school dropout rates. .Although other factors influence IQ, such as a good environment and nutrition, The Bell Curve authors estimated that IQ was about 40 to 80 percent genetic." (p. 173) End Quote Coulter goes on to discuss the misuse of science in the same chapter in relation to AIDS and homosexuality, feminism, trial-lawyers law suits, DDT and environmentalists, abortion and stem cell research, and other topics that are controversial among large segments of the population but of mainstream concern in the far right conservative base in the United States. Because of her position as a best-selling author and spokesperson for conservative groups, Ann Coulter's ideas about the genetic basis of intelligence and high school dropouts can have a profound impact upon political thinking about basic skills education among adults who have not achieved well. 2005. The Nobel Prize winning economist James J. Heckman in an interview at the Federal Reserve Bank region in Chicago discussed his ideas about cognitive skills and their malleability in later life with members of a presidential commission consisting of former U.S. senators, heads of federal agencies, tax attorneys and academic economists. Later in his interview he discusses what Adam Smith, in his The Wealth of Nations said and why he, Heckman, disagrees with Smith. According to Heckman, Adam Smith said, Quote: ". people are basically born the same and at age 8 one can't really see much difference among them. But then starting at age 8, 9, 10, they pursue different fields, they specialize and they diverge. In his mind, the butcher and the lawyer and the journalist and the professor and the mechanic, all are basically the same person at age 8." End Quote Heckman disagrees with this and says: Quote: This is wrong. IQ is basically formed by age 8, and there are huge differences in IQ among people. Smith was right that people specialize after 8, but they started specializing before 8. On the early formation of human skill, I think Smith was wrong, although he was right about many other things. . I think these observations on human skill formation are exactly why the job training programs aren't working in the United States and why many remediation programs directed toward disadvantaged young adults are so ineffective. And that's why the distinction between cognitive and noncognitive skill is so important, because a lot of the problem with children from disadvantaged homes is their values, attitudes and motivations. .Cognitive skills such as IQ can't really be changed much after ages 8 to 10. But with noncognitive skills there's much more malleability. That's the point I was making earlier when talking about the prefrontal cortex. It remains fluid and adaptable until the early 20s. That's why adolescent mentoring programs are as effective as they are. Take a 13-year-old. You're not going to raise the IQ of a 13-year-old, but you can talk the 13-year-old out of dropping out of school. Up to a point you can provide surrogate parenting. End Quote Here Heckman seems to think of the IQ as something relatively fixed at an early age and not likely to be changed later in life. But if IQ is measured in The Bell Curve, a book in which Heckman found some merit, using the AFQT, which in turn is a literacy and numeracy test, then this would imply that Heckman thinks the latter may not be very malleable in later life. This seems consistent with his belief that remediation programs for adults are ineffective and do not make very wise investments. 2000. It is easy to slip from talking about adults with low literacy ability to talking about adults with low intelligence. On October 2, 2000, Dan Seligman, columnist at Forbes magazine, wrote about the findings of the National Adult Literacy Survey (NALS) of 1993 and said, Quote: "But note that what's being measured here is not what you've been thinking all your life as "literacy. " The cluster of abilities being examined is obviously a proxy for plain old "intelligence." End Quote He then goes on to argue that government programs won't do much about this problem of low intelligence, and, by extension, of low literacy. These types of popular press articles can stymie funding for adult literacy education. That is one reason why it is critical that when national assessments of cognitive skills, including literacy, are administered, we need to be certain about just what it is we are measuring. Unfortunately, that is not the case with the 1993 NALS or the more recent 2003 National Assessment of Adult Literacy (NAAL). These assessments leave open the possibility of being called "intelligence" tests leading some, like Seligman, to the general conclusion that the less literate are simply the less intelligent and society might as well cast them off - their "intelligence genes" will not permit them to ever reach Level 3 or any other levels at the high end of cognitive tests. 1998. Dr. G. Reid L yon of the National Institute of Child Health and Human Development provided an Overview of Reading and Literacy Initiatives to the U. S. Congress Committee on Labor and Human Resources on April 28, 1998. In his testimony he stated that in learning to read it is important for children to possess good abilities in phonemic analysis. He stated: Quote: Difficulties in developing phoneme awareness can have genetic and neurobiological origins or can be attributable to a lack of exposure to language patterns and usage during the preschool years.. It is for this reason that the National Institute of Child Health and Human Development (NICHD) within the National Institutes of Health (NIH) considers reading failure to reflect not only an educational problem, but a significant public health problem as well. Within this context, a large research network consisting of 41 research sites in North America, Europe, and Asia are working hard to identify (1) the critical environmental, experiential, cognitive, genetic, neurobiological, and instructional conditions that foster strong reading development; (2) the risk factors that predispose youngsters to reading failure; and (3) the instructional procedures that can be applied to ameliorate reading deficits at the earliest possible time. End Quote Discussing why some children may have difficulties learning to read, Lyon went on to say: Quote: Children raised in poverty, youngsters with limited proficiency in English, children with speech and hearing impairments, and children from homes where the parent's reading levels are low are relatively predisposed to reading failure. Likewise, youngsters with sub-average intellectual capabilities have difficulties learning to read, particularly in the reading comprehension domain. End Quote Taken together, these statements by a senior government scientist advisor to both the President and the Congress of the United States indicates that the NICHD considers that in some cases low literacy may result from genetic, neurological, sub-average intellectual capability or a combination of these and other factors. Again, this may contribute to wide-spread beliefs that adults with low literacy may possess faulty genes, brains, and/or intellectual abilities and are unlikely to benefit from adult literacy education programs. From a policy perspective, then, policymakers may think that funding such programs may be regarded as a poor use of public funds. 1997. In a January 7, 1997 article in the Washington Times , a prominent newspaper published in Washington DC and read by many members of Congress, columnist Ken Adelman wrote: Quotes: The age-old nature vs. nurture debate assumes immediacy as the new Congress and new administration gin up to address such issues as poverty, crime, drugs, etc. .This, the most intellectually intriguing debate around, is moving far toward nature (and far from nurture) with new evidence presented by an odd pair - gay activist Chandler Burr and conservative scholar Charles Murray. .In brief, their new findings show that 1) homosexuality and 2) educational-economic achievement are each largely a matter of genes - not of upbringing. .If true, as appears so, the scope of effective government programs narrows. Fate, working through chromosomes, bestows both sexual orientation and brainpower, which shape one's life and success. Little can be altered - besides fostering tolerance and helping in any narrow window left open - through even an ideally designed public program. (page B-6) End Quotes The juxtaposition of homosexuals and those of lower educational and economic achievement is an obvious rhetorical device meant to stir negative emotions about both groups, This is a rhetorical device brought back into play by Coulter in her 2006 book cited above. 1991. One of the beliefs in our culture is that the brain and its intellectual capacity is developed in early childhood. There is a widespread belief that if children's early childhood development is not properly stimulated, then there is likely to be intellectual underdevelopment leading to academic failures, low aptitude, and social problems such as criminal activity, teenage pregnancy and welfare. It will be difficult if not impossible to overcome the disadvantages of deficiencies in early childhood stimulation later in adulthood. So why invest much in adult education? We need instead to put billions of dollars into early childhood education. That these beliefs about the consequence of early childhood development are widespread is revealed by articles written by prominent journalists in major newspapers. For instance, on Sunday, October 13, 1991 the San Diego Union newspaper reprinted an article by Joan Beck, a columnist for the Chicago Tribune , that argued for early childhood education because, Quote: "Half of adult intellectual capacity is already present by age 4 and 80 percent by age 8, ... the opportunity to influence [a child's] basic intelligence - considered to be a stable characteristic by age 17 - is greatest in early life." End Quote A year earlier in the same newspaper on October 14, 1990 an adult family literacy educator was quoted as saying, Quote: "Between the ages of zero to 4 we have learned half of everything we'll ever learn in our lives. Most of that has to do with language, imagination, and inquisitiveness." End Quote This doesn't hold out much hope for the adults in family literacy programs. Joan Beck was quoting research by Benjamin Bloom in the 1960s. But Bloom did not show that half of one's intellect was achieved by age 4. Rather, he argued that IQ at age 4 was correlated +.70 with IQ at age 17. Since the square of .7 is .49, Bloom stated that half of the variance among a group of adults' IQ scores at age 17 could be predicted from their group of scores at age 4. But half of the variability among a group of people's IQ scores is a long way from the idea that half of a given person's IQ is developed by age 4. This is not even conceptually possible because for one thing there is no universally agreed to understanding of what "intelligence" is. Further, even if we could agree on what "intelligence" is, there is no such thing as "half of one's intellect" because no one knows what 0 or 100 percent intelligence is. Without knowing the beginning and end of something we can't know when we have half of it. 1990. A report by the Department of Defense shows how these beliefs about the possibility of doing much for adults can affect government policy. After studying the job performance and post-service lives of "lower aptitude," less literate personnel, the report claimed that they had been failures both in and out of the military. Then, on February 24, 1990, the Director of Accession Policy of the Department of Defense commented in the Washington Post newspaper, Quote: "The lesson is that low-aptitude people, whether in the military or not, are always going to be at a disadvantage. That's a sad conclusion." End Quote A similar report of the Department of Defense study was carried in the New York Times of March 12, 1990. Then on April 8, 1990 Jack Anderson's column in the Washington Post quoted one of the Department of Defense researchers saying, Quote: "...by the age of 18 or 19, it's too late. The school system in early childhood is the only place to really help, and that involves heavy participation by the parents." End Quote Regarding the news articles about the Department of Defense studies of "low aptitude" troops, the conclusions were based on analyses of the job performance of hundreds of thousands of personnel in both the 1960s and 1980s with Armed Forces Qualification Test (AFQT) scores between the 10th and the 30th percentiles, the range of scores which the Department of Defense studies called "low aptitude." But contrary to what the Department of Defense researchers and accession policy maker stated, the actual data show that in both time periods, while the low aptitude personnel did not perform quite as well as those personnel with aptitudes above the 30th percentile, over 80 percent of the low aptitude personnel did, in fact, perform satisfactorily and many performed in an outstanding manner. As veterans they had employment rates and earnings far exceeding their rates and earnings at the beginning of the study. Further investigation by the media would have revealed these discrepancies between what the Department of Defense's researchers said and what the actual findings were. But as it stands, these popular media types of stories reinforce the stereotypes about adults with who score low on intelligence or aptitude tests and perform poorly on tests of the basic skills of literacy and numeracy. We can find these pieces of scientific debris all the way back to the Moonlight Schools of 1911. Following her account of those educators and academics who declared that teaching grown people to read and write was contrary to the principles of psychology, Cora Wilson Stewart said, Quote: While they went around saying it couldn't be done, we went on doing it. We asked the doubters this question, "When a fact disputes a theory, is it not time to discard the theory? There was no reply. End Quote Today when we ask why the funding for adult literacy education is so little so late, there is still no reply. So we just keep on teaching adults to read and write. And we do it on the cheap, even though it is theoretically impossible. Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019-2059 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net From marie.cora at hotspurpartners.com Fri Jun 16 09:56:29 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 16 Jun 2006 09:56:29 -0400 Subject: [Assessment 354] WE LEARN Newsletter Message-ID: <004701c6914c$ab2f3ea0$0302a8c0@LITNOW> The following is posted on behalf of Mev Miller of WE LEARN Women Expanding: Literacy Education Action Resource Network ________________________________ Looking for some good summer reading? Have a look at the news & updates from WE LEARN...Please be sure to check out these links... April 2006 Newsletter: http://www.litwomen.org/news/06april.pdf New titles to our database: http://www.litwomen.org/learnmats/06new.html Annual Conference -- 2006 Report & 2007 early news: http://www.litwomen.org/conference.html Excepts from the 1st issue of Women's Perspectives - Health & Wellness student writing Initiative also available: http://www.litwomen.org/perspectives/2006/06excerpts.pdf For complete details about Women's Perspectives go to: http://www.litwomen.org/perspectives.html We also invite you to get involved in our many projects: http://www.litwomen.org/projects.html 2005 Annual Report: http://www.litwomen.org/reports/2005annual.pdf Enjoy your summer. WE LEARN Women Expanding: Literacy Education Action Resource Network www.litwomen.org/welearn.html Mev Miller, Ed.D., Director 182 Riverside Ave. Cranston, RI 02910 401-383-4374 welearn at litwomen.org From marie.cora at hotspurpartners.com Fri Jun 16 11:49:56 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 16 Jun 2006 11:49:56 -0400 Subject: [Assessment 355] Reminder: Mathematics Discussion begins Monday! Message-ID: <007201c6915c$854d4410$0302a8c0@LITNOW> Dear everyone, I hope this email finds you well. I just wanted to remind you all that next week we will host Myrna Manly as our guest discussing assessment in math. I've received a few posts already from folks for next week - keep them coming or feel free to post yourself! If you haven't yet checked out any of the resources that Myrna has suggested, please do so - they are really great! Looking forward to seeing you all during next week's discussion, Thanks! marie cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060616/0ed65e26/attachment.html From marie.cora at hotspurpartners.com Sun Jun 18 10:13:33 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 18 Jun 2006 10:13:33 -0400 Subject: [Assessment 356] Math Discussion Guest on Women & Lit Message-ID: <001901c692e1$6274b370$0302a8c0@LITNOW> Dear colleagues, The following announcement comes from Daphne Greenberg, Moderator of the Women and Literacy Discussion List. She is hosting a discussion on adult numeracy and women with guest Judy Ward next week. Daphne and I will not be cross-posting emails, so I highly encourage you to sign on to the Women and Literacy List so you can participate in this joint math discussion (you can also choose to visit the archives). To sign on to the adult numeracy and women discussion, go to: http://www.nifl.gov/mailman/listinfo/Womenliteracy See you next week for discussions on Assessment in Mathematics and Adult Numeracy and Women. Marie Cora Assessment Discussion List Moderator _____________________________________________ I am pleased to announce that from Monday June 19th through Friday June 30th, Judy Ward, as a guest facilitator on this listserv, will lead a discussion on adult numeracy and women. Here is her introduction: Previous to earning my doctorate in adult education, I was a seventh grade mathematics teacher. Many of my students had experiences that manifested as mathematics anxiety. I found that by changing my teaching techniques to a more hands-on and visual approach and teaching for understanding, the anxiety level decreased drastically. About ten years ago I stumbled into adult education and found there was a need for my expertise. I am in the second year of a two year term as President Elect of the Adult Numeracy Network (ANN). ANN is dedicated to upgrading mathematics instruction for the adult learner. Once again, the discussion will be conducted from Monday June 19th through Friday June 30th. You may want to encourage friends and colleagues to join for this discussion (they can subscribe at: http://www.nifl.gov/mailman/listinfo/Womenliteracy >From now until the 19th, here are some questions that you may want to think about: When you think about yourself, how would you answer the following: How do you feel about math? Do you remember any situations both in and out of school that affected how you feel about math? Which mathematical topics/concepts would you say are your best? Which mathematical topics/concepts would you say are your most difficult? Do you or have you experienced "math anxiety"? If you have or currently teach your students math, how would you answer the following: How do most of your students feel about math? Have they ever shared any situations both in and out of school that affected how they feel about math? Which mathematical topics/concepts would you say are the easiest for your students to learn? Which mathematical topics/concepts would you say are the most difficult for your students to learn? Have any students shared with you that they experience "math anxiety"? Judy will be joining us on the 19th of June and looks foward to your comments and reactions. Daphne Daphne Greenberg Assistant Professor Educational Psych. & Special Ed. Georgia State University P.O. Box 3979 Atlanta, Georgia 30302-3979 phone: 404-651-0127 fax:404-651-4901 dgreenberg at gsu.edu From jataylor at utk.edu Sun Jun 18 12:00:57 2006 From: jataylor at utk.edu (jataylor) Date: Sun, 18 Jun 2006 12:00:57 -0400 Subject: [Assessment 357] Math anxiety and assessment Message-ID: <44A80ED0@webmail.utk.edu> Hello Myrna and All ~ Myrna, thanks for joining us for this discussion! Sounds like it will be a good week ahead. I have a question for you and the group. I believe math anxiety can be a clear barrier for some students in seeing themselves as successful at math (self-efficacy), and ultimately inhibiting their ability to make progress with math. What strategies do you use to help learners reduce math anxiety? How do you know (assess) whether or not the strategy does indeed reduce anxiety? (of course there?s simply asking them what they thought of the activity ) But are there also quick tools or other formative assessments you use to gauge student comfort with learning math? I?ve asked a similar question on the Women and Literacy List. I?d be glad to collect any strategies or resources posted to either list (pertaining to math anxiety), and share them with the group when the discussion is over. Best, Jackie Jackie Taylor, Adult Literacy Professional Development List Moderator, jataylor at utk.edu From marie.cora at hotspurpartners.com Mon Jun 19 08:42:44 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 19 Jun 2006 08:42:44 -0400 Subject: [Assessment 358] LD and math Message-ID: <000501c6939d$dcd8ebb0$0902a8c0@LITNOW> Hi, I have one for Mrs. Manley: My student has a learning disability with math. He is in his thirties, made 500's on all his GED tests except writing which was a 420. HE Passed. But he cannot get passed 400 on math. He does not respond with logical methods when doing problems. He cannot make change so he cannot pass a math test for employment either. This is sad. This has been going on for two years. I have started giving him worksheets with the answers and he has to set it up and find how I got it. Do you think this reversal will help him? what more can I do??? He and I are both frustrated./ What seems so easy to me is impossible for him. What he know yesterday he cannot do today. I teach at Halifax Community College (NC) FT and at SVCC in Emporia (VA) PT. Brenda Cousins From djrosen at comcast.net Mon Jun 19 09:30:36 2006 From: djrosen at comcast.net (djrosen at comcast.net) Date: Mon, 19 Jun 2006 13:30:36 +0000 Subject: [Assessment 359] Adult Literacy Education (ALE) Wiki Message-ID: <061920061330.14293.4496A6FB000D4DBB000037D52206998499020A9C019D060B@comcast.net> Colleagues, The Adult Literacy Education (ALE) Wiki now has 30 topics, a newly designed front page, over 730 pages of content, and more than 600 subscribers. Every week adult literacy educators add new content. The ALE Wiki is a community of practice and a professional development treasurehouse. Check it out -- or visit again -- at: http://wiki.literacytent.org/index.php/Main_Page For some of the topic areas we still nead Topic Area Leaders. If you are interested in learning more about this, please email me. David J. Rosen djrosen at comcast.net From marie.cora at hotspurpartners.com Mon Jun 19 10:51:23 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 19 Jun 2006 10:51:23 -0400 Subject: [Assessment 360] Assessing abstract math concepts Message-ID: <000901c693af$d61c8130$0902a8c0@LITNOW> Dear Everyone, I would like to welcome Myrna Manly to our List this week. A reminder that Myrna will be responding to our posts once per day. I am very interested in hearing what you all, as well as Myrna, have to say about the following prep question that Myrna provided: * Computation skills are easy to assess. How can we assess other important aspects of mathematics like strategic problem solving, conceptual understanding, and reasoning? How do folks approach assessing these skills? Do you use any existing tests or assessments to do this? Which ones? Do you develop your own? What do they look like? What are some of the things you do in the classroom to try and gauge a person's knowledge and abilities of the more abstract concepts in math? Thanks! Marie Cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060619/78d26c61/attachment.html From Tina_Luffman at yc.edu Mon Jun 19 14:14:53 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Mon, 19 Jun 2006 11:14:53 -0700 Subject: [Assessment 361] Re: Math anxiety and assessment Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060619/4dbc0e64/attachment.html From juddia at sbcglobal.net Mon Jun 19 16:26:40 2006 From: juddia at sbcglobal.net (Judith Diamond) Date: Mon, 19 Jun 2006 13:26:40 -0700 (PDT) Subject: [Assessment 362] Re: Math anxiety and assessment In-Reply-To: Message-ID: <20060619202640.86038.qmail@web80824.mail.yahoo.com> Just a brief comment about Math and LD. If skills are taught through connections both with each other and daily life (one we all use, money and percentages ) and students need to practice activities that integrate those skills (such as making a budget, creating graphs and charts for that budget -- including percentages, fractions, percents and addition, mult., div., sub.), then using one skill will generate memory of others. ----- Original Message ---- From: Tina_Luffman at yc.edu To: The Assessment Discussion List Sent: Monday, June 19, 2006 1:14:53 PM Subject: [Assessment 361] Re: Math anxiety and assessment Hi Jackie, I had a similar student in my GED class who would understand algebra, and then would forget percentages. Then she would relearn percentages and forget algebra, for example. She continued taking the GED test and getting 380, 400, 370, etc. She had been with our GED program for almost three years. What finally helped her were 1) the instructor creating lots of additional problems for her to practice beyond what was in the book 2) Skills Tutor software which she did additional study on at home and through the entire summer and 3) one of her classmates started peer tutoring with her after passing the GED exam. For some reason students sometimes can explain concepts better to another student than we can. Students sometimes have the same challenges with learning the concepts, and having this additional support really helped my student reach the 410 mark. I hope this helps. I know how it feels to wonder if a student will ever pass the GED test. :) I believe that no matter how fancy we get with learning tools, there is no substitute for practice, practice, practice, good old fashioned determination, and a lot of support and confidence from someone who believes in them. Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060619/63e9e28c/attachment.html From mmanly at earthlink.net Mon Jun 19 23:23:22 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Mon, 19 Jun 2006 20:23:22 -0700 Subject: [Assessment 363] Re: Math anxiety and assessment In-Reply-To: <44A80ED0@webmail.utk.edu> Message-ID: Jackie said: I believe math anxiety can be a clear barrier for some students in seeing themselves as successful at math (self-efficacy), and ultimately inhibiting their ability to make progress with math. What strategies do you use to help learners reduce math anxiety? How do you know (assess) whether or not the strategy does indeed reduce anxiety? (of course there's simply asking them what they thought of the activity.) But are there also quick tools or other formative assessments you use to gauge student comfort with learning math? Tina responded: I know how it feels to wonder if a student will ever pass the GED test. :) I believe that no matter how fancy we get with learning tools, there is no substitute for practice, practice, practice, good old fashioned determination, and a lot of support and confidence from someone who believes in them. ---------------------------------------------------------------------------- I will add that, in my experience, math anxiety varies with each person who suffers it and teachers need many different interventions to try in order to conquer it. Certainly math-anxious students need work to reverse their negative perception of their math abilities. Many can do it by opening up about their feelings and their understanding of math with a supportive and knowledgeable person. However, we need to recognize that those feelings most often have a basis in their experiences with math. They may be the victims of teachers who thought that there was only one way to find an answer or to solve a problem. Often students memorize procedures without understanding what each step does and why it works that way. It's no wonder that they panic at the thought of having to remember the procedure when they are under the pressure of taking a test! I'd like to start a list of interventions or techniques that you all can add to during the week. - For computation problems, (including all operations with whole numbers, fractions and percents) work on estimating an answer. Use different approaches to estimating so that students recognize that there is not just one 'rule' to follow. - Insist that your student talks about his/her reasoning while solving a simple problem that you know she/he can do. (It may be easier to begin by sharing with another student.) You want students to know that their reasoning is as important as getting the right answers. - Judith Diamond mentioned practicing computation with real-life familiar situations that can help them to understand what they are doing as well as providing a reference when trying to remember how they did it. It's your turn - what can you add? Myrna From mmanly at earthlink.net Mon Jun 19 23:23:22 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Mon, 19 Jun 2006 20:23:22 -0700 Subject: [Assessment 364] Re: LD and math In-Reply-To: <000501c6939d$dcd8ebb0$0902a8c0@LITNOW> Message-ID: Brenda, You describe a very difficult situation. I really have no particular expertise in dealing with math learning disabilities. The GEDTS does allow some accommodations for those with documented disabilities - Have you tried to qualify him for that? Are there others who can give Brenda some advice? Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, June 19, 2006 4:43 AM To: Assessment at nifl.gov Subject: [Assessment 358] LD and math Hi, I have one for Mrs. Manley: My student has a learning disability with math. He is in his thirties, made 500's on all his GED tests except writing which was a 420. HE Passed. But he cannot get passed 400 on math. He does not respond with logical methods when doing problems. He cannot make change so he cannot pass a math test for employment either. This is sad. This has been going on for two years. I have started giving him worksheets with the answers and he has to set it up and find how I got it. Do you think this reversal will help him? what more can I do??? He and I are both frustrated./ What seems so easy to me is impossible for him. What he know yesterday he cannot do today. I teach at Halifax Community College (NC) FT and at SVCC in Emporia (VA) PT. Brenda Cousins ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Tina_Luffman at yc.edu Mon Jun 19 23:32:48 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Mon, 19 Jun 2006 20:32:48 -0700 Subject: [Assessment 365] Re: Math anxiety and assessment Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060619/ae11e414/attachment.html From ginsburg at rci.rutgers.edu Tue Jun 20 10:31:58 2006 From: ginsburg at rci.rutgers.edu (Lynda Ginsburg) Date: Tue, 20 Jun 2006 10:31:58 -0400 Subject: [Assessment 366] Re: LD and math In-Reply-To: References: Message-ID: <449806DE.5020708@rci.rutgers.edu> Brenda and others, Regarding LD and adult math learning, there is a very helpful book chapter titled, "Teaching mathematics to adults with specific learning difficulties." It is in the book, "Adult Numeracy Development" (2000) edited by Iddo Gal and published by Hampton Press. The authors, Martha Sacks and Dorothy Cebula, are two individuals with special education/disability credentials and a particular interest in adults. The authors describe a number of different types of learning difficulties relating to numeracy learning, provide suggestions for identifying them (informally, not using validated assessment instruments) and suggest appropriate strategies. Lynda -- Lynda Ginsburg Senior Research Associate, MetroMath Rutgers University 118 Frelinghuysen Road Piscataway, NJ 08854 Tel: 732-445-1409 Fax: 732-445-2894 Myrna Manly wrote: >Brenda, > >You describe a very difficult situation. I really have no particular >expertise in dealing with math learning disabilities. The GEDTS does allow >some accommodations for those with documented disabilities - Have you tried >to qualify him for that? > >Are there others who can give Brenda some advice? > >Myrna > > >-----Original Message----- >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On >Behalf Of Marie Cora >Sent: Monday, June 19, 2006 4:43 AM >To: Assessment at nifl.gov >Subject: [Assessment 358] LD and math > > >Hi, I have one for Mrs. Manley: > >My student has a learning disability with math. He is in >his thirties, made 500's on all his GED tests except >writing which was a 420. HE Passed. But he cannot get >passed 400 on math. He does not respond with logical >methods when doing problems. He cannot make change so he >cannot pass a math test for employment either. This is >sad. This has been going on for two years. > >I have started giving him worksheets with the answers and >he has to set it up and find how I got it. Do you think >this reversal will help him? what more can I do??? He and >I are both frustrated./ What seems so easy to me is >impossible for him. What he know yesterday he cannot do >today. > >I teach at Halifax Community College (NC) FT and at SVCC >in Emporia (VA) PT. > >Brenda Cousins > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to >http://www.nifl.gov/mailman/listinfo/assessment > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > From ginsburg at rci.rutgers.edu Tue Jun 20 10:49:05 2006 From: ginsburg at rci.rutgers.edu (Lynda Ginsburg) Date: Tue, 20 Jun 2006 10:49:05 -0400 Subject: [Assessment 367] Re: Math anxiety and assessment In-Reply-To: References: Message-ID: <44980AE1.50503@rci.rutgers.edu> One strategy I have used for bringing issues of fear, anxiety, and negative self-image regarding math to the fore for open discussion is creating a "math history graph." Each person creates his/her own graph, with school grade levels (K-12) and then adult years across the horizontal axis and 1-10 on the vertical axis. They then draw a line graph charting how they felt about math ("On a scale of 1 to 10, how did you feel about math at each grade?"). All my learners have at least one deep dip and maybe some shallow ones. We all (me included -- I sometimes go first) explain our graphs and tell the stories that go with the dips. My own negative stories are pretty minor, but I think everyone's feelings are validated when we all participate. Often there are opportunities for most to see that feelings were caused by events and the situations don't have to be repeated. Often the stories are, "I didn't understand and the teacher wouldn't explain it again" or "I didn't want to ask again..." or "I couldn't pass the time test..." Lynda -- Lynda Ginsburg Senior Research Associate, MetroMath Rutgers University 118 Frelinghuysen Road Piscataway, NJ 08854 Tel: 732-445-1409 Fax: 732-445-2894 Myrna Manly wrote: >Jackie said: > I believe math anxiety can be a clear barrier for some students in >seeing themselves as successful at math (self-efficacy), and ultimately >inhibiting their ability to make progress with math. What strategies do you >use to help learners reduce math anxiety? How do you know (assess) whether >or not the strategy does indeed reduce anxiety? (of course there's simply >asking them what they thought of the activity.) But are there also quick >tools or other formative assessments you use to gauge student comfort with >learning math? > >Tina responded: > I know how it feels to wonder if a student will ever pass the GED >test. :) I believe that no matter how fancy we get with learning tools, >there is no substitute for practice, practice, practice, good old fashioned >determination, and a lot of support and confidence from someone who believes >in them. >---------------------------------------------------------------------------- > > I will add that, in my experience, math anxiety varies with each >person who suffers it and teachers need many different interventions to try >in order to conquer it. Certainly math-anxious students need work to >reverse their negative perception of their math abilities. Many can do it by >opening up about their feelings and their understanding of math with a >supportive and knowledgeable person. > > However, we need to recognize that those feelings most often have a basis >in their experiences with math. They may be the victims of teachers who >thought that there was only one way to find an answer or to solve a problem. >Often students memorize procedures without understanding what each step does >and why it works that way. It's no wonder that they panic at the thought of >having to remember the procedure when they are under the pressure of taking >a test! > >I'd like to start a list of interventions or techniques that you all can add >to during the week. > - For computation problems, (including all operations with whole >numbers, fractions and percents) work on estimating an answer. Use different >approaches to estimating so that students recognize that there is not just >one 'rule' to follow. > - Insist that your student talks about his/her reasoning while >solving a simple problem that you know she/he can do. (It may be easier to >begin by sharing with another student.) You want students to know that their >reasoning is as important as getting the right answers. > - Judith Diamond mentioned practicing computation with real-life >familiar situations that can help them to understand what they are doing as >well as providing a reference when trying to remember how they did it. > >It's your turn - what can you add? > >Myrna > > > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > From lmullins89 at yahoo.com Tue Jun 20 11:01:44 2006 From: lmullins89 at yahoo.com (Lisa Mullins) Date: Tue, 20 Jun 2006 08:01:44 -0700 (PDT) Subject: [Assessment 368] Re: Math anxiety and assessment In-Reply-To: Message-ID: <20060620150144.18178.qmail@web30214.mail.mud.yahoo.com> Myrna, In your book The Problem Solver you tackle algebraic concepts in the very beginning of the book. This is in contrast to many books on the market. I use this technique as well. My students are caught by the fact that algebra (a scary term for some) is so simple and can be used for many reasons. However, some people are skeptical that this will result in better scores or better understanding. Can you discuss the contrasts of learning math beginning with whole numbers and working up to algrebra versus using algebra as a problem solving method with all number systems throughout the math learning process. Are the results better scores sooner? Thanks, Lisa Mullins Hawkins County Adult Ed Rogersville, Tennessee --- Myrna Manly wrote: > Jackie said: > I believe math anxiety can be a clear barrier for > some students in > seeing themselves as successful at math > (self-efficacy), and ultimately > inhibiting their ability to make progress with math. > What strategies do you > use to help learners reduce math anxiety? How do you > know (assess) whether > or not the strategy does indeed reduce anxiety? (of > course there's simply > asking them what they thought of the activity.) But > are there also quick > tools or other formative assessments you use to > gauge student comfort with > learning math? > > Tina responded: > I know how it feels to wonder if a student will > ever pass the GED > test. :) I believe that no matter how fancy we get > with learning tools, > there is no substitute for practice, practice, > practice, good old fashioned > determination, and a lot of support and confidence > from someone who believes > in them. > ---------------------------------------------------------------------------- > > I will add that, in my experience, math anxiety > varies with each > person who suffers it and teachers need many > different interventions to try > in order to conquer it. Certainly math-anxious > students need work to > reverse their negative perception of their math > abilities. Many can do it by > opening up about their feelings and their > understanding of math with a > supportive and knowledgeable person. > > However, we need to recognize that those feelings > most often have a basis > in their experiences with math. They may be the > victims of teachers who > thought that there was only one way to find an > answer or to solve a problem. > Often students memorize procedures without > understanding what each step does > and why it works that way. It's no wonder that they > panic at the thought of > having to remember the procedure when they are under > the pressure of taking > a test! > > I'd like to start a list of interventions or > techniques that you all can add > to during the week. > - For computation problems, (including all > operations with whole > numbers, fractions and percents) work on estimating > an answer. Use different > approaches to estimating so that students recognize > that there is not just > one 'rule' to follow. > - Insist that your student talks about his/her > reasoning while > solving a simple problem that you know she/he can > do. (It may be easier to > begin by sharing with another student.) You want > students to know that their > reasoning is as important as getting the right > answers. > - Judith Diamond mentioned practicing computation > with real-life > familiar situations that can help them to understand > what they are doing as > well as providing a reference when trying to > remember how they did it. > > It's your turn - what can you add? > > Myrna > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, > please go to > http://www.nifl.gov/mailman/listinfo/assessment > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From mmanly at earthlink.net Tue Jun 20 17:03:22 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Tue, 20 Jun 2006 14:03:22 -0700 Subject: [Assessment 369] Re: Math anxiety and assessment In-Reply-To: <20060620150144.18178.qmail@web30214.mail.mud.yahoo.com> Message-ID: Hi Lisa, I'm happy to hear that you and your students are enjoying the book. Introducing algebraic thinking early in student's math study has now become widespread in the reform math efforts in K-12. (It is also a hallmark of the new EMPower series for adults.) In 1992, when I wrote the first edition of the book, I based my early-algebra-integration decision on my own experience as one who had taught algebra to students at many levels and as an 'insider' with respect to the GED Math test. (I had just left my job at GEDTS.) The overarching principle when formulating items for the GED math test is to assess the "major and lasting outcomes and skills of a high school education." For the most part, this means that the skills and concepts that are tested are ones that have some practical value. With respect to algebra, I felt that using the concept of a variable, solving simple equations, and graphing linear functions were the most obvious topics to be represented. As an algebra teacher, I had seen the difficulty that students had in making the transition to using variables and had added extra lessons to the textbooks that reviewed arithmetic principles by using variables in place of specific numbers - that is, I used algebra to generalize arithmetic. So, it was an easy decision for me to integrate algebra early - both from a mathematics pedagogy standpoint and from an adult student attitude perspective (knowing that many feel insulted by a review of arithmetic even if their entrance scores indicate that need). As to your question about the results obtained when students are introduced to algebraic ideas early in their mathematics study, I'm afraid that I have no data to substantiate better scores sooner. (That topic may be one that a practitioner would like to investigate as a project for the ANN practitioner research grants.) Thanks for the question, Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Lisa Mullins Sent: Tuesday, June 20, 2006 7:02 AM To: The Assessment Discussion List Subject: [Assessment 368] Re: Math anxiety and assessment Myrna, In your book The Problem Solver you tackle algebraic concepts in the very beginning of the book. This is in contrast to many books on the market. I use this technique as well. My students are caught by the fact that algebra (a scary term for some) is so simple and can be used for many reasons. However, some people are skeptical that this will result in better scores or better understanding. Can you discuss the contrasts of learning math beginning with whole numbers and working up to algrebra versus using algebra as a problem solving method with all number systems throughout the math learning process. Are the results better scores sooner? Thanks, Lisa Mullins Hawkins County Adult Ed Rogersville, Tennessee http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Wed Jun 21 07:06:43 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 21 Jun 2006 07:06:43 -0400 Subject: [Assessment 370] Suggestions on GED Math Message-ID: <004101c69522$c7ee2ea0$0302a8c0@LITNOW> The following post is from Tom Mechem. Brenda, et al.--- In the situation you describe, I would recommend starting the GED Math instruction with learning the GED math notation (how the test asks you to add, subtract, multiply, and divide) and the Order of Operations. This will start to use your student's high literacy level to "get into" the math test. Then I would give only set-up problems, which necessitate the thinking through of the problems but do not require calculations. Meanwhile, I would have the student officially diagnosed with a math learning disability so that a calculator could be used on both parts of the GED Math test. (Go onto our website, www.doe.mass.edu/ged, then click on the link, "Applicants with Disabilities." From that you can print the appropriate Accommodations Request form, which will give you a good idea of what kind of documentation is needed.) Hopefully your student can then go from the proper set-up of the problem to the answer using the calculator to do the calculations. In any case, if you have any further questions, please feel free to e-mail me directly: rmechem at doe.mass.edu. Good luck in your quest. Tom Mechem GED State Chief Examiner Massachusetts Department of Education 781-338-6621 "GED to Ph.D." -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060621/0f941c31/attachment.html From mmanly at earthlink.net Wed Jun 21 14:44:20 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Wed, 21 Jun 2006 11:44:20 -0700 Subject: [Assessment 371] Re: Math anxiety and assessment In-Reply-To: Message-ID: I've added Lynda's graph idea and Tina's suggestions below to the list of ideas. Great additions! Tina mentioned using manipulatives to learn about fractions and that reminded me of the website from NCTM that I found: http://www.nctm.org/news/assessment/2005_12nb.htm Will This Be on the Test? Check it out and see what you think about the test item involving fractions. Too tricky or really clever? Myrna _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Tina_Luffman at yc.edu Sent: Monday, June 19, 2006 7:33 PM To: Judith Diamond; The Assessment Discussion List Subject: [Assessment 365] Re: Math anxiety and assessment Hi Judy, Yes, I do agree with you about using the practical aspects of daily life to help math connect with students. It is a rule of mnemonics to link what is known to what is not known to help students remember. I also like to have the students work together in groups after giving the lesson to hear how they perceive the material and to locate error in understanding. When students work together, they have to restate what the teacher teaches and then explain it to someone else. That helps with memory retention. Another great tool with math is the manipulatives. We have various colored disks that represent fractions. One disk is whole which equals 1 or 1/1. The next is broken in half, and another into thirds and so on. When students lay three 1/4 disks on top of a 1/2 and a 1/4 disk, they can see in a tangible manner how 1/2 + 1/4 really does = 3/4. I also tell students to draw the word problems to figure out how to solve the math problems. Some students really work well with drawing the five piles of ten logs to know that they need to multiply to see how many logs they have. Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060621/7d53f63c/attachment.html From mdick at lagcc.cuny.edu Wed Jun 21 17:19:06 2006 From: mdick at lagcc.cuny.edu (Mae Dick) Date: Wed, 21 Jun 2006 17:19:06 -0400 Subject: [Assessment 372] Re: Math anxiety and assessment In-Reply-To: References: <20060620150144.18178.qmail@web30214.mail.mud.yahoo.com> Message-ID: <44997F8A0200000A0019F4D9@mailgate.lagcc.cuny.edu> Hi there. I thought you might be interested in a math resource that was developed by Steve Hinds, a staff developer for adult literacy programs in the City University of New York. Here's a quote from a workshop Steve recently offered at the Literacy Assistance Center in NYC. He says " Adult Literacy programs traditionally limit students in low-level classes to computation practice out of workbooks. Algebra, data and geometry topics are considered too difficult for these students until they have 'mastered the basics.' Steve believes that students can increase their mathematical reasoning, number sense and enjoyment of math through the kinds of exercises he presents on the CUNY web site. Check it out. Go to . The user name is literacy and the password is resources06. >>> "Myrna Manly" 06/20/06 5:03 PM >>> Hi Lisa, I'm happy to hear that you and your students are enjoying the book. Introducing algebraic thinking early in student's math study has now become widespread in the reform math efforts in K-12. (It is also a hallmark of the new EMPower series for adults.) In 1992, when I wrote the first edition of the book, I based my early-algebra-integration decision on my own experience as one who had taught algebra to students at many levels and as an 'insider' with respect to the GED Math test. (I had just left my job at GEDTS.) The overarching principle when formulating items for the GED math test is to assess the "major and lasting outcomes and skills of a high school education." For the most part, this means that the skills and concepts that are tested are ones that have some practical value. With respect to algebra, I felt that using the concept of a variable, solving simple equations, and graphing linear functions were the most obvious topics to be represented. As an algebra teacher, I had seen the difficulty that students had in making the transition to using variables and had added extra lessons to the textbooks that reviewed arithmetic principles by using variables in place of specific numbers - that is, I used algebra to generalize arithmetic. So, it was an easy decision for me to integrate algebra early - both from a mathematics pedagogy standpoint and from an adult student attitude perspective (knowing that many feel insulted by a review of arithmetic even if their entrance scores indicate that need). As to your question about the results obtained when students are introduced to algebraic ideas early in their mathematics study, I'm afraid that I have no data to substantiate better scores sooner. (That topic may be one that a practitioner would like to investigate as a project for the ANN practitioner research grants.) Thanks for the question, Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Lisa Mullins Sent: Tuesday, June 20, 2006 7:02 AM To: The Assessment Discussion List Subject: [Assessment 368] Re: Math anxiety and assessment Myrna, In your book The Problem Solver you tackle algebraic concepts in the very beginning of the book. This is in contrast to many books on the market. I use this technique as well. My students are caught by the fact that algebra (a scary term for some) is so simple and can be used for many reasons. However, some people are skeptical that this will result in better scores or better understanding. Can you discuss the contrasts of learning math beginning with whole numbers and working up to algrebra versus using algebra as a problem solving method with all number systems throughout the math learning process. Are the results better scores sooner? Thanks, Lisa Mullins Hawkins County Adult Ed Rogersville, Tennessee http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Mdr151 at aol.com Thu Jun 22 09:02:49 2006 From: Mdr151 at aol.com (Mdr151 at aol.com) Date: Thu, 22 Jun 2006 09:02:49 EDT Subject: [Assessment 373] Re: Math anxiety and assessment Message-ID: <506.11e0e13.31cbeef9@aol.com> I would agree with the recent posts about the materials presented in both Myrna's GED Math Problem solver and the new Empower series. Since our topic of discussion is math anxiety and assessment, I would say that this approach to teaching math content conceptually definitely helps to reduce the anxiety students bring to the math classroom. Recently I have been piloting the Empower books on Benchmark Fractions and Split it Up and am amazed by how quickly and confidently my students can see the relationship between decimals, fractions and percents. What is even more amazing is how easily they can solve percent problems, particularly percent problems that require finding the total. All of this is done conceptually thinking about what the part is and building to the total. Pam Meader President Adult Numeracy Network -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060622/237b7bd5/attachment.html From lmullins89 at yahoo.com Thu Jun 22 09:14:33 2006 From: lmullins89 at yahoo.com (Lisa Mullins) Date: Thu, 22 Jun 2006 06:14:33 -0700 (PDT) Subject: [Assessment 374] Re: Math anxiety and assessment In-Reply-To: Message-ID: <20060622131433.5950.qmail@web30204.mail.mud.yahoo.com> Hello Myrna and all, I enjoyed the link you gave us yesterday. I am very interested in the open-ended questions section. I think it is a great guide for formulating open-ended math questions. One challenge I encounter when I provide open-ended questions for my learners is being prepared for the "out of the box" thinking. In other words, when a learner thinks of the problem in a orginial way and solves it correctly, I am left dumbfounded as to why it worked. What should I do at that point? Also, the learners often want a fast, set in stone, rule to help them solve a problem. For example, percentages can be calculated using a number of stratgies. Some students want one good way to solve percentages and they are confused if I provide alternative methods of solving. I think this contributes to math anxiety. How would you handle this situation? Thanks, Lisa Mullins Tennessee --- Myrna Manly wrote: > I've added Lynda's graph idea and Tina's suggestions > below to the list of > ideas. Great additions! > > > > Tina mentioned using manipulatives to learn about > fractions and that > reminded me of the website from NCTM that I found: > http://www.nctm.org/news/assessment/2005_12nb.htm > Will This Be on the Test? > > > > > Check it out and see what you think about the test > item involving fractions. > Too tricky or really clever? > > > > Myrna > > > > _____ > > From: assessment-bounces at nifl.gov > [mailto:assessment-bounces at nifl.gov] On > Behalf Of Tina_Luffman at yc.edu > Sent: Monday, June 19, 2006 7:33 PM > To: Judith Diamond; The Assessment Discussion List > Subject: [Assessment 365] Re: Math anxiety and > assessment > > > > Hi Judy, > > > > Yes, I do agree with you about using the practical > aspects of daily life to > help math connect with students. It is a rule of > mnemonics to link what is > known to what is not known to help students > remember. > > > > I also like to have the students work together in > groups after giving the > lesson to hear how they perceive the material and to > locate error in > understanding. When students work together, they > have to restate what the > teacher teaches and then explain it to someone else. > That helps with memory > retention. > > > > Another great tool with math is the manipulatives. > We have various colored > disks that represent fractions. One disk is whole > which equals 1 or 1/1. The > next is broken in half, and another into thirds and > so on. When students lay > three 1/4 disks on top of a 1/2 and a 1/4 disk, they > can see in a tangible > manner how 1/2 + 1/4 really does = 3/4. > > > > I also tell students to draw the word problems to > figure out how to solve > the math problems. Some students really work well > with drawing the five > piles of ten logs to know that they need to multiply > to see how many logs > they have. > > Tina > > > Tina Luffman > Coordinator, Developmental Education > Verde Valley Campus > 928-634-6544 > tina_luffman at yc.edu > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, > please go to > http://www.nifl.gov/mailman/listinfo/assessment > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From mmanly at earthlink.net Thu Jun 22 14:36:36 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Thu, 22 Jun 2006 11:36:36 -0700 Subject: [Assessment 375] Re: Math anxiety and assessment In-Reply-To: <44997F8A0200000A0019F4D9@mailgate.lagcc.cuny.edu> Message-ID: Thanks, Mae, for this web site. It is encouraging to see that more people around the country are recommending that all content areas of math be studied in ABE at all levels of student learning. It brings up an interesting question: "What is influencing so many ABE teachers to focus only on numerical computation procedures at the early levels?" Is it the state standard documents, the workbooks, the NRS indicators, the TABE test, or their own experience learning math? What do you see out there? Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Mae Dick Sent: Wednesday, June 21, 2006 1:19 PM To: 'The Assessment Discussion List' Subject: [Assessment 372] Re: Math anxiety and assessment Hi there. I thought you might be interested in a math resource that was developed by Steve Hinds, a staff developer for adult literacy programs in the City University of New York. Here's a quote from a workshop Steve recently offered at the Literacy Assistance Center in NYC. He says " Adult Literacy programs traditionally limit students in low-level classes to computation practice out of workbooks. Algebra, data and geometry topics are considered too difficult for these students until they have 'mastered the basics.' Steve believes that students can increase their mathematical reasoning, number sense and enjoyment of math through the kinds of exercises he presents on the CUNY web site. Check it out. Go to . The user name is literacy and the password is resources06. .nifl.gov/mailman/listinfo/assessment From bddavis at butlercc.edu Thu Jun 22 16:14:02 2006 From: bddavis at butlercc.edu (Beverly Davis) Date: Thu, 22 Jun 2006 15:14:02 -0500 (CDT) Subject: [Assessment 376] Re: Math anxiety and assessment Message-ID: <4583411.1151007242913.JavaMail.bddavis@butlercc.edu> Mae, I went to the website you listed and couldn't get to any math problems. Could you tell me what I am doing wrong. Thank you! Mae Dick wrote: >Hi there. I thought you might be interested in a math resource that was developed by Steve Hinds, a staff developer for adult literacy programs in the City University of New York. Here's a quote from a workshop Steve recently offered at the Literacy Assistance Center in NYC. He says " Adult Literacy programs traditionally limit students in low-level classes to computation practice out of workbooks. Algebra, data and geometry topics are considered too difficult for these students until they have 'mastered the basics.' Steve believes that students can increase their mathematical reasoning, number sense and enjoyment of math through the kinds of exercises he presents on the CUNY web site. Check it out. Go to . The user name is literacy and the password is resources06. > >>>> "Myrna Manly" 06/20/06 5:03 PM >>> >Hi Lisa, >I'm happy to hear that you and your students are enjoying the book. >Introducing algebraic thinking early in student's math study has now become >widespread in the reform math efforts in K-12. (It is also a hallmark of the >new EMPower series for adults.) In 1992, when I wrote the first edition of >the book, I based my early-algebra-integration decision on my own experience >as one who had taught algebra to students at many levels and as an 'insider' >with respect to the GED Math test. (I had just left my job at GEDTS.) > >The overarching principle when formulating items for the GED math test is to >assess the "major and lasting outcomes and skills of a high school >education." For the most part, this means that the skills and concepts that >are tested are ones that have some practical value. With respect to algebra, >I felt that using the concept of a variable, solving simple equations, and >graphing linear functions were the most obvious topics to be represented. > >As an algebra teacher, I had seen the difficulty that students had in making >the transition to using variables and had added extra lessons to the >textbooks that reviewed arithmetic principles by using variables in place of >specific numbers - that is, I used algebra to generalize arithmetic. >So, it was an easy decision for me to integrate algebra early - both from a >mathematics pedagogy standpoint and from an adult student attitude >perspective (knowing that many feel insulted by a review of arithmetic even >if their entrance scores indicate that need). > >As to your question about the results obtained when students are introduced >to algebraic ideas early in their mathematics study, I'm afraid that I have >no data to substantiate better scores sooner. (That topic may be one that a >practitioner would like to investigate as a project for the ANN practitioner >research grants.) > >Thanks for the question, >Myrna > > > >-----Original Message----- >From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On >Behalf Of Lisa Mullins >Sent: Tuesday, June 20, 2006 7:02 AM >To: The Assessment Discussion List >Subject: [Assessment 368] Re: Math anxiety and assessment > >Myrna, >In your book The Problem Solver you tackle algebraic >concepts in the very beginning of the book. This is >in contrast to many books on the market. I use this >technique as well. My students are caught by the fact >that algebra (a scary term for some) is so simple and >can be used for many reasons. However, some people are >skeptical that this will result in better scores or >better understanding. > >Can you discuss the contrasts of learning math >beginning with whole numbers and working up to >algrebra versus using algebra as a problem solving >method with all number systems throughout the math >learning process. Are the results better scores >sooner? > >Thanks, >Lisa Mullins >Hawkins County Adult Ed >Rogersville, Tennessee > >http://www.nifl.gov/mailman/listinfo/assessment > > >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > --------------------------------------- It is not the load that breaks you, it is the way you carry it. Beverly Davis ABE/GEDButler Community College Instructional Coordinator (316) 321-4030, ext. 113 From djrosen at comcast.net Thu Jun 22 16:23:04 2006 From: djrosen at comcast.net (David Rosen) Date: Thu, 22 Jun 2006 16:23:04 -0400 Subject: [Assessment 377] Re: Math anxiety and assessment In-Reply-To: <4583411.1151007242913.JavaMail.bddavis@butlercc.edu> References: <4583411.1151007242913.JavaMail.bddavis@butlercc.edu> Message-ID: Beverly and others, On Jun 22, 2006, at 4:14 PM, Beverly Davis wrote: > Mae, I went to the website you listed and couldn't get to any math > problems. Could you tell me what I am doing wrong. Thank you! 1. Go to http://www.literacy.cuny.edu/ 2. Select "Resources" 3. At the prompt, type in the user name literacy and the password resources06 David Rosen djrosen at comcast.ne > > > Mae Dick wrote: > > >> Hi there. I thought you might be interested in a math resource >> that was > developed by Steve Hinds, a staff developer for adult literacy > programs > in the City University of New York. Here's a quote from a workshop > Steve recently offered at the Literacy Assistance Center in NYC. He > says " Adult Literacy programs traditionally limit students in > low-level classes to computation practice out of workbooks. Algebra, > data and geometry topics are considered too difficult for these > students until they have 'mastered the basics.' Steve believes that > students can increase their mathematical reasoning, number sense and > enjoyment of math through the kinds of exercises he presents on the > CUNY web site. Check it out. Go to . The user > name is literacy and the password is resources06. >> >>>>> "Myrna Manly" 06/20/06 5:03 PM >>> >> Hi Lisa, >> I'm happy to hear that you and your students are enjoying the book. >> Introducing algebraic thinking early in student's math study has now > become >> widespread in the reform math efforts in K-12. (It is also a hallmark > of the >> new EMPower series for adults.) In 1992, when I wrote the first >> edition > of >> the book, I based my early-algebra-integration decision on my own > experience >> as one who had taught algebra to students at many levels and as an > 'insider' >> with respect to the GED Math test. (I had just left my job at GEDTS.) >> >> The overarching principle when formulating items for the GED math >> test > is to >> assess the "major and lasting outcomes and skills of a high school >> education." For the most part, this means that the skills and >> concepts > that >> are tested are ones that have some practical value. With respect to > algebra, >> I felt that using the concept of a variable, solving simple >> equations, > and >> graphing linear functions were the most obvious topics to be > represented. >> >> As an algebra teacher, I had seen the difficulty that students had in > making >> the transition to using variables and had added extra lessons to the >> textbooks that reviewed arithmetic principles by using variables in > place of >> specific numbers - that is, I used algebra to generalize arithmetic. >> So, it was an easy decision for me to integrate algebra early - both > from a >> mathematics pedagogy standpoint and from an adult student attitude >> perspective (knowing that many feel insulted by a review of >> arithmetic > even >> if their entrance scores indicate that need). >> >> As to your question about the results obtained when students are > introduced >> to algebraic ideas early in their mathematics study, I'm afraid >> that I > have >> no data to substantiate better scores sooner. (That topic may be one > that a >> practitioner would like to investigate as a project for the ANN > practitioner >> research grants.) >> >> Thanks for the question, >> Myrna >> >> >> >> -----Original Message----- >> From: assessment-bounces at nifl.gov [mailto:assessment- >> bounces at nifl.gov] > On >> Behalf Of Lisa Mullins >> Sent: Tuesday, June 20, 2006 7:02 AM >> To: The Assessment Discussion List >> Subject: [Assessment 368] Re: Math anxiety and assessment >> >> Myrna, >> In your book The Problem Solver you tackle algebraic >> concepts in the very beginning of the book. This is >> in contrast to many books on the market. I use this >> technique as well. My students are caught by the fact >> that algebra (a scary term for some) is so simple and >> can be used for many reasons. However, some people are >> skeptical that this will result in better scores or >> better understanding. >> >> Can you discuss the contrasts of learning math >> beginning with whole numbers and working up to >> algrebra versus using algebra as a problem solving >> method with all number systems throughout the math >> learning process. Are the results better scores >> sooner? >> >> Thanks, >> Lisa Mullins >> Hawkins County Adult Ed >> Rogersville, Tennessee >> >> http://www.nifl.gov/mailman/listinfo/assessment >> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment >> > > --------------------------------------- > It is not the load that breaks you, it > is the way you carry it. > > Beverly Davis > ABE/GEDButler Community College > Instructional Coordinator > (316) 321-4030, ext. 113 > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment t From mmanly at earthlink.net Thu Jun 22 17:39:32 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Thu, 22 Jun 2006 14:39:32 -0700 Subject: [Assessment 378] Re: Math anxiety and assessment In-Reply-To: <20060622131433.5950.qmail@web30204.mail.mud.yahoo.com> Message-ID: Lisa said: "Also, the learners often want a fast, set in stone, rule to help them solve a problem. For example, percentages can be calculated using a number of strategies. Some students want one good way to solve percentages and they are confused if I provide alternative methods of solving. I think this contributes to math anxiety." I encounter this issue a lot when I do workshops with teachers. From previous posts you know that, in my opinion, it induces anxiety when students think that there is only one right way to get an answer to a problem and they need to remember it during the stressful atmosphere of a high stakes test like the GED. A person who can be flexible in solving problems is one who is confident both in testing situations and in meeting the demands of the real world. I recommended using estimation, not only for its intrinsic value in the real world, but also as a vehicle to convince students that they can think differently and still be correct. (They are more likely to accept that with estimation than with something like computation techniques that were taught in such a didactic manner.) Others have commented that group work is also helpful in that they see others' methods and have the opportunity to discuss them. Being flexible instead of being at the mercy of "the right way" is a goal that teachers need to work on every day with their students. We all know that the key to being flexible is having a deeper conceptual understanding of the subject than merely knowing a procedure. Sheila Tobias uses the words, "learned helplessness" in describing math anxious people. On a recent Oprah show where Bill Gates was the guest and the topic was the burgeoning HS dropout problem, they used the same phrase to characterize the attitude of the majority of dropouts. They have been encouraged to imitate, not to think for themselves. Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Lisa Mullins Sent: Thursday, June 22, 2006 5:15 AM To: The Assessment Discussion List Subject: [Assessment 374] Re: Math anxiety and assessment Hello Myrna and all, I enjoyed the link you gave us yesterday. I am very interested in the open-ended questions section. I think it is a great guide for formulating open-ended math questions. One challenge I encounter when I provide open-ended questions for my learners is being prepared for the "out of the box" thinking. In other words, when a learner thinks of the problem in a orginial way and solves it correctly, I am left dumbfounded as to why it worked. What should I do at that point? Also, the learners often want a fast, set in stone, rule to help them solve a problem. For example, percentages can be calculated using a number of stratgies. Some students want one good way to solve percentages and they are confused if I provide alternative methods of solving. I think this contributes to math anxiety. How would you handle this situation? Thanks, Lisa Mullins Tennessee --- Myrna Manly wrote: > I've added Lynda's graph idea and Tina's suggestions > below to the list of > ideas. Great additions! > > > > Tina mentioned using manipulatives to learn about > fractions and that > reminded me of the website from NCTM that I found: > http://www.nctm.org/news/assessment/2005_12nb.htm > Will This Be on the Test? > > > > > Check it out and see what you think about the test > item involving fractions. > Too tricky or really clever? > > > > Myrna > > > > _____ > > From: assessment-bounces at nifl.gov > [mailto:assessment-bounces at nifl.gov] On > Behalf Of Tina_Luffman at yc.edu > Sent: Monday, June 19, 2006 7:33 PM > To: Judith Diamond; The Assessment Discussion List > Subject: [Assessment 365] Re: Math anxiety and > assessment > > > > Hi Judy, > > > > Yes, I do agree with you about using the practical > aspects of daily life to > help math connect with students. It is a rule of > mnemonics to link what is > known to what is not known to help students > remember. > > > > I also like to have the students work together in > groups after giving the > lesson to hear how they perceive the material and to > locate error in > understanding. When students work together, they > have to restate what the > teacher teaches and then explain it to someone else. > That helps with memory > retention. > > > > Another great tool with math is the manipulatives. > We have various colored > disks that represent fractions. One disk is whole > which equals 1 or 1/1. The > next is broken in half, and another into thirds and > so on. When students lay > three 1/4 disks on top of a 1/2 and a 1/4 disk, they > can see in a tangible > manner how 1/2 + 1/4 really does = 3/4. > > > > I also tell students to draw the word problems to > figure out how to solve > the math problems. Some students really work well > with drawing the five > piles of ten logs to know that they need to multiply > to see how many logs > they have. > > Tina > > > Tina Luffman > Coordinator, Developmental Education > Verde Valley Campus > 928-634-6544 > tina_luffman at yc.edu > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, > please go to > http://www.nifl.gov/mailman/listinfo/assessment > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From djrosen at comcast.net Fri Jun 23 07:59:11 2006 From: djrosen at comcast.net (David Rosen) Date: Fri, 23 Jun 2006 07:59:11 -0400 Subject: [Assessment 379] Math as a puzzle, or swimming Message-ID: Myrna, Judy, and others, I am cross-posting this question to both the Women Literacy and Assessment lists, and hope that anyone who wishes to will join in. On National Public Radio Weekend Edition Sunday, in the Will Shortz "Puzzle Master" segment, the Public Radio host, Liane Hansen, often asks the contestant, "Are you a puzzle person?" How would you answer this question? For me, it's complicated. If I knew I wouldn't have to compete on the radio, and if I had as much time as I needed, I might say "sometimes," depending on the kind of puzzle. Those who would without waffling say yes, do not have "puzzle anxiety." They confidently dive into the deepest, coldest puzzle knowing that even if they thrash about they won't sink, and that they also know several strokes (strategies) in addition to treading water. Those who hesitate, qualify their "yes", or answer "no" have probably gulped water a few times, and it wasn't fun. They may be thinking that these waters are dangerous. So here's my question. How do you as a teacher help those who are not "puzzle people," or "math people," become more confident? Is it best for them to learn a few strokes first in shallow water? Or to dive right in to the deep parts with a buddy who can swim? What is the teacher's role as lifeguard? What are some strategies to help the most anxious to put their toes in the water? How do you help a mature fish to not feel foolish learning to swim next to all these smart fry swimming circles around them? How do you help an cautious swimmer become a strong swimmer? And since overcoming any anxiety is tough work, what do you tell your students is the reward? What's so great about swimming when you can enjoy sitting on a sunny beach or walking on the shore? And do you have any good stories? Let's hear about one of your students who was "aquaphobic" and who now loves to dive and to play water polo, or who at least can enjoy an occasional swim. How, exactly did that transformation happen? What was your role? David Rosen djrosen at comcast.net From marie.cora at hotspurpartners.com Fri Jun 23 13:57:04 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 23 Jun 2006 13:57:04 -0400 Subject: [Assessment 380] Re: Math as a puzzle, or swimming Message-ID: <002a01c696ee$7052df60$6f49e947@LITNOW> The following post is from Bertha Mo. Marie Cora I attended some of San Francisco's best public schools and so I thought I was to blame for my learning anxieties. I really didn't understand the process and mechanics of writing until I participated in the Bay Area Writer's Process while I was struggling with my Ph.D. dissertation. Here are some of the things I learned about writing and learning almost anything: 1) Writing (learning) is a process 2) Remote writing, mind mapping, experimentation all help one to relax...use colors or crayons to begin if that will relax you... 2) Writing need not be an isolating and lonely process...writing can be and probably should be shared. 3) Finished, published products have been shared with many people before reaching your eyes. 4) "Writing Prose" is the book that helped me crack a pretty serious writing anxiety. Bertie Bertha Mo -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060623/52c27406/attachment.html From varshna at grandecom.net Fri Jun 23 14:09:55 2006 From: varshna at grandecom.net (Varshna Narumanchi-Jackson) Date: Fri, 23 Jun 2006 13:09:55 -0500 Subject: [Assessment 381] Re: Math as a puzzle, or swimming In-Reply-To: <002a01c696ee$7052df60$6f49e947@LITNOW> Message-ID: I?ve found that reading about writing from prolific and successful writers ? their struggles, methods, anxiety ? has been inspirational. A recent website I visited is Thomas Sowell?s page on his own experiences as a writer (http://www.tsowell.com/About_Writing.html). I just want to make the disclaimer that if you?re an editor or have editorial aspirations (like me), you will probably want to skip the parts he has to say about working with editors, copywriters, and the rest of the publishing world...While he is humorous(!) and takes a light-hearted approach, it?s hit home a couple of times... Good luck, and keep writing! on 6/23/06 12:57 PM, Marie Cora at marie.cora at hotspurpartners.com wrote: > The following post is from Bertha Mo. > > Marie Cora > > > I attended some of San Francisco's best public schools and so I thought I was > to blame for my learning anxieties. > > I really didn't understand the process and mechanics of writing until I > participated in the Bay Area Writer's Process while I was struggling with my > Ph.D. dissertation. > > Here are some of the things I learned about writing and learning almost > anything: > > 1) Writing (learning) is a process > > 2) Remote writing, mind mapping, experimentation all help one to relax...use > colors or crayons to begin if that will relax you... > > 2) Writing need not be an isolating and lonely process...writing can be and > probably should be shared. > > 3) Finished, published products have been shared with many people before > reaching your eyes. > > 4) "Writing Prose" is the book that helped me crack a pretty serious writing > anxiety. > > Bertie > Bertha Mo > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060623/287f7091/attachment.html From marie.cora at hotspurpartners.com Fri Jun 23 16:08:27 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 23 Jun 2006 16:08:27 -0400 Subject: [Assessment 382] Final questions on math Message-ID: <000501c69700$cab71c70$6f49e947@LITNOW> Hi everyone, What a great and rich discussion this is, thank you all for making this so! There has been a wonderful exchange of interventions, techniques, and strategies to draw from in working with students in math. This is the last day with our guest, so I encourage you to ask any final questions or make any comments. Myrna asked this question in a post or two ago and I also would like to hear from folks on what they think: "What is influencing so many ABE teachers to focus only on numerical computation procedures at the early levels?" Is it the state standard documents, the workbooks, the NRS indicators, the TABE test, or their own experience learning math? What do you see out there? Myrna also mentioned ANN in another post, and I just wanted to make sure folks had the information on this: Adult Numeracy Network http://www.literacynet.org/ann/ Ok! Let's hear any final thoughts! It's your last chance! Marie Cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060623/ffb6fc4a/attachment.html From Tina_Luffman at yc.edu Fri Jun 23 16:52:25 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Fri, 23 Jun 2006 13:52:25 -0700 Subject: [Assessment 383] Re: Final questions on math Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060623/74530b7b/attachment.html From Mdr151 at aol.com Fri Jun 23 22:52:17 2006 From: Mdr151 at aol.com (Mdr151 at aol.com) Date: Fri, 23 Jun 2006 22:52:17 EDT Subject: [Assessment 384] Re: Math as a puzzle, or swimming Message-ID: <2c5.9efbf7b.31ce02e1@aol.com> One of the greatest joys of being a math teacher is to see students dread and hate of math turn to joy for math. One student that comes to mind just recently graduated from a technical college. She shared with me that it took her 9 years to consider coming back to school and facing math. I was so fortunate to have her for a student and watch her fear of math melt away. One way I address math phobia is through journal writing. Students are required to respond to a math prompt, provide examples, make connections and lastly to reflect on their learning. It is in the reflections that the fear and struggles are revealed. I journal back, giving them encouragement and offering suggestions. This particular student's journals showed her excitement in finally understanding math and seeing connections. She continues today to send me emails about math websites she has found or successes she has had since being in college.She was so proud to share that she got an A in technical physics and graduated with honors. I truly believe we must develop a safe environment for math learners, one in which they dare to take risks and not feel stupid in trying and one that offers support in their stuggles. Pam Meader President, Adult Numeracy Network -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060623/4f271536/attachment.html From juddia at sbcglobal.net Sat Jun 24 08:04:37 2006 From: juddia at sbcglobal.net (Judith Diamond) Date: Sat, 24 Jun 2006 08:04:37 -0400 Subject: [Assessment 385] Re: Math as a puzzle, or swimming In-Reply-To: <2c5.9efbf7b.31ce02e1@aol.com> Message-ID: <000501c69786$60f468c0$81bd044b@Judith> Hi Pam, One of the things that is so enjoyable about an online discussion is that occasionally you read something that changes the way you see things and that you will clip and save and use. Your idea of journals and how you structure them might not resonate with all students.for some writing is just another headache.but I can see it being a really meaningful connection both to math and to their teacher for others. Thanks. Judith Diamond Adult Learning Resource Center -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Mdr151 at aol.com Sent: Friday, June 23, 2006 10:52 PM To: assessment at nifl.gov Subject: [Assessment 384] Re: Math as a puzzle, or swimming One of the greatest joys of being a math teacher is to see students dread and hate of math turn to joy for math. One student that comes to mind just recently graduated from a technical college. She shared with me that it took her 9 years to consider coming back to school and facing math. I was so fortunate to have her for a student and watch her fear of math melt away. One way I address math phobia is through journal writing. Students are required to respond to a math prompt, provide examples, make connections and lastly to reflect on their learning. It is in the reflections that the fear and struggles are revealed. I journal back, giving them encouragement and offering suggestions. This particular student's journals showed her excitement in finally understanding math and seeing connections. She continues today to send me emails about math websites she has found or successes she has had since being in college.She was so proud to share that she got an A in technical physics and graduated with honors. I truly believe we must develop a safe environment for math learners, one in which they dare to take risks and not feel stupid in trying and one that offers support in their stuggles. Pam Meader President, Adult Numeracy Network -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060624/b5298436/attachment.html From mmanly at earthlink.net Sat Jun 24 15:01:16 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Sat, 24 Jun 2006 12:01:16 -0700 Subject: [Assessment 386] Re: Final questions on math In-Reply-To: Message-ID: Tina and all, Each of your postings brings up so many interesting aspects. For example, in Tina's message below, I see that the program uses the TABE to diagnose learner weaknesses in computation when they enter the program and then focuses instruction on the computation basics by using workbooks and study guides. It seems as though Tina is agreeing with that policy in that the students show a quick gain in skills and funding is maintained. That is certainly an important aspect - we need to keep the doors open. She also mentions that she includes all levels of students in an algebra lesson and even the low-level learners gain self esteem (and learn about variables and making generalizations) by succeeding with it. However, it is safe to assume that this learning does not have direct value in helping to raise their scores when they retake the TABE for accountability purposes because the test items at the early levels concern themselves mostly with naked computation (no context). This exposes a huge problem for math assessment, and in turn for math instruction, in ABE. If there is value added to the students' learning by early algebra exposure, it should be recognized in the accountability measures. In the meantime, what can a program or its teachers do? You all have made great suggestions: use manipulatives or realia to facilitate deeper understanding of the operations with number, teach computational skills along with conceptual understanding, insist that computations be motivated by a real situation or an application in measurement or data, use benchmarks and estimation as a check for reasonableness. These tactics serve both masters - improving student understanding and attitude and, at the same time, raising test scores. Actually, there may be a third - keeping students in the program longer than a month. Thanks for sharing this, Tina. I also would like to hear other's opinions. Myrna _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Tina_Luffman at yc.edu Sent: Friday, June 23, 2006 12:52 PM To: The Assessment Discussion List Subject: [Assessment 383] Re: Final questions on math Myrna and all, I believe the reason I place many students into numeracy workbooks first is that I want to be sure students can do basic adding, subtracting, multiplying, and dividing so that when they perform the other functions of fractions, decimals, percents, algebra, and geometry, that their incorrect responses are due to not understanding the new level of math rather than just making a calculation error. Of course, even we as instructors still make calculation errors, but I would like to minimize the frustration. I have had numerous students with math anxiety or other math disability get really frustrated with themselves when getting the answer wrong adding fractions, for example, when all they did was make a simple calculation error. Identifying where the error came from is paramount to these students' sense of capability. The primary reason I begin most students with basic calculation skills is due to our program's methodology. We are instructed as new instructors to give the TABE and then to give a math pretest even if the student is 11th grade + level. (One interesting finding I have is having recent high school students come to me able to do geometry and algebra, but they have forgotten how to divide. I am sure many of you have had similar experiences.) This pretest identifies challenge points in +, -, X, & ./. as well as decimals, fractions, percents, ratio, proportion, and measurement. Then we are instructed to place the students into the lowest level book and work them up to the top level using study guides. We are a highly linearly structured program. The reason for this linear system setup, I believe, is indeed because we often only see students for a month before they disappear, and we need to get an educational gain from these students in about as many weeks to maintain our funding. However, I do have the entire class do all lessons no matter what their grade level is on their most recent TABE. Even if a student just started coming to class and are at a fourth grade math level, he or she will do the algebra lesson with the rest of us. I am a firm believer that just because a student doesn't know his/her timestables yet, that doesn't mean that same student won't be able to understand that if 10 + a = 12, a = 2. I also believe that when a student sees that he/she can do algebra, it builds self esteem. I hope this answers you question. I hope to hear other people's ideas as well! Thanks, Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060624/be65aacb/attachment.html From mmanly at earthlink.net Sat Jun 24 17:37:39 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Sat, 24 Jun 2006 14:37:39 -0700 Subject: [Assessment 387] Re: Math as a puzzle, or swimming In-Reply-To: Message-ID: It is uncanny that David would mention puzzles and swimming as analogies to math anxiety. We just met a few months ago and he doesn't know that, not only am I a devoted puzzle person, but I taught swimming for many years! I started to write some detailed comparisons between the steps in learning to swim and similar benchmarks when learning to be comfortable with math, but that soon got too complex. However, I can confidently say that in both cases, an anxious person overcomes fear by becoming familiar with how the "medium" works. For example, the learner gains security by knowing how buoyancy is affected by the different positions of your body (you can float and glide on top of the water more effectively when your head is submerged) and how numbers act in consistent ways in various applications (the properties hold whether you are finding an average or a perimeter. Stimulating questions! Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of David Rosen Sent: Friday, June 23, 2006 3:59 AM To: The Assessment Discussion List; The Women and Literacy Discussion List Subject: [Assessment 379] Math as a puzzle, or swimming Myrna, Judy, and others, I am cross-posting this question to both the Women Literacy and Assessment lists, and hope that anyone who wishes to will join in. On National Public Radio Weekend Edition Sunday, in the Will Shortz "Puzzle Master" segment, the Public Radio host, Liane Hansen, often asks the contestant, "Are you a puzzle person?" How would you answer this question? For me, it's complicated. If I knew I wouldn't have to compete on the radio, and if I had as much time as I needed, I might say "sometimes," depending on the kind of puzzle. Those who would without waffling say yes, do not have "puzzle anxiety." They confidently dive into the deepest, coldest puzzle knowing that even if they thrash about they won't sink, and that they also know several strokes (strategies) in addition to treading water. Those who hesitate, qualify their "yes", or answer "no" have probably gulped water a few times, and it wasn't fun. They may be thinking that these waters are dangerous. So here's my question. How do you as a teacher help those who are not "puzzle people," or "math people," become more confident? Is it best for them to learn a few strokes first in shallow water? Or to dive right in to the deep parts with a buddy who can swim? What is the teacher's role as lifeguard? What are some strategies to help the most anxious to put their toes in the water? How do you help a mature fish to not feel foolish learning to swim next to all these smart fry swimming circles around them? How do you help an cautious swimmer become a strong swimmer? And since overcoming any anxiety is tough work, what do you tell your students is the reward? What's so great about swimming when you can enjoy sitting on a sunny beach or walking on the shore? And do you have any good stories? Let's hear about one of your students who was "aquaphobic" and who now loves to dive and to play water polo, or who at least can enjoy an occasional swim. How, exactly did that transformation happen? What was your role? David Rosen djrosen at comcast.net ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From djrosen at comcast.net Sat Jun 24 18:07:31 2006 From: djrosen at comcast.net (David Rosen) Date: Sat, 24 Jun 2006 18:07:31 -0400 Subject: [Assessment 388] Re: Final questions on math In-Reply-To: References: Message-ID: <95FBAD05-B8BE-4A70-BF8D-1967DE35012E@comcast.net> Hello Myrna, As you know, the National Council of Teachers of Mathematics (NCTM) several years ago established standards for the teaching of mathematics. These were the inspiration and basis for adult mathematics curriculum frameworks standards in my state, and possibly others. I believe these standards are embraced by the Adult Numeracy Network. Curriculum materials (EMPower, for example) have been developed to help teachers who want to teach to these standards. Do we have any adult math assessments which are congruent with these standards? If so, which ones? If the most widely used pre-post assessment for measuring adult math level gains (Educational Functional Level in NRS language) is the TABE, and if the TABE is measuring the wrong things, then doesn't that mean that teachers who are effectively teaching math in the right way, and students who are learning to think mathematically, to get the most important mathematics learning, are being punished and penalized? And , from a national perspective, doesn't that mean that the NRS math level gain data are not valid? Is it time for a revolution in adult numeracy assessment? Should the federal government be making that investment if the private sector is not interested? David J. Rosen newsomeassociates.com djrosen at comcast.net On Jun 24, 2006, at 3:01 PM, Myrna Manly wrote: > Tina and all, > > > > Each of your postings brings up so many interesting aspects. For > example, in Tina?s message below, I see that the program uses the > TABE to diagnose learner weaknesses in computation when they enter > the program and then focuses instruction on the computation basics > by using workbooks and study guides. It seems as though Tina is > agreeing with that policy in that the students show a quick gain in > skills and funding is maintained. That is certainly an important > aspect ? we need to keep the doors open. > > > > She also mentions that she includes all levels of students in an > algebra lesson and even the low-level learners gain self esteem > (and learn about variables and making generalizations) by > succeeding with it. However, it is safe to assume that this > learning does not have direct value in helping to raise their > scores when they retake the TABE for accountability purposes > because the test items at the early levels concern themselves > mostly with naked computation (no context). This exposes a huge > problem for math assessment, and in turn for math instruction, in > ABE. If there is value added to the students? learning by early > algebra exposure, it should be recognized in the accountability > measures. > > > > In the meantime, what can a program or its teachers do? You all > have made great suggestions: use manipulatives or realia to > facilitate deeper understanding of the operations with number, > teach computational skills along with conceptual understanding, > insist that computations be motivated by a real situation or an > application in measurement or data, use benchmarks and estimation > as a check for reasonableness. These tactics serve both masters ? > improving student understanding and attitude and, at the same time, > raising test scores. Actually, there may be a third ? keeping > students in the program longer than a month. > > > > Thanks for sharing this, Tina. I also would like to hear other?s > opinions. > > > > Myrna > > > > > > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On Behalf Of Tina_Luffman at yc.edu > Sent: Friday, June 23, 2006 12:52 PM > To: The Assessment Discussion List > Subject: [Assessment 383] Re: Final questions on math > > > > Myrna and all, > > > > I believe the reason I place many students into numeracy workbooks > first is that I want to be sure students can do basic adding, > subtracting, multiplying, and dividing so that when they perform > the other functions of fractions, decimals, percents, algebra, and > geometry, that their incorrect responses are due to not > understanding the new level of math rather than just making a > calculation error. Of course, even we as instructors still make > calculation errors, but I would like to minimize the frustration. I > have had numerous students with math anxiety or other math > disability get really frustrated with themselves when getting the > answer wrong adding fractions, for example, when all they did was > make a simple calculation error. Identifying where the error came > from is paramount to these students' sense of capability. > > > > The primary reason I begin most students with basic calculation > skills is due to our program's methodology. We are instructed as > new instructors to give the TABE and then to give a math pretest > even if the student is 11th grade + level. (One interesting finding > I have is having recent high school students come to me able to do > geometry and algebra, but they have forgotten how to divide. I am > sure many of you have had similar experiences.) This pretest > identifies challenge points in +, -, X, & ./. as well as decimals, > fractions, percents, ratio, proportion, and measurement. Then we > are instructed to place the students into the lowest level book and > work them up to the top level using study guides. We are a highly > linearly structured program. The reason for this linear system > setup, I believe, is indeed because we often only see students for > a month before they disappear, and we need to get an educational > gain from these students in about as many weeks to maintain our > funding. > > > > However, I do have the entire class do all lessons no matter what > their grade level is on their most recent TABE. Even if a student > just started coming to class and are at a fourth grade math level, > he or she will do the algebra lesson with the rest of us. I am a > firm believer that just because a student doesn't know his/her > timestables yet, that doesn't mean that same student won't be able > to understand that if 10 + a = 12, a = 2. I also believe that when > a student sees that he/she can do algebra, it builds self esteem. > > > > I hope this answers you question. I hope to hear other people's > ideas as well! > > Thanks, > > > Tina > > > > > Tina Luffman > Coordinator, Developmental Education > Verde Valley Campus > 928-634-6544 > tina_luffman at yc.edu > > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From djrosen at comcast.net Sun Jun 25 08:22:39 2006 From: djrosen at comcast.net (David Rosen) Date: Sun, 25 Jun 2006 08:22:39 -0400 Subject: [Assessment 389] Re: Math as a puzzle, or swimming In-Reply-To: References: Message-ID: <78F20CD9-83F1-49C8-8C3A-9F0821728A97@comcast.net> Hello Myrna, Please do write a detailed comparison between learning to swim, doing puzzles, and learning math -- on the ALE Wiki and/or for publication elsewhere. The complexities and details would be fascinating. Many of us could benefit, as teachers and learners, from your experience. David J. Rosen newsomeassociates.com djrosen at comcast.net On Jun 24, 2006, at 5:37 PM, Myrna Manly wrote: > It is uncanny that David would mention puzzles and swimming as > analogies to > math anxiety. We just met a few months ago and he doesn't know > that, not > only am I a devoted puzzle person, but I taught swimming for many > years! > > I started to write some detailed comparisons between the steps in > learning > to swim and similar benchmarks when learning to be comfortable with > math, > but that soon got too complex. However, I can confidently say that > in both > cases, an anxious person overcomes fear by becoming familiar with > how the > "medium" works. For example, the learner gains security by knowing how > buoyancy is affected by the different positions of your body (you > can float > and glide on top of the water more effectively when your head is > submerged) > and how numbers act in consistent ways in various applications (the > properties hold whether you are finding an average or a perimeter. > > Stimulating questions! > > Myrna > > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On > Behalf Of David Rosen > Sent: Friday, June 23, 2006 3:59 AM > To: The Assessment Discussion List; The Women and Literacy > Discussion List > Subject: [Assessment 379] Math as a puzzle, or swimming > > Myrna, Judy, and others, > > I am cross-posting this question to both the Women Literacy and > Assessment lists, and hope that anyone who wishes to will join in. > > On National Public Radio Weekend Edition Sunday, in the Will Shortz > "Puzzle Master" segment, the Public Radio host, Liane Hansen, often > asks the contestant, "Are you a puzzle person?" How would you answer > this question? For me, it's complicated. If I knew I wouldn't have to > compete on the radio, and if I had as much time as I needed, I might > say "sometimes," depending on the kind of puzzle. > > Those who would without waffling say yes, do not have "puzzle > anxiety." They confidently dive into the deepest, coldest puzzle > knowing that even if they thrash about they won't sink, and that they > also know several strokes (strategies) in addition to treading water. > Those who hesitate, qualify their "yes", or answer "no" have probably > gulped water a few times, and it wasn't fun. They may be thinking > that these waters are dangerous. > > So here's my question. How do you as a teacher help those who are > not "puzzle people," or "math people," become more confident? Is it > best for them to learn a few strokes first in shallow water? Or to > dive right in to the deep parts with a buddy who can swim? What is > the teacher's role as lifeguard? What are some strategies to help the > most anxious to put their toes in the water? How do you help a > mature fish to not feel foolish learning to swim next to all these > smart fry swimming circles around them? How do you help an cautious > swimmer become a strong swimmer? > > And since overcoming any anxiety is tough work, what do you tell your > students is the reward? What's so great about swimming when you can > enjoy sitting on a sunny beach or walking on the shore? > > And do you have any good stories? Let's hear about one of your > students who was "aquaphobic" and who now loves to dive and to play > water polo, or who at least can enjoy an occasional swim. How, > exactly did that transformation happen? What was your role? > > David Rosen > djrosen at comcast.net > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment David Rosen djrosen at comcast.net From donnac at gwi.net Sun Jun 25 11:35:37 2006 From: donnac at gwi.net (Donna Curry) Date: Sun, 25 Jun 2006 11:35:37 -0400 Subject: [Assessment 390] Re: Final questions on math References: Message-ID: <019c01c6986d$3e2fe020$8793c3d8@donnarq57f7nbg> Tina and all, Most adult education workbooks still use the scope and sequence that you've referred to below (add, subtract, etc. with whole numbers before moving on to fractions, then decimals, then percents . . . and, if students are fortunate enough to make it through all that, they tackle algebra). However, in K - 12, the curriculum is now typically based on the research on how people learn math. Algebra, data, number sense, and geometry are taught at all levels and are integrated. It's very rare to find an adult who cannot figure out what half of an amount is. Yet, we don't allow them to work on fractions until they have mastered all their whole number facts. It's actually much easier to figure what 1/2 of 240 is than to multiply 240 x 24 yet that's not the order of skills presented in typical adult ed. math workbooks. You can show students the application of number sense skills by having them explore geometry concepts such as perimeter and area. Students, when they draw pictures of rectangular shapes can add and group and see how multiplication is repeated addition. They can explore algebraic reasoning by looking at patterns in their lives: for example, if I make $10 an hour, I can make a simple table to show how much I would make in 2 hours, 5 hours, etc. Again, this gives them an opportunity to practice their basic facts but in the context of algebra. Those simple patterns (such as $$ per hour) lead to general rules, which can then eventually lead to expressions (total pay = $10 x number of hours worked). Given the opportunity to collect simple data (such as those who have children and those who do not), students can then create simple graphs and begin to make verbal comparisons, including fractions and percents (over 50% of the class have children). This early exposure to algebra, geometry, and data, along with number sense, will ensure that students can apply what they are learning in different situations rather than having to learn isolated skills before they understand how those skills are used in other contexts. Donna Donna Curry Center for Literacy Studies University of Tennessee, Knoxville ----- Original Message ----- From: Tina_Luffman at yc.edu To: The Assessment Discussion List Sent: Friday, June 23, 2006 4:52 PM Subject: [Assessment 383] Re: Final questions on math Myrna and all, I believe the reason I place many students into numeracy workbooks first is that I want to be sure students can do basic adding, subtracting, multiplying, and dividing so that when they perform the other functions of fractions, decimals, percents, algebra, and geometry, that their incorrect responses are due to not understanding the new level of math rather than just making a calculation error. Of course, even we as instructors still make calculation errors, but I would like to minimize the frustration. I have had numerous students with math anxiety or other math disability get really frustrated with themselves when getting the answer wrong adding fractions, for example, when all they did was make a simple calculation error. Identifying where the error came from is paramount to these students' sense of capability. The primary reason I begin most students with basic calculation skills is due to our program's methodology. We are instructed as new instructors to give the TABE and then to give a math pretest even if the student is 11th grade + level. (One interesting finding I have is having recent high school students come to me able to do geometry and algebra, but they have forgotten how to divide. I am sure many of you have had similar experiences.) This pretest identifies challenge points in +, -, X, & ./. as well as decimals, fractions, percents, ratio, proportion, and measurement. Then we are instructed to place the students into the lowest level book and work them up to the top level using study guides. We are a highly linearly structured program. The reason for this linear system setup, I believe, is indeed because we often only see students for a month before they disappear, and we need to get an educational gain from these students in about as many weeks to maintain our funding. However, I do have the entire class do all lessons no matter what their grade level is on their most recent TABE. Even if a student just started coming to class and are at a fourth grade math level, he or she will do the algebra lesson with the rest of us. I am a firm believer that just because a student doesn't know his/her timestables yet, that doesn't mean that same student won't be able to understand that if 10 + a = 12, a = 2. I also believe that when a student sees that he/she can do algebra, it builds self esteem. I hope this answers you question. I hope to hear other people's ideas as well! Thanks, Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -----assessment-bounces at nifl.gov wrote: ----- To: From: "Marie Cora" Sent by: assessment-bounces at nifl.gov Date: 06/23/2006 01:08PM Subject: [Assessment 382] Final questions on math Hi everyone, What a great and rich discussion this is, thank you all for making this so! There has been a wonderful exchange of interventions, techniques, and strategies to draw from in working with students in math. This is the last day with our guest, so I encourage you to ask any final questions or make any comments. Myrna asked this question in a post or two ago and I also would like to hear from folks on what they think: "What is influencing so many ABE teachers to focus only on numerical computation procedures at the early levels?" Is it the state standard documents, the workbooks, the NRS indicators, the TABE test, or their own experience learning math? What do you see out there? Myrna also mentioned ANN in another post, and I just wanted to make sure folks had the information on this: Adult Numeracy Network http://www.literacynet.org/ann/ Ok! Let?s hear any final thoughts! It?s your last chance! Marie Cora Assessment Discussion List Moderator ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------------------------------------------------------ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060625/b94b73cf/attachment.html From mmanly at earthlink.net Sun Jun 25 14:07:55 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Sun, 25 Jun 2006 11:07:55 -0700 Subject: [Assessment 391] Re: Final questions on math In-Reply-To: <95FBAD05-B8BE-4A70-BF8D-1967DE35012E@comcast.net> Message-ID: Hi David and all. Your statements in the message below are powerful ones. It is indeed time for a revolution in adult numeracy assessment. And, as in any revolution, we can use help from both the grass roots and the establishment. Although our present-day politicians seem to believe that they need to overstate their side of an argument in order to get what they really want when a compromise is reached, I think we should be clear from the start as to our position. So I will try to amend your statement a little (and welcome suggestions from others): "If the most widely used pre-post assessment for measuring adult math level gains (Educational Functional Level in NRS language) is the TABE, and if the TABE is measuring ONLY A NARROW BAND OF MATHEMATICAL SKILLS, then doesn't that mean that teachers who are effectively teaching MATHEMATICS IN A FULLER SENSE, and students who are learning to think mathematically to get the most important mathematics learning, are being punished and penalized? And, from a national perspective, doesn't that mean that the NRS math level gain data are not valid?" We are not saying that computation algorithms are the WRONG thing, but that they are not the ONLY thing. In addition, we do not yet have the kind of evidence that is required to say that our (and NCTM's) way of instruction is THE RIGHT way. But we are saying that the most important mathematics for today's requirements is not being assessed so the validity of the assessment is in question (using validity to mean the degree to which an instrument is representative of the domain it is testing.) I have heard numerous complaints about adult numeracy assessment from all areas of the country. In Massachusetts they funded the development of a new test. What do you know about its composition? Myrna -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of David Rosen Sent: Saturday, June 24, 2006 2:08 PM To: The Assessment Discussion List Subject: [Assessment 388] Re: Final questions on math Hello Myrna, As you know, the National Council of Teachers of Mathematics (NCTM) several years ago established standards for the teaching of mathematics. These were the inspiration and basis for adult mathematics curriculum frameworks standards in my state, and possibly others. I believe these standards are embraced by the Adult Numeracy Network. Curriculum materials (EMPower, for example) have been developed to help teachers who want to teach to these standards. Do we have any adult math assessments which are congruent with these standards? If so, which ones? If the most widely used pre-post assessment for measuring adult math level gains (Educational Functional Level in NRS language) is the TABE, and if the TABE is measuring the wrong things, then doesn't that mean that teachers who are effectively teaching math in the right way, and students who are learning to think mathematically, to get the most important mathematics learning, are being punished and penalized? Is it time for a revolution in adult numeracy assessment? Should the federal government be making that investment if the private sector is not interested? David J. Rosen newsomeassociates.com djrosen at comcast.net On Jun 24, 2006, at 3:01 PM, Myrna Manly wrote: > Tina and all, > > > > Each of your postings brings up so many interesting aspects. For > example, in Tina's message below, I see that the program uses the > TABE to diagnose learner weaknesses in computation when they enter > the program and then focuses instruction on the computation basics > by using workbooks and study guides. It seems as though Tina is > agreeing with that policy in that the students show a quick gain in > skills and funding is maintained. That is certainly an important > aspect - we need to keep the doors open. > > > > She also mentions that she includes all levels of students in an > algebra lesson and even the low-level learners gain self esteem > (and learn about variables and making generalizations) by > succeeding with it. However, it is safe to assume that this > learning does not have direct value in helping to raise their > scores when they retake the TABE for accountability purposes > because the test items at the early levels concern themselves > mostly with naked computation (no context). This exposes a huge > problem for math assessment, and in turn for math instruction, in > ABE. If there is value added to the students' learning by early > algebra exposure, it should be recognized in the accountability > measures. > > > > In the meantime, what can a program or its teachers do? You all > have made great suggestions: use manipulatives or realia to > facilitate deeper understanding of the operations with number, > teach computational skills along with conceptual understanding, > insist that computations be motivated by a real situation or an > application in measurement or data, use benchmarks and estimation > as a check for reasonableness. These tactics serve both masters - > improving student understanding and attitude and, at the same time, > raising test scores. Actually, there may be a third - keeping > students in the program longer than a month. > > > > Thanks for sharing this, Tina. I also would like to hear other's > opinions. > > > > Myrna > > > > > > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On Behalf Of Tina_Luffman at yc.edu > Sent: Friday, June 23, 2006 12:52 PM > To: The Assessment Discussion List > Subject: [Assessment 383] Re: Final questions on math > > > > Myrna and all, > > > > I believe the reason I place many students into numeracy workbooks > first is that I want to be sure students can do basic adding, > subtracting, multiplying, and dividing so that when they perform > the other functions of fractions, decimals, percents, algebra, and > geometry, that their incorrect responses are due to not > understanding the new level of math rather than just making a > calculation error. Of course, even we as instructors still make > calculation errors, but I would like to minimize the frustration. I > have had numerous students with math anxiety or other math > disability get really frustrated with themselves when getting the > answer wrong adding fractions, for example, when all they did was > make a simple calculation error. Identifying where the error came > from is paramount to these students' sense of capability. > > > > The primary reason I begin most students with basic calculation > skills is due to our program's methodology. We are instructed as > new instructors to give the TABE and then to give a math pretest > even if the student is 11th grade + level. (One interesting finding > I have is having recent high school students come to me able to do > geometry and algebra, but they have forgotten how to divide. I am > sure many of you have had similar experiences.) This pretest > identifies challenge points in +, -, X, & ./. as well as decimals, > fractions, percents, ratio, proportion, and measurement. Then we > are instructed to place the students into the lowest level book and > work them up to the top level using study guides. We are a highly > linearly structured program. The reason for this linear system > setup, I believe, is indeed because we often only see students for > a month before they disappear, and we need to get an educational > gain from these students in about as many weeks to maintain our > funding. > > > > However, I do have the entire class do all lessons no matter what > their grade level is on their most recent TABE. Even if a student > just started coming to class and are at a fourth grade math level, > he or she will do the algebra lesson with the rest of us. I am a > firm believer that just because a student doesn't know his/her > timestables yet, that doesn't mean that same student won't be able > to understand that if 10 + a = 12, a = 2. I also believe that when > a student sees that he/she can do algebra, it builds self esteem. > > > > I hope this answers you question. I hope to hear other people's > ideas as well! > > Thanks, > > > Tina > > > > > Tina Luffman > Coordinator, Developmental Education > Verde Valley Campus > 928-634-6544 > tina_luffman at yc.edu > > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From mmanly at earthlink.net Sun Jun 25 14:29:05 2006 From: mmanly at earthlink.net (Myrna Manly) Date: Sun, 25 Jun 2006 11:29:05 -0700 Subject: [Assessment 392] Re: Math as a puzzle, or swimming In-Reply-To: <000501c69786$60f468c0$81bd044b@Judith> Message-ID: Greetings, Judith has summed up the experience of this week of discussion very well. While every comment may not have been perfectly suited for your particular situation, we did learn from each other. My week as a guest on the list serve is over, but I will continue to be available here as just another member of the assessment discussion list. Thanks to Marie for arranging this opportunity to bring math and numeracy to the forefront and thanks to all of you who participated. Myrna _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Judith Diamond Sent: Saturday, June 24, 2006 4:05 AM To: 'The Assessment Discussion List' Subject: [Assessment 385] Re: Math as a puzzle, or swimming Hi Pam, One of the things that is so enjoyable about an online discussion is that occasionally you read something that changes the way you see things and that you will clip and save and use. Your idea of journals and how you structure them might not resonate with all students.for some writing is just another headache.but I can see it being a really meaningful connection both to math and to their teacher for others. Thanks. Judith Diamond Adult Learning Resource Center -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Mdr151 at aol.com Sent: Friday, June 23, 2006 10:52 PM To: assessment at nifl.gov Subject: [Assessment 384] Re: Math as a puzzle, or swimming One of the greatest joys of being a math teacher is to see students dread and hate of math turn to joy for math. One student that comes to mind just recently graduated from a technical college. She shared with me that it took her 9 years to consider coming back to school and facing math. I was so fortunate to have her for a student and watch her fear of math melt away. One way I address math phobia is through journal writing. Students are required to respond to a math prompt, provide examples, make connections and lastly to reflect on their learning. It is in the reflections that the fear and struggles are revealed. I journal back, giving them encouragement and offering suggestions. This particular student's journals showed her excitement in finally understanding math and seeing connections. She continues today to send me emails about math websites she has found or successes she has had since being in college.She was so proud to share that she got an A in technical physics and graduated with honors. I truly believe we must develop a safe environment for math learners, one in which they dare to take risks and not feel stupid in trying and one that offers support in their stuggles. Pam Meader President, Adult Numeracy Network -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060625/d15949f3/attachment.html From marie.cora at hotspurpartners.com Mon Jun 26 11:56:47 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 26 Jun 2006 11:56:47 -0400 Subject: [Assessment 393] Discussion Thanks! Message-ID: <005d01c69939$21907a20$6f49e947@LITNOW> Dear Colleagues, I would like to thank Myrna Manly for being our Guest last week and facilitating a super conversation on math and assessment! Thanks as well to all of you who contributed to the discussion - I'm sure subscribers found the discussion as interesting and as informative as I did. I will prepare the Math Discussion in user-friendly format, and post it this week at the following locations for your use (I'll send out an alert when it's posted): * The archive on the NIFL Discussion Lists Homepage (http://www.nifl.gov/lincs/discussions/discussions.html): click on the Guest Speaker button in the left toolbar for the archives of Guests on the NIFL Lists. * The ALE Wiki Assessment Section (http://wiki.literacytent.org/index.php/Assessment_Information): click on Discussions. * You can read the entire thread now if you go to the Assessment archive at: http://www.nifl.gov/mailman/listinfo/Assessment: click on Read Current Posted Messages. Thanks again to Myrna for leading such a wonderful discussion on math and assessment. For anyone who wishes to continue the discussion, I encourage you to do so - there were a number of points and questions raised toward the end of the week that I've no doubt folks have opinions about. Thanks to all again, Marie Cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060626/eaaf9995/attachment.html From mdick at lagcc.cuny.edu Mon Jun 26 13:46:54 2006 From: mdick at lagcc.cuny.edu (Mae Dick) Date: Mon, 26 Jun 2006 13:46:54 -0400 Subject: [Assessment 394] Re: Math anxiety and assessment In-Reply-To: References: <4583411.1151007242913.JavaMail.bddavis@butlercc.edu> Message-ID: <449FE54E0200000A0019FC94@mailgate.lagcc.cuny.edu> Hi David et al. I'm sorry, but the adult literacy site I recommended for math resources is intended as an internal one, and is not supposed to be public. Putting out the username and password was an error. Having said that, what is it you are interested in? If you email Steve at steve.hinds at mail.cuny.edu he could make suggestions for math material appropriate for your students. >>> David Rosen 06/22/06 4:23 PM >>> Beverly and others, On Jun 22, 2006, at 4:14 PM, Beverly Davis wrote: > Mae, I went to the website you listed and couldn't get to any math > problems. Could you tell me what I am doing wrong. Thank you! 1. Go to http://www.literacy.cuny.edu/ 2. Select "Resources" 3. At the prompt, type in the user name literacy and the password resources06 David Rosen djrosen at comcast.ne > > > Mae Dick wrote: > > >> Hi there. I thought you might be interested in a math resource >> that was > developed by Steve Hinds, a staff developer for adult literacy > programs > in the City University of New York. Here's a quote from a workshop > Steve recently offered at the Literacy Assistance Center in NYC. He > says " Adult Literacy programs traditionally limit students in > low-level classes to computation practice out of workbooks. Algebra, > data and geometry topics are considered too difficult for these > students until they have 'mastered the basics.' Steve believes that > students can increase their mathematical reasoning, number sense and > enjoyment of math through the kinds of exercises he presents on the > CUNY web site. Check it out. Go to . The user > name is literacy and the password is resources06. >> >>>>> "Myrna Manly" 06/20/06 5:03 PM >>> >> Hi Lisa, >> I'm happy to hear that you and your students are enjoying the book. >> Introducing algebraic thinking early in student's math study has now > become >> widespread in the reform math efforts in K-12. (It is also a hallmark > of the >> new EMPower series for adults.) In 1992, when I wrote the first >> edition > of >> the book, I based my early-algebra-integration decision on my own > experience >> as one who had taught algebra to students at many levels and as an > 'insider' >> with respect to the GED Math test. (I had just left my job at GEDTS.) >> >> The overarching principle when formulating items for the GED math >> test > is to >> assess the "major and lasting outcomes and skills of a high school >> education." For the most part, this means that the skills and >> concepts > that >> are tested are ones that have some practical value. With respect to > algebra, >> I felt that using the concept of a variable, solving simple >> equations, > and >> graphing linear functions were the most obvious topics to be > represented. >> >> As an algebra teacher, I had seen the difficulty that students had in > making >> the transition to using variables and had added extra lessons to the >> textbooks that reviewed arithmetic principles by using variables in > place of >> specific numbers - that is, I used algebra to generalize arithmetic. >> So, it was an easy decision for me to integrate algebra early - both > from a >> mathematics pedagogy standpoint and from an adult student attitude >> perspective (knowing that many feel insulted by a review of >> arithmetic > even >> if their entrance scores indicate that need). >> >> As to your question about the results obtained when students are > introduced >> to algebraic ideas early in their mathematics study, I'm afraid >> that I > have >> no data to substantiate better scores sooner. (That topic may be one > that a >> practitioner would like to investigate as a project for the ANN > practitioner >> research grants.) >> >> Thanks for the question, >> Myrna >> >> >> >> -----Original Message----- >> From: assessment-bounces at nifl.gov [mailto:assessment- >> bounces at nifl.gov] > On >> Behalf Of Lisa Mullins >> Sent: Tuesday, June 20, 2006 7:02 AM >> To: The Assessment Discussion List >> Subject: [Assessment 368] Re: Math anxiety and assessment >> >> Myrna, >> In your book The Problem Solver you tackle algebraic >> concepts in the very beginning of the book. This is >> in contrast to many books on the market. I use this >> technique as well. My students are caught by the fact >> that algebra (a scary term for some) is so simple and >> can be used for many reasons. However, some people are >> skeptical that this will result in better scores or >> better understanding. >> >> Can you discuss the contrasts of learning math >> beginning with whole numbers and working up to >> algrebra versus using algebra as a problem solving >> method with all number systems throughout the math >> learning process. Are the results better scores >> sooner? >> >> Thanks, >> Lisa Mullins >> Hawkins County Adult Ed >> Rogersville, Tennessee >> >> http://www.nifl.gov/mailman/listinfo/assessment >> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment >> > > --------------------------------------- > It is not the load that breaks you, it > is the way you carry it. > > Beverly Davis > ABE/GEDButler Community College > Instructional Coordinator > (316) 321-4030, ext. 113 > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment t ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From mnguyen at nifl.gov Wed Jun 28 11:41:24 2006 From: mnguyen at nifl.gov (My Linh Nguyen) Date: Wed, 28 Jun 2006 11:41:24 -0400 (EDT) Subject: [Assessment 395] National Institute for Literacy Appoints New Staff Message-ID: <20060628154124.4535546555@dev.nifl.gov> Hello everyone, My name is My Linh Nguyen, and I am the new Associated Director of Communications for the National Institute for Literacy. I would like to take this opportunity to introduce myself along with three other new hires at the Institute. The press release below introduces each of the four new hires and identifies our roles and backgrounds. Thank you, and I look forward to working with all of you. National Institute for Literacy Appoints New Staff The National Institute for Literacy has appointed four new staff members to expand its capacity to contribute to improvements in adult, adolescent, childhood and early childhood literacy. Andrea Grimaldi has joined the Institute as Senior Project Officer in Early Childhood Literacy. She will be responsible for planning and managing the Institute's work on early childhood literacy. She will also oversee dissemination of the National Early Literacy Panel Report, expected to be released in late 2006. My Linh Nguyen has joined the Institute as Associate Director of Communications. In this capacity, she will plan and manage communications activities designed raise awareness of literacy issues and the Institute's products and services. The two other new staffers will join the Institute in July. Susan Boorse, who will serve as Executive Officer, will have responsibility for budgetary and financial management activities as well as administrative functions. Heather Wright has been selected to serve as Dissemination Specialist. In this capacity, she will plan and oversee implementation of the Institute's print and electronic products to ensure that they are widely and easily available. "The arrival of these four individuals signals an exciting new era for the National Institute for Literacy," said Dr. Sandra Baxter, the Institute's Director. "Each of them brings fresh ideas and new perspectives along with their diverse backgrounds and expertise. Their presence will reenergize our existing programs and help us carry out new efforts to better serve the adult and childhood literacy communities." All four new appointees have a wide variety of experience in their fields. Ms. Grimaldi has more than a decade of experience as an early childhood education teacher, program manager, and trainer. Most recently, she served as the training manager for professional development with the Public Broadcasting Service's (PBS) five-year Ready To Learn Initiative to prepare young children for success in school through educational television, web-based media and training for parents and teachers. Ms. Nguyen comes to the Institute from the Delaware River Port Authority of Pennsylvania and New Jersey, where she served as manager of corporate communications and public information officer. She is a former general assignment reporter and copy editor for the St. Louis Post-Dispatch. Ms. Boorse served in the Peace Corps for eight years as an administrative and budget officer. She also served four years as a VISTA volunteer, working with a literacy program in Philadelphia, a rural school district in Mississippi, and a fledgling Philadelphia Habitat for Humanity affiliate. Ms. Wright comes to the Institute from the Montgomery County Department of Public Libraries in Maryland, where, as Children's Librarian, she was involved in early literacy programs for babies, toddlers, preschoolers and elementary-school children. Her previous professional background is in the field of marketing research, where she conducted many research studies to measure the needs of customers and target markets, both in the library setting and in the private sector. The National Institute for Literacy provides leadership on literacy issues, including the improvement of reading instruction for children, youth, and adults. In consultation with the U.S. Departments of Education, Labor, and Health and Human Services, the Institute serves as a national resource on current, comprehensive literacy research, practice, and policy. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From marie.cora at hotspurpartners.com Thu Jun 29 12:33:27 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 29 Jun 2006 12:33:27 -0400 Subject: [Assessment 396] Math discussion available Message-ID: <001d01c69b99$c0b5f690$6f49e947@LITNOW> Hi everyone, I've prepared the Math Discussion with Myrna and it is now posted at the ALE Wiki: http://wiki.literacytent.org/index.php/Assessment_Information and click on Discussions. This will also be posted shortly at the NIFL Assessment Discussion Guest Archives: http://www.nifl.gov/lincs/discussions/list_guests.html I hope you find this useful - if you do use it in some way, please let me know how you have done so. This is really helpful information for us - the more we know about how you use the Discussion Lists, the better we can serve your needs. Also - I just wanted to note that the math discussion on the Women & Lit List is carrying on and it continues to be extremely interesting. Thanks!! Marie Cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060629/854cbb68/attachment.html From marie.cora at hotspurpartners.com Fri Jun 30 09:02:52 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 30 Jun 2006 09:02:52 -0400 Subject: [Assessment 397] FW: [AAACE-NLA] testing Message-ID: <000f01c69c45$7f92ee50$6f49e947@LITNOW> Dear Colleagues, I thought you would find this email exchange of interest. The following response to both Janet's and Hal's posts comes from Bob Bickerton. (Read from the bottom up to get the thread in the correct order!). Thoughts? Comments? Is your state satisfied with the elements that make up its accountability system? Is anyone else developing their own assessments? How do we hold onto the good purpose of standardization ("level the playing field") and improve how we (or test publishers) understand the scope of content as well as test design? Do you find that our present landscape of tests and testing "adds value" or not? What do you think? Marie Cora Assessment Discussion List Moderator ______________________________________ Dear Colleagues, Janet's question is key as are the observations from Hal Beder's study. There is an aspect of this conversation that I hope will be explored and vetted on this list and that is: Under what circumstances does testing have/add value, to what extent, and what other measures must accompany the value that testing (may) bring? My concern is that the volume of criticism of standardized testing leads some (many?) to a conclusion that there is no such thing as "added value" by such testing. I disagree. Much of the criticism of standardized testing is right on target. Many such tests are poorly designed. Perhaps even worse, many standardized tests, whether well or poorly designed, are used for purposes for which they were not designed and are not appropriate for. Complicating this second point is that many test marketers are all to ready to blur the line between what is and is not an appropriate use of the test -- for them, testing ethics come second after profits. When I started as an adult educator (some 35 years ago) I was an adamant opponent of standardized testing -- primarily based on what I had heard in the media, re, test bias -- particularly when used on African American and other "minority" populations. There was already some pressure back then to use standardized tests so I decided the learn everything I could about the "enemy" so I could push back with "facts" rather than just my opinions and my emotional response. While teaching (part time in 3 different programs) I completed four semesters of relatively advanced statistics and two courses in tests and measures. What I learned is that it isn't as clear cut as I had thought. I have continued to read the research and study the pros and cons of standardized testing (I highly recommend reading the "Standards for Educational and Psychological Testing," AERA, APA, NCME) and have concluded that it is possible to develop and conduct standardized testing that adds real value, but that it is extraordinarily difficult (and expensive) to do so. In response to all the problems we were aware of with the TABE and other standardized tests in Massachusetts, we entered into a partnership with the REEP program in Arlington, VA to adapt their ESOL writing assessment to our purposes, helped field test and ultimately adopted the BEST+ for ESOL oral assessment, and this July 1st we will be transitioning from using the TABE (we held our noses for 2 years) to a MAPT, a brand new ABE online test we developed in partnership with the U.Mass Center for Educational Assessment -- a portion of which is computer adaptive. We've invested a lot of time and money (including hundreds of adult educators and thousands of students) to align this test with the learning standards in our curriculum frameworks (more than 5 years in the making and based on the work of hundreds of adult educators), to pilot the test getting reams of feedback from teachers and students including many face-to-face interviews and real time observations, and we believe the test we begin implementing in a few days WILL add value to the teaching and learning process as well as provide more valid, reliable, fair and legitimate data for our state's ABE performance accountability system. I've copied Jane Schwerdtfeger because she's labored tirelessly in the office -- but even more in the field with our colleagues to help us reach this point. Thank you Jane! Each state will approach these issues differently -- and for the most part, this can be good. What I believe we shouldn't do is: 1. Accept the claims of test publishers. It's the test users responsibility to determine how appropriate a test is for the intended use. 2. Write off all standardized tests as bad and hope they'll just go away -- or spend hours commiserating when they don't. We can use our time better than that. [NOTE: the "Standards" referenced above took 10 years of debate among AERA, APA and NCME during the very time when criticisms of standardized testing were at their peak. Many of those involved in the debate had very strong reservations about the quality of many standardized tests and how they were often misused. The standards are designed to address these very issues.] take care, bob bickerton, MA associate commissioner of education and come 7/1/06 past chair of NCSDAE/NAEPDC (welcome back to the chair role to Israel Mendoza of WA) and former MA state director of adult education (3 cheers for Anne Serino, MA SDAE!). -----Original Message----- From: aaace-nla-bounces at lists.literacytent.org [mailto:aaace-nla-bounces at lists.literacytent.org]On Behalf Of Janet Isserlis Sent: Wednesday, June 28, 2006 9:14 AM To: National Literacy Advocacy List sponsored by AAACE Subject: [AAACE-NLA] testing AND what do the tests actually test? Janet Isserlis > From: Jon Steinberg > Reply-To: National Literacy Advocacy List sponsored by AAACE >> Date: Tue, 27 Jun 2006 11:48:37 -0400 > > Re: Adult Literacy Assessment > Another NCSALL publication, "Lessons from NCSALL's Outcomes and Impacts > Study" by Hal Beder in FOB (http://www.ncsall.net/?id=386) summarizes an > analysis of 17 studies of adult literacy programs. This article notes > that although standardized tests showed little evidence of progress, > most learners asserted that they had made significant gains. Beder > offers various hypotheses that might explain this discrepancy. At a > minimum, his analysis should make us wary of asserting that > disappointing test results accurately measure the effectiveness of adult > education programs even though the students in them are so convinced > they are learning that they attend class week after week, often despite > great obstacles. As Marx (Groucho) said, "Who are you going to believe, > me or your own eyes?" > _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org From edl at world.std.com Thu Jun 29 16:55:25 2006 From: edl at world.std.com (Esther D Leonelli) Date: Thu, 29 Jun 2006 16:55:25 -0400 Subject: [Assessment 398] Re: Math discussion available In-Reply-To: <001d01c69b99$c0b5f690$6f49e947@LITNOW> Message-ID: Hello, Marie - some of the math discussion spilled over on to the Numeracy list as well with people responding to the NIFL announcement which I cross-posted there. If you want I will copy them also to your discussion although they may be more responsive to the women & literacy list. Thanks for hosting the discussion and thanks to Myrna for a lively week of math talk. Esther Moderator, ANN Numeracy list (numeracy at world.std.com) On Thu, 29 Jun 2006, Marie Cora wrote: > Hi everyone, > > I've prepared the Math Discussion with Myrna and it is now posted at the > ALE Wiki: http://wiki.literacytent.org/index.php/Assessment_Information > and click on Discussions. > > This will also be posted shortly at the NIFL Assessment Discussion Guest > Archives: http://www.nifl.gov/lincs/discussions/list_guests.html > > I hope you find this useful - if you do use it in some way, please let > me know how you have done so. This is really helpful information for us > - the more we know about how you use the Discussion Lists, the better we > can serve your needs. > > Also - I just wanted to note that the math discussion on the Women & Lit > List is carrying on and it continues to be extremely interesting. > > Thanks!! > > Marie Cora > Assessment Discussion List Moderator > > > > From marie.cora at hotspurpartners.com Fri Jun 30 09:28:02 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 30 Jun 2006 09:28:02 -0400 Subject: [Assessment 399] Re: Math discussion available In-Reply-To: Message-ID: <000501c69c49$037961b0$6f49e947@LITNOW> Hi Esther - thanks for this. FYI everyone: to subscribe to the Numeracy Discussion List and to view the archives, go to: http://www.literacynet.org/ann/numeracylist.html Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Esther D Leonelli Sent: Thursday, June 29, 2006 4:55 PM To: Marie Cora Cc: Assessment at nifl.gov Subject: [Assessment 398] Re: Math discussion available Hello, Marie - some of the math discussion spilled over on to the Numeracy list as well with people responding to the NIFL announcement which I cross-posted there. If you want I will copy them also to your discussion although they may be more responsive to the women & literacy list. Thanks for hosting the discussion and thanks to Myrna for a lively week of math talk. Esther Moderator, ANN Numeracy list (numeracy at world.std.com) On Thu, 29 Jun 2006, Marie Cora wrote: > Hi everyone, > > I've prepared the Math Discussion with Myrna and it is now posted at the > ALE Wiki: http://wiki.literacytent.org/index.php/Assessment_Information > and click on Discussions. > > This will also be posted shortly at the NIFL Assessment Discussion Guest > Archives: http://www.nifl.gov/lincs/discussions/list_guests.html > > I hope you find this useful - if you do use it in some way, please let > me know how you have done so. This is really helpful information for us > - the more we know about how you use the Discussion Lists, the better we > can serve your needs. > > Also - I just wanted to note that the math discussion on the Women & Lit > List is carrying on and it continues to be extremely interesting. > > Thanks!! > > Marie Cora > Assessment Discussion List Moderator > > > > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From bgiven at gmu.edu Fri Jun 30 09:46:03 2006 From: bgiven at gmu.edu (Barbara K Given) Date: Fri, 30 Jun 2006 09:46:03 -0400 Subject: [Assessment 400] Re: FW: [AAACE-NLA] testing In-Reply-To: <000f01c69c45$7f92ee50$6f49e947@LITNOW> References: <000f01c69c45$7f92ee50$6f49e947@LITNOW> Message-ID: since my colleagues and I are in the throws of developing a computer-driven adult assessment tool in reading, I'd like to learn more about the MAPT to avoid redundancy. We have a prototype developed and are moving forward with item development, but we are a long way from having a tool ready for field testing. When we get that far, I trust there will be practitioners out there willing to assist us in standardization. Yes, standardization is essential even for tests used to measure specific progress. barb given Barbara K. Given, Ph.D. Director, Adolescent and Adult Learning Research Center Krasnow Institute for Advanced Study, and Director, Center for Honoring Individual Learning Diversity, an International Learning Styles Center George Mason University Fairfax, VA 22030-4444 Fax: 703-993-4325 Ph: 703-993-4406 ----- Original Message ----- From: Marie Cora Date: Friday, June 30, 2006 9:02 am Subject: [Assessment 397] FW: [AAACE-NLA] testing > Dear Colleagues, > > I thought you would find this email exchange of interest. The > followingresponse to both Janet's and Hal's posts comes from Bob > Bickerton.(Read from the bottom up to get the thread in the > correct order!). > > Thoughts? Comments? Is your state satisfied with the elements that > make up its accountability system? Is anyone else developing > their own > assessments? How do we hold onto the good purpose of standardization > ("level the playing field") and improve how we (or test publishers) > understand the scope of content as well as test design? Do you find > that our present landscape of tests and testing "adds value" or not? > What do you think? > > Marie Cora > Assessment Discussion List Moderator > ______________________________________ > > > Dear Colleagues, > > Janet's question is key as are the observations from Hal Beder's > study.There is an aspect of this conversation that I hope will be > explored and > vetted on this list and that is: Under what circumstances does > testinghave/add value, to what extent, and what other measures > must accompany > the > value that testing (may) bring? > > My concern is that the volume of criticism of standardized testing > leadssome (many?) to a conclusion that there is no such thing as > "addedvalue" by > such testing. I disagree. > > Much of the criticism of standardized testing is right on target. > Manysuch > tests are poorly designed. Perhaps even worse, many standardized > tests,whether well or poorly designed, are used for purposes for > which they > were > not designed and are not appropriate for. Complicating this second > point is > that many test marketers are all to ready to blur the line between > whatis > and is not an appropriate use of the test -- for them, testing ethics > come > second after profits. > > When I started as an adult educator (some 35 years ago) I was an > adamantopponent of standardized testing -- primarily based on what > I had heard > in > the media, re, test bias -- particularly when used on African American > and > other "minority" populations. There was already some pressure > back then > to > use standardized tests so I decided the learn everything I could about > the > "enemy" so I could push back with "facts" rather than just my opinions > and > my emotional response. While teaching (part time in 3 different > programs) I > completed four semesters of relatively advanced statistics and two > courses > in tests and measures. What I learned is that it isn't as clear > cut as > I > had thought. I have continued to read the research and study the pros > and > cons of standardized testing (I highly recommend reading the > "Standardsfor > Educational and Psychological Testing," AERA, APA, NCME) and have > concluded > that it is possible to develop and conduct standardized testing that > adds > real value, but that it is extraordinarily difficult (and > expensive) to > do > so. > > In response to all the problems we were aware of with the TABE and > otherstandardized tests in Massachusetts, we entered into a > partnership with > the > REEP program in Arlington, VA to adapt their ESOL writing > assessment to > our > purposes, helped field test and ultimately adopted the BEST+ for ESOL > oral > assessment, and this July 1st we will be transitioning from using the > TABE > (we held our noses for 2 years) to a MAPT, a brand new ABE online test > we > developed in partnership with the U.Mass Center for Educational > Assessment > -- a portion of which is computer adaptive. We've invested a lot of > time > and money (including hundreds of adult educators and thousands of > students) > to align this test with the learning standards in our curriculum > frameworks > (more than 5 years in the making and based on the work of hundreds of > adult > educators), to pilot the test getting reams of feedback from teachers > and > students including many face-to-face interviews and real time > observations, > and we believe the test we begin implementing in a few days WILL add > value > to the teaching and learning process as well as provide more valid, > reliable, fair and legitimate data for our state's ABE performance > accountability system. I've copied Jane Schwerdtfeger because she's > labored > tirelessly in the office -- but even more in the field with our > colleagues > to help us reach this point. Thank you Jane! > > Each state will approach these issues differently -- and for the most > part, > this can be good. What I believe we shouldn't do is: > 1. Accept the claims of test publishers. It's the test users > responsibility to determine how appropriate a test is for the intended > use. > 2. Write off all standardized tests as bad and hope they'll just go > away -- > or spend hours commiserating when they don't. We can use our time > better > than that. > > [NOTE: the "Standards" referenced above took 10 years of debate among > AERA, > APA and NCME during the very time when criticisms of standardized > testing > were at their peak. Many of those involved in the debate had very > strong > reservations about the quality of many standardized tests and how they > were > often misused. The standards are designed to address these very > issues.] > > take care, > bob bickerton, MA associate commissioner of education and come 7/1/06 > past > chair of NCSDAE/NAEPDC (welcome back to the chair role to Israel > Mendozaof > WA) and former MA state director of adult education (3 cheers for Anne > Serino, MA SDAE!). > > -----Original Message----- > From: aaace-nla-bounces at lists.literacytent.org > [mailto:aaace-nla-bounces at lists.literacytent.org]On Behalf Of Janet > Isserlis > Sent: Wednesday, June 28, 2006 9:14 AM > To: National Literacy Advocacy List sponsored by AAACE > Subject: [AAACE-NLA] testing > > > AND > > what do the tests actually test? > > Janet Isserlis > > > > From: Jon Steinberg > > Reply-To: National Literacy Advocacy List sponsored by AAACE > >> Date: Tue, 27 Jun 2006 11:48:37 -0400 > > > > Re: Adult Literacy Assessment > > Another NCSALL publication, "Lessons from NCSALL's Outcomes and > Impacts > > Study" by Hal Beder in FOB (http://www.ncsall.net/?id=386) > summarizesan > > analysis of 17 studies of adult literacy programs. This article > notes> that although standardized tests showed little evidence of > progress,> most learners asserted that they had made significant > gains. Beder > > offers various hypotheses that might explain this discrepancy. > At a > > minimum, his analysis should make us wary of asserting that > > disappointing test results accurately measure the effectiveness of > adult > > education programs even though the students in them are so convinced > > they are learning that they attend class week after week, often > despite > > great obstacles. As Marx (Groucho) said, "Who are you going to > believe, > > me or your own eyes?" > > > > _______________________________________________ > AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org > http://lists.literacytent.org/mailman/listinfo/aaace-nla > LiteracyTent: web hosting, news, community and goodies for literacy > http://literacytent.org > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From edl at world.std.com Fri Jun 30 10:31:25 2006 From: edl at world.std.com (Esther D Leonelli) Date: Fri, 30 Jun 2006 10:31:25 -0400 Subject: [Assessment 401] Re: Math discussion available In-Reply-To: <000501c69c49$037961b0$6f49e947@LITNOW> Message-ID: Hello, Marie - unfortunately the Numeracy list is no longer archived on the NIFL LINCS web-site. The Numeracy list is still archived on The Math Forum at Drexel: http://mathforum.org/kb/forum.jspa?forumID=219 Folks can also subscribe without joining ANN (although we welcome all adult math educators to do so): To subscribe to the Numeracy list, write to: majordomo at world.std.com In the message area, type: subscribe numeracy Click here now to subscribe. There is also a Numeracy Research and Practice topic section on the ALE Wiki - http://wiki.literacytent.org/index.php/Numeracy_Research_and_Practice I will post a link there to your discussion section and copy some of the Numeracy list discussion there. Esther Moderator, Numeracy list ALE Wikiteer for Numeracy On Fri, 30 Jun 2006, Marie Cora wrote: > Hi Esther - thanks for this. FYI everyone: to subscribe to the > Numeracy Discussion List and to view the archives, go to: > > http://www.literacynet.org/ann/numeracylist.html > > Marie Cora > Assessment Discussion List Moderator > > > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > On Behalf Of Esther D Leonelli > Sent: Thursday, June 29, 2006 4:55 PM > To: Marie Cora > Cc: Assessment at nifl.gov > Subject: [Assessment 398] Re: Math discussion available > > Hello, Marie - some of the math discussion spilled over on to the > Numeracy > list as well with people responding to the NIFL announcement which I > cross-posted there. If you want I will copy them also to your discussion > although they may be more responsive to the women & literacy list. > > Thanks for hosting the discussion and thanks to Myrna for a lively week > of > math talk. > > Esther > > Moderator, ANN Numeracy list > (numeracy at world.std.com) > > > On Thu, 29 Jun 2006, Marie Cora wrote: > > > Hi everyone, > > > > I've prepared the Math Discussion with Myrna and it is now posted at > the > > ALE Wiki: > http://wiki.literacytent.org/index.php/Assessment_Information > > and click on Discussions. > > > > This will also be posted shortly at the NIFL Assessment Discussion > Guest > > Archives: http://www.nifl.gov/lincs/discussions/list_guests.html > > > > I hope you find this useful - if you do use it in some way, please let > > me know how you have done so. This is really helpful information for > us > > - the more we know about how you use the Discussion Lists, the better > we > > can serve your needs. > > > > Also - I just wanted to note that the math discussion on the Women & > Lit > > List is carrying on and it continues to be extremely interesting. > > > > Thanks!! > > > > Marie Cora > > Assessment Discussion List Moderator > > > > > > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > From barguedas at sfccnm.edu Fri Jun 30 11:11:11 2006 From: barguedas at sfccnm.edu (Barbara Arguedas) Date: Fri, 30 Jun 2006 09:11:11 -0600 Subject: [Assessment 402] Re: FW: [AAACE-NLA] testing Message-ID: <4CFDD6B88B634C409A76C0F44B3509BE029C47CD@ex01.sfcc.edu> Thank you for the great dialogue on standardized assessment. Bob Bickerton says "...this July 1st we will be transitioning from using the TABE (we held our noses for 2 years) to a MAPT, a brand new ABE online test we developed in partnership with the U.Mass Center for Educational Assessment -- a portion of which is computer adaptive." May I ask if this test -- the MAPT -- is available to other states to purchase? I understand that there was a large investment made so I would expect that there would be a cost for others to use it, if in fact it is found to meet the needs of programs outside Massachusetts. It seems we should also adopt the curriculum standards that the MAPT is correlated to. Anyway, just wondering about how to access this instrument rather than reinventing. Thank you. Barbara Arguedas ABE Director Santa Fe Community College Santa Fe, NM -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Friday, June 30, 2006 7:03 AM To: Assessment at nifl.gov Subject: [Assessment 397] FW: [AAACE-NLA] testing Dear Colleagues, I thought you would find this email exchange of interest. The following response to both Janet's and Hal's posts comes from Bob Bickerton. (Read from the bottom up to get the thread in the correct order!). Thoughts? Comments? Is your state satisfied with the elements that make up its accountability system? Is anyone else developing their own assessments? How do we hold onto the good purpose of standardization ("level the playing field") and improve how we (or test publishers) understand the scope of content as well as test design? Do you find that our present landscape of tests and testing "adds value" or not? What do you think? Marie Cora Assessment Discussion List Moderator ______________________________________ Dear Colleagues, Janet's question is key as are the observations from Hal Beder's study. There is an aspect of this conversation that I hope will be explored and vetted on this list and that is: Under what circumstances does testing have/add value, to what extent, and what other measures must accompany the value that testing (may) bring? My concern is that the volume of criticism of standardized testing leads some (many?) to a conclusion that there is no such thing as "added value" by such testing. I disagree. Much of the criticism of standardized testing is right on target. Many such tests are poorly designed. Perhaps even worse, many standardized tests, whether well or poorly designed, are used for purposes for which they were not designed and are not appropriate for. Complicating this second point is that many test marketers are all to ready to blur the line between what is and is not an appropriate use of the test -- for them, testing ethics come second after profits. When I started as an adult educator (some 35 years ago) I was an adamant opponent of standardized testing -- primarily based on what I had heard in the media, re, test bias -- particularly when used on African American and other "minority" populations. There was already some pressure back then to use standardized tests so I decided the learn everything I could about the "enemy" so I could push back with "facts" rather than just my opinions and my emotional response. While teaching (part time in 3 different programs) I completed four semesters of relatively advanced statistics and two courses in tests and measures. What I learned is that it isn't as clear cut as I had thought. I have continued to read the research and study the pros and cons of standardized testing (I highly recommend reading the "Standards for Educational and Psychological Testing," AERA, APA, NCME) and have concluded that it is possible to develop and conduct standardized testing that adds real value, but that it is extraordinarily difficult (and expensive) to do so. In response to all the problems we were aware of with the TABE and other standardized tests in Massachusetts, we entered into a partnership with the REEP program in Arlington, VA to adapt their ESOL writing assessment to our purposes, helped field test and ultimately adopted the BEST+ for ESOL oral assessment, and this July 1st we will be transitioning from using the TABE (we held our noses for 2 years) to a MAPT, a brand new ABE online test we developed in partnership with the U.Mass Center for Educational Assessment -- a portion of which is computer adaptive. We've invested a lot of time and money (including hundreds of adult educators and thousands of students) to align this test with the learning standards in our curriculum frameworks (more than 5 years in the making and based on the work of hundreds of adult educators), to pilot the test getting reams of feedback from teachers and students including many face-to-face interviews and real time observations, and we believe the test we begin implementing in a few days WILL add value to the teaching and learning process as well as provide more valid, reliable, fair and legitimate data for our state's ABE performance accountability system. I've copied Jane Schwerdtfeger because she's labored tirelessly in the office -- but even more in the field with our colleagues to help us reach this point. Thank you Jane! Each state will approach these issues differently -- and for the most part, this can be good. What I believe we shouldn't do is: 1. Accept the claims of test publishers. It's the test users responsibility to determine how appropriate a test is for the intended use. 2. Write off all standardized tests as bad and hope they'll just go away -- or spend hours commiserating when they don't. We can use our time better than that. [NOTE: the "Standards" referenced above took 10 years of debate among AERA, APA and NCME during the very time when criticisms of standardized testing were at their peak. Many of those involved in the debate had very strong reservations about the quality of many standardized tests and how they were often misused. The standards are designed to address these very issues.] take care, bob bickerton, MA associate commissioner of education and come 7/1/06 past chair of NCSDAE/NAEPDC (welcome back to the chair role to Israel Mendoza of WA) and former MA state director of adult education (3 cheers for Anne Serino, MA SDAE!). -----Original Message----- From: aaace-nla-bounces at lists.literacytent.org [mailto:aaace-nla-bounces at lists.literacytent.org]On Behalf Of Janet Isserlis Sent: Wednesday, June 28, 2006 9:14 AM To: National Literacy Advocacy List sponsored by AAACE Subject: [AAACE-NLA] testing AND what do the tests actually test? Janet Isserlis > From: Jon Steinberg > Reply-To: National Literacy Advocacy List sponsored by AAACE >> Date: Tue, 27 Jun 2006 11:48:37 -0400 > > Re: Adult Literacy Assessment > Another NCSALL publication, "Lessons from NCSALL's Outcomes and Impacts > Study" by Hal Beder in FOB (http://www.ncsall.net/?id=386) summarizes an > analysis of 17 studies of adult literacy programs. This article notes > that although standardized tests showed little evidence of progress, > most learners asserted that they had made significant gains. Beder > offers various hypotheses that might explain this discrepancy. At a > minimum, his analysis should make us wary of asserting that > disappointing test results accurately measure the effectiveness of adult > education programs even though the students in them are so convinced > they are learning that they attend class week after week, often despite > great obstacles. As Marx (Groucho) said, "Who are you going to believe, > me or your own eyes?" > _______________________________________________ AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org http://lists.literacytent.org/mailman/listinfo/aaace-nla LiteracyTent: web hosting, news, community and goodies for literacy http://literacytent.org ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Thu Jul 6 09:05:38 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 6 Jul 2006 09:05:38 -0400 Subject: [Assessment 403] Persistence Study to be discussed on Special Topics Discussion List Message-ID: <005501c6a0fc$e0efc310$0302a8c0@LITNOW> Colleagues, On the Special Topics discussion list from July 10th-18th, Dr. John Comings, Director of the National Center for the Study of Adult Learning and Literacy (NCSALL), will be a guest to discuss his research on student persistence in adult literacy education. John introduces the discussion this way: "When a group of us at World Education were preparing to write the proposal for the funds that have supported NCSALL, we surveyed practitioners and policy makers around the country to help us design our research agenda. Almost 500 people participated in the survey. We asked the survey participants to send us the questions that they wanted answered to help them improve practice in ABE, ESOL, and GED programs. One question was at the top of the list for teachers and second on everyone else's list. One teacher phrased it this way, "Just when they begin to make progress, many students leave the program. How can I keep those students long enough that they can meet their educational goals?" That question formed the basis of a three-phase study of persistence. The first two phases are complete. The first phase surveyed the literature, interviewed 150 students in the six New England states, and identified ways that programs were trying to support the persistence of their students. The report of that first phase can be found at: In the second phase, 9 library literacy programs were provided with funds to implement interventions that might help improve persistence, and our study team observed the programs and interviewed their staff and students. We also followed a cohort of 180 students for 14 months. The report of that second phase can be found at: We are prepared to implement the third phase, but NCSALL no longer has funding to begin a new research project. This next phase would test three interventions. One would add persistence supports to existing classroom programs, one would use a wide range of modes of learning (in programs and through self study on-line and in other ways) that more closely match the way adults manage their learning, and the third would combine these two approaches. I believe the third approach is a promising way to solve the persistence problem, as well as it can be solved. I'm looking forward to your questions, but I would also be interested in practical ideas of how to build support to persistence and how to expand opportunities for learning." ----- Special Topics is an intermittent discussion list. The topics open and close throughout the year, so there are periods where there will be no discussion or postings. You can subscribe to the e-list for a particular topic of interest, and then unsubscribe, or you can stay subscribed throughout the year. To participate in this topic, you can subscribe by going to: http://www.nifl.gov/mailman/listinfo/specialtopics David J. Rosen Special Topics Discussion List Moderator djrosen at comcast.net From marie.cora at hotspurpartners.com Fri Jul 7 20:28:50 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 7 Jul 2006 20:28:50 -0400 Subject: [Assessment 404] Testing Message-ID: <001b01c6a225$7cdbb120$0302a8c0@LITNOW> Dear Colleagues, The following post is from Tom Sticht. Marie Cora Assessment Discussion List Moderator ________________________________________________________________________ ___ In discussing test development efforts Marie Cora asked: "How do we hold onto the good purpose of standardization ("level the playing field") and improve how we (or test publishers) understand the scope of content as well as test design? My reply: The purpose of "standardization" in testing is not to "level the playing field" if by that we think that all examinees start from the same "level" of knowledge and/or skill. Rather, the purpose of standardization is to have the testing situation approach the methods of experimental research in which all variables are held constant or standard from one test situation to another while the experimental variable, say reading, is the only variable which is allowed to vary and affect the test performance. This is the purpose of "standardization." But note that this is not the same as "norming" or "scaling" the tests. The latter result from the need to develop scales of measurement to interpret the scores on the tests in some way so that the test performance is indicative of something beyond the test scores themselves. One can conduct standardized testing without providing a norm-referenced scale for interpreting the test scores. A teacher who gives a 100 point test to students about what was taught in the preceding six weeks may make certain that everyone gets the same standards of test administration and then interpret the test scores as 90 -100 gets a A, 80-89 get a B, 70-79 a C and so forth. This is a criterion-referenced means of interpreting the scores. In a norm-referenced approach, the teacher would look at the distribution of test scores and decide that 10 percent of students will get As, 20 percent Bs, 30 percent Cs, 20 percent Ds, and the remainder Fs. One then divides the distribution of scores so that these percentages are obtained. In this case, scores of 60-75 may get As, 45-59, Bs etc. This is known frequently as "grading on the curve." In general, the tests made by publishers (e.g., TABE, ABLE, CASAS, NAAL) are both "normed" in some way, and administered in some standardized manner (e.g., time limits are specified, all take the tests in similar conditions to those used in developing the tests, etc. Interestingly, it is generally the conditions of standardization rather than other factors (e.g., content knowledge) that are permitted to vary when accommodations for the learning disabled are made). The second part of Marie's question "and improve how we (or test publishers) understand the scope of content as well as test design?" falls into these questions of about how we go about developing scales of measurement that are useful for various purposes. Tom Sticht From kabeall at comcast.net Mon Jul 10 19:19:18 2006 From: kabeall at comcast.net (Kaye Beall) Date: Mon, 10 Jul 2006 19:19:18 -0400 Subject: [Assessment 404] New from NCSALL--Adult Student Persistence Message-ID: <002401c6a477$4523dec0$0202a8c0@your4105e587b6> Study Circle Guide: Adult Student Persistence Newly revised to include the second phase of the NCSALL research on adult student persistence, this guide provides comprehensive instructions for facilitating a 10?-hour study circle. It explores what the research says about adult student persistence and ideas for how to apply what is learned in classrooms and programs. The guide is based on a review of the NCSALL research on adult student persistence conducted by John Comings and others, summarized in an article entitled ?Supporting the Persistence of Adult Basic Education Students? and other studies on student motivation and retention. It includes articles, resources, and action research reports to help practitioners consider strategies for increasing adult student persistence. This guide provides all the necessary materials and clear instructions to plan and facilitate a three-session study circle with an option for a fourth. Each session lasts three-and-a-half hours. To download the study circle guide, visit NCSALL?s Web site: http://www.ncsall.net/?id=896 **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060710/5ba8e9de/attachment.html From MMaralit at NIFL.gov Wed Jul 12 10:08:29 2006 From: MMaralit at NIFL.gov (Maralit, Mary Jo) Date: Wed, 12 Jul 2006 10:08:29 -0400 Subject: [Assessment 405] Persistence Among Adult Education Students Video and transcript Message-ID: <4062487BDB6029428A763CAEF4E1FE5B0B932FA0@wdcrobe2m03.ed.gov> The National Institute for Literacy and the National Center for the Study of Adult Learning and Literacy present Persistence Among Adult Education Students Panel Discussion This 30 minute video focuses on persistence in ABE, ESOL, and GED programs, and features a NCSALL study entitled, "Supporting the Persistence of Adult Basic Education Students." Dr. John Cummings presentation examines student persistence in adult education programs. He presents a working definition of persistence, examines existing research, and describes NCSALL's three-phase study of the factors that support and inhibit persistence. Other panelist include two practitioners, Kathy Endaya and Ernest Best. You will find the video streamed and transcript by going to: http://www.nifl.gov/nifl/webcasts/persistence/persistence_cast.html You may need to cut and paste the whole web address in your browser, or you could try this shorter version: http://tinyurl.com/s6tcu Macintosh users will need to select the Quicktime format for viewing the presentation. The DVD of the panel will be available within the next two months, for more information, contact info at nifl.gov. Also, it is not too late to join in on the Special Topics list discussion with Dr. Cummings, for more information, go to: http://www.nifl.gov/pipermail/specialtopics/2006/000088.html Jo Maralit National Institute for Literacy mmaralit at nifl.gov http://www.nifl.gov/ From marie.cora at hotspurpartners.com Mon Jul 17 07:58:46 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 17 Jul 2006 07:58:46 -0400 Subject: [Assessment 406] Questions on ESOL Level Descriptors in NRS Message-ID: <008901c6a998$5c0125a0$0302a8c0@LITNOW> Good morning, afternoon, and evening to you all. I hope this email finds you well. I wanted to let everyone know that during this week, Larry Condelli from AIR (American Institutes of Research), who works with the NRS, and Sarah Young, from CAL (Center for Applied Linguistics) who works with BEST Plus will be available to answer any questions you might have regarding the changes in ESOL Level Descriptors, which go into effect this summer (this month I believe). I also encourage anyone who has questions regarding other ESOL tests (CASAS or EFF for example) to join in this Q&A. Because the Level Descriptors have been adjusted, the tests used to track learning gains also have undergone some shifting and it is important that we understand what these changes are. Larry and Sarah will be present on the List during this week, but perhaps intermittently - replies may not come immediately. I encourage you to post your question to the List, or to send your question to me for posting, if you prefer that. Larry, Sarah, and others working with any of the ESOL tests - feel free to jump in and give us a thumbnail sketch of what the changes are and how they might affect our work in programs and with students. The NRS homepage is located at: http://www.nrsweb.org/ To view information on the NRS Level Descriptors, please go to: http://www.nrsweb.org/reports/NewESLdescriptors.pdf At the bottom of the NRS homepage, see also: NRS Changes for Program Year 2006 Thanks so much - I'm looking forward to understanding this information, and hearing what folks questions are regarding the changes. Marie T. Cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060717/d72a26d9/attachment.html From marie.cora at hotspurpartners.com Mon Jul 17 11:45:28 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 17 Jul 2006 11:45:28 -0400 Subject: [Assessment 407] FW: [SpecialTopics 141] Keeping students' eyes on the prize Message-ID: <00d701c6a9b8$09cb41b0$0302a8c0@LITNOW> Dear colleagues, The following post if from David Rosen and the Special Topics List, which is hosting a discussion on student persistence now. I thought perhaps some of you might be interested in David's second paragraph especially, in which he asks a series of questions on what might help students to persist when their goals are quite far away. For example, David asks about recognition ceremonies, formative assessment, and awarding certificates for incremental achievements, something that was asked on this List a couple of months back, first by Howard Dooley, and then followed up by the email I am pasting below from Ramsey Ludlow: Hello- I believe Howard Dooley was asking whether any program awards certificates to adults for skills attainment- not whether teachers are certified: "Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessments are being used?" I'd be very interested to hear answers to this; there are many learners who attain useful and marketable skills, but who are many semesters away from a diploma. Do programs award certifications that a learner show a potential employer what the learner knows and is able to do- before a learner has attained a diploma or GED? thank you, Ramsey Ludlow Oxford Hills Adult Education 256 Main Street South Paris, Maine 04281 ludlowr at sad17.k12.me.us Your thoughts? Reactions? Suggestions? Do any folks out there "formally" recognize formative assessment results or incremental achievements? What do you do? How does it work? Is it successful? Please share your thoughts and experiences. marie cora Assessment Discussion List Moderator ********************************************************************* -----Original Message----- From: specialtopics-bounces at nifl.gov [mailto:specialtopics-bounces at nifl.gov] On Behalf Of David Rosen Sent: Sunday, July 16, 2006 4:26 PM To: Specialtopics at nifl.gov Subject: [SpecialTopics 141] Keeping students' eyes on the prize John, and other colleagues, Part of the persistence challenge is that some adult learners make progress very slowly and have so very far to go before they see the prize they may have their eyes on. The prize might be a high school diploma, a better job, a living wage, a good job with a decent salary and good benefits, or going to college, but these may be basic literacy or beginning English language students who need years of study to achieve one of these goals. One answer might be to increase intensity, more time on task, more hours of study. But this is not always possible for programs, because they lack the funding to increase intensity of classroom instruction, or for learners, who usually have other commitments like working and parenting. Funders -- especially companies when they fund "workplace literacy" -- often want results in a few weeks or months, and even major federal and state funders want results at the end of the fiscal year, either one of these prizes or evidence of progress toward its attainment. Are there some ways we could sustain the student's original motive or goal (the GED diploma, a good job, or an admission to college prize) over several years, if needed. What do we know about strategies like awarding certificates for small achievements, holding annual recognition ceremonies, and providing good formative assessment so students can see they have reached some milestones? How about strategies like building community, providing food, helping students to learn skills that they can use in daily living? Can we articulate from research and/or professional wisdom what strategies work (if any) in sustaining long-term students' motivation and convince funders that we need their support for these strategies? David J. Rosen djrosen at comcast.net From bonniesophia at adelphia.net Mon Jul 17 12:18:45 2006 From: bonniesophia at adelphia.net (Bonnie Odiorne) Date: Mon, 17 Jul 2006 12:18:45 -0400 Subject: [Assessment 408] Re: FW: [SpecialTopics 141] Keeping students' eyes onthe prize In-Reply-To: <00d701c6a9b8$09cb41b0$0302a8c0@LITNOW> Message-ID: <007901c6a9bc$adb8c0f0$0202a8c0@PC979240272114> Re: persistence Sorry I can't join another list serve just now, but I can add this, which I hope Marie will forward, and since it concerns "measurable outcomes," I would think addresses assessment interests as well. When I was involved in a WIA-funded Adult Ed/Technology/Workplace program, CT, in accordance w/ NRS guidelines, was not able to count increased test scores as a positive outcome. We had students who were GED or HS graduates so could not report their degree. Still, in testing scores, or in less tangible areas such as study skills, behavior, organization, time-management and level of discourse and/or comprehension that would concern college transition skills, they were far from ready to go on for higher education. Our program could not justify continued funding based on the need for skills improvement alone. And given the economic downturn that continues in this area, they couldn't obtain jobs that allowed them to utilize the increased education and technological skills they did have, so they were little better off than when they started, which I'd imagine would have an impact on persistence even were our program allowed to continue to serve them. Bonnie Odiorne, Ph.D. Director, Writing Center, Adjunct faculty University Learning Center, Post University Waterbury, CT -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, July 17, 2006 11:45 AM To: Assessment at nifl.gov Subject: [Assessment 407] FW: [SpecialTopics 141] Keeping students' eyes onthe prize Dear colleagues, The following post if from David Rosen and the Special Topics List, which is hosting a discussion on student persistence now. I thought perhaps some of you might be interested in David's second paragraph especially, in which he asks a series of questions on what might help students to persist when their goals are quite far away. For example, David asks about recognition ceremonies, formative assessment, and awarding certificates for incremental achievements, something that was asked on this List a couple of months back, first by Howard Dooley, and then followed up by the email I am pasting below from Ramsey Ludlow: Hello- I believe Howard Dooley was asking whether any program awards certificates to adults for skills attainment- not whether teachers are certified: "Does your state have a process for recognizing skill attainment before a secondary credential is awarded? Is it considering one? If so, what standards are included? What levels of performance are recognized? What assessments are being used?" I'd be very interested to hear answers to this; there are many learners who attain useful and marketable skills, but who are many semesters away from a diploma. Do programs award certifications that a learner show a potential employer what the learner knows and is able to do- before a learner has attained a diploma or GED? thank you, Ramsey Ludlow Oxford Hills Adult Education 256 Main Street South Paris, Maine 04281 ludlowr at sad17.k12.me.us Your thoughts? Reactions? Suggestions? Do any folks out there "formally" recognize formative assessment results or incremental achievements? What do you do? How does it work? Is it successful? Please share your thoughts and experiences. marie cora Assessment Discussion List Moderator ********************************************************************* -----Original Message----- From: specialtopics-bounces at nifl.gov [mailto:specialtopics-bounces at nifl.gov] On Behalf Of David Rosen Sent: Sunday, July 16, 2006 4:26 PM To: Specialtopics at nifl.gov Subject: [SpecialTopics 141] Keeping students' eyes on the prize John, and other colleagues, Part of the persistence challenge is that some adult learners make progress very slowly and have so very far to go before they see the prize they may have their eyes on. The prize might be a high school diploma, a better job, a living wage, a good job with a decent salary and good benefits, or going to college, but these may be basic literacy or beginning English language students who need years of study to achieve one of these goals. One answer might be to increase intensity, more time on task, more hours of study. But this is not always possible for programs, because they lack the funding to increase intensity of classroom instruction, or for learners, who usually have other commitments like working and parenting. Funders -- especially companies when they fund "workplace literacy" -- often want results in a few weeks or months, and even major federal and state funders want results at the end of the fiscal year, either one of these prizes or evidence of progress toward its attainment. Are there some ways we could sustain the student's original motive or goal (the GED diploma, a good job, or an admission to college prize) over several years, if needed. What do we know about strategies like awarding certificates for small achievements, holding annual recognition ceremonies, and providing good formative assessment so students can see they have reached some milestones? How about strategies like building community, providing food, helping students to learn skills that they can use in daily living? Can we articulate from research and/or professional wisdom what strategies work (if any) in sustaining long-term students' motivation and convince funders that we need their support for these strategies? David J. Rosen djrosen at comcast.net ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Tina_Luffman at yc.edu Mon Jul 17 13:27:43 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Mon, 17 Jul 2006 10:27:43 -0700 Subject: [Assessment 409] Re: FW: [SpecialTopics 141] Keeping students' eyes on the prize Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060717/96a822af/attachment.html From marie.cora at hotspurpartners.com Mon Jul 17 23:04:20 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 17 Jul 2006 23:04:20 -0400 Subject: [Assessment 410] NRS changes--message for the discussion list Message-ID: <006b01c6aa16$de0ec470$0302a8c0@LITNOW> The following post is from Lisa McKinney. Hello, Marie, Thanks for the opportunity to ask questions about the NRS's ESOL level descriptors and recent changes. I have two questions: 1) My director recently told me that we will still be able to use the BEST Oral test for our NRS reporting, even though I've been told for months that the test would no longer be accepted after September 1, 2006. Are we still able to use the Oral BEST (not BEST Plus, the old Oral test)? 2) What is the rationale for the 2nd level being so much larger than all the other levels? Yes, I know it was recently changed from a range of 8-41 to a range of 8-36, but it's still larger than any other level. Why? Thanks, Lisa McKinney Coosa Valley Technical College Calhoun, GA lmckinney at coosavalleytech.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060717/04fa9f07/attachment.html From marie.cora at hotspurpartners.com Tue Jul 18 11:32:53 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 18 Jul 2006 11:32:53 -0400 Subject: [Assessment 411] FW: [SpecialTopics 149] Re: Keeping students' eyes on the prize Message-ID: <000d01c6aa7f$7018a670$0302a8c0@LITNOW> Colleagues, The following post is from John Strucker, in response to the discussion focused on capturing incremental learning gains or goals. Marie Cora Assessment Discussion List Moderator ************************************************ Hi David and colleagues, One part of a total approach to improved persistence that we should explore is the one they are trying in the UK. Their adult students take a series of nationally developed curriculum-based benchmark tests that give them feedback on their mastery of various specific competencies and also give them a sense of how much closer they are getting to reaching their long-term goals. Best, John Strucker --On Sunday, July 16, 2006 4:25 PM -0400 David Rosen wrote: > John, and other colleagues, > > Part of the persistence challenge is that some adult learners make > progress very slowly and have so very far to go before they see the > prize they may have their eyes on. The prize might be a high school > diploma, a better job, a living wage, a good job with a decent salary > and good benefits, or going to college, but these may be basic > literacy or beginning English language students who need years of > study to achieve one of these goals. One answer might be to increase > intensity, more time on task, more hours of study. But this is not > always possible for programs, because they lack the funding to > increase intensity of classroom instruction, or for learners, who > usually have other commitments like working and parenting. Funders > -- especially companies when they fund "workplace literacy" -- often > want results in a few weeks or months, and even major federal and > state funders want results at the end of the fiscal year, either one > of these prizes or evidence of progress toward its attainment. > > Are there some ways we could sustain the student's original motive or > goal (the GED diploma, a good job, or an admission to college prize) > over several years, if needed. What do we know about strategies > like awarding certificates for small achievements, holding annual > recognition ceremonies, and providing good formative assessment so > students can see they have reached some milestones? How about > strategies like building community, providing food, helping students > to learn skills that they can use in daily living? Can we articulate > from research and/or professional wisdom what strategies work (if > any) in sustaining long-term students' motivation and convince > funders that we need their support for these strategies? > > David J. Rosen > djrosen at comcast.net > > > > ------------------------------- > National Institute for Literacy > Special Topics mailing list > SpecialTopics at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/SpecialTopics John Strucker, EdD Nichols House 303 Harvard Graduate School of Education 7 Appian Way Cambridge, MA 02138 617 495 4745 617 495 4811 (fax) ------------------------------- National Institute for Literacy Special Topics mailing list SpecialTopics at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/SpecialTopics From sarah at cal.org Tue Jul 18 13:19:44 2006 From: sarah at cal.org (Sarah Young) Date: Tue, 18 Jul 2006 13:19:44 -0400 Subject: [Assessment 411] Re: NRS changes--message for the discussion list Message-ID: <7E0B624DDF68104F92C38648A4D93D8F01845653@MAIL.cal.local> Hi Lisa and all, Thanks for your questions. In a nutshell: 1) The Center for Applied Linguistics (CAL) is retiring the BEST oral interview on September 30, 2006 and will no longer distribute it after that date. Adult ESL programs can continue to use existing supplies of the BEST oral interview, or they can purchase it until September 30. The BEST oral interview can continue to be used for federal reporting purposes even after September 30, 2006, provided that it is on your state's approved list of assessments. It will be up to each state (and, ultimately, the NRS) to determine how long the BEST oral interview will be approved for NRS reporting. The rationale behind the retirement of the BEST oral interview is to better focus on our much newer and improved oral language assessment, BEST Plus. For general information about BEST Plus, please visit the website at www.best-plus.net or email best-plus at cal.org. An additional note for those of you who use the BEST literacy skills section: The current BEST literacy skills section will not be affected by the retirement of the BEST oral interview. However, an updated version of the BEST literacy skills section is in development. This new version will be called BEST Literacy. Forms B and C will be made current without affecting the basic integrity of the test, and a new, parallel form will be available - Form D. Programs will be able to pre-test with the old version of BEST Literacy and posttest with the updated forms. BEST Literacy is scheduled for launch on October 1, 2006. There will be no change in pricing of these forms. More information can be found here: http://www.cal.org/BEST/ 2) Your second question is referring to BEST Literacy (Student Performance Level 2 = new Low Beginning ESL level with a score range of 8-35). As you may know, all of CAL's adult ESL assessments are correlated to Student Performance Levels (SPLs). For background information on SPLs, CAL assessments, and the NRS educational functioning level changes, see http://www.cal.org/BEST/NRSchanges.pdf . When the NRS level changes took effect on July 1, ESL assessments that are correlated to SPLs were also affected. Although 8-35 may seem like a wide range, the description of an SPL 2 in reading and writing best matches student performances on BEST Literacy within that range. Reading and writing SPL descriptors have been posted by the Center for Adult English Language Acquisition (CAELA) here: http://www.cal.org/caela/esl%5Fresources/ I hope this has been helpful.. Please feel free to shoot more questions our way! Thanks, Sarah Young Center for Applied Linguistics 4646 40th St. NW Washington, DC 20016 Phone: (202) 362-0700 ext. 529 Fax: (202) 362-3740 Web: www.cal.org Email: sarah at cal.org CAL: "Improving communication through better understanding of language and culture" ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, July 17, 2006 11:04 PM To: Assessment at nifl.gov Subject: [Assessment 410] NRS changes--message for the discussion list The following post is from Lisa McKinney. Hello, Marie, Thanks for the opportunity to ask questions about the NRS's ESOL level descriptors and recent changes. I have two questions: 1) My director recently told me that we will still be able to use the BEST Oral test for our NRS reporting, even though I've been told for months that the test would no longer be accepted after September 1, 2006. Are we still able to use the Oral BEST (not BEST Plus, the old Oral test)? 2) What is the rationale for the 2nd level being so much larger than all the other levels? Yes, I know it was recently changed from a range of 8-41 to a range of 8-36, but it's still larger than any other level. Why? Thanks, Lisa McKinney Coosa Valley Technical College Calhoun, GA lmckinney at coosavalleytech.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060718/ac985ce1/attachment.html From marie.cora at hotspurpartners.com Tue Jul 18 13:47:29 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 18 Jul 2006 13:47:29 -0400 Subject: [Assessment 412] BEST Plus questions Message-ID: <003b01c6aa92$3e04d8d0$0302a8c0@LITNOW> The following post if from Mary Jane Jerde. *************************************************** Greetings, This is primarily for the BEST Plus person. It seemed worthwhile to use BEST Plus in my multilevel Even Start ESL class for several reasons. (The old BEST was an excellent assessment tool for entry into another program where I previously taught. CASAS has some out of date items, and the repetition of Forms 53 and 54 is not good for assessment; this is especially obvious when students test into Forms 55 and 56 and go into shock. Finally, BEST Plus is to assess the students' comprehension and production of English.) Unfortunately, in our program we've had trouble getting consistent results from assessor to assessor. Yes, we all heard in training that the correlation was over 95%, but we've found that a student's ability to speak clearly seems to pull her/his overall score up. For example, a student may receive a top mark, 3, for oral communication, while really answering very briefly and simply, and end up with a high score. Also, some questions merely require a yes or no response. For an academically or linguistically savvy beginner, it's easy to follow the grammar pattern and respond appropriately. Do you have any ideas on how to counteract these tendencies to score students higher than their normal speech? Thanks, Mary Jane Jerde mjjerdems at yahoo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060718/d944dc4c/attachment.html From dsnell at racineliteracy.com Tue Jul 18 14:52:11 2006 From: dsnell at racineliteracy.com (Diane Snell) Date: Tue, 18 Jul 2006 13:52:11 -0500 Subject: [Assessment 413] Re: BEST Plus questions In-Reply-To: <003b01c6aa92$3e04d8d0$0302a8c0@LITNOW> Message-ID: <20060718185250.1B6F611D0C@mail.nifl.gov> Currently we "team" administer the BEST Plus. It seems that I (the ESL Coordinator) consistently score higher on the Communication section and lower on the Language Complexity than does our ABE Coordinator. We started using the team approach so that we would be consistent. I still do the BEST Oral (not plus) for low level ESL learners. The team approach is very labor intensive and we are going to have to go it alone soon. Diane Diane K. Snell ESL Education Coordinator Racine Literacy Council 734 Lake Avenue Racine, WI 53403 262-632-9495 www.racineliteracy.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060718/99a53a30/attachment.html From alantoops at cs.com Tue Jul 18 15:42:23 2006 From: alantoops at cs.com (Alan Toops) Date: Tue, 18 Jul 2006 15:42:23 -0400 Subject: [Assessment 414] Re: BEST Plus questions In-Reply-To: <003b01c6aa92$3e04d8d0$0302a8c0@LITNOW> Message-ID: Mary, CASAS is releasing a completely revised ESL piece with natural language prompts and revised items. Alan Toops Executive Director Ohio Literacy Network (614) 505-0717 atoops at ohioliteracynetwork.org http://www.ohioliteracynetwork.org On 7/18/06 1:47 PM, "Marie Cora" wrote: > The following post if from Mary Jane Jerde. > *************************************************** > > Greetings, > > This is primarily for the BEST Plus person. > > It seemed worthwhile to use BEST Plus in my multilevel Even Start ESL > class for several reasons. (The old BEST was an excellent assessment > tool for entry into another program where I previously taught. CASAS has some > out of date items, and the repetition of Forms 53 and 54 is not good for > assessment; this is especially obvious when students test into Forms 55 and 56 > and go into shock. Finally, BEST Plus is to assess the students' > comprehension and production of English.) > > Unfortunately, in our program we've had trouble getting consistent results > from assessor to assessor. Yes, we all heard in training that the correlation > was over 95%, but we've found that a student's ability to speak clearly seems > to pull her/his overall score up. For example, a student may receive a top > mark, 3, for oral communication, while really answering very briefly and > simply, and end up with a high score. Also, some questions merely require a > yes or no response. For an academically or linguistically savvy beginner, it's > easy to follow the grammar pattern and respond appropriately. > > Do you have any ideas on how to counteract these tendencies to score students > higher than their normal speech? > > Thanks, > > Mary Jane Jerde > mjjerdems at yahoo.com > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060718/88477ac4/attachment.html From sarah at cal.org Thu Jul 20 00:59:34 2006 From: sarah at cal.org (Sarah Young) Date: Thu, 20 Jul 2006 00:59:34 -0400 Subject: [Assessment 415] Re: BEST Plus questions Message-ID: <7E0B624DDF68104F92C38648A4D93D8F129AF8@MAIL.cal.local> Hi Mary Jane, Thanks for sharing some of the background about your use of BEST Plus in a multilevel class. Your question about accurately scoring lower level proficiency students with BEST Plus often comes up. I?m not sure if this discussion list would be the best place to get into the details of scoring the different components of BEST Plus, as there is so much that can be said about it. To very briefly answer your question, the level of proficiency of the examinee has no fundamental bearing on her ability to communicate her response in a comprehensible way. For example, if you asked me, ?Do you like ice cream?? and I responded, ?Yes,? you would have no problem understanding my response. True, it?s brief and simple, but it was clearly communicated. In other words, communication ability shouldn?t be confused with level of language complexity. However, if your program has noticed some inconsistent data or test results from one assessor to another, there may be other factors at play. Any language assessment that requires test administrators to rate a language sample (rather than simply scoring a multiple choice test, for example) must have standardized administration and scoring procedures to ensure reliability. These procedures are accompanied by benchmark samples that correspond to scores on the rating scale (or scoring rubric). As you know, BEST Plus test administrators are required to complete a 6-hour training to learn how to score accurately using the scoring rubric. As with any language assessment that relies on language samples, BEST Plus test administrators sometimes need to ?recalibrate? their understanding of the scoring rubric and benchmarks. This recalibration can be done in a few different ways: - an individual or group review of the Test Administrator Guide, scoring rubric, and benchmark samples - a consensus scoring session of a videotaped (or live) test administration in your program - has anyone out there conducted similar refresher sessions with your test administrators, for BEST Plus or another assessment? Since you brought up scoring accuracy, you may be interested in some new materials that we?re getting ready to release in September. It will be a toolkit for conducting BEST Plus scoring refresher sessions (at the program level) and will include a new scoring refresher training video with additional benchmark samples and practices, pass/fail scoring activities, a test administrator workbook, and guidelines for planning and implementing scoring refreshers for programs and facilitators. This toolkit will not replace the original 6-hour BEST Plus training, but rather serves as an additional tool for test administrators who have already been trained and approved, but who wish to refresh their scoring abilities. If you (or anyone else) have specific questions about BEST Plus scoring or would like more information about the Scoring Refresher Toolkit, please feel free to contact me off the list. :) Thanks, Sarah Young Center for Applied Linguistics 4646 40th St. NW Washington, DC 20016 (202) 362-0700 ext. 529 sarah at cal.org www.cal.org -----Original Message----- From: assessment-bounces at nifl.gov on behalf of Marie Cora Sent: Tue 7/18/2006 1:47 PM To: Assessment at nifl.gov Subject: [Assessment 412] BEST Plus questions The following post if from Mary Jane Jerde. *************************************************** Greetings, This is primarily for the BEST Plus person. It seemed worthwhile to use BEST Plus in my multilevel Even Start ESL class for several reasons. (The old BEST was an excellent assessment tool for entry into another program where I previously taught. CASAS has some out of date items, and the repetition of Forms 53 and 54 is not good for assessment; this is especially obvious when students test into Forms 55 and 56 and go into shock. Finally, BEST Plus is to assess the students' comprehension and production of English.) Unfortunately, in our program we've had trouble getting consistent results from assessor to assessor. Yes, we all heard in training that the correlation was over 95%, but we've found that a student's ability to speak clearly seems to pull her/his overall score up. For example, a student may receive a top mark, 3, for oral communication, while really answering very briefly and simply, and end up with a high score. Also, some questions merely require a yes or no response. For an academically or linguistically savvy beginner, it's easy to follow the grammar pattern and respond appropriately. Do you have any ideas on how to counteract these tendencies to score students higher than their normal speech? Thanks, Mary Jane Jerde mjjerdems at yahoo.com -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 4954 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060720/fcf975cc/attachment.bin From marie.cora at hotspurpartners.com Thu Jul 20 10:14:56 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 20 Jul 2006 10:14:56 -0400 Subject: [Assessment 416] FW: [SpecialTopics 164] Re: Keeping students' eyes on the prize Message-ID: <010a01c6ac06$e17bb540$0302a8c0@LITNOW> Colleagues - from the Special Topics List: -----Original Message----- From: specialtopics-bounces at nifl.gov [mailto:specialtopics-bounces at nifl.gov] On Behalf Of John Comings Sent: Wednesday, July 19, 2006 8:30 AM To: specialtopics at nifl.gov Subject: [SpecialTopics 164] Re: Keeping students' eyes on the prize I, too, think national curriculum frameworks for subgroups of students (based on their goals and their learning needs) would be helpful to persistence. It would make it much easier to build learning plans that included guides self-study, classes, tutoring, one-day intensive workshops and other modes of learning into a coherent learning activity. --On Tuesday, July 18, 2006 10:49 AM -0400 john strucker wrote: > Hi David and colleagues, > One part of a total approach to improved persistence that we should > explore is the one they are trying in the UK. Their adult students take > a series of nationally developed curriculum-based benchmark tests that > give them feedback on their mastery of various specific competencies and > also give them a sense of how much closer they are getting to reaching > their long-term goals. > Best, > John Strucker > > --On Sunday, July 16, 2006 4:25 PM -0400 David Rosen > wrote: > >> John, and other colleagues, >> >> Part of the persistence challenge is that some adult learners make >> progress very slowly and have so very far to go before they see the >> prize they may have their eyes on. The prize might be a high school >> diploma, a better job, a living wage, a good job with a decent salary >> and good benefits, or going to college, but these may be basic >> literacy or beginning English language students who need years of >> study to achieve one of these goals. One answer might be to increase >> intensity, more time on task, more hours of study. But this is not >> always possible for programs, because they lack the funding to >> increase intensity of classroom instruction, or for learners, who >> usually have other commitments like working and parenting. Funders >> -- especially companies when they fund "workplace literacy" -- often >> want results in a few weeks or months, and even major federal and >> state funders want results at the end of the fiscal year, either one >> of these prizes or evidence of progress toward its attainment. >> >> Are there some ways we could sustain the student's original motive or >> goal (the GED diploma, a good job, or an admission to college prize) >> over several years, if needed. What do we know about strategies >> like awarding certificates for small achievements, holding annual >> recognition ceremonies, and providing good formative assessment so >> students can see they have reached some milestones? How about >> strategies like building community, providing food, helping students >> to learn skills that they can use in daily living? Can we articulate >> from research and/or professional wisdom what strategies work (if >> any) in sustaining long-term students' motivation and convince >> funders that we need their support for these strategies? >> >> David J. Rosen >> djrosen at comcast.net >> >> From pmcnaughton at language.ca Thu Jul 20 14:22:25 2006 From: pmcnaughton at language.ca (Pauline Mcnaughton) Date: Thu, 20 Jul 2006 14:22:25 -0400 Subject: [Assessment 417] Re: FW: [SpecialTopics 164] Re: Keeping students' eyeson the prize In-Reply-To: <010a01c6ac06$e17bb540$0302a8c0@LITNOW> Message-ID: <005501c6ac29$740885b0$9a01a8c0@language.ca> I attended a workshop on the European Language Portfolio, based on the Common European Framework. It has really excited me about the possibilities of a standardized, widely recognized portfolio model. I think this portfolio model has a lot to offer in terms of adult ESL learners managing and taking charge of their learning progress. A standardized portfolio model such as the ELP moves beyond standardized testing which can at best provide only a snapshot of general language proficiency at key points in time. I think it's essential to move beyond standardized assessment tools and focus on the documentation of specific language competencies relevant to specific learner goals and objectives. A standardized portfolio tool, perhaps initiated in an adult ESL classroom, which goes with the learner from class to class, program to program and even to the workplace - would provide a much more detailed self-assessment. In the ESL classroom it would inform the development of individualized learning plans and encourage learner autonomy and self-direction both in and out of the classroom. Quoting from some of the research done on the ELP, it could provide "an important interface between language learning, teaching and assessment" and achieve these "invisible learning outcomes: - commitment to and ownership of one's language learning: - tolerance of ambiguity and uncertainty in communicative situations and learning - willingness to take risks in order to cope with communicative tasks - learning skills and strategies necessary for continuous, independent language learning - reflective basic orientation to language learning, with abilities for self-assessment of language competence Pauline McNaughton Executive Director / Directrice executive Centre for Canadian Language Benchmarks/Centre des niveaux de competence linguistique canadiens 200 Elgin Street, Suite 803 / 200 rue Elgin, piece 803 Ottawa, ON K2P 1L5 T (613) 230-7729 F (613) 230-9305 pmcnaughton at language.ca < http://www.language.ca/> This communication is intended for the use of the recipient to which it is addressed, and may contain confidential, personal, and or privileged information. Please contact us immediately if you are not the intended recipient of this communication, and do not copy, distribute, or take action relying on it. Any communication received in error, or subsequent reply, should be deleted or destroyed. Le present message n'est destine qu'a la personne ou l'organisme auquel il est adresse et peut contenir de l'information confidentielle, personnelle ou privilegiee. Si vous n'etes pas le destinataire de ce message, informez-nous immediatement. Il est interdit de copier, diffuser ou engager des poursuites fondees sur son contenu. Si vous avez recu ce communique par erreur, ou une reponse subsequente, veuillez le supprimer ou le detruire. -----Original Message----- From: Marie Cora [mailto:marie.cora at hotspurpartners.com] Sent: July 20, 2006 9:15 AM To: Assessment at nifl.gov Subject: [Assessment 416] FW: [SpecialTopics 164] Re: Keeping students' eyeson the prize Colleagues - from the Special Topics List: -----Original Message----- From: specialtopics-bounces at nifl.gov [mailto:specialtopics-bounces at nifl.gov] On Behalf Of John Comings Sent: Wednesday, July 19, 2006 8:30 AM To: specialtopics at nifl.gov Subject: [SpecialTopics 164] Re: Keeping students' eyes on the prize I, too, think national curriculum frameworks for subgroups of students (based on their goals and their learning needs) would be helpful to persistence. It would make it much easier to build learning plans that included guides self-study, classes, tutoring, one-day intensive workshops and other modes of learning into a coherent learning activity. --On Tuesday, July 18, 2006 10:49 AM -0400 john strucker wrote: > Hi David and colleagues, > One part of a total approach to improved persistence that we should > explore is the one they are trying in the UK. Their adult students take > a series of nationally developed curriculum-based benchmark tests that > give them feedback on their mastery of various specific competencies and > also give them a sense of how much closer they are getting to reaching > their long-term goals. > Best, > John Strucker > > --On Sunday, July 16, 2006 4:25 PM -0400 David Rosen > wrote: > >> John, and other colleagues, >> >> Part of the persistence challenge is that some adult learners make >> progress very slowly and have so very far to go before they see the >> prize they may have their eyes on. The prize might be a high school >> diploma, a better job, a living wage, a good job with a decent salary >> and good benefits, or going to college, but these may be basic >> literacy or beginning English language students who need years of >> study to achieve one of these goals. One answer might be to increase >> intensity, more time on task, more hours of study. But this is not >> always possible for programs, because they lack the funding to >> increase intensity of classroom instruction, or for learners, who >> usually have other commitments like working and parenting. Funders >> -- especially companies when they fund "workplace literacy" -- often >> want results in a few weeks or months, and even major federal and >> state funders want results at the end of the fiscal year, either one >> of these prizes or evidence of progress toward its attainment. >> >> Are there some ways we could sustain the student's original motive or >> goal (the GED diploma, a good job, or an admission to college prize) >> over several years, if needed. What do we know about strategies >> like awarding certificates for small achievements, holding annual >> recognition ceremonies, and providing good formative assessment so >> students can see they have reached some milestones? How about >> strategies like building community, providing food, helping students >> to learn skills that they can use in daily living? Can we articulate >> from research and/or professional wisdom what strategies work (if >> any) in sustaining long-term students' motivation and convince >> funders that we need their support for these strategies? >> >> David J. Rosen >> djrosen at comcast.net >> >> From LCondelli at air.org Thu Jul 20 18:20:18 2006 From: LCondelli at air.org (Condelli, Larry) Date: Thu, 20 Jul 2006 18:20:18 -0400 Subject: [Assessment 418] Re: Questions on ESOL Level Descriptors in NRS Message-ID: <8123523374E3514EACEE6B70E8FE31D7025A16EE@dc1ex3.air.org> Hi to all, I thought it would be helpful to give a brief background on the NRS descriptors and why the ESL levels and descriptors were changed, one of the topics that Marie wanted us to address in the brief discussion this week. Those of you that have been in adult literacy for a while know that we have divided students into ABE, ASE and ESL for some time. There were levels within each of these areas defined by student literacy, language and functional skills long before the NRS. (I have heard from the ancients that way back there were no levels, but that was before my time. I am not that old!) When I started in adult education in 1990, there were only 3 ESL levels, two ABE levels and one ASE level. In 1995, we added what is now the beginning literacy level to both ABE and ESL and that is where we were when the NRS development started in 1997. With the NRS, the importance of showing student educational gain by level became the key measure and we focused a lot of attention on developing levels that reflected what is taught in adult education, determining skills that would define levels and identifying tests that could the test levels measure student skills within these levels. We also wanted to narrow the levels because we realized the range of levels was too broad to reflect student learning. So we created 6 ESL levels, 4 ABE levels and 2 ASE levels. We hoped these levels would provide a meaningful framework for measuring educational progress. In early 2004, with three years of experience with this system, some dissatisfaction from states and the federal office emerged with the beginning and high advanced ESL levels: ? Many state staff and practitioners claimed that the wide range of skills encompassed within the beginning ESL level made it difficult to show educational gain and to demonstrate student progress accurately. ? Enrollment in high advanced ESL was low over the three years - about 40,000 students or about 4 percent of total ESL students. In contrast, over 340,000 students were enrolled in beginning ESL (about 30 percent of total ESL students) during that same period. ? There were limited ways of showing completion of the advanced ESL level. Until the development of BEST Plus in 2003, there were no valid and reliable NRS assessments that could measure completion from high advanced ESL, which was defined as SPL 8. For these reasons, the Division of Adult Education and Literacy (DAEL) in the Department of Education, changed the levels for ESL by splitting the beginning level into low beginning and high beginning. The high advanced ESL was been eliminated and the existing low advanced ESL was renamed as advanced ESL. The exit criteria for this level was lowered to better match currently available assessments. Before making these changes, DAEL consulted with the NRS technical working group and the state directors of adult education and it received extensive support. The proposed change was first announced in spring 2004 and we re-wrote the descriptors with help from CAL and CASAS staff in 2005. The new levels went into effect this month and we hope they will help us provide a better picture of adult ESL student's education progress. _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, July 17, 2006 7:59 AM To: Assessment at nifl.gov Subject: [Assessment 406] Questions on ESOL Level Descriptors in NRS Good morning, afternoon, and evening to you all. I hope this email finds you well. I wanted to let everyone know that during this week, Larry Condelli from AIR (American Institutes of Research), who works with the NRS, and Sarah Young, from CAL (Center for Applied Linguistics) who works with BEST Plus will be available to answer any questions you might have regarding the changes in ESOL Level Descriptors, which go into effect this summer (this month I believe). I also encourage anyone who has questions regarding other ESOL tests (CASAS or EFF for example) to join in this Q&A. Because the Level Descriptors have been adjusted, the tests used to track learning gains also have undergone some shifting and it is important that we understand what these changes are. Larry and Sarah will be present on the List during this week, but perhaps intermittently - replies may not come immediately. I encourage you to post your question to the List, or to send your question to me for posting, if you prefer that. Larry, Sarah, and others working with any of the ESOL tests - feel free to jump in and give us a thumbnail sketch of what the changes are and how they might affect our work in programs and with students. The NRS homepage is located at: http://www.nrsweb.org/ To view information on the NRS Level Descriptors, please go to: http://www.nrsweb.org/reports/NewESLdescriptors.pdf At the bottom of the NRS homepage, see also: NRS Changes for Program Year 2006 Thanks so much - I'm looking forward to understanding this information, and hearing what folks questions are regarding the changes. Marie T. Cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060720/e1021d0a/attachment.html From marie.cora at hotspurpartners.com Thu Jul 20 22:11:29 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 20 Jul 2006 22:11:29 -0400 Subject: [Assessment 419] Guest in Workplace Literacy next week Message-ID: <01a401c6ac6a$fb1f2130$0302a8c0@LITNOW> Dear Colleagues, Please note the up-coming Guest Discussion on the Workplace Literacy List from Moderator Donna Brian. Marie Cora Assessment Discussion List Moderator ************************************************************************ ** Guest Discussion: Workplace Literacy Monday, July 24 - Friday, July 28 Guest: Alison Campbell - please see Alison's bio below To subscribe to the Workplace List, go to http://www.nifl.gov/mailman/listinfo/Workplace/ Colleagues, Next week, Monday July 24 - Friday July 28, we are privileged to have , as a guest on the Workplace Literacy Discussion List , Alison Campbell of the Conference Board of Canada. Many of you already know of her work, but for those of you who don't, she has 3 websites that I consider sister sites of the Workforce Education site. Where our Workforce site is geared more toward adult education instructors with a workforce focus, her sites are geared more toward business and industry employers who want and/or need to upgrade their workers' literacy skills. The Conference Board is also responsible for quite a lot of good research which Alison has been a part of. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Bio Alison Campbell Alison Campbell is a Senior Research Associate with the Education and Learning practice at The Conference Board of Canada. She acts as lead researcher, author and web site manager on various research projects in the area of workplace education and learning. This year, Alison is also managing the Conference Board's International Workplace Education and Learning Conference: Sharing Global Solutions (Toronto, December 5-6, 2006). In 2005, Alison authored Profiting from Literacy: Creating a Sustainable Workplace Literacy Program and co-authored Literacy, Life and Employment: An Analysis of Canadian International Adult Literacy Survey (IALS) Microdata. In 2003, Alison authored Strength from Within: Overcoming the Barriers to Workplace Literacy Development as part of a national research study on the challenges employers face in designing and implementing workplace literacy and basic skills programs. In 2002, she co-managed a national study in the U.S. on the impacts of joint labor-management education programs. She co-authored the final report: Success by Design: What Works in Workforce Development. Alison currently manages a pilot project on the benefits of a national credit review service to improve credentialing opportunities for workplace education. Her work on workplace literacy and basic skills development extends beyond Canada to the United States. Alison manages and makes updates to www.work-basedlearning.org , www.scorecardforskills.com and www.workplacebasicskills.com - a suite of web sites funded by the U.S. Department of Education that act as free resources to American employers and their partners who wish to improve employees' skills. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ As you can tell from her bio, Alison has expertise in many areas of workplace literacy. Attached to this message is a complete listing of her publications and presentations, which are available for free download on the Conference Board site. You must, however, register with the site to access them. (Sign in at http://www.conferenceboard.ca/boardwiseii/Signin.asp and when you have the option, browse documents by "author" choosing "Campbell, Alison.") Access to the web sites does not require registration, and they are linked in her bio above. Alison is willing to discuss with us any of the areas of workplace literacy on her web sites or in her publications. It should be a wide-ranging discussion! I hope you will be able to participate! Donna Donna Brian, moderator Workplace Literacy Discussion List Center for Literacy Studies at The University of Tennessee djgbrian at utk.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060720/639ff01f/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: ACampbell Pubs Presentations.doc Type: application/msword Size: 49664 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060720/639ff01f/attachment.doc -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ATT00418.txt Url: http://www.nifl.gov/pipermail/assessment/attachments/20060720/639ff01f/attachment.txt From pmcnaughton at language.ca Fri Jul 21 10:10:36 2006 From: pmcnaughton at language.ca (Pauline Mcnaughton) Date: Fri, 21 Jul 2006 10:10:36 -0400 Subject: [Assessment 420] Re: Questions on ESOL Level Descriptors in NRS In-Reply-To: <8123523374E3514EACEE6B70E8FE31D7025A16EE@dc1ex3.air.org> Message-ID: <001901c6accf$70dac0f0$9a01a8c0@language.ca> Thanks very much for this summary overview. It's very helpful. There are many similarities in your overview to the Canadian situation. Canadian Language Benchmarks were developed by the federal government in 1996 with 12 benchmark levels and a separate set of ESL literacy benchmarks levels - Foundation, Phase I, II and III. The national assessment tools that exist to measure CLB, measure in benchmark levels for each skill - reading, writing, speaking and listening. Some practitioners think that 12 benchmark levels is too many, and that it may have been better to have had only 6 levels. (It takes a lot of work to document all the competencies necessary to move up one level for each of the 4 skills.) However, even with 12 levels there are concerns that progress can only be measured by achievement of an entire benchmark level - and that a great deal of progress made by learners, is not recognized because it may not result in completion of an entire level. There are many ways to measure and document the achievement of specific competencies in the classroom. We've developed collections of formal, standardized, exit assessment tasks that allow classroom teachers to accurately measure specific competencies. But record keeping and certificates of completion at this point only measure whole benchmark level achievements. This is why the idea of the European Language Portfolio model which I talked about yesterday in response to the email "Keeping Students eyes on teh prize" interests me - because it does get at the competency level and helps the learner (and teacher) target important competencies and measure those achievements. -----Original Message----- From: Condelli, Larry [mailto:LCondelli at air.org] Sent: July 20, 2006 5:20 PM To: The Assessment Discussion List Subject: [Assessment 418] Re: Questions on ESOL Level Descriptors in NRS Hi to all, I thought it would be helpful to give a brief background on the NRS descriptors and why the ESL levels and descriptors were changed, one of the topics that Marie wanted us to address in the brief discussion this week. Those of you that have been in adult literacy for a while know that we have divided students into ABE, ASE and ESL for some time. There were levels within each of these areas defined by student literacy, language and functional skills long before the NRS. (I have heard from the ancients that way back there were no levels, but that was before my time. I am not that old!) When I started in adult education in 1990, there were only 3 ESL levels, two ABE levels and one ASE level. In 1995, we added what is now the beginning literacy level to both ABE and ESL and that is where we were when the NRS development started in 1997. With the NRS, the importance of showing student educational gain by level became the key measure and we focused a lot of attention on developing levels that reflected what is taught in adult education, determining skills that would define levels and identifying tests that could the test levels measure student skills within these levels. We also wanted to narrow the levels because we realized the range of levels was too broad to reflect student learning. So we created 6 ESL levels, 4 ABE levels and 2 ASE levels. We hoped these levels would provide a meaningful framework for measuring educational progress. In early 2004, with three years of experience with this system, some dissatisfaction from states and the federal office emerged with the beginning and high advanced ESL levels: ? Many state staff and practitioners claimed that the wide range of skills encompassed within the beginning ESL level made it difficult to show educational gain and to demonstrate student progress accurately. ? Enrollment in high advanced ESL was low over the three years ? about 40,000 students or about 4 percent of total ESL students. In contrast, over 340,000 students were enrolled in beginning ESL (about 30 percent of total ESL students) during that same period. ? There were limited ways of showing completion of the advanced ESL level. Until the development of BEST Plus in 2003, there were no valid and reliable NRS assessments that could measure completion from high advanced ESL, which was defined as SPL 8. For these reasons, the Division of Adult Education and Literacy (DAEL) in the Department of Education, changed the levels for ESL by splitting the beginning level into low beginning and high beginning. The high advanced ESL was been eliminated and the existing low advanced ESL was renamed as advanced ESL. The exit criteria for this level was lowered to better match currently available assessments. Before making these changes, DAEL consulted with the NRS technical working group and the state directors of adult education and it received extensive support. The proposed change was first announced in spring 2004 and we re-wrote the descriptors with help from CAL and CASAS staff in 2005. The new levels went into effect this month and we hope they will help us provide a better picture of adult ESL student's education progress. ---------------------------------------------------------------------------- -- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, July 17, 2006 7:59 AM To: Assessment at nifl.gov Subject: [Assessment 406] Questions on ESOL Level Descriptors in NRS Good morning, afternoon, and evening to you all. I hope this email finds you well. I wanted to let everyone know that during this week, Larry Condelli from AIR (American Institutes of Research), who works with the NRS, and Sarah Young, from CAL (Center for Applied Linguistics) who works with BEST Plus will be available to answer any questions you might have regarding the changes in ESOL Level Descriptors, which go into effect this summer (this month I believe). I also encourage anyone who has questions regarding other ESOL tests (CASAS or EFF for example) to join in this Q&A. Because the Level Descriptors have been adjusted, the tests used to track learning gains also have undergone some shifting and it is important that we understand what these changes are. Larry and Sarah will be present on the List during this week, but perhaps intermittently ? replies may not come immediately. I encourage you to post your question to the List, or to send your question to me for posting, if you prefer that. Larry, Sarah, and others working with any of the ESOL tests ? feel free to jump in and give us a thumbnail sketch of what the changes are and how they might affect our work in programs and with students. The NRS homepage is located at: http://www.nrsweb.org/ To view information on the NRS Level Descriptors, please go to: http://www.nrsweb.org/reports/NewESLdescriptors.pdf At the bottom of the NRS homepage, see also: NRS Changes for Program Year 2006 Thanks so much ? I?m looking forward to understanding this information, and hearing what folks questions are regarding the changes. Marie T. Cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060721/af60835a/attachment.html From LCondelli at air.org Fri Jul 21 11:22:52 2006 From: LCondelli at air.org (Condelli, Larry) Date: Fri, 21 Jul 2006 11:22:52 -0400 Subject: [Assessment 421] Re: Questions on ESOL Level Descriptors in NRS Message-ID: <8123523374E3514EACEE6B70E8FE31D7021B9B78@dc1ex3.air.org> The CEF for Languages is very comprehensive and is designed to apply to all learners at all ages and levels for all European languages. I think it is an excellent and comprehensive way of guiding instruction and assessment of language learners , both for learners and teachers, and it provides a standard reference for determining language ability within the EU. While I thin it is excellent for teaching and assessment, the difficulty with using it in accountability framework, such as the NRS or Canadian system, is that the assessment requirements are quite burdensome (e.g., evaluating a learner's portfolio) and the skills and competencies represented in the higher levels of the framework are beyond those of learners in adult literacy classes (so are inapplicable to our students). Also, in the US, getting all of the states to along with a single national, framework is quite difficult, with our tradition of state and local control over instructional decisions. For those interested in this, the framework is described in Council of Europe (2001), A Common European Framework of Reference for Languages: Learning, teaching, assessment. Cambridge: Cambridge University Press. A short presentation on paper on the framework and a description of a project to adapt the framework for use with low-educated adult ESOL students in Holland will soon be available in the Proceedings from an international research conference held in Holland last year (see www.leslla.org). For now, the author's (Willemijn Stockmann ) Powerpoint presentation of the project is available on the web site under "Workshops." Larry Condelli _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Pauline Mcnaughton Sent: Friday, July 21, 2006 10:11 AM To: 'The Assessment Discussion List' Subject: [Assessment 420] Re: Questions on ESOL Level Descriptors in NRS Thanks very much for this summary overview. It's very helpful. There are many similarities in your overview to the Canadian situation. Canadian Language Benchmarks were developed by the federal government in 1996 with 12 benchmark levels and a separate set of ESL literacy benchmarks levels - Foundation, Phase I, II and III. The national assessment tools that exist to measure CLB, measure in benchmark levels for each skill - reading, writing, speaking and listening. Some practitioners think that 12 benchmark levels is too many, and that it may have been better to have had only 6 levels. (It takes a lot of work to document all the competencies necessary to move up one level for each of the 4 skills.) However, even with 12 levels there are concerns that progress can only be measured by achievement of an entire benchmark level - and that a great deal of progress made by learners, is not recognized because it may not result in completion of an entire level. There are many ways to measure and document the achievement of specific competencies in the classroom. We've developed collections of formal, standardized, exit assessment tasks that allow classroom teachers to accurately measure specific competencies. But record keeping and certificates of completion at this point only measure whole benchmark level achievements. This is why the idea of the European Language Portfolio model which I talked about yesterday in response to the email "Keeping Students eyes on teh prize" interests me - because it does get at the competency level and helps the learner (and teacher) target important competencies and measure those achievements. -----Original Message----- From: Condelli, Larry [mailto:LCondelli at air.org] Sent: July 20, 2006 5:20 PM To: The Assessment Discussion List Subject: [Assessment 418] Re: Questions on ESOL Level Descriptors in NRS Hi to all, I thought it would be helpful to give a brief background on the NRS descriptors and why the ESL levels and descriptors were changed, one of the topics that Marie wanted us to address in the brief discussion this week. Those of you that have been in adult literacy for a while know that we have divided students into ABE, ASE and ESL for some time. There were levels within each of these areas defined by student literacy, language and functional skills long before the NRS. (I have heard from the ancients that way back there were no levels, but that was before my time. I am not that old!) When I started in adult education in 1990, there were only 3 ESL levels, two ABE levels and one ASE level. In 1995, we added what is now the beginning literacy level to both ABE and ESL and that is where we were when the NRS development started in 1997. With the NRS, the importance of showing student educational gain by level became the key measure and we focused a lot of attention on developing levels that reflected what is taught in adult education, determining skills that would define levels and identifying tests that could the test levels measure student skills within these levels. We also wanted to narrow the levels because we realized the range of levels was too broad to reflect student learning. So we created 6 ESL levels, 4 ABE levels and 2 ASE levels. We hoped these levels would provide a meaningful framework for measuring educational progress. In early 2004, with three years of experience with this system, some dissatisfaction from states and the federal office emerged with the beginning and high advanced ESL levels: ? Many state staff and practitioners claimed that the wide range of skills encompassed within the beginning ESL level made it difficult to show educational gain and to demonstrate student progress accurately. ? Enrollment in high advanced ESL was low over the three years - about 40,000 students or about 4 percent of total ESL students. In contrast, over 340,000 students were enrolled in beginning ESL (about 30 percent of total ESL students) during that same period. ? There were limited ways of showing completion of the advanced ESL level. Until the development of BEST Plus in 2003, there were no valid and reliable NRS assessments that could measure completion from high advanced ESL, which was defined as SPL 8. For these reasons, the Division of Adult Education and Literacy (DAEL) in the Department of Education, changed the levels for ESL by splitting the beginning level into low beginning and high beginning. The high advanced ESL was been eliminated and the existing low advanced ESL was renamed as advanced ESL. The exit criteria for this level was lowered to better match currently available assessments. Before making these changes, DAEL consulted with the NRS technical working group and the state directors of adult education and it received extensive support. The proposed change was first announced in spring 2004 and we re-wrote the descriptors with help from CAL and CASAS staff in 2005. The new levels went into effect this month and we hope they will help us provide a better picture of adult ESL student's education progress. _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, July 17, 2006 7:59 AM To: Assessment at nifl.gov Subject: [Assessment 406] Questions on ESOL Level Descriptors in NRS Good morning, afternoon, and evening to you all. I hope this email finds you well. I wanted to let everyone know that during this week, Larry Condelli from AIR (American Institutes of Research), who works with the NRS, and Sarah Young, from CAL (Center for Applied Linguistics) who works with BEST Plus will be available to answer any questions you might have regarding the changes in ESOL Level Descriptors, which go into effect this summer (this month I believe). I also encourage anyone who has questions regarding other ESOL tests (CASAS or EFF for example) to join in this Q&A. Because the Level Descriptors have been adjusted, the tests used to track learning gains also have undergone some shifting and it is important that we understand what these changes are. Larry and Sarah will be present on the List during this week, but perhaps intermittently - replies may not come immediately. I encourage you to post your question to the List, or to send your question to me for posting, if you prefer that. Larry, Sarah, and others working with any of the ESOL tests - feel free to jump in and give us a thumbnail sketch of what the changes are and how they might affect our work in programs and with students. The NRS homepage is located at: http://www.nrsweb.org/ To view information on the NRS Level Descriptors, please go to: http://www.nrsweb.org/reports/NewESLdescriptors.pdf At the bottom of the NRS homepage, see also: NRS Changes for Program Year 2006 Thanks so much - I'm looking forward to understanding this information, and hearing what folks questions are regarding the changes. Marie T. Cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060721/26dd6c9b/attachment.html From LCondelli at air.org Fri Jul 21 12:24:24 2006 From: LCondelli at air.org (Condelli, Larry) Date: Fri, 21 Jul 2006 12:24:24 -0400 Subject: [Assessment 422] Re: Questions on ESOL Level Descriptors in NRS Message-ID: <8123523374E3514EACEE6B70E8FE31D7025A16FE@dc1ex3.air.org> Correction, the Stockmann paper applies the CEF for language to low-educated adults learing Dutch, not English. _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Condelli, Larry Sent: Friday, July 21, 2006 11:23 AM To: The Assessment Discussion List Subject: [Assessment 421] Re: Questions on ESOL Level Descriptors in NRS The CEF for Languages is very comprehensive and is designed to apply to all learners at all ages and levels for all European languages. I think it is an excellent and comprehensive way of guiding instruction and assessment of language learners , both for learners and teachers, and it provides a standard reference for determining language ability within the EU. While I thin it is excellent for teaching and assessment, the difficulty with using it in accountability framework, such as the NRS or Canadian system, is that the assessment requirements are quite burdensome (e.g., evaluating a learner's portfolio) and the skills and competencies represented in the higher levels of the framework are beyond those of learners in adult literacy classes (so are inapplicable to our students). Also, in the US, getting all of the states to along with a single national, framework is quite difficult, with our tradition of state and local control over instructional decisions. For those interested in this, the framework is described in Council of Europe (2001), A Common European Framework of Reference for Languages: Learning, teaching, assessment. Cambridge: Cambridge University Press. A short presentation on paper on the framework and a description of a project to adapt the framework for use with low-educated adult ESOL students in Holland will soon be available in the Proceedings from an international research conference held in Holland last year (see www.leslla.org). For now, the author's (Willemijn Stockmann ) Powerpoint presentation of the project is available on the web site under "Workshops." Larry Condelli _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Pauline Mcnaughton Sent: Friday, July 21, 2006 10:11 AM To: 'The Assessment Discussion List' Subject: [Assessment 420] Re: Questions on ESOL Level Descriptors in NRS Thanks very much for this summary overview. It's very helpful. There are many similarities in your overview to the Canadian situation. Canadian Language Benchmarks were developed by the federal government in 1996 with 12 benchmark levels and a separate set of ESL literacy benchmarks levels - Foundation, Phase I, II and III. The national assessment tools that exist to measure CLB, measure in benchmark levels for each skill - reading, writing, speaking and listening. Some practitioners think that 12 benchmark levels is too many, and that it may have been better to have had only 6 levels. (It takes a lot of work to document all the competencies necessary to move up one level for each of the 4 skills.) However, even with 12 levels there are concerns that progress can only be measured by achievement of an entire benchmark level - and that a great deal of progress made by learners, is not recognized because it may not result in completion of an entire level. There are many ways to measure and document the achievement of specific competencies in the classroom. We've developed collections of formal, standardized, exit assessment tasks that allow classroom teachers to accurately measure specific competencies. But record keeping and certificates of completion at this point only measure whole benchmark level achievements. This is why the idea of the European Language Portfolio model which I talked about yesterday in response to the email "Keeping Students eyes on teh prize" interests me - because it does get at the competency level and helps the learner (and teacher) target important competencies and measure those achievements. -----Original Message----- From: Condelli, Larry [mailto:LCondelli at air.org] Sent: July 20, 2006 5:20 PM To: The Assessment Discussion List Subject: [Assessment 418] Re: Questions on ESOL Level Descriptors in NRS Hi to all, I thought it would be helpful to give a brief background on the NRS descriptors and why the ESL levels and descriptors were changed, one of the topics that Marie wanted us to address in the brief discussion this week. Those of you that have been in adult literacy for a while know that we have divided students into ABE, ASE and ESL for some time. There were levels within each of these areas defined by student literacy, language and functional skills long before the NRS. (I have heard from the ancients that way back there were no levels, but that was before my time. I am not that old!) When I started in adult education in 1990, there were only 3 ESL levels, two ABE levels and one ASE level. In 1995, we added what is now the beginning literacy level to both ABE and ESL and that is where we were when the NRS development started in 1997. With the NRS, the importance of showing student educational gain by level became the key measure and we focused a lot of attention on developing levels that reflected what is taught in adult education, determining skills that would define levels and identifying tests that could the test levels measure student skills within these levels. We also wanted to narrow the levels because we realized the range of levels was too broad to reflect student learning. So we created 6 ESL levels, 4 ABE levels and 2 ASE levels. We hoped these levels would provide a meaningful framework for measuring educational progress. In early 2004, with three years of experience with this system, some dissatisfaction from states and the federal office emerged with the beginning and high advanced ESL levels: ? Many state staff and practitioners claimed that the wide range of skills encompassed within the beginning ESL level made it difficult to show educational gain and to demonstrate student progress accurately. ? Enrollment in high advanced ESL was low over the three years - about 40,000 students or about 4 percent of total ESL students. In contrast, over 340,000 students were enrolled in beginning ESL (about 30 percent of total ESL students) during that same period. ? There were limited ways of showing completion of the advanced ESL level. Until the development of BEST Plus in 2003, there were no valid and reliable NRS assessments that could measure completion from high advanced ESL, which was defined as SPL 8. For these reasons, the Division of Adult Education and Literacy (DAEL) in the Department of Education, changed the levels for ESL by splitting the beginning level into low beginning and high beginning. The high advanced ESL was been eliminated and the existing low advanced ESL was renamed as advanced ESL. The exit criteria for this level was lowered to better match currently available assessments. Before making these changes, DAEL consulted with the NRS technical working group and the state directors of adult education and it received extensive support. The proposed change was first announced in spring 2004 and we re-wrote the descriptors with help from CAL and CASAS staff in 2005. The new levels went into effect this month and we hope they will help us provide a better picture of adult ESL student's education progress. _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, July 17, 2006 7:59 AM To: Assessment at nifl.gov Subject: [Assessment 406] Questions on ESOL Level Descriptors in NRS Good morning, afternoon, and evening to you all. I hope this email finds you well. I wanted to let everyone know that during this week, Larry Condelli from AIR (American Institutes of Research), who works with the NRS, and Sarah Young, from CAL (Center for Applied Linguistics) who works with BEST Plus will be available to answer any questions you might have regarding the changes in ESOL Level Descriptors, which go into effect this summer (this month I believe). I also encourage anyone who has questions regarding other ESOL tests (CASAS or EFF for example) to join in this Q&A. Because the Level Descriptors have been adjusted, the tests used to track learning gains also have undergone some shifting and it is important that we understand what these changes are. Larry and Sarah will be present on the List during this week, but perhaps intermittently - replies may not come immediately. I encourage you to post your question to the List, or to send your question to me for posting, if you prefer that. Larry, Sarah, and others working with any of the ESOL tests - feel free to jump in and give us a thumbnail sketch of what the changes are and how they might affect our work in programs and with students. The NRS homepage is located at: http://www.nrsweb.org/ To view information on the NRS Level Descriptors, please go to: http://www.nrsweb.org/reports/NewESLdescriptors.pdf At the bottom of the NRS homepage, see also: NRS Changes for Program Year 2006 Thanks so much - I'm looking forward to understanding this information, and hearing what folks questions are regarding the changes. Marie T. Cora Assessment Discussion List Moderator marie.cora at hotspurpartners.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060721/a3dc7967/attachment.html From kabeall at comcast.net Mon Jul 24 11:20:18 2006 From: kabeall at comcast.net (Kaye Beall) Date: Mon, 24 Jul 2006 11:20:18 -0400 Subject: [Assessment 423] New from NCSALL--Training Guides Message-ID: <003601c6af34$abe32c70$0202a8c0@your4105e587b6> Practitioner Research Training Guide: Research-based Adult Reading Instruction This practitioner research training guide provides comprehensive instructions for facilitating a 31-hour training that guides practitioners through an investigation of a problem related to reading. The practitioners conduct the research in their own classrooms. This guide provides all the necessary materials and clear instructions to plan and facilitate a four-session practitioner research training. The sessions vary in length. To download the training guide, go to http://www.ncsall.net/index.php?id=1143 Training Guide: Study Circle Facilitators This training guide provides comprehensive instructions for preparing experienced adult education practitioners to facilitate NCSALL study circles. The training focuses on the NCSALL study circle, Research-based Adult Reading Instruction. However, the training can be adapted to prepare facilitators for NCSALL study circles in general or on another topic. The guide provides all the necessary materials and clear instructions to plan and facilitate a one-day, study circle facilitators training. The training is six hours in length. To download the training guide, go to http://www.ncsall.net/index.php?id=1137 **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060724/283c30b4/attachment.html From marie.cora at hotspurpartners.com Tue Jul 25 15:37:16 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 25 Jul 2006 15:37:16 -0400 Subject: [Assessment 424] Workplace assessment discussion Message-ID: <053701c6b021$bcfffc20$0302a8c0@LITNOW> Hi everyone, Just a quick note to say there is an interesting thread of discussion focused on assessment in the workplace on the NIFL Workplace Literacy Discussion List presently. They are hosting a Guest Discussion this week that is touching on a broad (and interesting!) array of topics related to workplace literacy, but I wanted to alert you that one thread is on assessment. You can subscribe to the Workplace Literacy List by going to: http://www.nifl.gov/mailman/listinfo/Workplace/ And/or you can view the archives at that site by clicking on Read Current Posted Messages (and look for messages that have 'assessment' in the Subject Line). Marie Cora Assessment Discussion List Moderator -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060725/fbc788d9/attachment.html From marie.cora at hotspurpartners.com Fri Jul 28 12:02:28 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 28 Jul 2006 12:02:28 -0400 Subject: [Assessment 425] Guest Speaker on Learning Disabilities List Message-ID: <004d01c6b25f$3a399f90$0302a8c0@LITNOW> Hello everyone, I hope this email finds you well. Please note the following Guest Discussion on the Learning Disabilities List: ' To subscribe: http://www.nifl.gov/mailman/listinfo/LearningDisabilities To read archived messages: http://www.nifl.gov/linc/discussions/list_archives.html Hello all, I am very pleased to announce that Nancie Payne has accepted my invitation as a Guest Speaker during the week of August 7-11, 2006. Beginning in July, we had a thread being discussed on the List that eventually became the "LD Discrepancy Model" topic. That topic is the one that I have asked Nancie to address during her week with us. I want to ask subscribers to begin thinking about questions they want to propose to Nancie. Since Nancie is already a subscriber on this List, she will see each of your suggestions as they are posted. Please feel free to let your colleagues know of this wonderful opportunity to hear from and interact with one of the leaders in the field of adults with Learning Disabilities. Information about subscribing to the List is at the bottom of this message. Lastly, I am including Nancie Payne's resume for your information. Nancie Payne, President of Payne & Associates, Inc. and the Northwest Center for the Advancement of Learning, is nationally recognized for thirty years of work in education and workplace-based services for children and adults with learning and cognitive disabilities. She consults with adult education, literacy, basic skills and GED instruction programs as well as correction facilities, employment and training agencies, human service organizations, and business on ways to create productive learning environments and maximize the potential of those with special learning needs. She has provided consultation in twenty-nine states and has developed and implemented the Payne Learning Needs Inventory and screening tools, facilitating long-term, system-wide change of service delivery models in the District of Columbia, Indiana, California, Oregon, Arkansas, West Virginia, Kentucky, Tennessee, Vermont, Oklahoma, Nebraska, Illinois, Mississippi, Rhode Island, North Carolina and Washington. She is a consultant for GED Testing Services. Ms. Payne has written numerous articles and book chapters on facilitating learning, assessment of special needs, transition to employment, and workplace accommodations. In 2000 a Brookes publication entitled Meeting the Challenge of Learning Disabilities in Adulthood by Arlyn Roffman, Ph.D. features Ms. Payne's personal insight about the impact of learning disabilities. Ms. Payne has a B.A. from the Evergreen State College in Liberal Arts, emphasis in Education-Administration and a M.S. from Chapman University School of Business and Economics in Human Resource Management and Organizational Development. Her civic work includes serving on the President's Committee for Employment of People with Disabilities Taskforce, Washington D.C.; participating in a National Institute for Literacy National Congress; past member of the National Learning Disabilities Research & Training Center Advisory Board; and serving her third term as a member of the National Learning Disabilities Association Professional Advisory Board. She is a member of the National Rehabilitation Association, the National Learning Disabilities Association, and the Commission on Adult Basic Education. She is currently an 18-year board member and past president of the Thurston County Economic Development Board of Directors and she serves on the Pacific Mountain Workforce Development Council Board of Directors as Chairperson. I look forward to reading the questions you post for Nancie Payne. Thanks very much, Rochelle Rochelle Kenyon, Moderator National Institute for Literacy Learning Disabilities Discussion List RKenyon721 at aol.com To subscribe: http://www.nifl.gov/mailman/listinfo/LearningDisabilities To read archived messages: http://www.nifl.gov/linc/discussions/list_archives.html -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060728/7e37ee8f/attachment.html From marie.cora at hotspurpartners.com Fri Jul 28 12:07:52 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 28 Jul 2006 12:07:52 -0400 Subject: [Assessment 426] Guest Discussion on English Language List Message-ID: <005201c6b25f$fb3464f0$0302a8c0@LITNOW> Hello again! Lots of interesting discussions to choose from - or not: why not join them both!!! Marie Cora Assessment Discussion List Moderator To subscribe to go: http://www.nifl.gov/mailman/listinfo/Englishlanguage Dear colleagues, I am happy to announce an upcoming panel discussion on the adult English language list on working with literacy-level adult English language learners. The discussion will be the week of August 7-11, with further questions, comments, and information-sharing welcome after that. Background Information Some teachers-especially those new to teaching adult English language learners-express concern about teaching learners who aren't literate in their native language or never went to school. In many ways, this concern is unwarranted. Having or not having had access to formal education does not correlate to cognitive functioning, interest, and energy. Most literacy-level learners will need explicit instruction in basic literacy skills (e.g., phonological processing, vocabulary development, syntactical processing). However, these learners bring an array of lifeskills knowledge (often including some oral proficiency and knowledge of American culture) problem-solving skills, and enthusiasm to the process. Still, teachers and administrators sometimes feel challenged by questions such as: * Who are the literacy-level adult ESL learners? * What skills do literacy-level learners need to develop? * How can programs and administrators effectively support literacy-level adult English language learners and their teachers? * What are effective instructional practices in the literacy class? * What are effective needs assessment activities for literacy-level adult English language learners? * What other approaches and activities are effective with literacy-level learners? * What resource are helpful for teachers? * What instructional materials are effective for literacy-level learners-to help them acquire skills they need to reach their personal goals? Process of the Discussion To address these and other questions, nine adult ESL and refugee content experts have graciously accepted my invitation to answer questions and share ideas on the topic of literacy-level learners in adult ESL. Within this group are teachers, program administrators, cultural orientation specialists, curriculum designers, assessment experts, and authors of teacher resources and literacy-level materials for learners. Members of the panel have worked extensively as volunteers, teachers, and administrators, in learning labs and online, in general ESL, workplace and work readiness programs, transition programs, family literacy, refugee programs, in the United States and overseas from Mongolia to (the then) Zaire. I started adding up the panelists' years of experience, but stopped when it topped 100 years. To organize this discussion with so many panelists, I will offer a short biography of each panelist, which includes their areas of particular expertise-although each panelist is knowledgeable in many areas related to adult ESL, refugees, and immigration. In this way, you can direct a question or comment to a specific panelist (e.g., a question about literacy-level learners in family literacy would be directed to the family literacy expert). However, all panelists, as well as the very many of you on the list who are also experts, please jump in at any time. I will post the nine biographies next week, a few days before the panel begins. The panelists will be: Sanja Bebic, Director, Cultural Orientation Resource Center, Center for Applied Linguistics, Washington, DC http://www.culturalorientation.net/ MaryAnn Cunningham Florez, Lead ESL Specialist, Arlington Education and Employment Program (REEP), Arlington, Virginia http://www.arlington.k12.va.us/instruct/ctae/adult_ed/REEP/ Debbie Jones, EL/Civics Literacy Coordinator, Arlington Education and Employment Program, Arlington, Virginia http://www.arlington.k12.va.us/instruct/ctae/adult_ed/REEP/ Sharon McKay, ESL Specialist, Center for Adult English Language Acquisition, Washington, DC http://ww.cal.org/caela Donna Moss, Family Literacy Coordinator, Arlington Education and Employment Program (REEP), Arlington, Virginia http://www.arlington.k12.va.us/instruct/ctae/adult_ed/REEP/ Barb Sample, Director of Educational Services, Spring Institute for Intercultural Learning, Denver, Colorado http://www.spring-institute.org/ Kate Singleton, Healthcare Social Worker, Fairfax INOVA Hospital, Fairfax, Virginia Sharyl Tanck, Program Coordinator, Cultural Orientation Resource Center, Center for Applied Linguistics, Washington, DC http://www.culturalorientation.net/ Betsy Lindeman Wong, Online facilitator, ESOL Basics, Virginia Adult Learning Resource Center, Richmond, Virginia http://www.valrc.org/ Pre-Discussion Reading If you are interested in reading more about literacy-level adult English language learners before August 7, here a few selected resources, with more to come later during the discussion: "Beginning ESOL Learners' Advice to Their Teachers." Mental Health and the Adult Refugee: The Role of the ESL Teacher What Non-readers or Beginning Readers Need to Know: Performance-based ESL Adult Literacy (Brod, 1999, ERIC No. ED 433 730 available from www.eric.ed.gov ) Working With Literacy-Level Adult English Language Learners Lynda Terrill Adult English language discussion list moderator Center for Adult English Language Acquisition Center for Applied Linguistics 4646 40th St, NW Washington, DC 20016 202-362-0700, ext 543 lterrill at cal.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060728/34cf7cf9/attachment.html From marie.cora at hotspurpartners.com Mon Jul 31 14:12:46 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 31 Jul 2006 14:12:46 -0400 Subject: [Assessment 427] PROLITERACY WORLDWIDE JOB OPPORTUNITY Message-ID: <01d301c6b4cc$ed9b8c00$0302a8c0@LITNOW> Dear all, The following announcement may be of interest to you. Marie Cora Assessment Discussion List Moderator *************************** Position: Project Manager Full-time position responsible for the implementation and ongoing management of the performance accountability project as well as all reporting required. This project runs through April 2009. Bachelor's in a relevant field (e.g., ed., mgt, etc.). Graduate degree preferred. Demonstrated success in literacy or ABE program management and implementing accountability systems. Experience in the development and delivery of professional development for adult education and literacy practitioners. Contracting experience with trainers and materials developers a plus. A minimum of three years experience in project management. Excellent written and oral communication, and public presentation skills. Must have team-oriented working style. Strong word processing skills required. Structured for telecommuting if outside central New York. 10-15% travel required. Send resume to ProLiteracy Worldwide/ HR, 1320 Jamesville Avenue, Syracuse, NY 13210, or e-mail: frontdesk at proliteracy.org. See website for information about ProLiteracy Worldwide and a detailed job description, www.proliteracy.org. Deadline for applications is August 7th. EEO From r.millar at uwinnipeg.ca Mon Jul 31 15:22:50 2006 From: r.millar at uwinnipeg.ca (Robin Millar) Date: Mon, 31 Jul 2006 14:22:50 -0500 Subject: [Assessment 428] definitions of diagnosis Message-ID: Hello, I am writing an article on diagnostic assessment and I am looking for a definition. I know what it is and that's what I end up describing. But I'm wondering if people have a definition they use. Help is welcome! Robin Millar Dr. Robin Millar Executive Director Centre for Education and Work 515 Portage Avenue Winnipeg, MB R3B 2E9 204-786-9395 From djrosen at comcast.net Mon Jul 31 17:49:46 2006 From: djrosen at comcast.net (David Rosen) Date: Mon, 31 Jul 2006 17:49:46 -0400 Subject: [Assessment 429] Re: definitions of diagnosis In-Reply-To: References: Message-ID: <55B907CD-16D6-4EB2-A661-14BE5B8778D9@comcast.net> Hi Robin, Diagnostic assessment is a kind of formative assessment of a student?s strengths, weaknesses, knowledge, and skills used by a teacher to adapt instruction to meet the student's unique needs. David David J. Rosen djrosen at comcast.net On Jul 31, 2006, at 3:22 PM, Robin Millar wrote: > Hello, I am writing an article on diagnostic assessment and I am > looking > for a definition. I know what it is and that's what I end up > describing. But I'm wondering if people have a definition they use. > Help is welcome! Robin Millar > > Dr. Robin Millar > Executive Director > Centre for Education and Work > 515 Portage Avenue > Winnipeg, MB R3B 2E9 > 204-786-9395 > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From ropteacher at gmail.com Mon Jul 31 18:13:53 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Mon, 31 Jul 2006 15:13:53 -0700 Subject: [Assessment 430] Re: Skills Bank Message-ID: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060731/2a7ce537/attachment.html From tarv at chemeketa.edu Mon Jul 31 20:00:46 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Mon, 31 Jul 2006 17:00:46 -0700 Subject: [Assessment 431] Re: Skills Bank In-Reply-To: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> Message-ID: We use skills bank every day here in our basic skills lab. We use it for credit as well as non credit students. We have the students run their own tracking sheet rather than the tracking portion of the program (due to our limitations). In Salem, they have a lab assistant who keeps track of the students. This software works great for basic reading, writing and math. It also has some science, etc. depending upon what you purchase. Students engage with it well and will stick to it long enough to gain skills using the software. Haven't worked enough with PLATO to comment. Va ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gloria Fuentes Sent: Monday, July 31, 2006 3:14 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060731/d9907d6c/attachment.html From tarv at chemeketa.edu Mon Jul 31 20:03:26 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Mon, 31 Jul 2006 17:03:26 -0700 Subject: [Assessment 432] Re: definitions of diagnosis In-Reply-To: <55B907CD-16D6-4EB2-A661-14BE5B8778D9@comcast.net> Message-ID: I guess I'd add student goals to what David said, other wise, that's it -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of David Rosen Sent: Monday, July 31, 2006 2:50 PM To: The Assessment Discussion List Subject: [Assessment 429] Re: definitions of diagnosis Hi Robin, Diagnostic assessment is a kind of formative assessment of a student's strengths, weaknesses, knowledge, and skills used by a teacher to adapt instruction to meet the student's unique needs. David David J. Rosen djrosen at comcast.net On Jul 31, 2006, at 3:22 PM, Robin Millar wrote: > Hello, I am writing an article on diagnostic assessment and I am > looking > for a definition. I know what it is and that's what I end up > describing. But I'm wondering if people have a definition they use. > Help is welcome! Robin Millar > > Dr. Robin Millar > Executive Director > Centre for Education and Work > 515 Portage Avenue > Winnipeg, MB R3B 2E9 > 204-786-9395 > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From djrosen at comcast.net Mon Jul 31 22:12:11 2006 From: djrosen at comcast.net (David Rosen) Date: Mon, 31 Jul 2006 22:12:11 -0400 Subject: [Assessment 433] Re: definitions of diagnosis In-Reply-To: References: Message-ID: <1F7E4864-064C-4398-8273-437EC7BA46DB@comcast.net> Thanks Virginia. That's an important addition. I have added it to the definition in the glossary section of the Adult Literacy Education Wiki http://wiki.literacytent.org/index.php/Diagnostic_assessment David David J. Rosen djrosen at comcast.net On Jul 31, 2006, at 8:03 PM, Virginia Tardaewether wrote: > I guess I'd add student goals to what David said, other wise, > that's it > > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > On Behalf Of David Rosen > Sent: Monday, July 31, 2006 2:50 PM > To: The Assessment Discussion List > Subject: [Assessment 429] Re: definitions of diagnosis > > Hi Robin, > > Diagnostic assessment is a kind of formative assessment of a > student's strengths, weaknesses, knowledge, and skills used by a > teacher to adapt instruction to meet the student's unique needs. > > David > > David J. Rosen > djrosen at comcast.net > > On Jul 31, 2006, at 3:22 PM, Robin Millar wrote: > >> Hello, I am writing an article on diagnostic assessment and I am >> looking >> for a definition. I know what it is and that's what I end up >> describing. But I'm wondering if people have a definition they use. >> Help is welcome! Robin Millar >> >> Dr. Robin Millar >> Executive Director >> Centre for Education and Work >> 515 Portage Avenue >> Winnipeg, MB R3B 2E9 >> 204-786-9395 >> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment From hdooley at riral.org Tue Aug 1 11:17:44 2006 From: hdooley at riral.org (Howard L. Dooley, Jr.) Date: Tue, 01 Aug 2006 11:17:44 -0400 Subject: [Assessment 434] Re: Skills Bank In-Reply-To: References: Message-ID: <44CF7098.7080508@riral.org> We use My Skills Tutor, a web based program, which works very well with our basic skills and GED or EDP preparation students. RI purchased a license as part of Project IDEAL participation, a distance learning initiative. At our main learning center, all High ESL, ABE and ASE students experience the program, and have varying levels of participation on it. Several of our teachers have been using the program with some learners, in preparation for an expanded use this fall with our general ed, community based programs (which are mostly part-time evening programs). We intend to offer learners a blended model (distance learning/ classroom instruction). Learners who can't attend every class or need to stop out will have support in continuing with My Skills Tutor. For learners in class, homework assignments will include using the program, or to add reinforcement and more time on task for individual needs. I agree with Virginia: we find most learners can access and stay with the program long enough to achieve some skill development. Very few students can use it as a stand-alone however, mostly because they lack independent study skills and self-monitoring. Staff is discussing if and how we can explicitly instruct in these areas. Howard D. Virginia Tardaewether wrote: > > We use skills bank every day here in our basic skills lab. We use it > for credit as well as non credit students. We have the students run > their own tracking sheet rather than the tracking portion of the > program (due to our limitations). In Salem, they have a lab assistant > who keeps track of the students. This software works great for basic > reading, writing and math. It also has some science, etc. depending > upon what you purchase. Students engage with it well and will stick > to it long enough to gain skills using the software. Haven't worked > enough with PLATO to comment. > > Va > > > > ------------------------------------------------------------------------ > > *From:* assessment-bounces at nifl.gov > [mailto:assessment-bounces at nifl.gov] *On Behalf Of *Gloria Fuentes > *Sent:* Monday, July 31, 2006 3:14 PM > *To:* Assessment at nifl.gov > *Subject:* [Assessment 430] Re: Skills Bank > > > > Is anyone familiar with the SkillsBank software or Plato, if so what > do you think about it for GED preparation? > > -- > Gloria Fuentes > > ------------------------------------------------------------------------ > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060801/c6a848dd/attachment.html From Shirley.Penn at MorganCC.edu Tue Aug 1 11:04:44 2006 From: Shirley.Penn at MorganCC.edu (Penn, Shirley) Date: Tue, 1 Aug 2006 09:04:44 -0600 Subject: [Assessment 435] Re: Skills Bank References: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> Message-ID: We have used Plato for about 12 years now. I have used Skillsbank in the past. The question is how are you planning on using it? Some programs use it as the only instruction. I do not agree with that approach. However I think Plato could be used quit effectively. I expecially like the fact that I can build instructional strands to meet the specific needs of the student. ________________________________ From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes Sent: Mon 7/31/2006 4:13 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3561 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20060801/09324038/attachment.bin From JWilletts at bccc.edu Tue Aug 1 12:16:33 2006 From: JWilletts at bccc.edu (Willetts, John) Date: Tue, 1 Aug 2006 12:16:33 -0400 Subject: [Assessment 436] Re: Skills Bank Message-ID: <6A4977C473B56C48AB5F0D1FF4BD8D281D86C7@mainpo.bccc.edu> We have both skills bank and plato for tutorial purposes for GED and Pre GED classes, and both are good, but by far skills bank in more user friendly! Call if you need more info. JBW John B. Willetts ACE Instructional Specialist Baltimore City Community College 700 E Lombard St., Baltimore, MD 21201 410-986-5458 jwilletts at bccc.edu -----Original Message----- From: Gloria Fuentes [mailto:ropteacher at gmail.com] Sent: Monday, July 31, 2006 6:14 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060801/b82470c6/attachment.html From pmcnaughton at language.ca Tue Aug 1 14:09:21 2006 From: pmcnaughton at language.ca (Pauline Mcnaughton) Date: Tue, 1 Aug 2006 14:09:21 -0400 Subject: [Assessment 437] Re: Assessment Digest, Vol 11, Issue 1 In-Reply-To: Message-ID: <00fc01c6b595$9da4dde0$9a01a8c0@language.ca> I like David's definition as well, and Virginia's amendment. I think it encapsulates the meaning well. I thought I would add the following anyway as additional "insights" from a book by Tara Holmes titled, "Integrating CLB Assessment into your ESL Classroom" ( Note: CLB = Canadian Language Benchmarks, and is the "standard" referred to below) Copyright 2005: Centre for Canadian Language Benchmarks. Tara mentions diagnostic assessment in the broader context of "assessment for learning" (formative) as opposed to "assessment of learning" (summative). She talks about "assessment for learning" as follows: Purpose: Assessment supports ongoing learner growth. Assessment information is used to help learners and teachers decide what to do next. Role of Teacher: Teacher gives clear descriptive feedback to learners to help them improve. Requires that the teacher understand the standards, analyze the gap between present and desired performance, and give feedback to learners. Teacher modifies and individualizes teaching/learning activities. Role of Learner: Requires that learners understand the standards (what is expected), are involved in self-assessment, and act on assessment information to improve. Timing: Is an integrated component of the teaching and learning process. Criteria for Effectiveness: Is considered effective when learners use the assessment information to support their learning and teachers use the information to adjust the teaching/learning activities in the classroom. Although this is not specifically focused on "diagnostic assessment" I thought it was interesting. Diagnostic assessment is a kind of formative assessment of a > student's strengths, weaknesses, knowledge, and skills used by a > teacher to adapt instruction to meet the student's unique needs or goals. -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of assessment-request at nifl.gov Sent: August 1, 2006 11:00 AM To: assessment at nifl.gov Subject: Assessment Digest, Vol 11, Issue 1 Send Assessment mailing list submissions to assessment at nifl.gov To subscribe or unsubscribe via the World Wide Web, visit http://www.nifl.gov/mailman/listinfo/assessment or, via email, send a message with subject or body 'help' to assessment-request at nifl.gov You can reach the person managing the list at assessment-owner at nifl.gov When replying, please edit your Subject line so it is more specific than "Re: Contents of Assessment digest..." From Tina_Luffman at yc.edu Tue Aug 1 16:28:16 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Tue, 1 Aug 2006 13:28:16 -0700 Subject: [Assessment 438] Re: Skills Bank Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060801/cd689522/attachment.html From ropteacher at gmail.com Tue Aug 1 18:15:50 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Tue, 1 Aug 2006 15:15:50 -0700 Subject: [Assessment 439] Re: Skills Bank In-Reply-To: <44CF7098.7080508@riral.org> References: <44CF7098.7080508@riral.org> Message-ID: <38914de00608011515l5a0ff5dmb34dd2165ce632bd@mail.gmail.com> Hi Howard, I didn't know they had a web based program as well. What version are you using over there? I came into this teaching job not knowing anything about GED preparation. I am teaching Office Occupations and didn't know I would also be preparing my students for GED as well. Which I don't mind even though I am a brand new teacher. I took over this classroom where their had been no teacher for 4 months. The only one that was working with the students was the sweet Case Technician who knows very little about computers. I had never used Skills Bank and no one was there to show me the ropes on anything, first teaching job and I have been learning everything the hard way. I finally just figured out how to work the SkillsBank4 version they have there. I did try to get some support on it but found out they don't support version 4 anymore since they are now on version 5. SO--I have been figuring it all out on my own. So far I think its great but I have come to a problem where one of my students is ready to go on to Intermediate Math and they say I don't have that, with our package. I have been told that when my boss gets back from vacation he has found some money for some new hardware and software. So since we are working on ANTIQUE PC's I thought I would put in for some new ones and also have been checking out some good GED-Prep programs. So far I think SkillsBank is winning. Plato seems VERY expensive and I am hearing it isn't very user friendly. I have also found working with my students, along with the Skills Bank program seems to be working BEST! Thank you very much for your response, Gloria On 8/1/06, Howard L. Dooley, Jr. wrote: > > We use My Skills Tutor, a web based program, which works very well with > our basic skills and GED or EDP preparation students. RI purchased a > license as part of Project IDEAL participation, a distance learning > initiative. At our main learning center, all High ESL, ABE and ASE students > experience the program, and have varying levels of participation on it. > Several of our teachers have been using the program with some learners, in > preparation for an expanded use this fall with our general ed, community > based programs (which are mostly part-time evening programs). We intend to > offer learners a blended model (distance learning/ classroom instruction). > Learners who can't attend every class or need to stop out will have support > in continuing with My Skills Tutor. For learners in class, homework > assignments will include using the program, or to add reinforcement and more > time on task for individual needs. I agree with Virginia: we find most > learners can access and stay with the program long enough to achieve some > skill development. Very few students can use it as a stand-alone however, > mostly because they lack independent study skills and self-monitoring. > Staff is discussing if and how we can explicitly instruct in these areas. > > Howard D. > > > Virginia Tardaewether wrote: > > We use skills bank every day here in our basic skills lab. We use it for > credit as well as non credit students. We have the students run their own > tracking sheet rather than the tracking portion of the program (due to our > limitations). In Salem, they have a lab assistant who keeps track of the > students. This software works great for basic reading, writing and math. > It also has some science, etc. depending upon what you purchase. Students > engage with it well and will stick to it long enough to gain skills using > the software. Haven't worked enough with PLATO to comment. > > Va > > > ------------------------------ > > *From:* assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > *On Behalf Of *Gloria Fuentes > *Sent:* Monday, July 31, 2006 3:14 PM > *To:* Assessment at nifl.gov > *Subject:* [Assessment 430] Re: Skills Bank > > > > Is anyone familiar with the SkillsBank software or Plato, if so what do > you think about it for GED preparation? > > -- > Gloria Fuentes > > ------------------------------ > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060801/8ce469ef/attachment.html From Khinson at future-gate.com Tue Aug 1 19:19:40 2006 From: Khinson at future-gate.com (Katrina Hinson) Date: Wed, 02 Aug 2006 01:19:40 +0200 Subject: [Assessment 440] Re: Skills Bank In-Reply-To: References: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> Message-ID: <44CFA94B.121D.00A0.0@future-gate.com> We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, Keytrain and A+Anywhere. Additionally I have experience with using PLATO as well. We've kind of taken a different approach and use different programs with different levels of students. My Skills Tutor is one we use with our midrange level students while A+ and MHC are programs we use with our higher level students. Keytrain is one we use for students who might only need to "brush" up and can do so quickly. PLATO is similar to A+ in that it is very academic oriented and the readability level of the questions and content in both is higher than that of My Skills Tutor. All of the software programs when appropriately used are excellent resources are all about the same in terms of "user friendliness." I have to do the administration for most of the programs I've listed and I always create a student account so that I can practice the same assignments I give with my students. My students like all of them...and recognize that each has its own different difficulty levels. One may work on fractions in one program and %'s in another and use still another for writing practice. The key is finding where your own comfort level is as well as the comfort level of your student. They are excellent ways to reinforce skills or provide practice at home for students who want homework or for students who may have to take a "break" from class for whatever reason. It gives them a chance to keep their skills up if they so desire. Like someone else, I'm not sure I like the idea of them being stand alone - albeit with A+Anywhere our program does utilize A+Anywhere for Adult High School Students who need a credit for a class that we don't necessarily offer every semester but is needed for graduation. It correlates with our states Standard Course of Study and also offers a college readiness module. Regards, Katrina Hinson ________________________________ From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes Sent: Mon 7/31/2006 4:13 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes From ropteacher at gmail.com Wed Aug 2 08:04:12 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Wed, 2 Aug 2006 05:04:12 -0700 Subject: [Assessment 441] Re: Skills Bank In-Reply-To: <44CFA94B.121D.00A0.0@future-gate.com> References: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> <44CFA94B.121D.00A0.0@future-gate.com> Message-ID: <38914de00608020504o70b46c16w775202c0d4a81d5b@mail.gmail.com> Hi Katrina, Thank you so much for your reply. I have the Skills Bank 4, and I do like it. I teach at risk kids, ages 18 to 21. Some of my kids just drifted through school not gaining anything from it. Others are stuck in their math or other areas but a LOT of them are at an early elementary level, it really makes me wonder how they made it through school all the years they did. But that is besides the point, my main objective is to teach them the skills they need to pass the GED! We don't have the My Skills Tutor is that something that is designed by the Skills Bank people? MCH Interactive, Keytrain and A+Anywhere are these through Skills Bank? I would really love to take a look at them. We don't have a lot of money for our program and my own pocket book is tapped out! So whatever I decide on will have to be a good price. The Plato is pretty expensive and I don't think I will be able to get my Director to go along with it. Anyways thank you so much for your input. Gloria On 8/1/06, Katrina Hinson wrote: > > > We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, > Keytrain and A+Anywhere. Additionally I have experience with using > PLATO as well. We've kind of taken a different approach and use > different programs with different levels of students. My Skills Tutor is > one we use with our midrange level students while A+ and MHC are > programs we use with our higher level students. Keytrain is one we use > for students who might only need to "brush" up and can do so quickly. > PLATO is similar to A+ in that it is very academic oriented and the > readability level of the questions and content in both is higher than > that of My Skills Tutor. > > All of the software programs when appropriately used are excellent > resources are all about the same in terms of "user friendliness." I have > to do the administration for most of the programs I've listed and I > always create a student account so that I can practice the same > assignments I give with my students. My students like all of them...and > recognize that each has its own different difficulty levels. One may > work on fractions in one program and %'s in another and use still > another for writing practice. The key is finding where your own comfort > level is as well as the comfort level of your student. > > They are excellent ways to reinforce skills or provide practice at home > for students who want homework or for students who may have to take a > "break" from class for whatever reason. It gives them a chance to keep > their skills up if they so desire. Like someone else, I'm not sure I > like the idea of them being stand alone - albeit with A+Anywhere our > program does utilize A+Anywhere for Adult High School Students who need > a credit for a class that we don't necessarily offer every semester but > is needed for graduation. It correlates with our states Standard Course > of Study and also offers a college readiness module. > > > Regards, > > Katrina Hinson > > > ________________________________ > > From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes > Sent: Mon 7/31/2006 4:13 PM > To: Assessment at nifl.gov > Subject: [Assessment 430] Re: Skills Bank > > > Is anyone familiar with the SkillsBank software or Plato, if so what do > you think about it for GED preparation? > > -- > Gloria Fuentes > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/3fda8f4d/attachment.html From Karen.Limkemann at fwliteracyalliance.org Wed Aug 2 11:00:57 2006 From: Karen.Limkemann at fwliteracyalliance.org (Limkemann, Karen) Date: Wed, 2 Aug 2006 11:00:57 -0400 Subject: [Assessment 442] Re: Skills Bank Message-ID: We also use KeyTrain as an instructional method but we use it with all levels including ESL. The auditory support is very popular with the ESL students who frequently have minimal opportunity to listen to standard spoken English. They love the repeat button! The beginning skills portions of the software are the most critical for lower level students. We had a situation here in Indiana where the South Bend area bought the KeyTrain program but NOT the beginning skills. They had to go back and get those portions. Without the beginning skills sections Katrina is right, it would only be useful for higher end students. Karen Limkemann Ft. Wayne, IN -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Katrina Hinson Sent: Wednesday, August 02, 2006 8:46 AM To: The Assessment Discussion List Subject: [Assessment 440] Re: Skills Bank We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, Keytrain and A+Anywhere. Additionally I have experience with using PLATO as well. We've kind of taken a different approach and use different programs with different levels of students. My Skills Tutor is one we use with our midrange level students while A+ and MHC are programs we use with our higher level students. Keytrain is one we use for students who might only need to "brush" up and can do so quickly. PLATO is similar to A+ in that it is very academic oriented and the readability level of the questions and content in both is higher than that of My Skills Tutor. All of the software programs when appropriately used are excellent resources are all about the same in terms of "user friendliness." I have to do the administration for most of the programs I've listed and I always create a student account so that I can practice the same assignments I give with my students. My students like all of them...and recognize that each has its own different difficulty levels. One may work on fractions in one program and %'s in another and use still another for writing practice. The key is finding where your own comfort level is as well as the comfort level of your student. They are excellent ways to reinforce skills or provide practice at home for students who want homework or for students who may have to take a "break" from class for whatever reason. It gives them a chance to keep their skills up if they so desire. Like someone else, I'm not sure I like the idea of them being stand alone - albeit with A+Anywhere our program does utilize A+Anywhere for Adult High School Students who need a credit for a class that we don't necessarily offer every semester but is needed for graduation. It correlates with our states Standard Course of Study and also offers a college readiness module. Regards, Katrina Hinson ________________________________ From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes Sent: Mon 7/31/2006 4:13 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From tarv at chemeketa.edu Wed Aug 2 12:01:46 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Wed, 2 Aug 2006 09:01:46 -0700 Subject: [Assessment 443] Re: Skills Bank In-Reply-To: <38914de00608011515l5a0ff5dmb34dd2165ce632bd@mail.gmail.com> Message-ID: I agree with Howard Gloria. We used the web-based. The only issue is cost as is true for all software. I have many ESL, ABE and GED students who use it along with other instructional strategies. Have you tried getting your students to figure it out instead of you? You might have some great computer "geeks" hidden there in the group....that gives them focus and frees your time. va ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gloria Fuentes Sent: Tuesday, August 01, 2006 3:16 PM To: hdooley at riral.org; The Assessment Discussion List Subject: [Assessment 439] Re: Skills Bank Hi Howard, I didn't know they had a web based program as well. What version are you using over there? I came into this teaching job not knowing anything about GED preparation. I am teaching Office Occupations and didn't know I would also be preparing my students for GED as well. Which I don't mind even though I am a brand new teacher. I took over this classroom where their had been no teacher for 4 months. The only one that was working with the students was the sweet Case Technician who knows very little about computers. I had never used Skills Bank and no one was there to show me the ropes on anything, first teaching job and I have been learning everything the hard way. I finally just figured out how to work the SkillsBank4 version they have there. I did try to get some support on it but found out they don't support version 4 anymore since they are now on version 5. SO--I have been figuring it all out on my own. So far I think its great but I have come to a problem where one of my students is ready to go on to Intermediate Math and they say I don't have that, with our package. I have been told that when my boss gets back from vacation he has found some money for some new hardware and software. So since we are working on ANTIQUE PC's I thought I would put in for some new ones and also have been checking out some good GED-Prep programs. So far I think SkillsBank is winning. Plato seems VERY expensive and I am hearing it isn't very user friendly. I have also found working with my students, along with the Skills Bank program seems to be working BEST! Thank you very much for your response, Gloria On 8/1/06, Howard L. Dooley, Jr. wrote: We use My Skills Tutor, a web based program, which works very well with our basic skills and GED or EDP preparation students. RI purchased a license as part of Project IDEAL participation, a distance learning initiative. At our main learning center, all High ESL, ABE and ASE students experience the program, and have varying levels of participation on it. Several of our teachers have been using the program with some learners, in preparation for an expanded use this fall with our general ed, community based programs (which are mostly part-time evening programs). We intend to offer learners a blended model (distance learning/ classroom instruction). Learners who can't attend every class or need to stop out will have support in continuing with My Skills Tutor. For learners in class, homework assignments will include using the program, or to add reinforcement and more time on task for individual needs. I agree with Virginia: we find most learners can access and stay with the program long enough to achieve some skill development. Very few students can use it as a stand-alone however, mostly because they lack independent study skills and self-monitoring. Staff is discussing if and how we can explicitly instruct in these areas. Howard D. Virginia Tardaewether wrote: We use skills bank every day here in our basic skills lab. We use it for credit as well as non credit students. We have the students run their own tracking sheet rather than the tracking portion of the program (due to our limitations). In Salem, they have a lab assistant who keeps track of the students. This software works great for basic reading, writing and math. It also has some science, etc. depending upon what you purchase. Students engage with it well and will stick to it long enough to gain skills using the software. Haven't worked enough with PLATO to comment. Va ________________________________ From: assessment-bounces at nifl.gov [ mailto:assessment-bounces at nifl.gov ] On Behalf Of Gloria Fuentes Sent: Monday, July 31, 2006 3:14 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes ________________________________ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/9946fc3b/attachment.html From tarv at chemeketa.edu Wed Aug 2 12:10:40 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Wed, 2 Aug 2006 09:10:40 -0700 Subject: [Assessment 444] Re: Skills Bank In-Reply-To: Message-ID: Tina I've used skills tutor with Math 20, math 052, GED, ESL transition students, ABE, GED and reading 080, 090 and 115. It has a large enough system that there are lessons that can be linked to individuals for skills gain. Va ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Tina_Luffman at yc.edu Sent: Tuesday, August 01, 2006 1:28 PM To: The Assessment Discussion List Subject: [Assessment 438] Re: Skills Bank Hi List members, We use Skills Tutor, a Web based version of Skills Bank here at Yavapai College, and it has been highly valuable for both developmental education and freshman level work in English and Math classes. Our ADEAL GED Online classes have actually taken a turn this semester. Three semesters ago the state of Arizona bought us GED Online from MHC to serve our online student popuation. Yavapai College had already bought us the rights to use Skills Tutor. After experimenting with both software programs, most of our students this year have self selected Skills Tutor as the favorite program of the two, and are using it almost exclusively on the Verde Campus. I have seen PLATO and believe that Skills Tutor is the better program over that one as well, although each program has its value. Any one else have any ideas? Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -----assessment-bounces at nifl.gov wrote: ----- To: "The Assessment Discussion List" From: "Virginia Tardaewether" Sent by: assessment-bounces at nifl.gov Date: 07/31/2006 05:00PM Subject: [Assessment 431] Re: Skills Bank We use skills bank every day here in our basic skills lab. We use it for credit as well as non credit students. We have the students run their own tracking sheet rather than the tracking portion of the program (due to our limitations). InSalem, they have a lab assistant who keeps track of the students. This software works great for basic reading, writing and math. It also has some science, etc. depending upon what you purchase. Students engage with it well and will stick to it long enough to gain skills using the software. Haven?t worked enough with PLATO to comment. Va ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gloria Fuentes Sent: Monday, July 31, 2006 3:14 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/fab58aaa/attachment.html From tarv at chemeketa.edu Wed Aug 2 12:11:34 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Wed, 2 Aug 2006 09:11:34 -0700 Subject: [Assessment 445] Re: definitions of diagnosis In-Reply-To: <1F7E4864-064C-4398-8273-437EC7BA46DB@comcast.net> Message-ID: Wow Thanks -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of David Rosen Sent: Monday, July 31, 2006 7:12 PM To: The Assessment Discussion List Subject: [Assessment 433] Re: definitions of diagnosis Thanks Virginia. That's an important addition. I have added it to the definition in the glossary section of the Adult Literacy Education Wiki http://wiki.literacytent.org/index.php/Diagnostic_assessment David David J. Rosen djrosen at comcast.net On Jul 31, 2006, at 8:03 PM, Virginia Tardaewether wrote: > I guess I'd add student goals to what David said, other wise, > that's it > > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > On Behalf Of David Rosen > Sent: Monday, July 31, 2006 2:50 PM > To: The Assessment Discussion List > Subject: [Assessment 429] Re: definitions of diagnosis > > Hi Robin, > > Diagnostic assessment is a kind of formative assessment of a > student's strengths, weaknesses, knowledge, and skills used by a > teacher to adapt instruction to meet the student's unique needs. > > David > > David J. Rosen > djrosen at comcast.net > > On Jul 31, 2006, at 3:22 PM, Robin Millar wrote: > >> Hello, I am writing an article on diagnostic assessment and I am >> looking >> for a definition. I know what it is and that's what I end up >> describing. But I'm wondering if people have a definition they use. >> Help is welcome! Robin Millar >> >> Dr. Robin Millar >> Executive Director >> Centre for Education and Work >> 515 Portage Avenue >> Winnipeg, MB R3B 2E9 >> 204-786-9395 >> >> >> ------------------------------- >> National Institute for Literacy >> Assessment mailing list >> Assessment at nifl.gov >> To unsubscribe or change your subscription settings, please go to >> http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From Tina_Luffman at yc.edu Wed Aug 2 13:17:35 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Wed, 2 Aug 2006 10:17:35 -0700 Subject: [Assessment 446] Re: Skills Bank In-Reply-To: Message-ID: Hi Virginia and all, Yes, I have used Skills Tutor with ENG 140 (a freshman level reading class) for the college as well as for GED classes. The college had Skills Bank and then upgraded to Skills Tutor about 3 years ago. The parts I like about Skills Tutor are: 1) the way the software engages the students, 2) the fact that the software self-diagnoses the student lessons after the pretests, 3) the software grades the lessons so we can see right away if the student needs to go back and do the lesson over, or perhaps get additional lessons from another program or a textbook, or teacher instruction, 4) the versatility of the program whereby the teacher actually designs which lessons he/she thinks are important for this particular student or class, 5) the many levels of the software. Yavapai College bought Learning Milestones as well as the regular Skills Tutor lessons, so we can use the materials all the way down into higher level elementary school all the way into high school level work. What I really like about the GED Online software by McGraw Hill is that it has pretests and posttests that simulate the actual GED exam, and then offer lessons specific to building those skills the student misses in the pretest, and then offers the posttest to show improvement. The program is definitely less broad than most and needs more outside support, but is a great tool for students from about 8th grade level on up to prepare them for the GED exam. This software program offers a lot of great video presentations as well primarily in Science and Social Studies. For the online students I get who are close to passing the GED exam, this program has been highly effective. I am not qualified to talk about PLATO as I have not used it with students, only saw it at a training in Phoenix and played with it there for a while. The best software program I have seen for reading improvement is Reading Power Modules, which I mentioned before in the NIFL list. This program is from Steck Vaughn, 1999, and I am not sure if it is still available for sale. The program is lan based rather than web based, but is highly engaging with students and has brought up reading rates as much as 30 percent in one semester with my ENG 140 classes, and nearly always brings up a TABE score at least one level within three months' usage. Thanks for letting me share. Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/7ee3c3cd/attachment.html From ropteacher at gmail.com Wed Aug 2 15:44:32 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Wed, 2 Aug 2006 12:44:32 -0700 Subject: [Assessment 447] Re: Skills Bank In-Reply-To: References: Message-ID: <38914de00608021244s7b10be70ke8c72ba07ff87eca@mail.gmail.com> Virginia, That is an excellent idea! Right now most of my students are going away to a camp for the next 10 days to the University of Redlands, a great program they have going on there. They will also get paid for this at the end of the 10 days. I think I may have one student that didn't want to participate so now I know what to do with him! Thank you, Gloria On 8/2/06, Tina_Luffman at yc.edu wrote: > > > Hi Virginia and all, > > Yes, I have used *Skills Tutor* with ENG 140 (a freshman level reading > class) for the college as well as for GED classes. The college had Skills > Bank and then upgraded to Skills Tutor about 3 years ago. The parts I like > about Skills Tutor are: > > 1) the way the software engages the students, > > 2) the fact that the software self-diagnoses the student lessons after the > pretests, > > 3) the software grades the lessons so we can see right away if the student > needs to go back and do the lesson over, or perhaps get additional lessons > from another program or a textbook, or teacher instruction, > > 4) the versatility of the program whereby the teacher actually designs > which lessons he/she thinks are important for this particular student or > class, > > 5) the many levels of the software. Yavapai College bought Learning > Milestones as well as the regular Skills Tutor lessons, so we can use the > materials all the way down into higher level elementary school all the way > into high school level work. > > What I really like about the *GED Online* software by McGraw Hill is that > it has pretests and posttests that simulate the actual GED exam, and then > offer lessons specific to building those skills the student misses in the > pretest, and then offers the posttest to show improvement. The program is > definitely less broad than most and needs more outside support, but is a > great tool for students from about 8th grade level on up to prepare them for > the GED exam. This software program offers a lot of great video > presentations as well primarily in Science and Social Studies. For the > online students I get who are close to passing the GED exam, this program > has been highly effective. > > I am not qualified to talk about *PLATO* as I have not used it with > students, only saw it at a training in Phoenix and played with it there for > a while. > > The best software program I have seen for reading improvement is* Reading > Power Modules*, which I mentioned before in the NIFL list. This program is > from Steck Vaughn, 1999, and I am not sure if it is still available for > sale. The program is lan based rather than web based, but is highly engaging > with students and has brought up reading rates as much as 30 percent in one > semester with my ENG 140 classes, and nearly always brings up a TABE score > at least one level within three months' usage. > > Thanks for letting me share. > > > Tina > > > > > > > > Tina Luffman > Coordinator, Developmental Education > Verde Valley Campus > 928-634-6544 > tina_luffman at yc.edu > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/b4206d80/attachment.html From ryanryanc at yahoo.com Wed Aug 2 15:52:08 2006 From: ryanryanc at yahoo.com (Ryan Hall) Date: Wed, 02 Aug 2006 15:52:08 -0400 Subject: [Assessment 448] Re: Skills Bank In-Reply-To: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> Message-ID: I have never heard of Plato, but Skills Bank is used in the college tutoring lab where I work. We use the reading, writing, and math sections. The program is mostly used by students who are in the remedial courses (reading, writing, and math), but students taking courses for credit also use the program. The students say they like it because it teaches the skills they need without giving tons of details they get lost in, and it asks questions throughout each section that keeps them engaged. I think it would help your students prepare for the GED. I know this is a very brief answer to your question, but if you have other questions, let me know. Ryan On 7/31/06 6:13 PM, "Gloria Fuentes" wrote: > Is anyone familiar with the SkillsBank software or Plato, if so what do you > think about it for GED preparation? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/b7575d18/attachment.html From ropteacher at gmail.com Wed Aug 2 17:09:55 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Wed, 2 Aug 2006 14:09:55 -0700 Subject: [Assessment 449] Re: Skills Bank In-Reply-To: References: <38914de00607311513pc84ebdegb828e5f6dba92f61@mail.gmail.com> Message-ID: <38914de00608021409g593b059al8ab7a00b8a01d888@mail.gmail.com> Thank you Ryan, What version are you guys using? What I have heard so far of Plato I am thinking it may be a little bit to much for my students, and WAY to much for our districts budget. Where as SkillsBank seems much more affordable. Gloria On 8/2/06, Ryan Hall wrote: > > I have never heard of Plato, but Skills Bank is used in the college > tutoring lab where I work. We use the reading, writing, and math sections. > The program is mostly used by students who are in the remedial courses > (reading, writing, and math), but students taking courses for credit also > use the program. The students say they like it because it teaches the skills > they need without giving tons of details they get lost in, and it asks > questions throughout each section that keeps them engaged. I think it would > help your students prepare for the GED. > I know this is a very brief answer to your question, but if you have other > questions, let me know. > > Ryan > > > On 7/31/06 6:13 PM, "Gloria Fuentes" wrote: > > Is anyone familiar with the SkillsBank software or Plato, if so what do > you think about it for GED preparation? > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/45bd3633/attachment.html From tarv at chemeketa.edu Wed Aug 2 20:49:16 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Wed, 2 Aug 2006 17:49:16 -0700 Subject: [Assessment 450] Re: Skills Bank In-Reply-To: <38914de00608021244s7b10be70ke8c72ba07ff87eca@mail.gmail.com> Message-ID: I have found students to be terrific classroom resources....and why should I do all the work when they can gain skills by doing it? ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gloria Fuentes Sent: Wednesday, August 02, 2006 12:45 PM To: The Assessment Discussion List Subject: [Assessment 447] Re: Skills Bank Virginia, That is an excellent idea! Right now most of my students are going away to a camp for the next 10 days to the University of Redlands, a great program they have going on there. They will also get paid for this at the end of the 10 days. I think I may have one student that didn't want to participate so now I know what to do with him! Thank you, Gloria On 8/2/06, Tina_Luffman at yc.edu wrote: Hi Virginia and all, Yes, I have used Skills Tutor with ENG 140 (a freshman level reading class) for the college as well as for GED classes. The college had Skills Bank and then upgraded to Skills Tutor about 3 years ago. The parts I like about Skills Tutor are: 1) the way the software engages the students, 2) the fact that the software self-diagnoses the student lessons after the pretests, 3) the software grades the lessons so we can see right away if the student needs to go back and do the lesson over, or perhaps get additional lessons from another program or a textbook, or teacher instruction, 4) the versatility of the program whereby the teacher actually designs which lessons he/she thinks are important for this particular student or class, 5) the many levels of the software. Yavapai College bought Learning Milestones as well as the regular Skills Tutor lessons, so we can use the materials all the way down into higher level elementary school all the way into high school level work. What I really like about the GED Online software by McGraw Hill is that it has pretests and posttests that simulate the actual GED exam, and then offer lessons specific to building those skills the student misses in the pretest, and then offers the posttest to show improvement. The program is definitely less broad than most and needs more outside support, but is a great tool for students from about 8th grade level on up to prepare them for the GED exam. This software program offers a lot of great video presentations as well primarily in Science and Social Studies. For the online students I get who are close to passing the GED exam, this program has been highly effective. I am not qualified to talk about PLATO as I have not used it with students, only saw it at a training in Phoenix and played with it there for a while. The best software program I have seen for reading improvement is Reading Power Modules, which I mentioned before in the NIFL list. This program is from Steck Vaughn, 1999, and I am not sure if it is still available for sale. The program is lan based rather than web based, but is highly engaging with students and has brought up reading rates as much as 30 percent in one semester with my ENG 140 classes, and nearly always brings up a TABE score at least one level within three months' usage. Thanks for letting me share. Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060802/186ce567/attachment.html From khinson at future-gate.com Wed Aug 2 21:58:58 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Thu, 03 Aug 2006 03:58:58 +0200 Subject: [Assessment 451] Re: Skills Bank Message-ID: <44D17482020000A000002F9E@fgwiel01a.wie.de.future-gate.com> I'm glad you pointed this out. I'm going to have to investigate this on our end. We've been looking for something like this and this might be something for us to add. Thanks! Katrina >>> "Limkemann, Karen" 08/02/06 8:00 AM >>> We also use KeyTrain as an instructional method but we use it with all levels including ESL. The auditory support is very popular with the ESL students who frequently have minimal opportunity to listen to standard spoken English. They love the repeat button! The beginning skills portions of the software are the most critical for lower level students. We had a situation here in Indiana where the South Bend area bought the KeyTrain program but NOT the beginning skills. They had to go back and get those portions. Without the beginning skills sections Katrina is right, it would only be useful for higher end students. Karen Limkemann Ft. Wayne, IN -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Katrina Hinson Sent: Wednesday, August 02, 2006 8:46 AM To: The Assessment Discussion List Subject: [Assessment 440] Re: Skills Bank We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, Keytrain and A+Anywhere. Additionally I have experience with using PLATO as well. We've kind of taken a different approach and use different programs with different levels of students. My Skills Tutor is one we use with our midrange level students while A+ and MHC are programs we use with our higher level students. Keytrain is one we use for students who might only need to "brush" up and can do so quickly. PLATO is similar to A+ in that it is very academic oriented and the readability level of the questions and content in both is higher than that of My Skills Tutor. All of the software programs when appropriately used are excellent resources are all about the same in terms of "user friendliness." I have to do the administration for most of the programs I've listed and I always create a student account so that I can practice the same assignments I give with my students. My students like all of them...and recognize that each has its own different difficulty levels. One may work on fractions in one program and %'s in another and use still another for writing practice. The key is finding where your own comfort level is as well as the comfort level of your student. They are excellent ways to reinforce skills or provide practice at home for students who want homework or for students who may have to take a "break" from class for whatever reason. It gives them a chance to keep their skills up if they so desire. Like someone else, I'm not sure I like the idea of them being stand alone - albeit with A+Anywhere our program does utilize A+Anywhere for Adult High School Students who need a credit for a class that we don't necessarily offer every semester but is needed for graduation. It correlates with our states Standard Course of Study and also offers a college readiness module. Regards, Katrina Hinson ________________________________ From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes Sent: Mon 7/31/2006 4:13 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From khinson at future-gate.com Wed Aug 2 21:59:17 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Thu, 03 Aug 2006 03:59:17 +0200 Subject: [Assessment 452] Re: Skills Bank Message-ID: <44D17495020000A000002FA2@fgwiel01a.wie.de.future-gate.com> MHC Interactive is developed by McGrawHill MySkills Tutor is developed by Achievement technologies. I don't think this is part of the same package as Skills Bank. It's a different company/product. A+Anywhere Learning system is developed by the American Education Corporation Off the top of my head, I can't tell you developed Keytrain. Keytrain is part of a larger system. I don't know what state you're in but you can probably find the representatives for each of these companies for your area. If not, I can provide the contacts I have at least as a jumping off point. Regards, Katrina Hinson >>> "Gloria Fuentes" 08/02/06 5:04 AM >>> Hi Katrina, Thank you so much for your reply. I have the Skills Bank 4, and I do like it. I teach at risk kids, ages 18 to 21. Some of my kids just drifted through school not gaining anything from it. Others are stuck in their math or other areas but a LOT of them are at an early elementary level, it really makes me wonder how they made it through school all the years they did. But that is besides the point, my main objective is to teach them the skills they need to pass the GED! We don't have the My Skills Tutor is that something that is designed by the Skills Bank people? MCH Interactive, Keytrain and A+Anywhere are these through Skills Bank? I would really love to take a look at them. We don't have a lot of money for our program and my own pocket book is tapped out! So whatever I decide on will have to be a good price. The Plato is pretty expensive and I don't think I will be able to get my Director to go along with it. Anyways thank you so much for your input. Gloria On 8/1/06, Katrina Hinson wrote: > > > We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, > Keytrain and A+Anywhere. Additionally I have experience with using > PLATO as well. We've kind of taken a different approach and use > different programs with different levels of students. My Skills Tutor is > one we use with our midrange level students while A+ and MHC are > programs we use with our higher level students. Keytrain is one we use > for students who might only need to "brush" up and can do so quickly. > PLATO is similar to A+ in that it is very academic oriented and the > readability level of the questions and content in both is higher than > that of My Skills Tutor. > > All of the software programs when appropriately used are excellent > resources are all about the same in terms of "user friendliness." I have > to do the administration for most of the programs I've listed and I > always create a student account so that I can practice the same > assignments I give with my students. My students like all of them...and > recognize that each has its own different difficulty levels. One may > work on fractions in one program and %'s in another and use still > another for writing practice. The key is finding where your own comfort > level is as well as the comfort level of your student. > > They are excellent ways to reinforce skills or provide practice at home > for students who want homework or for students who may have to take a > "break" from class for whatever reason. It gives them a chance to keep > their skills up if they so desire. Like someone else, I'm not sure I > like the idea of them being stand alone - albeit with A+Anywhere our > program does utilize A+Anywhere for Adult High School Students who need > a credit for a class that we don't necessarily offer every semester but > is needed for graduation. It correlates with our states Standard Course > of Study and also offers a college readiness module. > > > Regards, > > Katrina Hinson > > > ________________________________ > > From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes > Sent: Mon 7/31/2006 4:13 PM > To: Assessment at nifl.gov > Subject: [Assessment 430] Re: Skills Bank > > > Is anyone familiar with the SkillsBank software or Plato, if so what do > you think about it for GED preparation? > > -- > Gloria Fuentes > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > -- Gloria Fuentes From Karen.Limkemann at fwliteracyalliance.org Thu Aug 3 09:14:36 2006 From: Karen.Limkemann at fwliteracyalliance.org (Limkemann, Karen) Date: Thu, 3 Aug 2006 09:14:36 -0400 Subject: [Assessment 453] Re: Skills Bank Message-ID: KeyTrain is published by Thinking Media. It is actually part of a larger Employment System that includes the WorkKeys Assessment by ACT. WorkKeys is used as a hiring tool by many companies. You can check out www.keytrain.com for more info. We tell students that even though KeyTrain never talks about GED, math is math, reading is reading etc. It is a nice tool to crosswalk the language barrier between education and employment. Karen Limkemann Ft. Wayne, IN -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Katrina Hinson Sent: Wednesday, August 02, 2006 10:31 PM To: assessment at nifl.gov Subject: [Assessment 452] Re: Skills Bank MHC Interactive is developed by McGrawHill MySkills Tutor is developed by Achievement technologies. I don't think this is part of the same package as Skills Bank. It's a different company/product. A+Anywhere Learning system is developed by the American Education Corporation Off the top of my head, I can't tell you developed Keytrain. Keytrain is part of a larger system. I don't know what state you're in but you can probably find the representatives for each of these companies for your area. If not, I can provide the contacts I have at least as a jumping off point. Regards, Katrina Hinson >>> "Gloria Fuentes" 08/02/06 5:04 AM >>> Hi Katrina, Thank you so much for your reply. I have the Skills Bank 4, and I do like it. I teach at risk kids, ages 18 to 21. Some of my kids just drifted through school not gaining anything from it. Others are stuck in their math or other areas but a LOT of them are at an early elementary level, it really makes me wonder how they made it through school all the years they did. But that is besides the point, my main objective is to teach them the skills they need to pass the GED! We don't have the My Skills Tutor is that something that is designed by the Skills Bank people? MCH Interactive, Keytrain and A+Anywhere are these through Skills Bank? I would really love to take a look at them. We don't have a lot of money for our program and my own pocket book is tapped out! So whatever I decide on will have to be a good price. The Plato is pretty expensive and I don't think I will be able to get my Director to go along with it. Anyways thank you so much for your input. Gloria On 8/1/06, Katrina Hinson wrote: > > > We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, > Keytrain and A+Anywhere. Additionally I have experience with using > PLATO as well. We've kind of taken a different approach and use > different programs with different levels of students. My Skills Tutor is > one we use with our midrange level students while A+ and MHC are > programs we use with our higher level students. Keytrain is one we use > for students who might only need to "brush" up and can do so quickly. > PLATO is similar to A+ in that it is very academic oriented and the > readability level of the questions and content in both is higher than > that of My Skills Tutor. > > All of the software programs when appropriately used are excellent > resources are all about the same in terms of "user friendliness." I have > to do the administration for most of the programs I've listed and I > always create a student account so that I can practice the same > assignments I give with my students. My students like all of them...and > recognize that each has its own different difficulty levels. One may > work on fractions in one program and %'s in another and use still > another for writing practice. The key is finding where your own comfort > level is as well as the comfort level of your student. > > They are excellent ways to reinforce skills or provide practice at home > for students who want homework or for students who may have to take a > "break" from class for whatever reason. It gives them a chance to keep > their skills up if they so desire. Like someone else, I'm not sure I > like the idea of them being stand alone - albeit with A+Anywhere our > program does utilize A+Anywhere for Adult High School Students who need > a credit for a class that we don't necessarily offer every semester but > is needed for graduation. It correlates with our states Standard Course > of Study and also offers a college readiness module. > > > Regards, > > Katrina Hinson > > > ________________________________ > > From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes > Sent: Mon 7/31/2006 4:13 PM > To: Assessment at nifl.gov > Subject: [Assessment 430] Re: Skills Bank > > > Is anyone familiar with the SkillsBank software or Plato, if so what do > you think about it for GED preparation? > > -- > Gloria Fuentes > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > -- Gloria Fuentes ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Thu Aug 3 09:40:32 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 3 Aug 2006 09:40:32 -0400 Subject: [Assessment 454] Poverty, Race, & Literacy Guest next week Message-ID: <020901c6b702$6503e710$0302a8c0@LITNOW> Dear colleagues, the following announcement is from Donna Brian, Moderator of the Poverty, Race, and Literacy Discussion List. Marie Cora ******************************* Guest Discussion: Poverty, Race, & Literacy Monday, August 7- Friday, August 11 Guest: Andy Nash- please see Andy's bio below To participate, sign up for the list at http://www.nifl.gov/mailman/listinfo/Povertyliteracy Literacy Discussion List Colleagues, Next week, Monday August 7 - Friday August 11, we have the great good fortune to have as a guest on the Poverty, Race, & Literacy Discussion List Andy Nash, Staff Development Specialist at the New England Literacy Resource Center at World Education. As you can see from her bio below, Andy has experience in lots of different adult education literacy areas, but her overarching concern has been relating literacy to social justice and advocacy for participation in our democracy. Andy introduces this discussion by asking *us* some questions (see paragraph 2 below). She intends to learn from us as we learn from her. Please read her bio and look up The Change Agent ( www.nelrc.org/changeagent) and, if you would like to participate in the discussion, join the list at and be ready starting Monday to participate in a lively discussion about literacy and social justice issues! Donna Donna Brian, Moderator Poverty, Race, and Literacy Discussion List Center for Literacy Studies at The University of Tennessee djgbrian at utk.edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Andy Nash's bio: My work in adult education over the past 20 years has focused on building the capacity of adults to use their developing skills to be more informed and active participants in a democracy. I've brought this perspective to my work in ESOL, civic participation, worker education, family literacy, standards-based education, and many years of resource and professional development. Having just finished editing a new resource about bringing issues of social justice into the classroom (see below), I am interested in thinking about the role such materials can play in adult ed. Do you find resources for talking about issues such as gentrification or globalization useful, or do you think educators should stick to more immediately tangible issues such as advocating for more affordable daycare, interpreters at clinics, etc.? In the short amount of time we have, is it necessary to stick with the "local," which is often speaking up for better community services, or are your students also interested in more general problems such as growing incarceration rates, the war(s), or the current debate over whether a president has the right to sidestep federal laws passed by Congress? In the interest of being as participatory and responsive to students as possible, does it matter if an issue gets raised by the teacher rather than being initiated by the students? And, of course, what does it all have to do with improving basic academic, language, and job skills? These are all questions we think about when we work on The Change Agent (www.nelrc.org/changeagent), a biannual social justice newspaper for adult educators and learners published by the New England Literacy Resource Center at World Education. It was conceived in 1994 as a tool to educate and mobilize teachers and learners to apply advocacy skills in response to impending federal funding cutbacks for adult education. The first issue was so well received by teachers that we continued to produce more issues. Now well established as a unique resource within the adult education community, The Change Agent continues to promote social action as an important part of the adult learning experience. Each issue explores a different social justice theme through news articles, opinion pieces, classroom activities and lessons, poems, cartoons, interviews, project descriptions, and printed and Web-based resources. "Through the Lens of Social Justice: Using The Change Agent in Adult Education" is a newly published book that celebrates The Change Agent's first decade by gathering its best and most timeless pieces and by offering guidance for educators in how to use the paper. Chapter 1 introduces readers to the kinds of articles and tools that are available in The Change Agent and how they can be used. These include: "Ways In," short visual or textual prompts that can be used with students to draw out their experiences, questions, and concerns about social issues; "Issue Analyses," articles that examine an issue (prisons, school vouchers, health care, etc.) by looking at how our systems work and for whose benefit; and "Students Making Change," accounts of students who have used what they have learned to take some kind of individual or collective action outside the classroom. Chapter 2 provides guidance in how teachers can use the articles to build thematic curriculum units, with sample units for ABE, ESOL, and GED. And Chapter 3 is a collection of articles about the challenges of bringing social justice issues into the classroom and the creative strategies that teachers have used to deal with those challenges. To see sample pages from the book, go to www.nelrc.org/publications/cabook.html -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060803/d1559da5/attachment.html From ropteacher at gmail.com Thu Aug 3 09:55:05 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Thu, 3 Aug 2006 06:55:05 -0700 Subject: [Assessment 455] Re: Skills Bank In-Reply-To: References: Message-ID: <38914de00608030655l1c8153aet8739a21274f746d3@mail.gmail.com> Excellent, I will check it out. And being how we are teaching my students career skills as well this sounds great! Thank you, Gloria On 8/3/06, Limkemann, Karen wrote: > > KeyTrain is published by Thinking Media. It is actually part of a > larger Employment System that includes the WorkKeys Assessment by ACT. > WorkKeys is used as a hiring tool by many companies. You can check out > www.keytrain.com for more info. We tell students that even though > KeyTrain never talks about GED, math is math, reading is reading etc. > It is a nice tool to crosswalk the language barrier between education > and employment. > > Karen Limkemann > Ft. Wayne, IN > > > -----Original Message----- > From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > On Behalf Of Katrina Hinson > Sent: Wednesday, August 02, 2006 10:31 PM > To: assessment at nifl.gov > Subject: [Assessment 452] Re: Skills Bank > > MHC Interactive is developed by McGrawHill > MySkills Tutor is developed by Achievement technologies. I don't think > this is part of the same package as Skills Bank. It's a different > company/product. > A+Anywhere Learning system is developed by the American Education > Corporation > Off the top of my head, I can't tell you developed Keytrain. Keytrain is > part of a larger system. I don't know what state you're in but you can > probably find the representatives for each of these companies for your > area. If not, I can provide the contacts I have at least as a jumping > off point. > > > > Regards, > Katrina Hinson > > >>> "Gloria Fuentes" 08/02/06 5:04 AM >>> > Hi Katrina, > > Thank you so much for your reply. I have the Skills Bank 4, and I do > like > it. I teach at risk kids, ages 18 to 21. Some of my kids just drifted > through school not gaining anything from it. Others are stuck in their > math > or other areas but a LOT of them are at an early elementary level, it > really > makes me wonder how they made it through school all the years they did. > But > that is besides the point, my main objective is to teach them the skills > they need to pass the GED! > > > We don't have the My Skills Tutor is that something that is designed by > the > Skills Bank people? MCH Interactive, Keytrain and A+Anywhere are these > through Skills Bank? I would really love to take a look at them. > > We don't have a lot of money for our program and my own pocket book is > tapped out! So whatever I decide on will have to be a good price. The > Plato > is pretty expensive and I don't think I will be able to get my Director > to > go along with it. > > Anyways thank you so much for your input. > > Gloria > > > On 8/1/06, Katrina Hinson wrote: > > > > > > We have Skills Bank 4, My Skills Tutor (online), MHC Interactive, > > Keytrain and A+Anywhere. Additionally I have experience with using > > PLATO as well. We've kind of taken a different approach and use > > different programs with different levels of students. My Skills Tutor > is > > one we use with our midrange level students while A+ and MHC are > > programs we use with our higher level students. Keytrain is one we use > > for students who might only need to "brush" up and can do so quickly. > > PLATO is similar to A+ in that it is very academic oriented and the > > readability level of the questions and content in both is higher than > > that of My Skills Tutor. > > > > All of the software programs when appropriately used are excellent > > resources are all about the same in terms of "user friendliness." I > have > > to do the administration for most of the programs I've listed and I > > always create a student account so that I can practice the same > > assignments I give with my students. My students like all of > them...and > > recognize that each has its own different difficulty levels. One may > > work on fractions in one program and %'s in another and use still > > another for writing practice. The key is finding where your own > comfort > > level is as well as the comfort level of your student. > > > > They are excellent ways to reinforce skills or provide practice at > home > > for students who want homework or for students who may have to take a > > "break" from class for whatever reason. It gives them a chance to keep > > their skills up if they so desire. Like someone else, I'm not sure I > > like the idea of them being stand alone - albeit with A+Anywhere our > > program does utilize A+Anywhere for Adult High School Students who > need > > a credit for a class that we don't necessarily offer every semester > but > > is needed for graduation. It correlates with our states Standard > Course > > of Study and also offers a college readiness module. > > > > > > Regards, > > > > Katrina Hinson > > > > > > ________________________________ > > > > From: assessment-bounces at nifl.gov on behalf of Gloria Fuentes > > Sent: Mon 7/31/2006 4:13 PM > > To: Assessment at nifl.gov > > Subject: [Assessment 430] Re: Skills Bank > > > > > > Is anyone familiar with the SkillsBank software or Plato, if so what > do > > you think about it for GED preparation? > > > > -- > > Gloria Fuentes > > ------------------------------- > > National Institute for Literacy > > Assessment mailing list > > Assessment at nifl.gov > > To unsubscribe or change your subscription settings, please go to > > http://www.nifl.gov/mailman/listinfo/assessment > > > > > > -- > Gloria Fuentes > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060803/bc1afaad/attachment.html From cook.sandra at northlandscollege.sk.ca Thu Aug 3 11:03:12 2006 From: cook.sandra at northlandscollege.sk.ca (Sandra Cook) Date: Thu, 3 Aug 2006 09:03:12 -0600 Subject: [Assessment 456] Re: Skills Bank In-Reply-To: <44CF7098.7080508@riral.org> Message-ID: Is there a web site that I can go to check out this skills bank as we have also been looking for something to use..we have plato but it sounds like this skills bank may be a bit more user friendly to our literacy students. Sandra Cook Northlands College La Ronge, Sask. Canada _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Howard L. Dooley, Jr. Sent: Tuesday, August 01, 2006 9:18 AM To: The Assessment Discussion List Subject: [Assessment 434] Re: Skills Bank We use My Skills Tutor, a web based program, which works very well with our basic skills and GED or EDP preparation students. RI purchased a license as part of Project IDEAL participation, a distance learning initiative. At our main learning center, all High ESL, ABE and ASE students experience the program, and have varying levels of participation on it. Several of our teachers have been using the program with some learners, in preparation for an expanded use this fall with our general ed, community based programs (which are mostly part-time evening programs). We intend to offer learners a blended model (distance learning/ classroom instruction). Learners who can't attend every class or need to stop out will have support in continuing with My Skills Tutor. For learners in class, homework assignments will include using the program, or to add reinforcement and more time on task for individual needs. I agree with Virginia: we find most learners can access and stay with the program long enough to achieve some skill development. Very few students can use it as a stand-alone however, mostly because they lack independent study skills and self-monitoring. Staff is discussing if and how we can explicitly instruct in these areas. Howard D. Virginia Tardaewether wrote: We use skills bank every day here in our basic skills lab. We use it for credit as well as non credit students. We have the students run their own tracking sheet rather than the tracking portion of the program (due to our limitations). In Salem, they have a lab assistant who keeps track of the students. This software works great for basic reading, writing and math. It also has some science, etc. depending upon what you purchase. Students engage with it well and will stick to it long enough to gain skills using the software. Haven't worked enough with PLATO to comment. Va _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Gloria Fuentes Sent: Monday, July 31, 2006 3:14 PM To: Assessment at nifl.gov Subject: [Assessment 430] Re: Skills Bank Is anyone familiar with the SkillsBank software or Plato, if so what do you think about it for GED preparation? -- Gloria Fuentes _____ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060803/72db6cb5/attachment.html From Tina_Luffman at yc.edu Thu Aug 3 11:54:04 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Thu, 3 Aug 2006 08:54:04 -0700 Subject: [Assessment 457] Re: Skills Bank Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060803/b6189605/attachment.html From Tina_Luffman at yc.edu Thu Aug 3 11:55:46 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Thu, 3 Aug 2006 08:55:46 -0700 Subject: [Assessment 458] Re: Skills Bank Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060803/def6fe0a/attachment.html From marie.cora at hotspurpartners.com Thu Aug 3 13:47:44 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 3 Aug 2006 13:47:44 -0400 Subject: [Assessment 459] Reading discussion on Women & Literacy List Message-ID: <001401c6b724$edb4c2b0$0302a8c0@LITNOW> Colleagues, The following book discussion will take place on the Women & Literacy Discussion List. To subscribe, go to: http://www.nifl.gov/mailman/listinfo/Womenliteracy **************************** After counting the votes, two readings have been selected for our first reading discussion that Mev Miller will facilitate from August 15 - 22, 2006: 1) Moving Beyond "Stupid": Taking Account of the Impact of Violence on Women's Learning (12 pages) 2) Chapter 5 from Too Scared to Learn, "Learning in the Context of Trauma: The Challenge of Setting Goals" (37 pages) If you are interested in participating in the discussion, please download and read these two articles from the Jenny Horsman website: http://www.jennyhorsman.com (if you look at the right side bar of her website, you will see the two readings listed. Click on the readings and you will be taken to the material. Depending on your computer, it may take a few minutes for the article to download): *** These articles are posted as a courtesy on the Internet for the purposes of this discussion. They will be available for download only until August 22, 2006. *** As supplemental reading, you may also want to look at this article available on the WE LEARN website: "But Is It Education?" The Challenge of Creating Effective Learning for Survivors of Trauma -- Women's Studies Quarterly, 32: #1&2, 2004-- (16 pages) http://www.litwomen.org/Research/horsman_wsq.pdf Guidelines for Discussion * Do not begin discussing the articles until the group is formally opened by the facilitator on Tuesday, Aug. 15, 2006 (this will give participants time to read prior to discussion). * During the designated discussion period, use the assigned discussion subject line each time you post. (Mev will announce it in email that opens the discussion) * Be mindful that many people only check email once a day or sporadically. As with any discussion, if you have made a post, please allow space and time for others to come into the discussion. * Remember, this is an open, public discussion. If you have something private or sensitive to respond, you may want to take it off list with an individual. The discussion will begin on Aug. 15, 2006 with an opening statement by the facilitator, Mev Miller. We hope you will join us. ------------------------------- Mev Miller, Ed.D. is director and founder of WE LEARN (Women expanding Literacy Education action resource Network -- http://litwomen.org/welearn.html). A long time feminist activist, Mev has years of experience in facilitating reading-discussion circles on a variety of women's issues. Her experience also includes facilitating Women Leading Through Reading Reading-Discussion Circles with women in both ABE and ESOL learning Settings.WE LEARN Women Expanding: Literacy Education Action Resource Network www.litwomen.org/welearn.html Please encourage your friends/colleagues to join us. They can subscribe at: http://www.nifl.gov/mailman/listinfo/Womenliteracy _______________________________________________ From marie.cora at hotspurpartners.com Thu Aug 3 13:48:41 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 3 Aug 2006 13:48:41 -0400 Subject: [Assessment 460] WE LEARN Opportunity in Minnesota Message-ID: <001501c6b725$0f267790$0302a8c0@LITNOW> Professional Development opportunity for people in Minnesota and environs... Mev Miller, founder and director of WE LEARN, will be offering a workshop, Women & Literacy: Moving to Power and Participation, on Thurs., August 24, 2006 from 3:00 - 5:30 at the Minnesota Literacy Council in St. Paul, MN. In this workshop, participants will: a) strengthen their understanding of issues and challenges for women in adult basic/literacy education programs and, b) learn about specific resources to use in curriculum and lesson planning in order to support women's learning. Participants will also have opportunity to discuss options for developing a regional WE LEARN network in order to create on-going support for working with women's literacy issues and needs. For more details, go to: http://www.litwomen.org/regions/2006mnflyer.pdf WE LEARN Women Expanding: Literacy Education Action Resource Network www.litwomen.org/welearn.html Mev Miller, Ed.D., Director 182 Riverside Ave. Cranston, RI 02910 401-383-4374 welearn at litwomen.org From ropteacher at gmail.com Thu Aug 3 15:29:22 2006 From: ropteacher at gmail.com (Gloria Fuentes) Date: Thu, 3 Aug 2006 12:29:22 -0700 Subject: [Assessment 461] Re: Skills Bank In-Reply-To: References: <44CF7098.7080508@riral.org> Message-ID: <38914de00608031229x4aa8c66ax5e598f35c39c9938@mail.gmail.com> Sandra, What is the Plato program like? How do you like it? What do your students think about it? I believe you can just go to www.skillsbank.com and check it out! I really do like it but they don't support the version we are using anymore. Gloria On 8/3/06, Sandra Cook wrote: > > Is there a web site that I can go to check out this skills bank as we > have also been looking for something to use?.we have plato but it sounds > like this skills bank may be a bit more user friendly to our literacy > students. > > Sandra Cook > > Northlands College > > La Ronge, Sask. Canada > > > ------------------------------ > > *From:* assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] *On > Behalf Of *Howard L. Dooley, Jr. > *Sent:* Tuesday, August 01, 2006 9:18 AM > > *To:* The Assessment Discussion List > *Subject:* [Assessment 434] Re: Skills Bank > > > > We use My Skills Tutor, a web based program, which works very well with > our basic skills and GED or EDP preparation students. RI purchased a > license as part of Project IDEAL participation, a distance learning > initiative. At our main learning center, all High ESL, ABE and ASE students > experience the program, and have varying levels of participation on it. > Several of our teachers have been using the program with some learners, in > preparation for an expanded use this fall with our general ed, community > based programs (which are mostly part-time evening programs). We intend to > offer learners a blended model (distance learning/ classroom instruction). > Learners who can't attend every class or need to stop out will have support > in continuing with My Skills Tutor. For learners in class, homework > assignments will include using the program, or to add reinforcement and more > time on task for individual needs. I agree with Virginia: we find most > learners can access and stay with the program long enough to achieve some > skill development. Very few students can use it as a stand-alone however, > mostly because they lack independent study skills and self-monitoring. > Staff is discussing if and how we can explicitly instruct in these areas. > > Howard D. > > > Virginia Tardaewether wrote: > > We use skills bank every day here in our basic skills lab. We use it for > credit as well as non credit students. We have the students run their own > tracking sheet rather than the tracking portion of the program (due to our > limitations). In Salem, they have a lab assistant who keeps track of the > students. This software works great for basic reading, writing and math. > It also has some science, etc. depending upon what you purchase. Students > engage with it well and will stick to it long enough to gain skills using > the software. Haven't worked enough with PLATO to comment. > > Va > > > ------------------------------ > > *From:* assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] > *On Behalf Of *Gloria Fuentes > *Sent:* Monday, July 31, 2006 3:14 PM > *To:* Assessment at nifl.gov > *Subject:* [Assessment 430] Re: Skills Bank > > > > Is anyone familiar with the SkillsBank software or Plato, if so what do > you think about it for GED preparation? > > -- > Gloria Fuentes > > > > > > ------------------------------ > > > > > ------------------------------- > > National Institute for Literacy > > Assessment mailing list > > Assessment at nifl.gov > > To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > -- Gloria Fuentes -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060803/b55dc256/attachment.html From Mylinh.Nguyen at ed.gov Tue Aug 8 17:00:57 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Tue, 8 Aug 2006 17:00:57 -0400 Subject: [Assessment 462] NIFL Hosts Live Webcast on NAAL Findings for Below Basic & Basic Adults Message-ID: Join the National Institute for Literacy for a LIVE webcast on: Adults with Basic and Below Basic Literacy Levels: Findings from NAAL and Implications for Practice. Featuring Dr. Sheida White, Dr. John Strucker, & Brian Bosworth, and moderated by Lori Aratani WHEN: August 15, 2006 1:30 p.m. - 3:15 p.m. EST The webcast can be viewed from your computer. We encourage you to register in advance. To register for this webcast go to: For more information about this webcast, go to: ---------------------------------------------------------------------------- The National Institute for Literacy is hosting a live webcast on Tuesday, August 15 at 1:30 p.m. to discuss the results of the National Assessement of Adult Literacy (NAAL) 2003, specifically relating to Americans who tested in the Below Basic and Basic literacy levels. The webcast will feature Dr. Sheida White, of the National Center for Education Statistics, who served as project officer for the NAAL, who will present the findings of the NAAL for Below Basic and Basic levels. In addition, there will be a panel of subject-matter experts who will discuss what implications the NAAL findings for Below Basic and Basic adults will have on programs. The panelists include John Strucker, of the National Center for Adult Literacy and Learning, will discuss basic skills; and Brian Bosworth, of the consulting firm FutureWorks, will discuss implications for workforce programs. The live webcast will feature: * Dr. Sheida White directs the National Assessment of Adult Literacy at the National Center for Education Statistics (or NCES). After working as a full-time reading researcher for 6 years, she joined NCES in 1991. During the first 8 years at NCES, she monitored the National Assessment of Education Progress (NAEP). Since 1999, she has been directing the NAAL project. Her articles have appeared in journals such as Language in Society and Reading Research Quarterly. * John Strucker, Ed.D., is a Lecturer in Education and Research Associate at the National Center for the Study of Adult Learning and Literacy (NCSALL) at the Harvard Graduate School of Education. He teaches a laboratory practicum course at Harvard, "Developing Reading in Adults and Older Adolescents," and he has been the principal investigator on two large-scale assessment projects, NCSALL's Adult Reading Components Study (ARCS) and the joint NCSALL/ETS Level 1 Study. * Brian Bosworth is the founder and President of FutureWorks, a private consulting and public policy research firm in Belmont, Massachusetts, that builds regional institutions and strategies for economic growth, workforce education, and civic improvement. The webcast will be moderated by Lori Aratani, Education Staff Writer at the Washington Post. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From marie.cora at hotspurpartners.com Fri Aug 11 12:41:41 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 11 Aug 2006 12:41:41 -0400 Subject: [Assessment 463] Assessment Special Collection launched! Message-ID: <00d301c6bd65$06a7e1d0$2917c047@LITNOW> Dear colleagues, I hope this email finds you well. I'm so happy to announce the launch of the newly revised site of the LINCS Assessment Special Collection. Please go to: http://literacy.kent.edu/Midwest/assessment/ to check it out. The site has on-line resources and materials that are organized based on the roles of people involved in the work, but please do not limit yourself to any one particular role area - many resources will be of interest to you in other areas. In addition, while many resources are cross-posted, many are not, so I encourage you to surf around or do a keyword search at the site. Got a great cyber resource that you don't see in the Assessment Collection and you think it should be there? Definitely let me know about it and I will make sure it gets into the review process for possible addition to the Collection. I'm very interested in resources for use by teachers and tutors in the classroom, self assessment materials for students/learners, and any assessment materials from our international colleagues (I would like to build an international section). Thanks! Let me know what you think!! marie Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060811/9eb1f143/attachment.html From JaneS at doe.mass.edu Fri Aug 11 13:48:23 2006 From: JaneS at doe.mass.edu (Schwerdtfeger, Jane) Date: Fri, 11 Aug 2006 13:48:23 -0400 Subject: [Assessment 464] Re: Assessment Special Collection launched! Message-ID: This is wonderful! Thank you so much! Jane Schwerdtfeger Massachusetts Department of Education -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Marie Cora Sent: Friday, August 11, 2006 12:42 PM To: Assessment at nifl.gov Subject: [Assessment 463] Assessment Special Collection launched! This is a multi-part message in MIME format. ou well. I'm so happy to announce the launch of the newly revised site of the LINCS Assessment Special Collection. Please go to: http://literacy.kent.edu/Midwest/assessment/ to check it out. The site has on-line resources and materials that are organized based on the roles of people involved in the work, but please do not limit yourself to any one particular role area - many resources will be of interest to you in other areas. In addition, while many resources are cross-posted, many are not, so I encourage you to surf around or do a keyword search at the site. Got a great cyber resource that you don't see in the Assessment Collection and you think it should be there? Definitely let me know about it and I will make sure it gets into the review process for possible addition to the Collection. I'm very interested in resources for use by teachers and tutors in the classroom, self assessment materials for students/learners, and any assessment materials from our international colleagues (I would like to build an international section). Thanks! Let me know what you think!! marie Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ From Mylinh.Nguyen at ed.gov Mon Aug 14 09:52:26 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Mon, 14 Aug 2006 09:52:26 -0400 Subject: [Assessment 465] Reminder NIFL Webcast on NAAL findings for Basic & Below Basic Adults Message-ID: Hi All, Just a reminder to join the National Institute for Literacy for a LIVE webcast on: Adults with Basic and Below Basic Literacy Levels: Findings from NAAL and Implications for Practice. Featuring Dr. Sheida White, Dr. John Strucker, & Brian Bosworth, and moderated by Lori Aratani WHEN: August 15, 2006 1:30 p.m. - 3:15 p.m. EDT We encourage you to register in advance. To register for this webcast go to: For more information about this webcast, go to: ---------------------------------------------------------------------------- The National Institute for Literacy is hosting a live webcast on Tuesday, August 15 at 1:30 p.m. EASTERN TIME to discuss the results of the National Assessement of Adult Literacy (NAAL) 2003, specifically relating to Americans who tested in the Below Basic and Basic literacy levels. The webcast will feature Dr. Sheida White, of the National Center for Education Statistics, who served as project officer for the NAAL, who will present the findings of the NAAL for Below Basic and Basic levels. In addition, there will be a panel of subject-matter experts who will discuss what implications the NAAL findings for Below Basic and Basic adults will have on programs. The panelists include John Strucker, of the National Center for Adult Literacy and Learning, will discuss basic skills; and Brian Bosworth, of the consulting firm FutureWorks, will discuss implications for workforce programs. The live webcast will feature: * Dr. Sheida White directs the National Assessment of Adult Literacy at the National Center for Education Statistics (or NCES). After working as a full-time reading researcher for 6 years, she joined NCES in 1991. During the first 8 years at NCES, she monitored the National Assessment of Education Progress (NAEP). Since 1999, she has been directing the NAAL project. Her articles have appeared in journals such as Language in Society and Reading Research Quarterly. * Dr. John Strucker is a lecturer at the Harvard Graduate School of Education whose research for NCSALL has focused on adult reading development. He previously taught and assessed adults with reading difficulties at the Community Learning Center in Cambridge, Massachusetts. * Brian Bosworth is the founder and President of FutureWorks, a private consulting and public policy research firm in Belmont, Massachusetts, that builds regional institutions and strategies for economic growth, workforce education, and civic improvement. The webcast will be moderated by Lori Aratani, Education Staff Writer at the Washington Post. Please note: For anyone unable to view the webcast live, the National Institute for Literacy will be archiving this webcast on its website www.nifl.gov approximately one week later. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From Mylinh.Nguyen at ed.gov Wed Aug 16 08:54:57 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Wed, 16 Aug 2006 08:54:57 -0400 Subject: [Assessment 466] NAAL Webcast to be Archived Message-ID: On behalf of the National Institute for Literacy, thank you to everyone who tuned in to yesterday's live webcast on the results of the NAAL findings for adults who scored in the Basic and Below Basic categories. And thank you to our panelists: Dr. Sheida White of the National Center for Education Statistics, Dr. John Strucker of the National Center for the Study of Adult Learning and Literacy, and Mr. Brian Bosworth of FutureWorks. For those who missed the live webcast or would like to see it again, we will be archiving the webcast on our website www.nifl.gov in about one week. We'll send an announcement when it is ready. Thank you. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From marie.cora at hotspurpartners.com Fri Aug 18 09:04:30 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 18 Aug 2006 09:04:30 -0400 Subject: [Assessment 467] Workplace Literacy Resources: testing Message-ID: <010401c6c2c6$d831bfd0$0202a8c0@LITNOW> Dear Colleagues: I'm forwarding the following post from Tom Sticht from the Workplace Literacy Discussion List. I thought some of your might find this of interest. Marie Cora Assessment Discussion List Moderator ************************************** Colleagues: Here are some resources that may be of interest in better understanding work readiness and workplace literacy program development and evaluation. One of the problems with work readiness testing is that such tests rarely are validated using predictive validity methods. In this method of validating work readiness tests groups of people with the full range of skills and knowledge assessed by the tests are hired somewhere and the correlation between test scores and scores on some criterion or criteria of failure or success in performing the job are computed. The Department of Defense has conducted the most extensive work-readiness testing and predictive validity assessment of any employer. The Armed Services Vocational Aptitude (Achievement) Battery (ASVAB) uses ten tests to predict people's likelihood of being successful in job training and on the job. In over 50 years of research the DoD has not been able to predict in numerous jobs with predictive validity correlations of much above .35-.40, accounting for some 12-16 percent of the variation in job success of employees. More about predictive validity for work-readiness validation and the military can be found in: The Military Experience and Workplace Literacy. A Review and Synthesis for Policy and Practice http://literacyonline.org/products/ncal/pdf/TR9401.pdf This report reviews extensive research and development on workplace literacy and literacy training programs as interpreted within a conceptual framework of cognitive development and also considers the implications of this research for policy on adult workforce and workplace literacy and provisions for lifelong learning. A second resource for workplace literacy with over seven methods for trying to identify literacy demands of jobs, including extensive predictive validity research is Reading for Working: A functional literacy anthology. This book presents the research that originally led to the formation of the National Workplace Literacy Program. http://www.nald.ca/fulltext/sticht/rfw/cover.htm Table of Contents Chapter 1. Introduction Part I Determining Functional Literacy Demands of Jobs Introduction 2. Readability of Job Materials 3. Performing Job Reading Task 4. Literacy in Relation to Job Knowledge, Job Performance, and Supervisor Ratings 5. Using Personnel Data Files to Estimate Reading Demands of Jobs 6. Commentary on Methodologies for Determining Literacy Demands of Jobs Part II Reducing Discrepancies Between Literacy Skills of Personnel and Literacy Demands of Jobs Introduction 7. Methods for Reducing Literacy Demands of Jobs. 8. Functional Literacy Training: A Case Study Part III Collected Papers on Functional Literacy Introduction 9. Reading and Career Education 10. A Career-Oriented Literacy Training System for the Armed Services 11. Needed: A Functional Literacy Curriculum for the Secondary School A third resource for workplace literacy program design and assessment is: Testing and Accountability in Adult Literacy Education: Focus on Workplace Literacy Resources for Program Design, Assessment, Testing, & Evaluation http://www.nald.ca/fulltext/report4/rep36-40/rep36-01.htm TABLE OF CONTENTS Preface Chapter 1 Knowledge Resources for Designing and Delivering Workplace Literacy Programs Chapter 2 Q & A on the Evaluation of Workplace Literacy Programs Chapter 3 Case Study Using the "DO ED" Approach for Evaluating Workplace Literacy Programs Chapter 4 Testing and Accountability in Adult Literacy Programs in the Workforce Investment Act of 1998 Chapter 5 Determining How Many Adults Are Lacking in Workforce Literacy: The National and International Adult Literacy Surveys Appendix Reviews of Eight Tests Used in ABE & ESL From djgbrian at utk.edu Wed Aug 23 09:10:26 2006 From: djgbrian at utk.edu (Brian, Dr Donna J G) Date: Wed, 23 Aug 2006 09:10:26 -0400 Subject: [Assessment 468] Job Posting: Special Projects Coordinator, Adult Learner Program, Queens Library, NY Message-ID: <6A5CE13D731DE249BC61CB8C5C474B0A13CF123F@UTKFSVS1.utk.tennessee.edu> ------------------------------------------------------- Special Projects Coordinator Queens Library Adult Learner Program This is a temporary grant funded position. The Adult Learner Program (ALP) Special Projects Coordinator is responsible for administration and implementation of ALP's special projects. Responsible for development of distance learning instruction via video teleconferencing to increase use of technology throughout the Adult Learner Programs; manages the Wireless Computer Centers; supervises the Basic Computer Literacy and Health Literacy classes; revises and maintains Computer Literacy and Health Literacy curricula; writes all reporting required for grant funded projects; hires and trains staff for special projects; visits classes and evaluates classroom instruction. Performs other duties as required. The schedule for this position will include Saturdays and evenings as required. Requires a Master's Degree in Education or related area, and/or ESOL Certification. Adult Education experience required, at least two years working with literacy and/or ESOL programs. Knowledge of current trends in literacy and ESOL instruction. Must have knowledge of Computer Assisted Instruction such as Internet, educational software, and MS Office Software. Experience in staff /curriculum development and supervision preferred. Excellent written and verbal communication skills. Ability to work with diversified community. Must be able to complete multiple projects with competing deadlines. About Queens Library: Situated in New York City, the Queens Library has one of the highest circulations of any library in the world and serves more than two million people in one of the most ethnically diverse counties in the United States. The Library pulses with the multiculturalism and excitement of life in "the greatest city in the world". Queens County is one of the five boroughs of New York City. Situated across the East River from Manhattan, Queens enjoys 7,000 acres of beautiful parks, 196 miles of waterfront and an excellent mass transit system. Queens has diverse and charming neighborhoods, excellent shopping and a wealth of ethnic eateries and shops reflecting the unique multicultural mosaic that defines Queens. To apply, please send your resume with cover letter to: QUEENS LIBRARY Human Resources Department 89-11 Merrick Boulevard, Jamaica, NY 11432 Fax: 718-658-2919 E-mail: employment at queenslibrary.org The Queens Library is an Equal Opportunity Employer www.queenslibrary.org From akohring at utk.edu Thu Aug 24 17:01:29 2006 From: akohring at utk.edu (Kohring, Aaron M) Date: Thu, 24 Aug 2006 17:01:29 -0400 Subject: [Assessment 469] Content Standards Guest Discussion Next Week - CASAS basic skills Content Standards Project Message-ID: <6A5CE13D731DE249BC61CB8C5C474B0A13F18850@UTKFSVS1.utk.tennessee.edu> Greetings colleagues, Next week, Monday, August 28 thru Friday, September 1, the Content Standards Discussion List will be hosting a guest discussion on the CASAS Basic Skills Content Standards Project. Our guests will be Jane Eguez, Jim Harrison, and Linda Taylor from CASAS. Please read the introductory information below which includes a link to the CASAS website to help prepare you for the discussion. To participate, sign up for the list at: http://www.nifl.gov/mailman/listinfo/Contentstandards Aaron Aaron Kohring Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) ************************************ Since its inception, CASAS (Comprehensive Adult Student Assessment System) has focused on teaching and assessing basic skills in contexts that are relevant and important to adult learners. CASAS has developed and continues to refine a highly formalized hierarchy of competencies, the application of basic skills that adults need to be fully functional and productive members of society. In the past few years, at the request of the CASAS National Consortium, representing approximately 30 states, CASAS has begun development of basic skills content standards as a formal part of the CASAS system. This enhancement of the CASAS system is intended to assist and encourage teachers to more fully integrate basic skills content standards and functional competencies in instruction. The basic skills content standards for Reading and Listening contain simple, clearly stated, detailed statements that are leveled according to the NRS Educational Functioning Levels, and are also related to CASAS test items in several CASAS test series. The statements are divided into Categories to assist teachers to navigate through the standards. In the past two years, CASAS has worked with Iowa and California to pilot these standards with teachers in a variety of adult education programs. A number of useful teacher worksheets and other tools have emerged from these efforts. We invite you to learn more about the CASAS basic skills Content Standards Project and to ask questions about it during the listserv discussion next week. To prepare for this discussion, we refer you to the CASAS website where you will find more detailed information about the development of the standards, the standards themselves, worksheets for teachers, and information about the pilot project in Iowa. Go to http://www.casas.org/DirctDwnlds.cfm?mfile_id=4504&selected_id=1720&wtar get=body We look forward to engaging with you in this discussion next week. Jane Eguez (jeguez at casas.org) , Jim Harrison (jharrison at casas.org ) and Linda Taylor (ltaylor at casas.org ), CASAS From marie.cora at hotspurpartners.com Tue Aug 29 10:51:52 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 29 Aug 2006 10:51:52 -0400 Subject: [Assessment 470] Family Literacy List Guest Discussion Message-ID: <00c401c6cb7a$aa8eed20$0202a8c0@LITNOW> Dear Colleagues, The following announcement comes from Gail Price, Moderator of the Family Literacy Discussion List. To participate in the discussion, sign up for the Family Literacy Discussion List at http://www.nifl.gov/mailman/listinfo/familyliteracy Marie Cora Assessment Discussion List Moderator ************ On September 11th through the 13th, Cyndy Colletti, Literacy Program Manager at the Illinois State Library, will join the Family Literacy Discussion List as a guest speaker/discussion leader. Cyndy's topic will be "Implementing Interactive Parent Child Activities"-- a topic of much interest to those working with families. Cyndy's biography is given below. Before she begins her discussion on September 11, I will post some questions for your consideration. They will be the questions that will guide Cyndy's discussion. We look forward to having Cyndy join us and know that you will make her time with us rewarding and valuable by responding to her comments and questions. I will remind you of this discussion again as we get closer to the date. Read on for Cyndy's biography. Cyndy Colletti, currently the Literacy Program Manager at the Illinois State Library (ISL), worked as the Family Literacy Coordinator at ISL for nine years. In that position, she was responsible for comprehensive grant administration including developing and implementing the Family Literacy Grant Program, a comprehensive five component program including library services as the fifth component. The Illinois State Library has consistently funded between 40 and 55 family literacy projects annually since 1991. She has worked cooperatively with the practitioners in Illinois to develop programmatic resources for the Family Literacy projects such as parent-child activities (The Story Kits, online at http://leep.lis.uiuc.edu/publish/ccollett/storykit/sitemap.html are an example.) and workshops on other issues vital to family literacy. She has a master's degree from the University of Illinois and more than 20 years experience in the field of adult education and literacy and social service. Her current responsibility as Literacy Program Manager includes grants management and facilitating the effectiveness of program implementation on the local level by providing resource materials, training and support for local adult education and family literacy providers throughout Illinois. Gail J. Price Multimedia Specialist National Center for Family Literacy 325 West Main Street, Suite 300 Louisville, KY 40205 Phone: 502 584-1133, ext. 112 Fax: 502 584-0172 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060829/b102f59d/attachment.html From marie.cora at hotspurpartners.com Tue Aug 29 10:57:40 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 29 Aug 2006 10:57:40 -0400 Subject: [Assessment 471] Guest discussion - CASAS Basic Skills Content Standards Project Message-ID: <00c901c6cb7b$7a388c20$0202a8c0@LITNOW> Dear colleagues, Please note that the CASAS discussion began yesterday - apologies for not sending you this reminder earlier. If you would like to see any of yesterday's posts, please check out the archives at http://www.nifl.gov/mailman/listinfo/Contentstandards and click on Read Current Posted Messages. Marie Cora Assessment Discussion List Moderator ************ Greetings all, Today begins our guest discussion on the CASAS Basic Skills Content Standards Project. Please welcome our guests Jane Eguez, Jim Harrison, and Linda Taylor from CASAS. I know our guests will attempt to answer your questions in a timely manner, but as always, remember this list represents colleagues in multiple time zones across the U.S. as well as International subscribers- so we'll work together to make this a lively exchange. I am re-posting the introductory information below for those who may need it. If you wish to forward this message to others who are not currently subscribed, they can participate by signing up for the list at: http://www.nifl.gov/mailman/listinfo/Contentstandards Aaron Aaron Kohring Moderator, National Institute for Literacy's Content Standards Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) ************************************ Since its inception, CASAS (Comprehensive Adult Student Assessment System) has focused on teaching and assessing basic skills in contexts that are relevant and important to adult learners. CASAS has developed and continues to refine a highly formalized hierarchy of competencies, the application of basic skills that adults need to be fully functional and productive members of society. In the past few years, at the request of the CASAS National Consortium, representing approximately 30 states, CASAS has begun development of basic skills content standards as a formal part of the CASAS system. This enhancement of the CASAS system is intended to assist and encourage teachers to more fully integrate basic skills content standards and functional competencies in instruction. The basic skills content standards for Reading and Listening contain simple, clearly stated, detailed statements that are leveled according to the NRS Educational Functioning Levels, and are also related to CASAS test items in several CASAS test series. The statements are divided into Categories to assist teachers to navigate through the standards. In the past two years, CASAS has worked with Iowa and California to pilot these standards with teachers in a variety of adult education programs. A number of useful teacher worksheets and other tools have emerged from these efforts. We invite you to learn more about the CASAS basic skills Content Standards Project and to ask questions about it during the listserv discussion next week. To prepare for this discussion, we refer you to the CASAS website where you will find more detailed information about the development of the standards, the standards themselves, worksheets for teachers, and information about the pilot project in Iowa. Go to http://www.casas.org/DirctDwnlds.cfm?mfile_id=4504&selected_id=1720&wtar get=body We look forward to engaging with you in this discussion next week. Jane Eguez (jeguez at casas.org), Jim Harrison (jharrison at casas.org ) and Linda Taylor (ltaylor at casas.org ), CASAS From kabeall at comcast.net Wed Aug 30 07:58:20 2006 From: kabeall at comcast.net (Kaye Beall) Date: Wed, 30 Aug 2006 07:58:20 -0400 Subject: [Assessment 471] New from NCSALL--NCSALL by Role Message-ID: <006801c6cc2b$97280440$0202a8c0@your4105e587b6> NCSALL by Role This new section of NCSALL's Web site offers a variety of professional development ideas on: * adult multiple intelligences * adult student persistence * authentic context * General Educational Development (GED) * reading Professional developers and program administrators access guides for facilitating half-day seminars and multi-session study circles. Policymakers read relevant research articles and reflect on policy-related questions. Teachers and tutors access self-studies that invite them to (1) read the related research, (2) reflect on this research and their practice, and (3) focus on an aspect of their practice. Check out NCSALL by Role at http://www.ncsall.net/?id=787. The reading topic offers ideas for accessing and using the Assessment Strategies and Reading Profiles Web site. It includes assessment tools. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060830/ccbda9b1/attachment.html From Mylinh.Nguyen at ed.gov Wed Aug 30 13:17:30 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Wed, 30 Aug 2006 13:17:30 -0400 Subject: [Assessment 472] NIFL Webcast Now Available in Archive Message-ID: Hi Everyone, The National Institute for Literacy has now made available an archived version of its latest webcast: "Adults with Basic and Below Basic Literacy Levels: Findings from NAAL and Implications for Practice" from August 15, 2006. We have made the entire webcast available (include transcript and slides for download) on our website at www.nifl.gov. Look under the heading "What's New." Thank you. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From marie.cora at hotspurpartners.com Thu Aug 31 09:12:41 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 31 Aug 2006 09:12:41 -0400 Subject: [Assessment 473] International Community Virtual Visit Project Message-ID: <016301c6ccff$2401eab0$0202a8c0@LITNOW> Colleagues, The following post is from David Rosen. Marie Cora Assessment Discussion List Moderator *********************** The International Classroom and School Virtual Visit (Virtual School) project is beginning its eighth year, linking classrooms across the world to enable students to meet each other virtually, share information about their cultures, their classrooms, and their communities, and to build cultural understanding. Classes can include English as a Second or Other Language (ESOL/ESL), Adult Basic Education (ABE, GED), elementary or secondary education, or family literacy. Students can be from age seven to adult. As in past years, we hope classes will engage in lively written discussion, and possibly choose a film, book or current event to discuss. This year we have set up a free wiki, so classes don't have to create their own web pages, and we will help teachers to use free Internet telephony so their classes can talk to each other if they can find a time that works to do that. If you would like to participate in this year's project, 1. Sign up on the I.C.V.V. e-list by going to: http://lists.literacytent.org/mailman/listinfo/icvv Scroll down the page to choose an ID and password. That's it, easy and free. 2. Once you receive confirmation that you are on the I.C.V.V. e-list, send an e-mail to: icvv at lists.literacytent.org indicating your interest in participating this year. Be sure to describe your class, when it will begin, and what age group or nationality you would prefer to partner with. If you would like to look at classroom virtual visit projects from previous years go to: http://www.otan.us/webfarm/emailproject/school.htm and then choose http://www.otan.us/webfarm/emailproject/school2003.htm We look forward to your joining the project. Let one of us know if you have questions. And please pass this information on -- by e-mail or electronic list -- to teachers who you think might be interested. All the best, David J. Rosen djrosen at comcast.net Susan Gaer sgaer at yahoo.com From marie.cora at hotspurpartners.com Thu Aug 31 10:27:31 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 31 Aug 2006 10:27:31 -0400 Subject: [Assessment 474] Special Topic: Formative Assessment inInternational Education Message-ID: <018b01c6cd09$98608560$0202a8c0@LITNOW> Dear List Members: The following post is from David Rosen, Moderator of the Special Topics Discussion List. I highly encourage you to join this discussion as it relates directly to our interests here on this List. The Discussion is brief - 3 days - and it is my hope that we can learn a lot from Janet Looney and carry on the discussion here on this List afterwards. I've no doubt that many of you will have comments and questions regarding formative assessment, and how this fits into our assessment landscape on the program, state, and national levels. Please subscribe to this discussion at the Special Topics Discussion List at http://www.nifl.gov/mailman/listinfo/SpecialTopics Looking forward to seeing you there! Marie Cora Assessment Discussion List Moderator ------- Colleagues, In preparation for celebrating International Literacy Day, on September 5th-7th the Special Topics Discussion List is pleased to welcome Ms. Janet Looney representing the Organisation for Economic Co-operation and Development(OECD). Janet is the leader of the Centre for Educational Research and Innovation program known as What Works in Innovation in Education. Its current focus is formative assessment. The discussion will serve to introduce some of OECD's work in international education. The primary focus of the discussion will be on the value of formative assessment for promoting higher levels of learner achievement, greater equity of outcomes, and the development of "learning to learn" skills. Not a term widely known in the U.S., formative assessment refers to what teachers and learners do in the classroom to assess learning progress. An assessment is _formative_ when information gathered in the assessment process is used to modify teaching and learning activities. It's an assessment _for_ learning not just _of_ learning. Between 2002 and 2004, the What Works program explored formative assessment in lower secondary classrooms in eight international systems [see Formative Assessment: Improving Learning in Secondary Classrooms (2005)]. OECD will publish a second study addressing formative assessment for adult basic skill learners in 2007. Together, the two studies are intended to strengthen understanding of effective approaches to lifelong learning. FORMATIVE ASSESSMENT IN LOWER SECONDARY SCHOOLS While many teachers incorporate aspects of formative assessment into their teaching, it is not often practiced systematically. The What Works study, Formative Assessment: Improving Learning in Secondary Classrooms (2005), features exemplary cases from secondary schools in eight systems and international research reviews, and relates these to the broader policy environment. The study shows how teachers have addressed barriers to systematic practice, and how school and policy leaders may apply the principles of formative assessment to promote constructive cultures of assessment and evaluation throughout education systems. FORMATIVE ASSESSMENT IN ADULT BASIC SKILL PROGRAMS Formative approaches may be particularly appropriate for adults with basic skill needs, the focus of the current What Works study. Instructors using formative approaches are able to tailor instruction more closely to the needs of diverse adult learners. Formative approaches also place an explicit focus on identifying and building upon learners' prior knowledge and skills - whether gained in formal education settings, or informal work or other settings. The OECD study on "Improving Teaching and Learning for Adults with Basic Skill Needs through Formative Assessment" , now underway, is: 1. Developing studies of exemplary teaching and assessment practice for adults with basic skill needs 2. Bringing together international scholarship on teaching and assessment for adults with basic skill needs 3. Identifying effective policy levers for improving the quality of provision in the adult basic skills sector, and 4. Creating opportunities for policy officials, researchers and practitioners to exchange insights and ideas on promoting effective teaching, assessment and evaluation. We look forward to your subscribing to this three-day discussion. To do so, go to: http://www.nifl.gov/mailman/listinfo/SpecialTopics David J. Rosen Special Topics Discussion List Moderator djrosen at comcast.net _______________________________________________ From marie.cora at hotspurpartners.com Sat Sep 2 16:26:36 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sat, 2 Sep 2006 16:26:36 -0400 Subject: [Assessment 475] Webcast casts web of doubt Message-ID: <003801c6cece$172f79d0$0202a8c0@LITNOW> Colleagues: The following post is from Tom Sticht. I wonder if subscribers have any questions or comments regarding either the NAAL Webcast, or Tom's discussion of the webcast? Let us hear your thoughts and questions. You can view the webcast at the archives at www.nifl.gov - click on "What's New". Marie Cora Assessment Discussion List Moderator ********** August 31, 2003 National Institute for Literacy Webcast Casts Web of Doubt About Commitment to Adult Literacy Education Tom Sticht International Consultant in Adult Education On 15 August 2006 the National Institute for Literacy (NIFL) presented a webcast about the National Assessment of Adult Literacy (NAAL) of 2003. Entitled "Adults with Basic and Below Basic Literacy Levels: Findings from NAAL and Implications for Practice" the webcast focussed on the adults who scored at the lowest two levels of the NAAL, those in the Basic and Below Basic levels, and implications for reading instruction and workforce development. Unfortunately, as far as I can discern, the webcast presented nothing of any substance for policy or practice for adult literacy/numeracy education nor for workforce development. In fact, it presented a number of statements about education, literacy, and workforce development of dubious validity. Here are some of these statements. 1. Sheida White from the National Center for Education Statistics, which sponsored the NAAL, made the statement that "Nearly two-thirds, which is actually 67% of all the jobs created over the next decade, will require a college degree." But in the Statistics and Facts section of the NIFL web site it is claimed that 69.8% percent of job openings from 2000 through 2010 will NOT require college but only some sort of work-related training, 57% of which will be short or moderate term training. Other data from the Department of Labor indicates that in 1998 78% of jobs required non-college levels of education while in 2008 76% will be non-college jobs. A 2006 report from the Educational Testing Service (ETS) by Paul Barton also raises questions about the education levels required by jobs. He presents data showing that the 44 occupations that account for half of the 26 million average annual job openings during 2001-2012 require only short-term education or on-the-job training, not post-secondary education, and 25 of the 44 occupations have workforces with 50 percent or more having high school or less education. Clearly, there is reason to question the claim that two-thirds of all jobs created over the next decade will require a college degree. 2. John Strucker called attention to the well known gaps in performance on the tests among whites, blacks, and Hispanics, but he had nothing to say about what to do about the situation other than we need to do something. He talked about age and literacy and focussed on the problems of younger adults, but he did not comment on the fact that the NAAL may not be valid across the lifespan, especially for older adults, as other research has suggested. He also commented on the fact that quantitative literacy (numeracy) had larger percentages of adults in the lower two levels than on the prose or document literacy scales. He also mentioned that there could be real problems with decoding, vocabulary, and fluency for adults in the Below Basic and Basic levels, and there could be large numbers of learning disabilities in these groups. But there was really nothing that I read that lead to insights regarding how teachers or programs should go about changing their adult reading, numeracy, or English language instruction. 3. Brian Bosworth simply repeated the oft stated notions that low literacy can consign workers to low paying jobs and reduce America's global competitiveness. He made a plea for a demand side approach to skills development that seemed very much like a call for a return to workplace literacy programs in which employers and employees determine their skills needs and work together to design and deliver instruction. Again, however, there was nothing that I read that produced solid evidence of how workers with low skills actually perform important job tasks in specific jobs or what returns to investment in workforce education business, workers, or the rest of the nation might experience if investments in worker literacy or numeracy education were increased. It would be useful if the NIFL or some other government agency would support this type of research. Noticeably missing from the presentation was a discussion of just how arbitrary the whole enterprise of literacy assessment in the NAAL was, including the naming of levels as Below Basic or Basic (instead of Below Average and Average for instance). There was also no discussion of how Prose, Document, and Quantitative literacy might "add up" across the three scales to form a person's total literacy ability. Nor was there any discussion of the very large differences between what the test developers said about adults' reading and math skills based on the standardized tests and what adults have said about their own skills as they perceive their adequacy to be for work and daily life. Some 95 percent of adults in the NALS thought their skills met their needs and the recent international Adult Literacy and Life Skills (ALL) report developed a methodology for examining the mismatch between workers skills and their job demands for these skills. The report said that 80 percent of adults had literacy and numeracy skills that matched or exceeded their job demands, while 20 percent were working in jobs with demands that exceed their skills in these areas. These huge differences between tested and self-perceived skills should receive considerable study because it is adults' self-perceptions of their skill needs that will eventually move them to seek help in upgrading their skills. Perhaps a future webcast can address some of these serious issues in determining the scale of need for and the desire for adult literacy education. Brian Bosworth said "I think that it's unlikely that we are going to see a significant change and reform from the federal level to deal with most of these workplace literacy issues." I think this was probably the most significant policy- and practice-related statement in the entire webcast. It has been clear since the NALS of 1993 in which 90 million (47%) of adults were said to lack the skills needed to cope with contemporary society, including the world of work, that the federal government that produced this result did not actually believe it. For three years after that report the federal budget for the Adult Education and Literacy System went down. After that it rose for a while, but stayed at a pitiful level in which per adult enrollee funds equaled about $200. After the 2003 NAAL which indicated that over 93 million adults possessed only Basic or Below Basic prose literacy, the present administration (1) asked for a cut in funding for adult literacy education from around $575 million to $200 million and a complete drop of funds for the Head Start family literacy program; (2) formed an interagency committee to coordinate their work; and (3) the committee met in early 2006 and it was reported that the meeting went well and it would meet again later on. At no time in the Bush administration has it called for more funding for the Adult Education and Literacy System, even while repeatedly making dire warnings of impending disasters in global competition and the American economy due to the poor literacy skills of the workforce. Perhaps the webcast about the NAAL, reading instruction, and the workforce will have some positive effects on some aspect of adult literacy education. Clearinghouses, committees, meetings, and web discussions may possibly be useful in meeting the need for adult literacy education in the Nation. But there is nothing like a large infusion of money into an obscenely under-funded education system to move the Nation forward. So far, for me, the NIFL NAAL webcast has reinforced a web of doubt about the federal government's sincerity and commitment to providing the funds needed to move the Adult Education and Literacy System from the margins to the mainstream of education in the United States. Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net From marie.cora at hotspurpartners.com Sat Sep 2 16:55:01 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sat, 2 Sep 2006 16:55:01 -0400 Subject: [Assessment 476] FW: [ContentStandards 246] Re: Teaching beyond the GED? Message-ID: <003901c6ced2$0f89c920$0202a8c0@LITNOW> Colleagues, I am forwarding an excerpt from the discussion that was held last week on the Content Standards Discussion List. The context of this discussion was to examine the CASAS basic skills content standards. Toward the end of this discussion, some attention was turned toward the GED and I have posted below 3 emails, beginning with Ajit's email at the bottom. Ajit discusses the relationship between the NRS and the CASAS standards, noting that: "the GED test which is a goal/outcome for many learners is attainable to those functioning at lower NRS levels i.e. the ABE High Intermediate and the Low Adult Secondary levels. Many of these learners squeak through the GED with minimal pass scores but then face challenges with postsecondary entrance/success. I would anticipate that students with higher abilities (i.e. the High Adult Secondary level - CASAS scale 246 and higher) are even better prepared to enter/succeed in postsecondary situations." Donna's and Aaron's subsequent remarks and questions raise important issues for us, and I am extremely interested in hearing your thoughts and responses (and further questions). The next post that I will forward is a response by David Rosen in which he suggests that there are mainly 3 reasons why students pursue a GED, and that programs should strive to serve all 3 of these purposes. Look for this next post, and please let us know what your comments are. What do you think about this? Do you find yourself or your program in this situation? Do you feel that structures such as the one David proposes would address some of the challenges that we face and that have been raised by the comments Ajit and Donna have made? What are your thoughts? Marie Cora Assessment Discussion List Moderator *********** Donna, You raise some interesting points about the GED and whether the focus of instruction and assessment should go beyond preparation for the GED. It sounds like you believe that metacognitive skills - reasoning/thinking/analyzing skills- are also very important. I know there is some level of tension in the field when you have learners and/or programs stressing achievement of the GED in as short a time as possible as the ultimate goal vs. "preparing the adult learner for today's economy" as you have suggested. What do others think about this? What is our role as instructors? Aaron _____ From: contentstandards-bounces at nifl.gov [mailto:contentstandards-bounces at nifl.gov] On Behalf Of Donna Chambers Sent: Thursday, August 31, 2006 8:53 AM To: 'The Adult Education Content Standards Discussion List' Subject: [ContentStandards 239] Re: Questions onCASASBasicSkillsContentStandards Project Ajit, Thanks for your very thoughtful response. You and Jane both clarified your comment. I see that states who are dealing with adult high school completion programs should also look at the content standards that must be measured for K12 requirements for the individual state. These may be in addition to CASAS standards. As you mentioned, the GED as an goal/outcome for many learners can be achieved without the learner having mastered skills that are measured by the Level D CASAS assessment. This poses a concern when the individual learner's need is to become more gainfully employed and go to college or any other postsecondary training. Must we then prepare the individual to go beyond the GED? This may mean that the focus of instruction and assessment be on reasoning/thinking/analyzing skills so that the learner understands concepts such as in math, not just manipulating formulas. The quandary arises from the definition of "basic skills". My work in adult education has always lead me to focus on what adults need to know and be able to do to survive. What math, reading and writing skills must a learner need to adequately function as a parent, citizen and worker? However, my recent work has required that I look closer at what adults need to know and be able to do and this closer look changes the picture somewhat. The list of skills I would have come up with five years ago, today becomes the very basic skills. What adults need to know today goes beyond these basics. When the question becomes "What does an adult need to know in order to pass a test that the employer requires or the Accuplacer Test in order to move into credit bearing college classes, etc?" the list changes. Why does an adult GED student need to know how to demonstrate the symbolic manipulation of polynomial expressions or analyze properties of three dimensional geometric shapes when they can pass the GED without knowing this? The answer is simple, even if we know that all students are not going to college. Because developing these concepts helps a student develop necessary reasoning/thinking skills and positions the student to advance in his/her education if they so choose. Looking at and working toward this big picture better prepares students for success as they exit our programs. My work in RI and Massachusetts has caused me to look closely at current K12 standards and align these standards with ABE/ASE instruction since both states require competency determination in the K12 standards to earn a high school diploma. Rather than focusing on the lower levels to move forward, we as instructors are looking at the whole picture. What understanding of number sense must the student have from the beginning level that will prepare that student to understand the number sense concepts at the higher level? We are looking across all levels in introducing content standards that begin to develop good thinking skills and integrating all the content areas. This does not necessarily change the content standards, but does require that we look at the instruction differently. How can we integrate the instruction to assure that concepts are learned in a way that can be applied to any life and/or academic situation. If we see our job as preparing the adult learner for today's economy, we must consider all students at every level capable of developing the thinking skills necessary to meet whatever goal they want to achieve. Thanks, Donna Chambers ----- Original Message ----- From: Ajit Gopalakrishnan To: 'The Adult Education Content Standards Discussion List' Sent: Tuesday, August 29, 2006 11:09 PM Subject: [ContentStandards 228] Re: Questions on CASASBasicSkillsContentStandards Project Hi Donna, It is nice to hear from you. I can see how my email would have led to your question. I accidentally hit the send button before I had fully finished composing my email! I said: >>I would imagine that many states will adopt them while others may need to reference 9-12 high school standards especially for their adult high school diploma programs. I would have added "also" after "may" in the above sentence. I meant to say that reference to 9-12 high school standards may also be necessary in addition to the basic skill content standards. The basic skills of reading, writing, math, listening, and speaking are the focus of most adult education efforts. In addition to helping learners improve their basic skills, the adult credit diploma programs also help learners to earn high school credits toward graduation. The curricula in these programs tend to mirror that of the regular high school. Therefore, in addition to the basic skill content standards, adult credit diploma programs may be expected to have additional content standards in areas like science, social studies, arts (visual/performing), world languages, etc. I hope this clarifies my comment. The content standards that CASAS is developing most definitely address the expectations for secondary levels functioning with respect to the basic skills. CASAS assessments also measure student abilities well into the adult secondary levels. The High Adult Secondary NRS level for reading and math begins at 246 on the CASAS scale. Level D CASAS assessments measure student performance into the high 250s (may be even a little higher). As an aside, the GED test which is a goal/outcome for many learners is attainable to those functioning at lower NRS levels i.e. the ABE High Intermediate and the Low Adult Secondary levels. Many of these learners squeak through the GED with minimal pass scores but then face challenges with postsecondary entrance/success. I would anticipate that students with higher abilities (i.e. the High Adult Secondary level - CASAS scale 246 and higher) are even better prepared to enter/succeed in postsecondary situations. You raise a whole other topic with high school exit testing. It raises questions about: (i) which of the standards are measured on these exit tests (i.e. just basic skills or also science, social studies, etc.); (ii) how they are measured (e.g. selected response versus constructed response; problem solving-applied performance focus versus non-contextual/abstract academic subcomponent focus); and (iii) the level of mastery that is expected (10th, 11th, or 12th or even 9th grade standards). Ajit Ajit Gopalakrishnan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060902/34c38836/attachment.html From marie.cora at hotspurpartners.com Sat Sep 2 16:56:31 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sat, 2 Sep 2006 16:56:31 -0400 Subject: [Assessment 477] FW: [ContentStandards 247] Re: Teaching beyond the GED? Message-ID: <003e01c6ced2$4530fc10$0202a8c0@LITNOW> Here is the post from David Rosen - let us hear what you think. Marie Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ********** Hello Aaron and others, On Sep 1, 2006, at 9:53 AM, you wrote: > Donna, > > You raise some interesting points about the GED and whether the > focus of instruction and assessment should go beyond preparation > for the GED. It sounds like you believe that metacognitive skills > - reasoning/thinking/analyzing skills- are also very important. I > know there is some level of tension in the field when you have > learners and/or programs stressing achievement of the GED in as > short a time as possible as the ultimate goal vs. "preparing the > adult learner for today's economy" as you have suggested. What do > others think about this? What is our role as instructors? I have thought about this recurrent question and propose the following: The key is for each learner, in many cases with the help of a teacher or counsel, to examine what "I want my GED" means. " _Why_ do you want to get a GED (or ADP or EDP) ?" "If you had it, what would you hope it would do for you?" The answers will mostly fall in three categories: 1. GED as a terminal diploma or certificate . I want to hold my head up as a holder of a GED or adult high school diploma. Personal pride and satisfaction. Not so much for my job or my career. or . I have to have a GED or h.s. diploma to keep my job. I need this as soon as possible. or . I need a job now. I can't get one because I don't have a high school diploma. I think, with a GED I can get a job. I don't care if it's a low-paying job. I need money as soon as possible. 2. GED as a key to entering Post-Secondary education . I want a good job, one that will enable me (and my family) to be self-sufficient. I understand that the GED is not enough, that I have to get at least a year of college, too, but the GED is needed first. . I want to succeed in college, I understand that a GED may be enough to get in, but I want to take regular, not developmental study courses so I want to be prepared to do academic work in college, and in other ways to be prepared for college before I enroll. 3. Limbo . I don't know. I really don't. I was told to come here by my social worker (parole officer, mother....) Category 1 folks are "true GED" people. Category 2 folks are college prep people. They need a GED or h.s. diploma _and_ transition to college/college prep work. Category 3 people may or may not belong in an adult education program. For example, some out-of-school youth programs are designed help young adults get motivated. A high quality, seamless adult education system (not necessarily every program in the system) should offer all three options, and the screening process should be such that students get referred to the right option for their goal, so that "fast track for employment: GED students get 1, GED for increased lifetime earning folks get 2, and those who need motivation and counseling, and maybe a stimulating program of education with try-out work, get 3. David J. Rosen djrosen at comcast.net From marie.cora at hotspurpartners.com Mon Sep 4 17:03:06 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 4 Sep 2006 17:03:06 -0400 Subject: [Assessment 478] Correction to Webcast casts web of doubt from Tom Sticht Message-ID: <002f01c6d065$85799590$0202a8c0@LITNOW> Colleagues, Please note that there was an error in Tom's previous post. Tom writes: "I said the present administration called for eliminating Head Start when I meant Even Start." Following is a corrected version of his email. I invite your comments, suggestions, and questions - let's hear from others on this topic. Thanks, Marie Cora Assessment Discussion List Moderator ******************* August 31, 2003 National Institute for Literacy Webcast Casts Web of Doubt About Commitment to Adult Literacy Education Tom Sticht International Consultant in Adult Education On 15 August 2006 the National Institute for Literacy (NIFL) presented a webcast about the National Assessment of Adult Literacy (NAAL) of 2003. Entitled "Adults with Basic and Below Basic Literacy Levels: Findings from NAAL and Implications for Practice" the webcast focussed on the adults who scored at the lowest two levels of the NAAL, those in the Basic and Below Basic levels, and implications for reading instruction and workforce development. Unfortunately, as far as I can discern, the webcast presented nothing of any substance for policy or practice for adult literacy/numeracy education nor for workforce development. In fact, it presented a number of statements about education, literacy, and workforce development of dubious validity. Here are some of these statements. 1. Sheida White from the National Center for Education Statistics, which sponsored the NAAL, made the statement that "Nearly two-thirds, which is actually 67% of all the jobs created over the next decade, will require a college degree." But in the Statistics and Facts section of the NIFL web site it is claimed that 69.8% percent of job openings from 2000 through 2010 will NOT require college but only some sort of work-related training, 57% of which will be short or moderate term training. Other data from the Department of Labor indicates that in 1998 78% of jobs required non-college levels of education while in 2008 76% will be non-college jobs. A 2006 report from the Educational Testing Service (ETS) by Paul Barton also raises questions about the education levels required by jobs. He presents data showing that the 44 occupations that account for half of the 26 million average annual job openings during 2001-2012 require only short-term education or on-the-job training, not post-secondary education, and 25 of the 44 occupations have workforces with 50 percent or more having high school or less education. Clearly, there is reason to question the claim that two-thirds of all jobs created over the next decade will require a college degree. 2. John Strucker called attention to the well known gaps in performance on the tests among whites, blacks, and Hispanics, but he had nothing to say about what to do about the situation other than we need to do something. He talked about age and literacy and focussed on the problems of younger adults, but he did not comment on the fact that the NAAL may not be valid across the lifespan, especially for older adults, as other research has suggested. He also commented on the fact that quantitative literacy (numeracy) had larger percentages of adults in the lower two levels than on the prose or document literacy scales. He also mentioned that there could be real problems with decoding, vocabulary, and fluency for adults in the Below Basic and Basic levels, and there could be large numbers of learning disabilities in these groups. But there was really nothing that I read that lead to insights regarding how teachers or programs should go about changing their adult reading, numeracy, or English language instruction. 3. Brian Bosworth simply repeated the oft stated notions that low literacy can consign workers to low paying jobs and reduce America's global competitiveness. He made a plea for a demand side approach to skills development that seemed very much like a call for a return to workplace literacy programs in which employers and employees determine their skills needs and work together to design and deliver instruction. Again, however, there was nothing that I read that produced solid evidence of how workers with low skills actually perform important job tasks in specific jobs or what returns to investment in workforce education business, workers, or the rest of the nation might experience if investments in worker literacy or numeracy education were increased. It would be useful if the NIFL or some other government agency would support this type of research. Noticeably missing from the presentation was a discussion of just how arbitrary the whole enterprise of literacy assessment in the NAAL was, including the naming of levels as Below Basic or Basic (instead of Below Average and Average for instance). There was also no discussion of how Prose, Document, and Quantitative literacy might "add up" across the three scales to form a person's total literacy ability. Nor was there any discussion of the very large differences between what the test developers said about adults' reading and math skills based on the standardized tests and what adults have said about their own skills as they perceive their adequacy to be for work and daily life. Some 95 percent of adults in the NALS thought their skills met their needs and the recent international Adult Literacy and Life Skills (ALL) report developed a methodology for examining the mismatch between workers skills and their job demands for these skills. The report said that 80 percent of adults had literacy and numeracy skills that matched or exceeded their job demands, while 20 percent were working in jobs with demands that exceed their skills in these areas. These huge differences between tested and self-perceived skills should receive considerable study because it is adults' self-perceptions of their skill needs that will eventually move them to seek help in upgrading their skills. Perhaps a future webcast can address some of these serious issues in determining the scale of need for and the desire for adult literacy education. Brian Bosworth said "I think that it's unlikely that we are going to see a significant change and reform from the federal level to deal with most of these workplace literacy issues." I think this was probably the most significant policy- and practice-related statement in the entire webcast. It has been clear since the NALS of 1993 in which 90 million (47%) of adults were said to lack the skills needed to cope with contemporary society, including the world of work, that the federal government that produced this result did not actually believe it. For three years after that report the federal budget for the Adult Education and Literacy System went down. After that it rose for a while, but stayed at a pitiful level in which per adult enrollee funds equaled about $200. After the 2003 NAAL which indicated that over 93 million adults possessed only Basic or Below Basic prose literacy, the present administration (1) asked for a cut in funding for adult literacy education from around $575 million to $200 million and a complete drop of funds for the Even Start family literacy program; (2) formed an interagency committee to coordinate their work; and (3) the committee met in early 2006 and it was reported that the meeting went well and it would meet again later on. At no time in the Bush administration has it called for more funding for the Adult Education and Literacy System, even while repeatedly making dire warnings of impending disasters in global competition and the American economy due to the poor literacy skills of the workforce. Perhaps the webcast about the NAAL, reading instruction, and the workforce will have some positive effects on some aspect of adult literacy education. Clearinghouses, committees, meetings, and web discussions may possibly be useful in meeting the need for adult literacy education in the Nation. But there is nothing like a large infusion of money into an obscenely under-funded education system to move the Nation forward. So far, for me, the NIFL NAAL webcast has reinforced a web of doubt about the federal government's sincerity and commitment to providing the funds needed to move the Adult Education and Literacy System from the margins to the mainstream of education in the United States. Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net From Mylinh.Nguyen at ed.gov Tue Sep 5 09:44:42 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Tue, 5 Sep 2006 09:44:42 -0400 Subject: [Assessment 479] National Institute for Literacy Launches International Perspective Webpages Message-ID: Just in time for International Literacy Day on September 8, the National Institute for Literacy has launched a series of webpages on its website (www.nifl.gov) to highlight worldwide efforts to address and combat literacy problems. The International Perspectives webpages allow American adult literacy and English language teachers and students quick access to information about: * adult literacy education in other countries and cultures, including both developing and industrialized countries, and including curriculum and outcomes standards for adult education in other countries * international comparative studies of adult literacy and PreK-12 education, and * international efforts to raise literacy levels (e.g.UNESCO, International Reading Association, and the Venezuelan and Argentinian literacy campaigns) The Institute plans to continue to build on the information on the International Perspective pages () as they develop into a central site for worldwide literacy resources. The National Institute for Literacy provides leadership on literacy issues, including the improvement of reading instruction for children, youth, and adults. In consultation with the U.S. Departments of Education, Labor, and Health and Human Services, the Institute serves as a national resource on current, comprehensive literacy research, practice, and policy. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From marie.cora at hotspurpartners.com Wed Sep 6 09:48:54 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 6 Sep 2006 09:48:54 -0400 Subject: [Assessment 480] Family Literacy Guest Discussion Leader Message-ID: <004c01c6d1bb$31e61690$0302a8c0@LITNOW> Colleagues, The following reminder is from Gail Price, Moderator of the Family Literacy Discussion List. If you are interested in joining the discussion, you can subscribe to the Family Literacy Discussion List by going to http://www.nifl.gov/mailman/listinfo/familyliteracy *************************** This is a reminder that Cyndy Colletti will be joining the Family Literacy Discussion List Monday, September 11, through Wednesday, September 13. Cyndy, currently the Literacy Program Manager at the Illinois State Library (ISL), worked as the Family Literacy Coordinator at ISL for nine years. She has worked cooperatively with the practitioners in Illinois to develop programmatic resources for the Family Literacy projects such as parent-child activities. She will be facilitating a discussion on "Implementing Interactive Parent Child Activities." Following are some questions about PACT to consider in anticipation of the discussion. Design and development -- How do we design PACT that includes diverse learners and low level learners? How do we communicate the value of PACT to enrolled parents who just "want to go to GED class today?" Do you have some Web or hard copy resources on developing PACT activities you want to share? Implementation -- What are some successful activities you want to share? Outcomes -- What are the outcomes for the program of a successful PACT component? What are the outcomes for the participants of a successful PACT component? How do we communicate the value of PACT to funders and the community? If you are interested in joining the discussion, you can subscribe to the Family Literacy Discussion List by going to http://www.nifl.gov/mailman/listinfo/familyliteracy Gail J. Price Multimedia Specialist National Center for Family Literacy 325 West Main Street, Suite 300 Louisville, KY 40205 Phone: 502 584-1133, ext. 112 Fax: 502 584-0172 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060906/5d7fdd20/attachment.html From marie.cora at hotspurpartners.com Wed Sep 6 10:23:37 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 6 Sep 2006 10:23:37 -0400 Subject: [Assessment 481] The Health Literacy of America's Adults: Results from the 2003 National Assessment of Adult Literacy Message-ID: <009b01c6d1c0$0b76aa10$0302a8c0@LITNOW> ============================================================== News Flash The Health Literacy of America's Adults: Results from the 2003 National Assessment of Adult Literacy (9/6/2006) Results from the Health Literacy component of the 2003 National Assessment of Adult Literacy (NAAL) were just released. The health literacy findings are based on the first large-scale national assessment designed specifically to measure the health literacy of adults living in America. This report measures health literacy among American adults including their ability to read, understand, and apply health-related information in English. Findings include: * The majority of American adults (53 percent) had Intermediate health literacy. Fewer than 15 percent of adults had either Below Basic or Proficient health literacy. * Women had higher average health literacy than men. * Adults who were ages 65 and older had lower average health literacy than younger adults. * Hispanic adults had lower average health literacy than adults in any other racial/ethnic group. To view the reports and for more information, visit http://nces.ed.gov/naal Jaleh Behroozi Soroui Education Statistics Services Institute (ESSI-Stat) American Institutes for Research 1990 K Street, NW Suite 500 Washington, DC 20006 Phone: 202/403-6958 email: jsoroui at air.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060906/0f0a9319/attachment.html From jcrawford at nifl.gov Wed Sep 6 11:44:34 2006 From: jcrawford at nifl.gov (Crawford, June) Date: Wed, 6 Sep 2006 11:44:34 -0400 Subject: [Assessment 482] GED and College Admissions Message-ID: <9B35BF1886881547B5DFF88364AF31A3081E8B8F@wdcrobe2m03.ed.gov> For many years prior to my employment with the federal government, I directed a university learning center that offered developmental classes, ran the university's placement testing program, and offered tutoring, ESL classes, and services for those with learning disabilities and other physical disabilities. Over a 20 year period I saw the test results for at least 15,000 students and I can say without any doubt that having a GED was NOT a guarantee that an entering student had the reading, writing, or math skills that were required as the basic skills before attempting college-level classes. We saw many adults enter college with a GED who had large gaps between what we anticipated would be the skill level of high school graduates and those who just passed high school with minimum skill levels. And, unfortunately, we saw many of them leave college in academic difficulty - and with debts for tuition. (I was the person, in the end, who interviewed all these people and had to send the final letters of dismissal.) Adults who wish to go on for more education need to be advised that having a piece of paper that says you have a high school diploma is not sufficient. There are basic skills and then there are more advanced skills and the person who will be successful at the college level has to be able to perform competently from the beginning. Just as about 1/3 of high school graduates are not ready for the level of work required at a college, the GED does not adequately prepare most students. If we could connect jobs to skill levels and make this clear to students and parents and employers and employees, this would be a real boon to the American economy and school system. Perhaps it is time to consider levels of readiness and make it clear to high school students and to adults in adult education that there are varying levels depending on the end goal. People need to know how they need to be able to perform for the goals they set for themselves. Paper just doesn't do it; performance is the key to success. June Crawford From Tina_Luffman at yc.edu Wed Sep 6 13:25:39 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Wed, 6 Sep 2006 10:25:39 -0700 Subject: [Assessment 483] Re: GED and College Admissions Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060906/1aa01b63/attachment.html From Donna.Albanese at ode.state.oh.us Wed Sep 6 13:50:48 2006 From: Donna.Albanese at ode.state.oh.us (Albanese, Donna) Date: Wed, 6 Sep 2006 13:50:48 -0400 Subject: [Assessment 484] Re: GED and College Admissions Message-ID: <76CE091B5AD1F940BC577F0D01A29A3202BFA604@mailb02.ode.state.oh.us> Getting students "college ready" has been an issue for years, particularly not knowing exactly what college ready means from state to state or from college to college. In Ohio, we are crosswalking college readiness standards with our ABLE standards to identify the gaps and make needed revisions. I'd be interested to hear from states or programs that are currently involved in or have completed this process. Donna Albanese, Consultant Ohio Department of Education Adult Basic and Literacy Education 25 South Front Street, Stop 614 Columbus, Ohio 43215-4183 Phone: 614-466-5015 Fax: 614-728-8470 e-mail: donna.albanese at ode.state.oh.us -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Crawford, June Sent: Wednesday, September 06, 2006 11:45 AM To: Assessment at nifl.gov Subject: [Assessment 482] GED and College Admissions For many years prior to my employment with the federal government, I directed a university learning center that offered developmental classes, ran the university's placement testing program, and offered tutoring, ESL classes, and services for those with learning disabilities and other physical disabilities. Over a 20 year period I saw the test results for at least 15,000 students and I can say without any doubt that having a GED was NOT a guarantee that an entering student had the reading, writing, or math skills that were required as the basic skills before attempting college-level classes. We saw many adults enter college with a GED who had large gaps between what we anticipated would be the skill level of high school graduates and those who just passed high school with minimum skill levels. And, unfortunately, we saw many of them leave college in academic difficulty - and with debts for tuition. (I was the person, in the end, who interviewed all these people and had to send the final letters of dismissal.) Adults who wish to go on for more education need to be advised that having a piece of paper that says you have a high school diploma is not sufficient. There are basic skills and then there are more advanced skills and the person who will be successful at the college level has to be able to perform competently from the beginning. Just as about 1/3 of high school graduates are not ready for the level of work required at a college, the GED does not adequately prepare most students. If we could connect jobs to skill levels and make this clear to students and parents and employers and employees, this would be a real boon to the American economy and school system. Perhaps it is time to consider levels of readiness and make it clear to high school students and to adults in adult education that there are varying levels depending on the end goal. People need to know how they need to be able to perform for the goals they set for themselves. Paper just doesn't do it; performance is the key to success. June Crawford ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From MMaralit at NIFL.gov Wed Sep 6 15:27:05 2006 From: MMaralit at NIFL.gov (Maralit, Mary Jo) Date: Wed, 6 Sep 2006 15:27:05 -0400 Subject: [Assessment 484] Re: The Health Literacy of America's Adults: Results from the 2003 National Assessment of Adult Literacy Message-ID: <4062487BDB6029428A763CAEF4E1FE5B0B93313A@wdcrobe2m03.ed.gov> The following announcement is posted on behalf of The National Center for Education Statistics: The Health Literacy of America's Adults: Results from the 2003 National Assessment of Adult Literacy (9/6/2006) Results from the Health Literacy component of the 2003 National Assessment of Adult Literacy (NAAL) were just released. The health literacy findings are based on the first large-scale national assessment designed specifically to measure the health literacy of adults living in America. This report measures health literacy among American adults including their ability to read, understand, and apply health-related information in English. Findings include: * The majority of American adults (53 percent) had Intermediate health literacy. Fewer than 15 percent of adults had either Below Basic or Proficient health literacy. * Women had higher average health literacy than men. * Adults who were ages 65 and older had lower average health literacy than younger adults. * Hispanic adults had lower average health literacy than adults in any other racial/ethnic group. To download, view and print the publication as a PDF file, please visit: To view other NAAL reports and for more information, visit Jaleh Behroozi Soroui Education Statistics Services Institute (ESSI-Stat) American Institutes for Research 1990 K Street, NW Suite 500 Washington, DC 20006 Phone: 202/403-6958 email: jsoroui at air.org From bonniesophia at adelphia.net Wed Sep 6 19:57:44 2006 From: bonniesophia at adelphia.net (Bonnie Odiorne) Date: Wed, 6 Sep 2006 19:57:44 -0400 Subject: [Assessment 485] Re: GED and College Admissions In-Reply-To: <9B35BF1886881547B5DFF88364AF31A3081E8B8F@wdcrobe2m03.ed.gov> Message-ID: <004f01c6d210$40811970$0202a8c0@PC979240272114> Hello, all, I've been waiting to contribute to this discussion, since I've never been involved in a GED program, but the last ABE program I facilitated and taught was an employability program integrating technology, and taught a lot of "soft" skills (based on PBS-KET's Workplace Essential Skills), such as communication, appropriate behavior, ways to be organized not just job hunting but on the job, time management and prioritizing. Now I direct the Writing Center at Post University, Waterbury, CT, and between me and the University Learning Center we're doing many of the things June describes. The "skills" gap is one factor, but it's more an "awareness" issue. University-wide, we're piloting a program this year for freshmen that will extend into Senior Year that focuses on self-assessment, planning, creating a college success/career profile, and focuses on many of the "other" skills beyond content areas that a student needs to succeed and to plan a career. In adult community education, very often basic skills education as well as ESL is a "window" to integrate a lot of other survival/"soft" skills, but I'd suspect GED preparation, being content-area and test-driven, would be a more difficult sell. Yes, many of the students we get, while not necessarily GEDs but H.S. diplomas, have many areas in which improvement is needed, that in addition to the profiling, planning and communicative skills, can only be described as critical thinking, reflection on the "metacompetencies," the awareness of how one thinks, learns, solves problems, makes decisions etc. It remains to be seen how well the program works, an integration of a text, CD, and software program, and if the students will see and dismiss it as "what they already know" or as skills worth knowing. These are big concerns; it seems to me that in any adult ed program whose ultimate goals, particularly in advanced ESL and/or adult literacy students who are already H.S. graduates might be college transition, these issues should be raised. One of our volunteer tutors years ago was teaching what were then called "study skills," and it had never occurred to me that these were skills that didn't come "naturally," and that they could be taught. I'd like to hear more about these issues, also. Bonnie Odiorne, Ph.D. Director, Writing Center, Adjunct Professor Post University, Waterbury, CT -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Crawford, June Sent: Wednesday, September 06, 2006 11:45 AM To: Assessment at nifl.gov Subject: [Assessment 482] GED and College Admissions For many years prior to my employment with the federal government, I directed a university learning center that offered developmental classes, ran the university's placement testing program, and offered tutoring, ESL classes, and services for those with learning disabilities and other physical disabilities. Over a 20 year period I saw the test results for at least 15,000 students and I can say without any doubt that having a GED was NOT a guarantee that an entering student had the reading, writing, or math skills that were required as the basic skills before attempting college-level classes. We saw many adults enter college with a GED who had large gaps between what we anticipated would be the skill level of high school graduates and those who just passed high school with minimum skill levels. And, unfortunately, we saw many of them leave college in academic difficulty - and with debts for tuition. (I was the person, in the end, who interviewed all these people and had to send the final letters of dismissal.) Adults who wish to go on for more education need to be advised that having a piece of paper that says you have a high school diploma is not sufficient. There are basic skills and then there are more advanced skills and the person who will be successful at the college level has to be able to perform competently from the beginning. Just as about 1/3 of high school graduates are not ready for the level of work required at a college, the GED does not adequately prepare most students. If we could connect jobs to skill levels and make this clear to students and parents and employers and employees, this would be a real boon to the American economy and school system. Perhaps it is time to consider levels of readiness and make it clear to high school students and to adults in adult education that there are varying levels depending on the end goal. People need to know how they need to be able to perform for the goals they set for themselves. Paper just doesn't do it; performance is the key to success. June Crawford ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From KHinson at future-gate.com Wed Sep 6 21:31:48 2006 From: KHinson at future-gate.com (Katrina Hinson) Date: Thu, 07 Sep 2006 03:31:48 +0200 Subject: [Assessment 486] Re: GED and College Admissions In-Reply-To: References: Message-ID: <44FF3E42.121C.00A0.0@future-gate.com> See for me, as a GED instructor - I want my students to be ready for the college level. I don't want them to meet the frustration of multiple developmental classes. From prior discussions on other lists, I remember that one statistic was that when a GED graduate has to take more than one developmental, he/she is less likey to finish their post secondary degree. I want to limit this for my students yet the struggle is to figure out how to accomplish that (as I stated in my original e-mail). I want them to know from the beginning what it takes to get where they want to go, be it a good job, that pays well or a college campus. What does it mean to be "college" ready. How can we do a better job of ensuring they are college ready. It's statistically sad and disheartening as an instructor to see the number of GED graduates in various reports nationwide, who never finish any post secondary training because they still lacked skills, hit a wall, grew frustrated and quit. How can we give them a good foundation at the starting line so that they see the GED as a step, a beginning point in their future - see that it's not an end or a destination (unless they are solely returning to school for personal satisfaction.). Regards, Katrina >>> 9/6/2006 1:25 pm >>> June, What you are saying is so true. At our college we have a COMPASS placement test that all students must take before being placed into Math or English classes regardless of whether they have a high school diploma, GED, or lack of credentials. Most of our GED graduates place into Fundamentals of Math or Beginning Algebra, which are developmental courses, before moving on to college level work. Our GED classes are filling the gap for many to at least get themselves up to a level to fit into developmental college courses, but they do not prepare students to walk into College Composition or College Algebra, for example. Our Adult Education Director, Dr. Carolyn Beckman, has had additional responsibilities added to her duty list to be in charge of developmental education for the college as well. I am hearing that this additional responsibility for our ESOL & GED directors is becoming a trend in junior colleges in a number of states. Thanks for your insight, Tina Tina Luffman Coordinator, Developmental Education Verde Valley Campus 928-634-6544 tina_luffman at yc.edu From marie.cora at hotspurpartners.com Thu Sep 7 10:05:24 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 7 Sep 2006 10:05:24 -0400 Subject: [Assessment 487] Re: GED and College Admissions Message-ID: <01dc01c6d286$aa65d990$0302a8c0@LITNOW> Colleagues: The following post is cross-posted from the Content Standards Discussion List. Marie Cora ******************************************** I know I am a little bit late jumping into this discussion. But--as a high school drop out in the 9th grade and someone who came from an immigrant family, meaning I spent more years working the fields with my family than going to elementary school. I had not been able to keep up with the other students so by the time I hit junior high school I was completely lost. My father officially signend me out of school at the age of 15, and I then went to work full time. Well when I had my own children and by this time I was beginning to understand the value of an education when I couldn't help my children with their homework and also when two of my boys decided they didn't want to finish school. Well by this time I knew I had to do something so I went to the adult education school where I obtained my GED. From there my oldest daughter talked me into going to the community college. I started going to a community college soon afterwards and had a really hard time. But with the help of the Learning Resource Center I was able to catch up to the other students who had finished all the k through 12 years in school. Today I am a secretary at a community college and also teach an ROP/Office Occupations & GED preparation class part-time. I teach 16 to 21 year olds. My students are at all different levels, some were like me and never really grasped things at the elementary level others just got side tracked in the junior or senior years of school. I stress to my students the importance of learning more than what just the GED teaches. We have a career development skills section we also teach in our classroom. Some of my students only want to get their GED to get a job, others need their GED to go on to higher education. What ever my students want to pursue that is what we work for. If they plan on going on to higher education then we study more indepth with those ones. With the ones who only want a GED for a job, and thats all they care about getting then we only study the basics to pass the GED and job skills. One thing I really stress when they first come into my classroom, is if they want to go on to higher education they are really going to have to and WANT to work hard for it. Some of them will have to be tutored in different areas that they never grasped during their k-12 years. We do that with them in our classroom. I have an awesome case technician that helps me with this. We do go out of our way to work with our students in what ever they desire. But they know they have to have the desire if they want to succeed. We have some students that are coming back to us now after having gotten their GED but are struggling with a college course they are taking. We make the time to help them even if we don't get paid for it. I guess for our classroom it is once our student, always our student! Okay I have rambled on enough this morning and I hope I made some sense with all of this. Gloria Fuentes Gloria Fuentes [ropteacher at gmail.com] -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060907/0101b761/attachment.html From djgbrian at utk.edu Thu Sep 7 10:10:49 2006 From: djgbrian at utk.edu (Brian, Dr Donna J G) Date: Thu, 7 Sep 2006 10:10:49 -0400 Subject: [Assessment 488] Re: GED and College Admissions In-Reply-To: <76CE091B5AD1F940BC577F0D01A29A3202BFA604@mailb02.ode.state.oh.us> Message-ID: Here's a very relevant article from the New York Times on the "readiness" of students for college work and remedial classes. Donna djgbrian at utk.edu -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Albanese, Donna Sent: Wednesday, September 06, 2006 1:51 PM To: The Assessment Discussion List Subject: [Assessment 484] Re: GED and College Admissions Getting students "college ready" has been an issue for years, particularly not knowing exactly what college ready means from state to state or from college to college. In Ohio, we are crosswalking college readiness standards with our ABLE standards to identify the gaps and make needed revisions. I'd be interested to hear from states or programs that are currently involved in or have completed this process. Donna Albanese, Consultant Ohio Department of Education Adult Basic and Literacy Education 25 South Front Street, Stop 614 Columbus, Ohio 43215-4183 Phone: 614-466-5015 Fax: 614-728-8470 e-mail: donna.albanese at ode.state.oh.us -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Crawford, June Sent: Wednesday, September 06, 2006 11:45 AM To: Assessment at nifl.gov Subject: [Assessment 482] GED and College Admissions For many years prior to my employment with the federal government, I directed a university learning center that offered developmental classes, ran the university's placement testing program, and offered tutoring, ESL classes, and services for those with learning disabilities and other physical disabilities. Over a 20 year period I saw the test results for at least 15,000 students and I can say without any doubt that having a GED was NOT a guarantee that an entering student had the reading, writing, or math skills that were required as the basic skills before attempting college-level classes. We saw many adults enter college with a GED who had large gaps between what we anticipated would be the skill level of high school graduates and those who just passed high school with minimum skill levels. And, unfortunately, we saw many of them leave college in academic difficulty - and with debts for tuition. (I was the person, in the end, who interviewed all these people and had to send the final letters of dismissal.) Adults who wish to go on for more education need to be advised that having a piece of paper that says you have a high school diploma is not sufficient. There are basic skills and then there are more advanced skills and the person who will be successful at the college level has to be able to perform competently from the beginning. Just as about 1/3 of high school graduates are not ready for the level of work required at a college, the GED does not adequately prepare most students. If we could connect jobs to skill levels and make this clear to students and parents and employers and employees, this would be a real boon to the American economy and school system. Perhaps it is time to consider levels of readiness and make it clear to high school students and to adults in adult education that there are varying levels depending on the end goal. People need to know how they need to be able to perform for the goals they set for themselves. Paper just doesn't do it; performance is the key to success. June Crawford ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From marie.cora at hotspurpartners.com Fri Sep 8 09:49:17 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 8 Sep 2006 09:49:17 -0400 Subject: [Assessment 489] GED and college admissions Message-ID: <028801c6d34d$94c29340$0302a8c0@LITNOW> Colleagues: The following article, from Public Education Network's Weekly Newsblast should be of interest to you in regards to our discussion on GED and preparedness for higher education. Marie Cora *********************************************** PAYING DOUBLE According to "Paying Double: Inadequate High Schools and Community College Remediation," a new issue brief from the Alliance for Excellent Education, the United States spends over $1.4 billion each year to provide community college remediation education for recent high school graduates who did not acquire the basic skills necessary to succeed in college or at work. The brief, which was produced with support from MetLife Foundation, also finds that the nation loses almost $2.3 billion annually in wages as a result of the significantly reduced earnings potential of students whose need for remedial reading make them more likely to drop out of college without a degree. Therefore, by increasing the number of students graduating from high school prepared to succeed in college, an additional $3.7 billion annually would flow into the nation's economy. The brief offers no simple solutions but does point out that improving the nation's high schools could certainly reduce the number of students who need remediation in college. It points to "weak curricula, vague standards, and lack of alignment between high school content and the expectations of colleges and employers" as reasons for the need for remediation. It adds that students who take a rigorous high school curriculum are less likely to need remedial courses than students whose course load is less demanding. Finally, it suggests that statewide performance standards for college admission would enable educators to assess student progress toward readiness for college. To view the complete issue brief, which includes a breakdown of state-by-state costs, visit: http://www.all4ed.org/publications/remediation.pdf Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060908/9edf4942/attachment.html From cynthia_zafft at worlded.org Fri Sep 8 10:47:28 2006 From: cynthia_zafft at worlded.org (Cynthia Zafft) Date: Fri, 08 Sep 2006 10:47:28 -0400 Subject: [Assessment 490] GED and College Admissions Message-ID: I feel like every message, every sentence, in this discussion touches on important aspect of ASE/ESOL to college transition. It's hard to figure out where to start. Focusing primarily on college placement, there is good news and bad news all mixed up together. First, the statistics on persistence are discouraging for both traditional and nontraditional students, especially in community college where many adult education students begin. The problem is that adults are under a lot of pressure to do well from the start and have a hard time justifying tiers of coursework that doesn't count toward graduation. That said, over time, "despite a higher rate of remediation and more family obligations, low-income adult students [that includes GED recipients] earn slightly better grades, on average, than traditional students" (from Low-Income Adults in Profile by Lumina Foundation). So, it appears that if adults can make it over the transition point, they can do well. Second, one developmental education course won't do you in but the type and amount of developmental education does matter. "Among students who were in remedial reading for more than one course, nearly 80% were in two or more other remedial courses, and less than 9% earned bachelor's degrees." (The Kiss of Death? An Alternative View of College Remediation" by Adelman. See http://www.highereducation.org/crosstalk/ct0798/voices0798-adelman.shtml). That said, students who place into developmental reading do best to take it (it is optional in some colleges) and do go on to be stronger students. So, the take-home message here is that the level of preparation in adult education is key. (See Research to Practice Brief at http://www.collegetransition.org/promising/rp2.html) I'll end here but just want to say, there are a lot of folks interested in this issue. I run the National College Transition Network (www.collegetransition.org). We are a member organization (free for all interested individuals but focused around the needs of adult educators) and our website has many resources you might find helpful. Regards, Cynthia Cynthia Zafft, Director National College Transition Network (NCTN) World Education, Inc. 44 Farnsworth Street Boston, MA 02210 (617) 482-9485 www.collegetransition.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060908/a0ecd54d/attachment.html From marie.cora at hotspurpartners.com Fri Sep 8 11:26:56 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 8 Sep 2006 11:26:56 -0400 Subject: [Assessment 491] Remedial courses, preparedness, HS, and GED Message-ID: <02ae01c6d35b$38c5ce50$0302a8c0@LITNOW> Hi everyone, this is kinda long. I had the opportunity to read thru the 2 articles that have been posted during this exchange on the GED and college admissions. I have some thoughts, comments, and questions for you all. In the article posted earlier today ("Paying Double: Inadequate High Schools and Community College Remediation") there are some alarming statistics: * Nearly half (42%) of students in Community Colleges (CCs) are taking remedial courses * 20% of students in 4 year colleges are taking remedial courses * CCs are quickly becoming the space where students who need to take catch up courses must go (11 states have passed LAWS banning remedial courses in 4 year institutions - I am assuming that other states will follow suit) The article notes that in addition to students arriving at higher ed institutions without adequate skills in reading, writing, and math, they are also poorly prepared in their study habits and in understanding and managing complex material. While about half of the students in CC remedial courses fall into categories including retraining, re-education, and learning English language skills (so these folks could be of any age), the other half are in fact recent or fairly recent High School grads who are taking remedial courses in order to gain the skills and knowledge that they should have achieve/received in their high schools. In addition, the leading predictor for dropping out of college is the need upon entry for a remedial course in reading. The other article, sent by Donna Brian the other day ("At 2-Year Colleges, Students Eager But Unready"), the following text-bits really hit me hard: * The young man featured in the article can balance his checkbook (a complex action and necessary practical life skill), but struggles to remember mathematical notions like Pythagorean theory, sine, cosine, and tangent. * That there is presently a movement among public universities to "crack down on ill-prepared students" (also noted in the first article above). * That many students are "shocked" when they learn upon entry to college that their high school experience has not prepared them for higher ed. It is their assumption that HS would/should do this, but instead, they find they need remedial courses first. * "It's the math that's killing us." (see page 2 of this article) What does this mean for us in adult education? If it's so clear that our youth is not prepared for the rigors of higher ed in terms of both academic skills, and critical thinking and study skills, then we all expect to see a lot of these folks in our adult education programs. How do we work with this? Will we become the next space where all the "remediation" takes place? Are we already there? What role does testing and accountability play at the high school level (which has invented the new term "Push Outs") - is it exacerbating this trend at the expense of our children's minds and capabilities, and of course the future of our economy and generations? How about the GED? Where does this fit in with this scenario? Which is better, which is worse at this point: a HS diploma or a GED, and does this matter? I know the research says that it does, but the emerging information we keep reading is that our HSs are also not up to snuff. What about that other 50% in the CCs - those folks of any age who are there because they need retraining (lost their job for whatever reason, or got laid off), re-education of older students (they need way better reading, writing, math, computer, critical thinking skills in order to compete for living-wage jobs), students who are in need of improving their English language skills. All these folks sound familiar to me when I think back on my work as a program administrator and practitioner in the field. Does this mean that ABE/ESOL needs to prepare students to go into remedial courses in CCs?? Do we need to surpass what HSs are doing, and what remedial courses are there for? Is ABE equal to the remedial level of study in the CCs? What do we need to do to go beyond the GED level with our students? If high school diplomas do not mean that with continued diligence and work, you can enter higher ed at a successful level and stay there until you graduate, then where does that leave us helping students to achieve a GED? marie Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060908/9cec75b3/attachment.html From marie.cora at hotspurpartners.com Fri Sep 8 13:08:53 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 8 Sep 2006 13:08:53 -0400 Subject: [Assessment 492] New from CAELA: Brief on Tranisitions Message-ID: <02e701c6d369$77bf7260$0302a8c0@LITNOW> Hi again, The following resource comes from Miriam Burt at CAELA - this is very relevant and timely for our present discussion. Marie ****************************************************** Hello, everyone, The latest brief from CAELA, Supporting Adult English Language Learners' Transitions to Postsecondary Education, by Julie Mathews-Aydinli, is now availabe on line at http://www.cal.org/caela/esl_resources/briefs/transition.html. Adult immigrants studying English in the United States have diverse educational backgrounds. Some have earned graduate degrees, while others have had little or no access to education. Their goals and expectations for future education and employment are also diverse. This CAELA brief focuses on transitions from adult ESL programs postsecondary education. For a discussion of classroom-level (e.g., how to develop vocabulary needed for academic classes, types of reading to do in class, etc.) and programmatic (e.g., orientation needed, suggestions for how the adult ESL programs can collaborate with the associated postsecondary institutions, etc.) approaches that can further such transitions, read the brief at http://www.cal.org/caela/esl_resources/briefs/transition.html. Coming this fall: briefs on content standards for the adult English language classroom and integrating instruction, content standards, and assessment in the adult ESL classroom. Miriam ********** Miriam Burt Center for Adult English Language Acquisition (CAELA) Center for Applied Linguistics 4646 40th Street NW Washington, DC 20016 (202) 362-0700 (202) 363-7204 (fax) miriam at cal.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060908/d1036ee7/attachment.html From marie.cora at hotspurpartners.com Mon Sep 11 10:56:29 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 11 Sep 2006 10:56:29 -0400 Subject: [Assessment 493] Corrections Education Discussion on Special Topics List Message-ID: <039201c6d5b2$770db5c0$0302a8c0@LITNOW> Good morning, afternoon, and evening to you all. The following post is from David Rosen, Moderator of the Special Topics Discussion List. Marie Cora Assessment Discussion List Moderator ======================================================================== =========================================== To subscribe to the Special Topics Discussion List, go to http:// www.nifl.gov/mailman/listinfo/specialtopics , fill in your name, email address and pick a password. After you have subscribed you will receive an email asking you to confirm your subscription. Please reply immediately. ======================================================================== =========================================== Dear Colleagues, From September 18-22, on the Special Topics Discussion List, we are pleased to have a panel of expert guests in corrections education. The topic will focus on research and professional wisdom in corrections family literacy, and on the transition from corrections education to community education for inmates who have been released. Our guests are: John Linton, Correctional Education, Office of Safe and Drug Free Schools, U.S. Department of Education John is the program officer for two correctional education grant programs ("Lifeskills for State and Local Prisoners" and "Grants to States for Workplace and Community Transition Training for Incarcerated Youth Offenders") in the Office of Safe and Drug Free Schools of the U. S. Department of Education. John formerly served the State of Maryland as the director of adult correctional education programs. He has been with the federal agency since 2001, originally with the Office of Vocational and Adult Education. Stephen J. Steurer, Ph.D., Executive Director, Correctional Education Association The Correctional Education Association is a professional organization of educators who work in prisons, jails and juvenile settings. William R. Muth, PhD, Assistant Professor, Reading Education and Adult Literacy, Virginia Commonwealth University Bill is an Assistant Professor of Adult and Adolescent Literacy at Virginia Commonwealth University. Until August 2005, he was the Education Administrator for the Federal Bureau of Prisons. Other positions with the FBOP included: reading teacher, principal, and Chief of the Program Analysis Branch. In 2004 Bill earned his doctorate in adult literacy from George Mason University. His dissertation, "Performance and Perspective: Two Assessments of Federal Prisoners in Literacy Programs" won the College Reading Association's Dissertation of the Year Award. His research interests include Thirdspace and Reading Components theories, especially as these apply to prison-based family literacy programs and children of incarcerated parents. The following readings are recommended by the panelists as background for the discussion: 1. "Locked Up and Locked Out, An Educational Perspective on the US Prison Population," Coley, Richard J. and Barton, Paul E., 2006 Available on line at the ETS web site: http://tinyurl.com/qmzfa (short URL) 2. "Learning to Reduce Recidivism: A 50-state analysis of postsecondary correctional education policy," Institute for Higher Education Policy, Erisman, Wendy and Contardo, Jeanne B., 2005. Available on line at the IHEP web site: http://tinyurl.com/pj2sh (short URL) 3. "Understanding California Corrections" from the California Policy Research Center, U of C. (Chapter 4) http://www.ucop.edu/cprc/ documents/understand_ca_corrections.pdf John Linton believes that California is a watershed state in corrections issues and policies , and that how things unfold there has great national significance. He says that this is a thoughtful and well-informed report on the "overview" of the corrections situation in California -- including the role of treatment programs. Education is not presented as a central issue, but it has a place -- as a piece of a bigger puzzle. 4. An article by Bill Muth in Exploring Adult Literacy can be found at http://literacy.kent.edu/cra/2006/wmuth/index.html The article contains other on-line links related to prison-based intergenerational programs. He recommends especially the link to the Hudson River Center's excellent publication, Bringing Family Literacy to Incarcerated Settings: An Instructional Guide at: http://www.hudrivctr.org/products_ce.htm David J. Rosen Special Topics Discussion Moderator djrosen at comcast.net From marie.cora at hotspurpartners.com Wed Sep 13 07:09:57 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 13 Sep 2006 07:09:57 -0400 Subject: [Assessment 494] FW: [EnglishLanguage 626] Ability to Benefit Message-ID: <045201c6d725$26606090$0302a8c0@LITNOW> Colleagues, The following post is from the English Language Discussion List. I wonder if anyone here has experience with this test and has any comments? Marie Cora Assessment Discussion List Moderator ****************************************************** Dear NIFL ESL List serv: This past summer, the federal government added the COMPASS ESL to the list of federally-approved "Ability to Benefit" ESL tests, and they increased the minimum raw score on the one existing ESL "Ability to Benefit" test which already existed. (Ability to Benefit addresses whether or not a student will benefit from financial aid.) The minimum raw score of 90 on the CELSA (ESL Ability to Benefit Test) was raised to a 97 for Ability to Benefit/financial aid purposes. Do any of you have experience with this new score of 97? Are any of you using the CELSA for this purpose? How did this change in minimum score affect your program? How are you dealing with this higher minimum score? Thank you very much, Tracy vonMaluski El Paso Community College -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060913/c813904e/attachment.html From marie.cora at hotspurpartners.com Thu Sep 14 08:31:22 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 14 Sep 2006 08:31:22 -0400 Subject: [Assessment 495] FW: [ProfessionalDevelopment 526] ESOL to ABE Transition Message-ID: <04d301c6d7f9$b0a9bba0$0302a8c0@LITNOW> Colleagues, This email query is from the Professional Development Discussion List. What are your thoughts and comments? Marie Cora Assessment Discussion List Moderator *********************** Professional Development Colleagues: I'm wondering what policies and supports your states are putting into place to assist programs in transitioning their ESOL students into ABE, now that the cut point for NRS level six has been lowered. In New York City, some programs are creating special classes for students who score above 540 on the BEST Plus, but still need to improve their oral proficiency. These students will be tested on the TABE and designated ABE for NRS purposes, but their classes will be comprised only of non-native English speakers and will incorporate much more oral language development than a typical ABE class. What are some other strategies you might suggest? Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ---------------------------------------------------- From andresmuro at aol.com Thu Sep 14 09:11:51 2006 From: andresmuro at aol.com (andresmuro at aol.com) Date: Thu, 14 Sep 2006 09:11:51 -0400 Subject: [Assessment 496] Re: FW: [ProfessionalDevelopment 526] ESOL to ABE Transition In-Reply-To: <04d301c6d7f9$b0a9bba0$0302a8c0@LITNOW> References: <04d301c6d7f9$b0a9bba0$0302a8c0@LITNOW> Message-ID: <8C8A6167A6E524E-A48-4B4F@WEBMAIL-DC15.sysops.aol.com> The logical process should be ABE to ESL. First, students acquire basic literacy in L1. Then, you can transition them into ESL and they will do very well. If you try to do the opposite, it will be very frustrating for students and teachers. I understand that there are things that make this process very difficult, e.g. too many languages, people from oral societies, etc. However, if at all possible, the process should be from ABE to ESL. Andres Please take a look at my artwork: www.geocities.com/andresmuro/art.html -----Original Message----- From: marie.cora at hotspurpartners.com To: Assessment at nifl.gov Sent: Thu, 14 Sep 2006 6:31 AM Subject: [Assessment 495] FW: [ProfessionalDevelopment 526] ESOL to ABE Transition Colleagues, This email query is from the Professional Development Discussion List. What are your thoughts and comments? Marie Cora Assessment Discussion List Moderator *********************** Professional Development Colleagues: I'm wondering what policies and supports your states are putting into place to assist programs in transitioning their ESOL students into ABE, now that the cut point for NRS level six has been lowered. In New York City, some programs are creating special classes for students who score above 540 on the BEST Plus, but still need to improve their oral proficiency. These students will be tested on the TABE and designated ABE for NRS purposes, but their classes will be comprised only of non-native English speakers and will incorporate much more oral language development than a typical ABE class. What are some other strategies you might suggest? Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ---------------------------------------------------- ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ________________________________________________________________________ Check out the new AOL. Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060914/6d91caf5/attachment.html From IraY at lacnyc.org Thu Sep 14 09:39:04 2006 From: IraY at lacnyc.org (Ira Yankwitt) Date: Thu, 14 Sep 2006 09:39:04 -0400 Subject: [Assessment 497] Re: FW: [ProfessionalDevelopment 526] ESOL to ABE Transition Message-ID: <6E8BC13A30982C44BCD32B38FB8F5AB81A4527@lac-exch.lacnyc.local> I completely agree, but this is actually a different point. For students with limited or no literacy in their native language, the evidence seems overwhelming that they should begin their studies in a basic education class in their native language (BENL). Unfortunately, this presents a challenge for programs funded through WIA because NRS doesn't recognize BENL as an instructional type. In NYC, WIA-funded programs incorporate native language literacy development into instruction, but their classes have to be designated ESOL, and educational gain must be measured by the BEST Plus. This is obviously a very dubious, problematic solution. The problem I was referring to is actually at the other end of the continuum -- what to do about higher level ESOL students who, because of the elimination of the former NRS Advanced ESOL level, are now placing out of ESOL, and, paradoxically, must now be considered ABE students in order to continue to develop their oral proficiency. So we actually have two complimentary problems -- students who are considered ESOL but should be ABE (i.e., BENL); and students who are considered ABE who should be ESOL. My question was about the second issue, but I would also be very happy to hear suggested strategies to address the first! Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of andresmuro at aol.com Sent: Thursday, September 14, 2006 9:12 AM To: assessment at nifl.gov Subject: [Assessment 496] Re: FW: [ProfessionalDevelopment 526] ESOL to ABE Transition The logical process should be ABE to ESL. First, students acquire basic literacy in L1. Then, you can transition them into ESL and they will do very well. If you try to do the opposite, it will be very frustrating for students and teachers. I understand that there are things that make this process very difficult, e.g. too many languages, people from oral societies, etc. However, if at all possible, the process should be from ABE to ESL. Andres Please take a look at my artwork: www.geocities.com/andresmuro/art.html -----Original Message----- From: marie.cora at hotspurpartners.com To: Assessment at nifl.gov Sent: Thu, 14 Sep 2006 6:31 AM Subject: [Assessment 495] FW: [ProfessionalDevelopment 526] ESOL to ABE Transition Colleagues, This email query is from the Professional Development Discussion List. What are your thoughts and comments? Marie Cora Assessment Discussion List Moderator *********************** Professional Development Colleagues: I'm wondering what policies and supports your states are putting into place to assist programs in transitioning their ESOL students into ABE, now that the cut point for NRS level six has been lowered. In New York City, some programs are creating special classes for students who score above 540 on the BEST Plus, but still need to improve their oral proficiency. These students will be tested on the TABE and designated ABE for NRS purposes, but their classes will be comprised only of non-native English speakers and will incorporate much more oral language development than a typical ABE class. What are some other strategies you might suggest? Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ---------------------------------------------------- ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ________________________________ Check out the new AOL . Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060914/f55438c6/attachment.html From djrosen at comcast.net Thu Sep 14 09:40:37 2006 From: djrosen at comcast.net (David Rosen) Date: Thu, 14 Sep 2006 09:40:37 -0400 Subject: [Assessment 498] Re: [ProfessionalDevelopment 527] FW: Re: ESOL to ABE Transition In-Reply-To: References: Message-ID: <939699A1-D83E-40FD-A57E-D195C38D6154@comcast.net> Andres, Marie, Jackie and others, An ideal situation, for ESOL programs in the United States, would be the option for learners to purse English (ESOL) and also some basic skills content in their first language and English: numeracy/ mathematics, U.S. history, political system, and culture, first language reading and writing, and perhaps other subjects. The goals would be: learning English _and_ learning basic skills (using the first language and English). For programs that cannot imagine doing this because, for example they have no teachers of numeracy/ mathematics who speak the students' first language, perhaps they do. Some programs in Boston, for example, hire English language students who happen to be expert math teachers with years of experience teaching numeracy and math in schools and colleges in their home countries. Many ESOL students want to learn math as much as they want to learn English. We should provide them with that opportunity as soon as possible, not make them wait until they have sufficient English skills. David J. Rosen djrosen at comcast.net On Sep 14, 2006, at 9:20 AM, Taylor, Jackie wrote: > Hello All, > > Marie Cora, Moderator of the Assessment List, forwarded Ira?s > message on to the Assessment List. Below is a response from Andres > Muro. > > > > So how are professional developers and programs grappling with this > issue? Let?s hear from you. Best, Jackie > > > > From: assessment-bounces at nifl.gov [mailto:assessment- > bounces at nifl.gov] On Behalf Of andresmuro at aol.com > Sent: Thursday, September 14, 2006 9:12 AM > To: assessment at nifl.gov > Subject: [Assessment 496] Re: FW: [ProfessionalDevelopment 526] > ESOL to ABE Transition > > > > The logical process should be ABE to ESL. First, students acquire > basic literacy in L1. Then, you can transition them into ESL and > they will do very well. If you try to do the opposite, it will be > very frustrating for students and teachers. I understand that there > are things that make this process very difficult, e.g. too many > languages, people from oral societies, etc. However, if at all > possible, the process should be from ABE to ESL. > > Andres > > > > Please take a look at my artwork: www.geocities.com/andresmuro/ > art.html > > > > > -----Original Message----- > From: marie.cora at hotspurpartners.com > To: Assessment at nifl.gov > Sent: Thu, 14 Sep 2006 6:31 AM > Subject: [Assessment 495] FW: [ProfessionalDevelopment 526] ESOL to > ABE Transition > > Colleagues, This email query is from the Professional Development > Discussion List.What are your thoughts and comments? Marie > CoraAssessment Discussion List Moderator *********************** > Professional Development Colleagues: I'm wondering what policies > and supports your states are putting intoplace to assist programs > in transitioning their ESOL students into ABE,now that the cut > point for NRS level six has been lowered. In New YorkCity, some > programs are creating special classes for students who scoreabove > 540 on the BEST Plus, but still need to improve their > oralproficiency. These students will be tested on the TABE and > designatedABE for NRS purposes, but their classes will be comprised > only ofnon-native English speakers and will incorporate much more > oral languagedevelopment than a typical ABE class. What are some > other strategiesyou might suggest? Ira Yankwitt, DirectorNYC > Regional Adult Education NetworkLiteracy Assistance Center32 > Broadway, 10th FloorNY, NY > 10004212-803-3356---------------------------------------------------- > -------------------------------National Institute for > LiteracyAssessment mailing listAssessment at nifl.govTo unsubscribe or > change your subscription settings, please go to http://www.nifl.gov/ > mailman/listinfo/assessment > size=2 width="100%" align=center> > Check out the new AOL. Most comprehensive set of free safety and > security tools, free access to millions of high-quality videos from > across the web, free AOL Mail and more. > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ---------------------------------------------------- > National Institute for Literacy > Adult Literacy Professional Development mailing list > ProfessionalDevelopment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/professionaldevelopment > > Professional Development section of the Adult Literacy Education Wiki > http://wiki.literacytent.org/index.php/ > Adult_Literacy_Professional_Development David J. Rosen djrosen at comcast.net From mjjerdems at yahoo.com Thu Sep 14 10:19:06 2006 From: mjjerdems at yahoo.com (Mary Jane Jerde) Date: Thu, 14 Sep 2006 07:19:06 -0700 (PDT) Subject: [Assessment 499] Re: FW: [ProfessionalDevelopment 526] ESOL to ABE Transition In-Reply-To: <8C8A6167A6E524E-A48-4B4F@WEBMAIL-DC15.sysops.aol.com> Message-ID: <20060914141906.7910.qmail@web54001.mail.yahoo.com> I suspect that we're differing in verbiage and not classroom technique. I taught pre-literacy to ESL students for seven years. These people needed it in an ESL context because they didn't have the vocabulary to deal with the normal native English speaker vocabulary in the ABE curriculum. These were usually women from east Africa or Arabic speaking cultures, to generalize further, students who had no or little experience in schooling ever and who did not speak English. Sometimes those from west Africa, or with some strong English background, were able to attend an ABE class. Frequently even they spent time in the conversational ESL classes gathering the vocabulary and experience to prepare for the ABE class. The classes were all taught by instructors who were well prepared to teach. The students needed to be prepared to learn in an ABE environment. Mary Jane Jerde andresmuro at aol.com wrote: The logical process should be ABE to ESL. First, students acquire basic literacy in L1. Then, you can transition them into ESL and they will do very well. If you try to do the opposite, it will be very frustrating for students and teachers. I understand that there are things that make this process very difficult, e.g. too many languages, people from oral societies, etc. However, if at all possible, the process should be from ABE to ESL. Andres Please take a look at my artwork: www.geocities.com/andresmuro/art.html -----Original Message----- From: marie.cora at hotspurpartners.com To: Assessment at nifl.gov Sent: Thu, 14 Sep 2006 6:31 AM Subject: [Assessment 495] FW: [ProfessionalDevelopment 526] ESOL to ABE Transition Colleagues, This email query is from the Professional Development Discussion List. What are your thoughts and comments? Marie Cora Assessment Discussion List Moderator *********************** Professional Development Colleagues: I'm wondering what policies and supports your states are putting into place to assist programs in transitioning their ESOL students into ABE, now that the cut point for NRS level six has been lowered. In New York City, some programs are creating special classes for students who score above 540 on the BEST Plus, but still need to improve their oral proficiency. These students will be tested on the TABE and designated ABE for NRS purposes, but their classes will be comprised only of non-native English speakers and will incorporate much more oral language development than a typical ABE class. What are some other strategies you might suggest? Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ---------------------------------------------------- ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Check out the new AOL. Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more. ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- All-new Yahoo! Mail - Fire up a more powerful email and get things done faster. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060914/54879164/attachment.html From marie.cora at hotspurpartners.com Thu Sep 14 11:34:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 14 Sep 2006 11:34:08 -0400 Subject: [Assessment 500] FW: [ProfessionalDevelopment 530] Re: ESOL to ABE Transition Message-ID: <051901c6d813$38aa9740$0302a8c0@LITNOW> Hello, We developed a new category of learners ... Transition ... for our very Advanced ESL learners. We used to put them in our ABE materials, but found that they were too cultural specific and didn't provide the necessary work in vocabulary, pronunciation, grammar, etc. We use NorthStar: Speaking and Listening as the text. The level is determined by the ABE TABE. Diane K. Snell ESL Education Coordinator Racine Literacy Council 734 Lake Avenue Racine, WI 53403 262-632-9495 www.racineliteracy.com From marie.cora at hotspurpartners.com Thu Sep 14 13:27:55 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 14 Sep 2006 13:27:55 -0400 Subject: [Assessment 501] Re: ESOL to ABE Transition Message-ID: <054b01c6d823$1de994a0$0302a8c0@LITNOW> I have been reading with interest the posts that concern teaching L1 and then L2. LESLLA (Low-educated Second Language and Literacy Acquisition-- for Adults) is an international organization will be having a conference Research, Practice, and Policy for Low-educated Second Language and Literacy Acquisition - for Adults (LESLLA), at Virginia Commonwealth University in Richmond, November 2-3, 2006. Please visit the LESLLA website at www.leslla.org for more information about its focus. There is still space for those who are interested in participating in this very interesting and seminal event. You can contact me if you are interested. Nancy ********************************************************* Nancy R. Faux ESOL Specialist Virginia Adult Learning Resource Center Virginia Commonwealth University Richmond, VA nfaux at vcu.edu http://www.valrc.org 1-800-237-0178 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060914/d4161163/attachment.html From marie.cora at hotspurpartners.com Thu Sep 14 15:00:11 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 14 Sep 2006 15:00:11 -0400 Subject: [Assessment 502] FW: [Technology 528] Guest discussion next week Message-ID: <059001c6d830$01cd7630$0302a8c0@LITNOW> Colleagues, the following reminder is from Mariann Fedele, Moderator of the Technology Discussion List. To sign up for this discussion, go to: http://www.nifl.gov/mailman/listinfo/Technology One of the first questions to be posed so far concerns specific assistive technology assessment -- i.e. a protocol to investigate an individual's needs and to find the best hardware/software to address those needs. I encourage you to check out this discussion. Marie Cora Assessment Discussion List Moderator ************************************** Dear Tech list colleagues, Just a reminder that beginning Tuesday, September 19th through Friday, September 22nd, there will be a guest discussion on the Technology list on "Assistive Technology, Instructional Technology, and Universal Design Strategies for Adult Literacy" with guest facilitator Dr. Dave Edyburn of the University of Wisconsin-Milwaukee. More information about the content of the discussion and our guest facilitator follows. As with past discussions your questions, contributions and professional wisdom will make this a great learning experience for everyone on the list. What questions do you have that you would specifically like to see addressed? What general questions do you have about this topic that can help inform what Dr. Edyburn presents? Send any questions that will help shape the discussion to the Tech list as a whole or to me off list and I will forward them to Dr. Edyburn. Here are a few of the questions that have been sent to date: 1. Does anyone in an adult education setting use a specific assistive technology assessment -- i.e. a protocol to investigate an individual's needs and to find the best hardware/software to address those needs? 2. I'm not familiar with the meaning of assistive technology. If it means helping the learning/physically disabled, I'd like to know anything there is about assessment/diagnosis tools/instruments/programs when their first language isn't English. 3. If a program was going to become a "Universal Design" program, what are the essential UD elements/features that should be in place at each stage of the program: recruitment, orientation, intake, instruction, testing, transition, etc.? 4. What is available either as separate entities from the computer or on the computer itself, that will promote universal design principles? Being libraries, our computers often have a lot of security on them and it is not practical to take it on and off so specific patrons can gain access. What can we do to make it more accommodating for the patron with disabilities (learning or otherwise) and for the library staff which is often few in number? 5. What research has been done with assistive technology and reading/adult literacy which shows the efficacy of AT as a learning tool: retention, achievement, etc. 6. We are creating curricula for developmental students (pre-college skill levels), to prepare them for college-level reading and writing. I was just asked "what guidelines can we give our designing faculty so that they can incorporate Universal Design Principles into this?" A primary concern is, in fact, our students with labeled and unlabeled LDs. (So, really, she was asking "how can we make sure they break things down enough, and give directions for stuff that teachers tend to assume students know; how do we prevent "write a 2 page paper that responds to a political cartoon?" ) So... my question would be: how can we make "basic level" college assignments more accessible to students with LDs? All the best, Mariann **************************** Title: Assistive Technology, Instructional Technology, and Universal Design Strategies for Adult Literacy Overview Adult literacy professionals and volunteers are well aware of the effects of school failure and the lifelong impact of failing to acquire functional reading skills. In this online event, Dr. Dave Edyburn a professor at the University of Wisconsin-Milwaukee, will engage participants in a discussion about three forms of technology and their application for adult literacy learners and programs. On day one, participants will be introduced to the concept of assistive technology and learn about products that have been designed to support struggling readers. On day two, conversations will focus on instructional technology. That is, how can technology be used to teach and assess critical literacy skills. On day three, participants will learn about universal design for learning and the promise of this approach to address the needs of diverse learners in ways that combine the best attributes of assistive and instructional technology. Participants in this online event will have the opportunity to learn about practical applications of technology in adult literacy programs, ask questions, and obtain information about software and web resources. Bio Dave L. Edyburn, Ph.D. Dave L. Edyburn, Ph.D., is a Professor in the Department of Exceptional Education at the University of Wisconsin-Milwaukee. Dr. Edyburn's teaching and research interests focus on the use of technology to enhance teaching, learning, and performance. He has authored over 100 articles and book chapters on assistive and instructional technology. He is a co-editor of the recently published book, Handbook of Special Education Technology Research and Practice. He is a past president of the Special Education Technology Special Interest Group (SETSIG) in the International Society for Technology in Education (ISTE) as well as a past president of the Technology and Media (TAM) Division of the Council for Exceptional Children (CEC). He is a frequent conference presenter and national workshop leader. Mariann Fedele Associate Director, NYC Regional Adult Education Network Literacy Assistance Center Moderator, NIFL Technology and Literacy Discussion List 32 Broadway 10th Floor New York, New York 10004 212-803-3325 mariannf at lacnyc.org www.lacnyc.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060914/4b4f0d35/attachment.html From marie.cora at hotspurpartners.com Fri Sep 15 09:52:26 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 15 Sep 2006 09:52:26 -0400 Subject: [Assessment 503] Discussion on ELL in the workplace Message-ID: <002301c6d8ce$2e080d20$0302a8c0@LITNOW> Colleagues: Donna Brian and Lynda Terrill, Moderators of the Workplace Literacy Discussion List and English Language Learners Discussion List proudly announce a discussion on ELL in the workplace. You can partake in the discussion by subscribing to either list: Adult English Language list http://www.nifl.gov/mailman/listinfo/Englishlanguage OR Workplace Literacy list: http://www.nifl.gov/mailman/listinfo/Workplace/ Here is some information about the discussion: One issue that has been of interest on the Workplace Literacy Discussion List has been serving the needs and goals of employers while at the same time serving the needs of adult immigrants on the job and in their lives. On the Adult English Language Discussion List, issues related to workplace and adult English language learners have been discussed for many years. Workplace ESOL classes are not new, but, as more workplaces throughout the country hire immigrants who may not be proficient in English, new teachers come on board and new needs (and funding sources) arise. New questions also arise. We have home-grown expertise on both the Workplace and ELL Discussion Lists. Subscribers on both lists have been involved in managing state and federal workplace grants, developing curricula and materials, teaching or training other teachers in workplace contexts. We could all learn a lot from sharing questions and experiences. Lynda Terrill, moderator of the English Language Learner Discussion List and Donna Brian, moderator of the Workplace Literacy Discussion List, invite you to access this combined expertise in a cross-list discussion: a focused, simultaneous, shared discussion on both lists on issues related to the workplace and adult immigrants. We hope that you will join us in sharing philosophies, approaches, and techniques?lessons learned?with each other in a week long dialogue combining the two lists. We have set next week?September 18-22?as the time scheduled for this shared discussion to take place. Discussion Questions Some important questions we hope may be addressed in the discussion are: ? What are effective ways of planning, implementing, and evaluating (adult ESL) workplace classes? ? What types of workplace classes have proven most effective and why? ? How can teachers and administrators develop curricula and materials that meet the needs and goals of the learners in class as well as the needs and expectations of employers? ? What are effective and appropriate approaches for teaching issues related to culture, civil rights, and responsibilities on the job? Background Reading and Resources Below is a small sample of the best available materials we know of. We hope subscribers will suggest others that have been useful to them: This resource is added by Janet Isserlis: One reading I would add (and that in fact was just discussed in the workplace literacy share that grew out of a CAELA study circle), is this piece by Gary Pharness, which very much focuses on learners? needs in workplace learning contexts. Learner-Centered Worker Education Program. http://www.eric.ed.gov/ERICDocs/data/ericdocs2/content_storage_01/000000 0b/8 0/2a/14/f4.pdf Issues in Improving Immigrant Workers' English Language Skills (Burt, M., 2003, Washington, DC: Center for Applied Linguistics) http://www.cal.org/caela/esl_resources/digests/Workplaceissues.html. ESOL in the Workplace: A Training Manual for ESOL Supervisors and Instructors. (Tennessee Department of Labor and Workforce Development Office of Adult Education and University of Tennessee Center for Literacy Studies, 2003). http://www.cls.utk.edu/pdf/esol_workplace/Tenn_ESOL_in_the_Workplace.pdf Getting to Work: A Report on How Workers with Limited English Skills Can Prepare for New Jobs (Working for America Institute) http://www.workingforamerica.org/documents/PDF/GTW50704.pdf Getting to Work: A Report on How Workers with Limited English Skills Can Prepare for New Jobs (Working for America Institute) http://www.workingforamerica.org/documents/PDF/GTW50704.pdf Steps to Employment in Ontario. http://209.121.217.200/main.html ************* From marie.cora at hotspurpartners.com Fri Sep 15 12:48:13 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 15 Sep 2006 12:48:13 -0400 Subject: [Assessment 504] Re: Teaching beyond the GED? Message-ID: <002901c6d8e6$bca7a960$0302a8c0@LITNOW> The following post is from Donna Chambers. ********************************* Ajit and All, I agree with Ajit that we must push for the alignment of knowledge measured whatever the students goal may be. Preparing a learner for a college entrance test should not be any different from preparing a learner for work and or life. A changing society and economic system requires a continual redefinition of the context of literacy. As I referenced in my August 31 message as adult educators we must adapt to the change and be aware of developing conceptual thinking skills to position the learner to advance in his/her education if they so choose. This requires faith in a learners ability to acquire whatever skills are necessary even if it means changing the way we have been teaching for years. For example, I am currently working in Rhode Island with a groups of adults who entered our program to get their GED. Since my work is primarily with the External Diploma Program, a competency based assessment system, I tend to focus on a learner understanding concepts and being able to apply the learning in a number of different situations. Placement testing showed that these individuals had low level math skills and they all declared that they "were not good in math". I told the group that we were not going to work in GED books, except maybe to do some practice work, but that I would help them learn math in a way that was different and this may mean learning math beyond what they needed to know to pass the test. Our work would include understanding numbers and operations and how they relate to each other and that once they had this understanding, they should be better prepared for the math that they would encounter on any test, not just the GED. They all agreed and so we began using colored popsicle sticks and other manipulative to see, touch, and understand what it means to add, subtract, divide and multiply numbers. Before long we were discussing fractions, percents, and even ratio proportion with a clear understanding. The learners in my class now know what a variable is and can not only show an example of an algebraic expression using the manipulative, but can also write and solve simple algebraic equations. They may be able to pass the GED without knowing algebra, but this understanding has lead to a true understanding of 20% of 300; 4800 by 8 or 80; and 1:2 as 2:4, 3:6, etc which will help them to pass the test. It didn't take long for the learners to start saying, "I get it" and "this is fun". Using a hands on approach for math and getting away from going page by page in a workbook takes some planning, but the benefits are tremendous. How can we challenge an adult's thinking and help them become critical thinker? The answer to this question may mean looking at what we are doing differently. Developing activities centered around developing thinking skills, critical reading, separating fact from fiction and making predictions will help prepare our learners for the real world. ,Looking at and working toward this big picture better prepares students for success as they exit our programs. Exactly how we all do this and make it all relevant is what I see as the great challenge. change is not easy, but often necessary. Let us begin to look at a bigger picture beyond the GED and develop a list of standards to measure the concrete and abstract knowledge and skills that are needed for life. I invite anyone and everyone to chime in here. Donna Chambers ----- Original Message ----- From: Ajit Gopalakrishnan To: 'The Adult Education Content Standards Discussion List' Sent: Sunday, September 10, 2006 9:26 AM Subject: [Content 267] Re: Teaching beyond the GED? I am glad that this discussion on teaching beyond the GED has continued into the next week. Let me jump back in! I don't see the skills to enter college/work that are needed in addition to those required for passing the GED as "beyond the basics". To me, they are still basic skills that all high school graduates (whether K-12 or adult ed) should be proficient in prior to graduation. That said, it is not good enough for me to simply accept that some of these skills (a few of which may be highly abstract) are necessary only to pass "gatekeeper" postsecondary entrance examinations and have no relevance for life or work. Instead of simply accepting this, I believe that we should push for the alignment of knowledge measured in such entrance exams with the student's proposed course of study and interested profession. The relevance is absolutely there but needs to be made explicit. In the CASAS system for example, higher level reading/math test items don't become abstract but retain their connection to relevant priority competencies. However, few students including GED graduates achieve these higher levels of proficiency. People also tend to forget abstract knowledge if there is not some application. Computer training is a classic example. Millions are spent on teaching people how to utilize every feature within Microsoft products and three weeks later, the individual may remember 20% of the content - often the 20% that is used regularly. The transition ability gap is real and can be bridged with both rigor and relevance. Ajit Gopalakrishnan CT _____ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060915/57b473f1/attachment.html From marie.cora at hotspurpartners.com Sun Sep 17 11:24:53 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 17 Sep 2006 11:24:53 -0400 Subject: [Assessment 505] Professional Development through the ALE Wiki Message-ID: <002001c6da6d$6d091500$0302a8c0@LITNOW> Colleagues, The Adult Literacy Education Wiki (now with over 750 pages and nearly 700 registered users) is becoming a very useful resource for teachers and other practitioners in adult literacy education. It includes 30 topics linking research, professional wisdom, and practice. It offers easy-to-read archived discussions that were held on this and other discussion lists, links to research and other resources, questions (and sometimes answers) from teachers and other practitioners and researchers, a comprehensive adult literacy glossary, and more. It's free, and it's designed for you. Best of all, it's not only a set of informative web pages. It's a community of practice. You -- and your colleagues -- can easily add to and improve it. It's a wiki! To check out the ALE wiki, go to: http://wiki.literacytent.org/index.php/Main_Page You will see that some ALE topics need to be nurtured, and to grow. They need a Topic Leader. Perhaps you would be the right person to be a Topic Leader. To see a list of topics and leaders, go to http://wiki.literacytent.org/index.php/Topic_Leaders If you are interested in being a (volunteer) Topic Leader for an existing topic, or if you would like to help develop a new topic, e- mail me and tell me about yourself and your interest. David J. Rosen ALE Wiki Organizer and Wikiteer djrosen at comcast.net From marie.cora at hotspurpartners.com Mon Sep 18 08:44:50 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 18 Sep 2006 08:44:50 -0400 Subject: [Assessment 506] Special Topics List Discussion Announcement Message-ID: <003f01c6db20$3ba14920$0302a8c0@LITNOW> Colleagues: Some of you may be interested in the following Guest Discussion, beginning today, on the Special Topics Discussion List. Marie Cora Assessment Discussion List Moderator =========================================== To subscribe to the Special Topics Discussion List, go to http://www.nifl.gov/mailman/listinfo/specialtopics, fill in your name, email address and pick a password. After you have subscribed you will receive an email asking you to confirm your subscription. Please reply immediately. =========================================== Dear Colleagues, Beginning today, on the Special Topics Discussion List, we are pleased to have a panel of expert guests in corrections education. The topic will focus on research and professional wisdom in corrections family literacy, and on the transition from corrections education to community education for inmates who have been released. Our guests are: John Linton, Correctional Education, Office of Safe and Drug Free Schools, U.S. Department of Education John is the program officer for two correctional education grant programs ("Lifeskills for State and Local Prisoners" and "Grants to States for Workplace and Community Transition Training for Incarcerated Youth Offenders") in the Office of Safe and Drug Free Schools of the U. S. Department of Education. John formerly served the State of Maryland as the director of adult correctional education programs. He has been with the federal agency since 2001, originally with the Office of Vocational and Adult Education. Stephen J. Steurer, Ph.D., Executive Director, Correctional Education Association. The Correctional Education Association is a professional organization of educators who work in prisons, jails and juvenile settings. William R. Muth, PhD, Assistant Professor, Reading Education and Adult Literacy, Virginia Commonwealth University Bill is an Assistant Professor of Adult and Adolescent Literacy at Virginia Commonwealth University. Until August 2005, he was the Education Administrator for the Federal Bureau of Prisons. Other positions with the FBOP included: reading teacher, principal, and Chief of the Program Analysis Branch. In 2004 Bill earned his doctorate in adult literacy from George Mason University. His dissertation, "Performance and Perspective: Two Assessments of Federal Prisoners in Literacy Programs" won the College Reading Association's Dissertation of the Year Award. His research interests include Thirdspace and Reading Components theories, especially as these apply to prison-based family literacy programs and children of incarcerated parents. ******************************* The following readings are recommended by the panelists as background for the discussion: 1. "Locked Up and Locked Out, An Educational Perspective on the US Prison Population," Coley, Richard J. and Barton, Paul E., 2006 Available on line at the ETS web site: http://tinyurl.com/qmzfa (short URL) 2. "Learning to Reduce Recidivism: A 50-state analysis of postsecondary correctional education policy," Institute for Higher Education Policy, Erisman, Wendy and Contardo, Jeanne B., 2005. Available on line at the IHEP web site: http://tinyurl.com/pj2sh (short URL) 3. "Understanding California Corrections" from the California Policy Research Center, U of C. (Chapter 4) http://www.ucop.edu/cprc/documents/understand_ca_corrections.pdf John Linton believes that California is a watershed state in corrections issues and policies , and that how things unfold there has great national significance. He says that this is a thoughtful and well-informed report on the "overview" of the corrections situation in California -- including the role of treatment programs. Education is not presented as a central issue, but it has a place -- as a piece of a bigger puzzle. 4. An article by Bill Muth in Exploring Adult Literacy can be found at http://literacy.kent.edu/cra/2006/wmuth/index.html The article contains other on-line links related to prison-based intergenerational programs. He recommends especially the link to the Hudson River Center's excellent publication, Bringing Family Literacy to Incarcerated Settings: An Instructional Guide at: http://www.hudrivctr.org/products_ce.htm David J. Rosen Special Topics Discussion Moderator djrosen at comcast.net From marie.cora at hotspurpartners.com Tue Sep 19 10:22:03 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 19 Sep 2006 10:22:03 -0400 Subject: [Assessment 507] Guest Speaker on Content Standards List Next Week Message-ID: <011a01c6dbf6$fb67cf00$0302a8c0@LITNOW> Hello everyone, Please sign up for the guest discussion on the Content Standards List with Ronna Spacone next week. I would think that many on this List would find that discussion of great interest. To participate in the discussion, sign up for the list at: http://www.nifl.gov/mailman/listinfo/Contentstandards Marie Cora Assessment Discussion List Moderator ************************************************************************ ****************** Greetings colleagues, Next week, Monday, September 25 through Friday, September 29, the Content Standards Discussion List will be hosting a discussion on the U.S. Department of Education's efforts to support state-level adoption and institutionalization of content standards for adult learning. Our guest will be Ronna Spacone from the Department's Office of Vocational and Adult Education (OVAE), Division of Adult Education and Literacy (DAEL). Please read the introductory information below, which includes a link to the Adult Education Content Standards Warehouse Website, to help prepare you for the discussion. You may begin posting your questions to Ronna this week. I will collect the questions together and re-post when the discussion starts next week. Aaron Aaron Kohring Moderator, National Institute for Literacy's Content Discussion List (http://www.nifl.gov/mailman/listinfo/Contentstandards) ************************************************************************ ******************** For the past several years, the U.S. Department of Education, OVAE, has used National Leadership Activity funds to provide technical assistance and support to states already committed to standards-based education reform. OVAE's efforts to promote the implementation of state-level content standards began with the Adult Education Content Standards Warehouse Project, operated by the American Institutes of Research (AIR), 2003-2005. The project included: 1. Technical assistance and networking for state collaborative working groups or consortia in 14 states, 2. The development and publication of "A Process Guide for Establishing State Adult Education Content Standards" and 3. The development of the Adult Education Content Standards Warehouse (AECSW) Website. The AECSW site provides universal access to existing state standards as well as nationally developed content standards in the areas of reading, mathematics, and English language acquisition. Since it was launched in May 2005, eleven states and CASAS and Equipped for the Future have contributed their standards for posting. The site also serves to disseminate the "Process Guide for Establishing State Adult Education Content Standards" as well as the professional development materials that were developed for the State Standards Consortia project. In preparation for our listserv discussion next week, I invite you to please visit the AECSW Website located at: (http://www.adultedcontentstandards.org ), which AIR continues to operate with OVAE. In September 2005, with the conclusion of the State Standards Consortia activities, OVAE funded a new activity to identify how best to continue to support states committed to the implementation of standards. A six-month planning project was then conducted by MPR Associates, Inc., along with partner organizations Design Perspectives and World Education. Planning activities included a literature review of noteworthy practices, an evaluation of the electronic warehouse, and an assessment of the needs of states to support standards-based education. Twenty-four states chose to participate in the project. Based on the results, OVAE has moved ahead and made plans for a new project scheduled to begin next month. As in the past, the new activities will provide opportunities for interested states to work together and learn about standards-based education. The project is expected to focus especially on the implementation of standards, including: how to translate standards into classroom instruction and curriculum and how to assess the implementation of standards to guide instructional improvement and program practice. I invite you to learn more about the Department of Education's efforts to promote state-level adoption of content standards and to ask questions about these activities during the listserv discussion. Please refer to the various sections of the AECSW Website (http://www.adultedcontentstandards.org ) including the Guide, Warehouse, and Field Resources as well as the OVAE DAEL Website located at: (http://www.ed.gov/about/offices/list/ovae/pi/AdultEd/index.html). I look forward to an interesting, engaging discussion and appreciate the opportunity to take part. Thanks. Ronna Spacone Education Program Specialist Office of Vocational and Adult Education U.S. Department of Education Ronna.spacone at ed.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060919/d4b70c08/attachment.html From marie.cora at hotspurpartners.com Tue Sep 19 10:23:29 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 19 Sep 2006 10:23:29 -0400 Subject: [Assessment 508] Self-Esteem Discussion on FOB list next week! Message-ID: <011f01c6dbf7$2e570de0$0302a8c0@LITNOW> Colleagues: Another interesting guest discussion next week, on the FOB List: You can join the FOB discussion list at the following address: http://www.nifl.gov/mailman/listinfo/focusonbasics (Be sure to reply to the confirmation e-mail to complete the process.) Marie Cora Assessment Discussion List Moderator **************************************************** Next week (Sept. 25-29) we are pleased to have Anastasiya Lipnevich as a guest on the Focus on Basics Discussion List to discuss her recent FOB article about self-esteem in adult learners. Anastasiya is a PhD student in educational psychology at Rutgers University. She has a master's degree in counseling psychology from Rutgers University and a master's degree in psychology and education from the University of Minsk. Her research interests include self-esteem, motivation, and self-regulation. Low Self-Esteem: Myth or Reality? http://www.ncsall.net/index.php?id=1105 I will send out some discussion questions to think about tomorrow. You can join the FOB discussion list at the following address: http://www.nifl.gov/mailman/listinfo/focusonbasics (Be sure to reply to the confirmation e-mail to complete the process.) I hope you can join us! Julie Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From barguedas at sfccnm.edu Tue Sep 19 14:03:17 2006 From: barguedas at sfccnm.edu (Barbara Arguedas) Date: Tue, 19 Sep 2006 12:03:17 -0600 Subject: [Assessment 509] Re: FW: [ProfessionalDevelopment 526] ESOL to ABETransition Message-ID: <4CFDD6B88B634C409A76C0F44B3509BE029C4AEE@ex01.sfcc.edu> We have grappled with this exact situation here at Santa Fe Community College (in NM). We have determined that the first decision point is based on the student's goal. For example, if their goal is to get the GED then we place them in ABE/GED classes and use the TABE as the assessment. However, many of them still want to improve their English and that is their goal. This group (who have topped out of the CASAS with a score of 236 or above) is being encouraged to take the developmental level credit classes at our community college. There are several lower level English classes that they can place into (a placement test is required). Though the emphasis is not necessarily ESOL, they will make additional progress in these credit classes. They are proud to move on and to be considered "college students" and often can apply for financial aid as well (undocumented students are the exception for financial aid). Our developmental studies department has been instrumental in transitioning these students into the college's system. Hope this helps. Barbara Arguedas Director of Adult Basic Education Santa Fe Community College Santa Fe, NM -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Ira Yankwitt Sent: Thursday, September 14, 2006 7:39 AM To: The Assessment Discussion List Subject: [Assessment 497] Re: FW: [ProfessionalDevelopment 526] ESOL to ABETransition I completely agree, but this is actually a different point. For students with limited or no literacy in their native language, the evidence seems overwhelming that they should begin their studies in a basic education class in their native language (BENL). Unfortunately, this presents a challenge for programs funded through WIA because NRS doesn't recognize BENL as an instructional type. In NYC, WIA-funded programs incorporate native language literacy development into instruction, but their classes have to be designated ESOL, and educational gain must be measured by the BEST Plus. This is obviously a very dubious, problematic solution. The problem I was referring to is actually at the other end of the continuum -- what to do about higher level ESOL students who, because of the elimination of the former NRS Advanced ESOL level, are now placing out of ESOL, and, paradoxically, must now be considered ABE students in order to continue to develop their oral proficiency. So we actually have two complimentary problems -- students who are considered ESOL but should be ABE (i.e., BENL); and students who are considered ABE who should be ESOL. My question was about the second issue, but I would also be very happy to hear suggested strategies to address the first! Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of andresmuro at aol.com Sent: Thursday, September 14, 2006 9:12 AM To: assessment at nifl.gov Subject: [Assessment 496] Re: FW: [ProfessionalDevelopment 526] ESOL to ABE Transition The logical process should be ABE to ESL. First, students acquire basic literacy in L1. Then, you can transition them into ESL and they will do very well. If you try to do the opposite, it will be very frustrating for students and teachers. I understand that there are things that make this process very difficult, e.g. too many languages, people from oral societies, etc. However, if at all possible, the process should be from ABE to ESL. Andres Please take a look at my artwork: www.geocities.com/andresmuro/art.html -----Original Message----- From: marie.cora at hotspurpartners.com To: Assessment at nifl.gov Sent: Thu, 14 Sep 2006 6:31 AM Subject: [Assessment 495] FW: [ProfessionalDevelopment 526] ESOL to ABE Transition Colleagues, This email query is from the Professional Development Discussion List. What are your thoughts and comments? Marie Cora Assessment Discussion List Moderator *********************** Professional Development Colleagues: I'm wondering what policies and supports your states are putting into place to assist programs in transitioning their ESOL students into ABE, now that the cut point for NRS level six has been lowered. In New York City, some programs are creating special classes for students who score above 540 on the BEST Plus, but still need to improve their oral proficiency. These students will be tested on the TABE and designated ABE for NRS purposes, but their classes will be comprised only of non-native English speakers and will incorporate much more oral language development than a typical ABE class. What are some other strategies you might suggest? Ira Yankwitt, Director NYC Regional Adult Education Network Literacy Assistance Center 32 Broadway, 10th Floor NY, NY 10004 212-803-3356 ---------------------------------------------------- ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ________________________________ Check out the new AOL . Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060919/538e6415/attachment.html From marie.cora at hotspurpartners.com Wed Sep 20 09:05:45 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 20 Sep 2006 09:05:45 -0400 Subject: [Assessment 510] News from the Special Collection Message-ID: <021101c6dcb5$7d14bc20$0302a8c0@LITNOW> Good morning, afternoon, and evening to you all. I would like to let you know that some final revisions have been made to the new Special Collection in Assessment website. You can find the Collection at: http://literacy.kent.edu/Midwest/assessment/ These final revisions should be helpful to users: On the homepage, click on About This Site in the right toolbar; this provides you with an overview of the Collection and its features, as well as a brief history of the Special Collection and information on the CKG (Core Knowledge Group) - who are responsible for poplulating the site with quality resources. A listing of CKG members, past and present, is also there. A number of international resources have been added; these are sprinkled throughout the Collection - click on the 'roles' and look in the areas entitled "Program Planning" or "Planning Assessment". International resources are identified by name at the end of the abstract: i.e.: (New Zealand). Presently, resources from Canada, Australia, and New Zealand are offered. Click on Links & Directories in the right toolbar on the homepage; many more links have been added here and they have been organized according to topic areas including International Resources, K-12 Resources, and Research Organizations, as well as a wealth of resources focused directly on adult education, evaluation and testing. I hope that this re-organization of the links will be easier for the user than searching down a lengthy, alphabetized listing. Do you have suggestions for the Collection? Do you have resources or links in mind that you think should be a part of the Collection? Do you have a glossary that you think should be linked to the Collection? Do you have feedback for what's missing at the Collection? The least number of resources is in the Student area - what should be in this section? How can the Collection be improved to best suit your needs? Please let me know - this site is for your use, you should be a part of building it. I would like to send out a special thank you to Chris Cugino, Web Specialist at the Ohio Literacy Resource Center - Chris' work and creativity on the Collection are much appreciated!! marie Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060920/a699387d/attachment.html From Mylinh.Nguyen at ed.gov Wed Sep 20 13:19:35 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Wed, 20 Sep 2006 13:19:35 -0400 Subject: [Assessment 511] CROSS-POSTED: Tips to handling multiple postings Message-ID: From: My Linh Nguyen National Institute for Literacy CROSSPOSTED: Tips to handling multiple postings As you know, the National Institute for Literacy's Discussion Lists are very active. In addition to messages directly related to the subject of each Discussion List, often we have announcements that are posted to all the lists. If you are subscribed to two, three or all lists, you may receive multiple copies of one post - this is called "cross-posting." We do this to ensure that the maximum number of subscribers receives the messages - usually an announcement that may be of interest to subscribers on more than one list. While this can be cumbersome, this is done to ensure that you, our subscribers, are kept up-to-date on the latest news, guest speakers, etc. Here are a few tips to help you manage the number of repeat emails into your inbox. 1) Change your subscriber settings to receive postings in digest format. 2) Disable mail delivery from the lists while you are away. You still remain subscribed to the list, but will not receive postings while you are away. When you return, you can review the archives for posts that you missed while you were away. 3) You do not have to subscribe to a list to benefit from a discussion. Discussions are archived at the Institute's website and can be sorted by thread, date, and author, and can also be searched by keyword. 4) When sending a message to multiple lists, start with CROSS-POSTED in the subject line, so that other subscribers can recognize that it is a cross-post that they may have already received. For information and instructions on changing your subscriber settings please visit our Discussion List Help page at To access Discussion List archives, visit We hope that these tips will help you reduce the number of emails you receive every day, while still keeping you well-informed. Thank you for your continued support and participation in the National Institute for Literacy's Discussion Lists. My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From marie.cora at hotspurpartners.com Thu Sep 21 09:24:56 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 21 Sep 2006 09:24:56 -0400 Subject: [Assessment 512] Assessment discussion on workplace/ESOL List Message-ID: <000601c6dd81$5497ec00$360110ac@LITNOW> Colleagues, There is an interesting thread in the discussion being hosted by both the English Language Learners Discussion List and the Workplace Discussion List. If you want to follow via the archives, go to: http://www.nifl.gov/mailman/listinfo/Englishlanguage or http://www.nifl.gov/mailman/listinfo/Workplace/ and click on the archives. But also: I am collecting that thread and when the discussion has completed, I will post it in user-friendly format at both the NIFL Discussion Guest Archives and the ALE Wiki. I will announce this when it's ready, and give you the urls to find it. And don't hesitate to subscribe and jump in - you all have much to contribute. marie Marie Cora NIFL Assessment Discussion List Moderator marie.cora at hotspurpartners.com Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060921/6aad7388/attachment.html From marie.cora at hotspurpartners.com Thu Sep 21 09:30:44 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 21 Sep 2006 09:30:44 -0400 Subject: [Assessment 513] Discussion Questions for FOB List's Self-Esteem Discussion Message-ID: <001101c6dd82$23dc1270$360110ac@LITNOW> Colleagues: Read on for info on joining the discussion on self-esteem next week on FOB. To sign on to this discussion, go to: http://www.nifl.gov/mailman/listinfo/Focusonbasics Marie Cora Assessment Discussion List Moderator ********************************************** Our discussion on the FOB article "Low Self-Esteem: Myth or Reality?" will be next week, September 25-29. See below for some questions to think about. Link to the article (in Volume 8B) http://www.ncsall.net/index.php?id=1105 Discussion Questions for "Low Self-Esteem: Myth or Reality? 1. How much do you and your programs presume low self-esteem among your learners? 2. How do you think this presumption affects the delivery of education to adult learners? 3. The findings of this study may conflict with some people's notions of adult learners. How does this article affect your thoughts about the self-esteem of learners in your program? 4. How might you adjust your teaching or delivery of services if your learners had a higher self-esteem than you thought? 5. There was some discussion of science's ability to "understand the human mind", and of the validity of research on a concept such as self-esteem. It is a broad and variably-defined concept, which may be affected by many external factors related to one's situation in life. How do we, as consumers of research, handle this question? I hope you can join us! Julie Julie McKinney Discussion List Moderator World Education/NCSALL jmckinney at worlded.org From jataylor at utk.edu Thu Sep 21 23:04:35 2006 From: jataylor at utk.edu (Taylor, Jackie) Date: Thu, 21 Sep 2006 23:04:35 -0400 Subject: [Assessment 514] Meeting of the Minds II Symposium Message-ID: Hi All, I thought this might be of interest to you! Please read on ~ Jackie Taylor ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ National Adult Education Researcher-Practitioner Symposium: A Meeting of the Minds II Join us for this exciting dialogue among adult education researchers, practitioners, and policy makers! Nationally recognized adult education researchers will discuss their studies in such areas as reading, learner persistence, English as a Second Language instructional strategies, technology innovations, transitioning adults to college, authentic materials, health literacy, adult numeracy, family literacy, social justice, innovations in statewide assessment, practitioner inquiry, professional development, and many more. In addition, a featured concurrent session consists of a panel of adult literacy learners. Presenters include Kathleen Bailey, Hal Beder, Alisa Belzer, Beth Bingman, John Comings, Larry Condelli, Ros Davidson, Ron Glass, John Fleishman, Daphne Greenberg, Kathy Harris, Erik Jacobson, Jere Johnston, Tara Joyce, Cheryl Keenan, Mark Kutner, Susan Levine, Myrna Manly, Dennis Porter, Paul Porter, Steve Reder, Pat Rickard, Rima Rudd, Maricel Santos, Robin Schwarz, Renee Sherman, Heidi Silver-Pacuilla, Cristine Smith, John Strucker, Robin Waterman, Cynthia Zafft, and others. Throughout the symposium, each research presentation will be followed by a panel of practitioners who will respond to the presentations, and then by group discussions among participants who will share their reactions and explore implications from their perspectives as practitioners, researchers, and policy makers. The opening plenary session on Thursday features presentations by John Comings on Advice from NCSALL Research on Building High Quality Programs and by Mark Kutner on results of the National Assessment of Adult Literacy (NAAL) and the Health Literacy survey. The plenary session on Friday features a panel discussion on the topic of how research influences policy in adult literacy education. Dates of the Symposium are November 30-December 2, 2006, at the Sheraton Grand Hotel, Sacramento. To register online for the Symposium, visit the Web site www.researchtopractice.org . The complete program schedule will be posted to the Symposium Web site within the next few days. Registration is open now. Sponsors of the Symposium are the California Department of Education (CDE), the American Institutes for Research, the California Adult Literacy Professional Development Project (CALPRO), and National Center for the Study of Adult Learning and Literacy (NCSALL). Don't miss this exciting opportunity! Registration is limited to the first 300 people. Visit the Symposium Web site and register now! We look forward to seeing you in Sacramento on November 30. -Mary Ann Corley, Ph.D. CALPRO Director and Symposium Coordinator American Institutes for Research -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060921/531b3ee9/attachment.html -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ATT8711054.txt Url: http://www.nifl.gov/pipermail/assessment/attachments/20060921/531b3ee9/attachment.txt From KURIENS at lagcc.cuny.edu Sun Sep 24 15:01:54 2006 From: KURIENS at lagcc.cuny.edu (Suma Kurien) Date: Sun, 24 Sep 2006 15:01:54 -0400 Subject: [Assessment 515] Re: [ProfessionalDevelopment 541] Meeting of the Minds II Symposium Message-ID: <45169DE2020000FA00005483@mailgate.lagcc.cuny.edu> Greetings, October is almost here. That means it's time to register for the Bridges to Opportunity: Workforce Development for English Language Learners conference taking place October 27th and 28th. Registration is limited to 100 participants, and pre-registration is required. You must register in advance to attend. There will be no on-site registration! The conference fee is just $100 for a two-day event, or $75 for a group of five or more. For more information please visit the conference web site at: http://www.lagcc.cuny.edu/ace/bridges . Sign up soon to reserve your space. Use the registration form attached to this email, or register online at the conference website: http://www.lagcc.cuny.edu/ace/bridges/registration.htm . If you have any questions about registration please contact Ros Orgel at 718-482-5448 or roslyno at lagcc.cuny.edu . Bringing together who provide key services for ELL students developing both language and job-related skills, Bridges to Opportunity is a proactive, working conference. The keynote speaker is Dr. Heide Spruck Wrigley, president of Literacy International. Three panels made up of a range of program managers, instructors, policy makers, and career counselors will grapple with essential questions including (just to name a few): * How can employers, educators and other stake holders be brought together to identify and develop solutions to address the workforce development needs of English language learners? What are national models for developing such initiatives in needs assessment ? * Many immigrant workers come with prior credentials and experience which remain unused because there no easily accessible ways to re-enter their previous careers. The challenge is to re-integrate skilled immigrant workers into careers where their skills can be best utilized. How is this best achieved ? * Employer expectations and assumptions about English language instructional goals including the length of training are often different from those of educators. How can these different sets of expectations and goals be managed in customized training programs ? Your participation and recommendations are a key ingredient in this event - it is not a lecture-style conference, but a pro-active, participatory event where your input and experience count. Sponsored by the Center for Immigrant Education and Training and the Center for Teaching and Learning of LaGuardia Community College, Bridges for Opportunity is funded in part by a grant from the President's High Growth Job Training Initiative, as implemented by the U.S. Department of Labor's Employment and Training Administration. Sincerely, Suma Kurien Director, Center for Immigrant Education & Training Division of Adult & Continuing Education LaGuardia Community College Tel: (718) 482 5361 www.lagcc.cuny.edu/ciet From jataylor at utk.edu Tue Sep 26 12:15:41 2006 From: jataylor at utk.edu (Taylor, Jackie) Date: Tue, 26 Sep 2006 12:15:41 -0400 Subject: [Assessment 516] Cross-post: Action Research as Professional Development Message-ID: Assessment List Colleagues: I am very pleased to announce an exciting discussion lined up for the Adult Literacy Professional Development List, beginning this Monday, October 2nd, on "Action Research as Professional Development." University researchers from the University of the District of Columbia and at least five of the 17 teacher researchers will be joining us to share their rich experiences with the practice of action research as a vehicle for teacher change. I hope you will join us! Best, Jackie Taylor, Adult Literacy Professional Development List Moderator, jataylor at utk.edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Discussion Topic: Action Research as Professional Development Discussion Dates: October 2 - 9, 2006 To Participate: Subscribe by visiting: http://www.nifl.gov/mailman/listinfo/Professionaldevelopment To Prepare: View this short streaming video for background about action research in the District of Columbia http://www.nifl.gov/lincs/discussions/professionaldevelopment/webcast_ac tion.html (or try: http://tinyurl.com/krah5 ) General Overview: Join our guests from the University of the District of Columbia and teacher researchers (listed below) on the Adult Literacy Professional Development Discussion List to discuss a broad range of topics related to action research in adult literacy professional development (PD), including: * Defining action research * Problems/questions from instruction that could become action research * Teachers' experiences with action research * Action Research in Adult Basic Education in the District of Columbia * Using practitioner inquiry as professional development * Ancillary support systems for action research * Tools that give teachers different modes for examining their practice and to build a PD community * Role of action research in "evidence-based practice" * Action research as a transformative professional development ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ GUESTS: Teacher Researchers: Patricia DeFerrari works for Academy of Hope which runs several adult education programs including ABE/GED classes. Her research was on the use of authentic materials to improve attendance in ABE reading class. Adrienne Jones works for Catholic Charities GED program. Her research was on how daily interactive, self-paced computer learning and discussion time with peers and the instructor effect written posttest scores in science, reading and social studies. Kris Garvin works for Notre Dame Adult Education Center. Her research was on using authentic material to improve social studies scores in GED. Her research also focused on building, highlighting, and reinforcing personal connections learners have to community, history, and current events. Cheryl Jackson works as an independent consultant teaching workplace education classes for the District Department of Transportation employees. Her research topic focused on how computer technology could be used to improve reading comprehension and word recognition skills for low-level readers participating in a workplace education program. Chenniah Randolph works for Metropolitan Delta GED program. Her research was on the instructional gap between CASAS assessment instrument and GED instructional materials. University of the District of Columbia Researchers: Maigenet Shifferraw, Ph.D., Associate Professor and Coordinator of the Graduate Certificate Program in Adult Education, Department of Education, University of the District of Columbia Dr. Shifferraw is the Principal Investigator for the Action Research project in Adult Basic Education at the University of the District of Columbia. The actual researchers are the adult education teachers who are teaching in community based organization, but we (the team) is also responsible for evaluating the benefits of guided action research to enhancing the professional development of adult education teachers. Janet Burton, DSW, Professor and Director, Social Work Program, University of the District of Columbia As a member of the Action research team, Dr. Janet Burton provides consultation on research particularly related to social factors and adult education. She is conducting a study that examines how social factors impact participation in adult basic education. George W. Spicely, Adjunct Professor, Department of Education, University of the District of Columbia; and Education Consultant Professor George Spicely coordinates the work of the Action Research Project Team and provides support to the participating teacher-researchers. Specifically, he coordinates planning, implementation and follow-up of project activities, and leads research related discussion on administrative and research issues using Blackboard software. Supplemental Materials: Streaming Video: Action Research in Adult Basic Education in the District of Columbia http://www.nifl.gov/lincs/discussions/professionaldevelopment/webcast_ac tion.html (or try: http://tinyurl.com/krah5 ) (b) About the D.C. Action Research Project: http://www.nifl.gov/lincs/discussions/professionaldevelopment/action.htm l (c) "What is Research?" Focus on Basics, Volume 1, Issue A: February 1997, National Center for the Study of Adult Learning and Literacy: http://www.ncsall.net/index.php?id=166 Includes articles: "Research with Words: Qualitative Inquiry" http://www.ncsall.net/?id=468 "Knowing, Learning, Doing: Participatory Action Research http://www.ncsall.net/?id=479 (d) "Learning from Practice" http://www.pde.state.pa.us/able/cwp/view.asp?a=215&Q=110064 A Project of the Pennsylvania ABLE Lifelong Learning Shares information on Project's three Learning from Practice Models o Pennsylvania Action Research Network (PAARN) o Pennsylvania Adult Literacy Practitioner Inquiry Network o Agency Research Projects (e) New! The Action Research Topic Area of the ALE Wiki: http://wiki.literacytent.org/index.php/Action_Research ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ To Subscribe, Visit: http://www.nifl.gov/mailman/listinfo/Professionaldevelopment I hope you will join us! Jackie Jackie Taylor, Adult Literacy Professional Development List Moderator, jataylor at utk.edu National Institute for Literacy http://www.nifl.gov/ Association of Adult Literacy Professional Developers http://www.aalpd.org/ From marie.cora at hotspurpartners.com Sat Sep 30 16:37:18 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sat, 30 Sep 2006 16:37:18 -0400 Subject: [Assessment 517] Brief on Adult ESL Content Standards - New from CAELA Message-ID: <001001c6e4d0$39bd5e00$0302a8c0@LITNOW> Colleagues, The following information is from Miriam Burt. Marie Cora Assessment Discussion List Moderator _____ From: englishlanguage-bounces at nifl.gov [mailto:englishlanguage-bounces at nifl.gov] On Behalf Of Miriam Burt Sent: Friday, September 29, 2006 4:13 PM To: The Adult English Language Learners Discussion List Subject: [EnglishLanguage 748] Brief on Adult ESL Content Standards - Newfrom CAELA Hello, everyone: The latest brief from the Center for Adult English Language Acquisition (CAELA), Understanding Adult ESL Content Standards, by CAELA staff member Sarah Young and by Cristine Smith from the National Center for the Study of Adult Learning and Literacy (NCSALL), has been released and can be found on the Web site at http://www.cal.org/caela/esl_resources/briefs/contentstandards.html. It can be downloaded in both html and in pdf formats. This brief defines different types of standards and describes the instructional benefits of using adult ESL content standards. It also describes uses of content standards in the adult ESL field and discusses research about the implementation of content standards. Coming this fall: Another brief on content standards, Aligning Content Standards with Instruction and Assessment for Adult ESL Instruction, by Kirsten Schaetzel and Sarah Young. Thanks. Miriam ******* Miriam Burt CAELA Center for Applied Linguistics 4646 40th Street NW Washington, DC 20016 (202) 362-0700, ext 556 (202) 363-7204 (fax) miriam at cal.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20060930/85544703/attachment.html From macorley1 at earthlink.net Mon Oct 2 16:38:30 2006 From: macorley1 at earthlink.net (Mary Ann Corley) Date: Mon, 2 Oct 2006 16:38:30 -0400 (GMT-04:00) Subject: [Assessment 518] Full Schedule of the Meeting of the Minds II Symposium Now Available at Web Site Message-ID: <11237048.1159821510029.JavaMail.root@elwamui-polski.atl.sa.earthlink.net> Dear List Subscribers: I'm writing to let you know that the full conference schedule for the Meeting of the Minds II Symposium is now available at www.researchtopractice.org. This is a national adult education practitioner-researcher conference, the goal of which is to create dialogue between adult education researchers and adult education teachers and administrators, with the aim of enhancing literacy practice. The Symposium is scheduled for November 30 through December 2, 2006, at the Sheraton Grand Hotel, Sacramento, California. Participating researchers are from the National Center for the Study of Adult Learning and Literacy (NCSALL), from the American Institutes for Research (AIR), as well as from various universities and non-profit organizations. The opening plenary session on Thurday morning will feature Mark Kutner from AIR, who will present results of the National Assessment of Adult Literacy and the Health Literacy Survey, and John Comings, who will provide an overview of what NCSALL has learned from 10 years of research in adult literacy. A plenary session on Friday afternoon will feature a discussion on "how research has influenced adult literacy education policy at the national and state levels." Hal Beder from Rutgers University will provide an overview of the topic and moderate this session. Panelists include Cheryl Keenan, Director, Division of Adult Education and Literacy, US Department of Eduation; Sandra Baxter, Director of the National Institute for Literacy; and three state-level administrators: Bob Bickerton from Massachusetts; Jean Scott from California; and Israle Mendoza from Washington state. The three-day schedule is structured to include six strands of six concurrent sessions each, for a total of 36 sessions. Each session consists of a presentation of research, followed by a brief discussion/reaction from two practitioners, followed by an activity that involves session attendees, in small groups, in brainstorming implications of the research for practice, policy, and further research. CALPRO will post the list of implications to the Symposium Web site following the Symposium. Deadline for registering for the Symposium is November 15, 2006. There is no on-site registration. Deadline for registering for a hotel room at the Sheraton Grand (at the CA staterate of $84/night) is November 9, 2006. The Symposium Web site, www.researchtopractice.org, will take you to registration links for both the symposium and the hotel. Plan to attend this Symposium, network with other practitioners and researchers, and consider implications of research for your delivery of adult literacy education! Hope to see you in Sacramento in November!! -Mary Ann Corley, Ph.D. CALPRO Director and Symposium Coordinator American Institutes for Research From gspangenberg at caalusa.org Mon Oct 9 10:26:25 2006 From: gspangenberg at caalusa.org (Gail Spangenberg) Date: Mon, 9 Oct 2006 10:26:25 -0400 Subject: [Assessment 519] Launch of National Commission on Adult Literacy (cross posted) Message-ID: Friends, I think you will be pleased by the attached news release, issued jointly today by Dollar General Corporation and CAAL. This has been in the making for some months. I hope it will bring a new sense of hope and possibility to those who toil in the adult education and literacy trenches of service, planning, and policy development and who appreciate the findings of the 2003 NAALs. Should you be unable to access the PDF attachment, you can see essentially the same information at the CAAL website (www.caalusa.org). ? Gail Spangenberg President Council for Advancement of Adult Literacy 1221 Avenue of the Americas - 46th Fl New York, NY 10020 212-512-2362, F: 212-512-2610 www.caalusa.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061009/75766c95/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: dg-caal GENERAL News Release 100906.pdf Type: application/pdf Size: 135650 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20061009/75766c95/attachment.pdf -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061009/75766c95/attachment-0001.html From kabeall at comcast.net Tue Oct 10 08:35:47 2006 From: kabeall at comcast.net (Kaye Beall) Date: Tue, 10 Oct 2006 08:35:47 -0400 Subject: [Assessment 520] Back Issues of Focus on Basics Available Message-ID: <008a01c6ec68$9d0963b0$0202a8c0@your4105e587b6> After ten years of research and development, the National Center for the Study of Adult Learning and Literacy (NCSALL) project is coming to an end. NCSALL's dissemination efforts will end in March 2007. The Web site (www.ncsall.net) will remain available for free downloading of NCSALL materials. NCSALL is happy to offer printed copies of our magazine, Focus on Basics. Attached is a list of the back issues of Focus on Basics that are available either free (for orders of less than 100 copies) or for minimal shipping costs (for orders of more than 100 copies). Order Requirements: Due to staffing, we will only accept orders on a first-come, first-served basis with the following requirements: - Minimum quantity per order: 50 copies - Minimum quantity per issue: 10 copies - Orders of more than 100 copies will be charged a minimal shipping cost; see order form for details. Order Instructions: E-mail Caye Caplan at ccaplan at worlded.org with order information: Volume and Issue, Quantity per Issue, Mailing Address (provide street address), and Shipping Payment Method (if applicable). Or Fill out the attached "Comp FOB Order Form", Fax to: 617 482-0617 attn: NCSALL/ Caye Caplan or, Mail to: Caye Caplan, NCSALL/World Education, 44 Farnsworth St., Boston, MA 02210 Shipment will be UPS Ground; please provide street address (physical address, "NO" PO Box please!). Allow 4 - 5 weeks for delivery. Please forward this e-mail to interested programs / parties. Caye Caplan Coordinator of NCSALL Dissemination World Education 44 Farnsworth Street Boston, MA 02210-1211 Tel: (617) 482-9485 Fax: (617) 482-0617 E-mail: ccaplan at worlded.org Web-site: www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061010/fd1c7afb/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Focus on Basics back issues II.pdf Type: application/pdf Size: 241177 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20061010/fd1c7afb/attachment.pdf -------------- next part -------------- A non-text attachment was scrubbed... Name: Comp FOB Order Form.pdf Type: application/pdf Size: 98020 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20061010/fd1c7afb/attachment-0001.pdf From djrosen at comcast.net Tue Oct 10 21:28:14 2006 From: djrosen at comcast.net (David Rosen) Date: Tue, 10 Oct 2006 21:28:14 -0400 Subject: [Assessment 521] (no subject) Message-ID: <16052EAD-1017-4052-8CF3-99D59423F1EC@comcast.net> Assessment and Technology Colleagues, First, a cross-post: ================= Tom Sticht saw this story on the BBC News website and thought you should see it. ** Message ** Aaace-nla Colleagues: This article illustrates differences in news sources amng younger and older youth and aldults and raises questions about literacy assessment. ** Web browsing beats page-turning ** Europeans now spend more of their week online than they do reading papers or magazines, a report says. < http://news.bbc.co.uk/go/em/fr/-/2/hi/business/6034433.stm > ** BBC Daily E-mail ** Choose the news and sport headlines you want - when you want them, all in one daily e-mail < http://www.bbc.co.uk/email > ** Disclaimer ** The BBC is not responsible for the content of this e-mail, and anything written in this e-mail does not necessarily reflect the BBC's views or opinions. Please note that neither the e-mail address nor name of the sender have been verified. ================= Now some questions for you: 1. Have you assessed your students' web page reading skills? Has anyone assessed adult learners' web page reading skills? Is there such an assessment? 2. What impact do adult literacy programs have on students' access to or use of computers or the Internet? I have seen an unpublished study which found they have --- none -- and that makes me wonder why. Any ideas? Are you aware of any studies of adult literacy programs' impact on students' access to or use of computers? 3. Are adult literacy programs helping students to use assistive technology -- for example, (free) text-to-speech web page reader software that would enable them to join the community of internet users even if they have difficulty reading text? If not, should this be a program responsibility? David J. Rosen djrosen at comcast.net From marie.cora at hotspurpartners.com Sun Oct 15 08:09:03 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 15 Oct 2006 08:09:03 -0400 Subject: [Assessment 522] AELS marks milestone Message-ID: <012b01c6f052$b57b2990$0302a8c0@LITNOW> Good morning, afternoon, and evening to you all. The following post is from Tom Sticht. Marie Cora Assessment Discussion List Moderator ------ Colleagues: The following article appears in Reading TODAY, the official newspaper of the International Reading Association with a readership of some 160,000 worldwide. I hope all of you are planning celebrations for the 40th anniversary of the AELS on November 3rd. Tom Sticht Reading Today October/November 2006 Vol. 24, No. 2 page 22 U. S. Adult Education and Literacy System marks milestone This year marks the 40th anniversary of the Adult Education and Literacy System (AELS) in the United States, which continues today as Title 2: The Adult Education and Family Literacy Act of the Workforce Investment Act of 1998. Over the past four decades, adults have produced some 100 million enrollments in AELS. Yet establishing the system took years of effort. A merger of interests. By the beginning of the 1960s, the adult education community had become fragmented into several factions: those seeking recognition for adult education as a broad, liberal educational component of the national education system; those seeking education for the least educated, least literate adults; and those seeking to enhance America's security and increase the industrial productivity of the nation by giving education and job training to adults stuck in poverty. None of these groups, however, was having much success getting adult education or adult literacy education implemented in federal legislation. Finally, leverage to break the log jam came from the nation's military. In the summer of 1963, a task force on manpower conservation was established by the Department of Labor. The task force, led by Daniel Patrick Moynihan, set out to understand why so many young men were failing the military's standardized entrance screening exam, the Armed Forces Qualification Test (AFQT), and to recommend what might be done to alleviate this problem. The task force's report was delivered on January 1, 1964, to President Lyndon B. Johnson, who had taken office in November following the assassination of John F. Kennedy. The report revealed that one third of the young men called for military service did not meet the standards of health and education. It went on to recommend methods for using the AFQT to identify young adults with remediable problems and to provide them services, and it also recommended the enactment of new legislation that would provide additional education and training. In launching his "Great Society" programs in May 1964, Johnson argued that "The Great Society rests on abundance and liberty for all. It demands an end to poverty and racial injustice, to which we are totally committed in our time" By appealing to "abundance and liberty," Johnson captured the interest of those in Congress concerned with employment, productivity, and poverty as well as those concerned with national security. In August 1964, Johnson signed the Economic Opportunity Act into law. It contained within it Title IIB: the Adult Basic Education program. In 1966, adult educators lobbied to move the Adult Basic Education program to the U. S. Office of Education and to change the name to the Adult Education Act, broadening its applicability beyond basic education. Congress agreed, and, on November 3, 1966, Johnson signed an amendment to the Elementary and Secondary Education Act of 1965 that included Title III: The Adult Education Act of 1966. With the passing of the Adult Education Act, the seed from which the AELS would grow was finally planted. For 40 years, adults have used the AELS to help them find abundance and liberty from the bonds of poverty and underemployment for themselves and their families. For tens of millions of adults this hope has been fulfilled. [Note: Most of the foregoing is adapted from " The rise of the Adult Education and Literacy System in the United States: 1600-2000" by Thomas Sticht, in John Comings, Barbara Garner, and Cristine Smith (Eds.), The annual review of adult learning and literacy (volume 3, pages 10-43). San Francisco: Jossey-Bass, 2001. Thomas G. Sticht International Consultant in Adult Education El Cajon, California, USA From marie.cora at hotspurpartners.com Sun Oct 15 08:17:37 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Sun, 15 Oct 2006 08:17:37 -0400 Subject: [Assessment 523] ALE Wiki soon to be two years old Message-ID: <013101c6f053$e7588dd0$0302a8c0@LITNOW> Colleagues, Have you been introduced to the ALE Wiki? If so, when was the last time you visited? Have you contributed your valuable experience, knowledge, or wisdom there yet? Why not? Do you have comments, suggestions, or feedback for the "wikiteers"? We would love to hear from you. Projects are nothing without timely, specific, and productive assessment and evaluation by the people involved. As we continue to build the ALE Wiki, it's important for you in the field to let us know if you use it and how, and whether there are topics missing, or resources to add. Take a moment to read the post below from David Rosen, and find out about the ALE Wiki and where it is today! Marie Cora Assessment Discussion List Moderator ================ ALE Wiki soon to be two years old Colleagues, To improve practice in our field, teachers need to quickly and easily find the results of research and professional wisdom. This is a practical, everyday concern. A teacher has a question that needs an answer, such as "What are effective ways to increase student persistence?". "How do you handle a multilevel classroom?" "What is the optimum class size for beginning ESOL or basic literacy?" "What assessments are used in our field?" "Does my state offer free professional development or training?", "Does getting a GED lead to increased earnings?"or "How can I be an effective advocate for adult literacy?" Suppose there were one place to find answers to these questions, one place organized by topic -- and within each topic by teachers' questions -- and with lists of web-accessible research and professional wisdom sources. Suppose the topic area included some of the best discussions in the field. Suppose that this gold mine of professional development, designed to be accessed "just-in-time", were free. That's what the Adult Literacy Education Wiki is becoming. Some topics are nearly there, while others have just scratched the surface. Increasingly, it is becoming the "go to" place for teachers, researchers, administrators, and grant writers, both those new to the field and old hands. Launched in December, 2004, at the Meeting of the Minds I practitioner-researcher Symposium in Sacramento, California, it will have is second birthday this year at Meeting of the Minds II, November 30- December 2. The ALE Wiki now has 31 topics, 14 topic leaders, over 700 registered users -- 65 of whom have posted a brief bio statement, and nearly 800 pages of text. It was presented at an international conference on Wikis at Harvard this year. A chapter of a new book on communities of practice will be devoted to the ALE Wiki. It includes the work and the writing, or links to writing of many of the top people in our field from across the world. Not bad for a two year old, especially one that was created and raised entirely by volunteers. You can use the ALE Wiki. Check it out at: http://wiki.literacytent.org You can contribute to it -- it's easy! Go to: http://wiki.literacytent.org/index.php/New_Here%3F You might want to be a Topic Leader. http://wiki.literacytent.org/index.php/Topic_Leaders If so, e-mail me. And, of course, the volunteer "wikiteers" appreciate your comments. What is useful? What would you like to see to be added or changed? David J. Rosen djrosen at comcast.net From marie.cora at hotspurpartners.com Mon Oct 16 12:56:36 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 16 Oct 2006 12:56:36 -0400 Subject: [Assessment 524] Assessing On-Line Reading Message-ID: <01c901c6f144$0b098040$0302a8c0@LITNOW> Dear Colleagues, Last week, David Rosen posted the following set of questions accompanied by some information on the frequency with which people now read on-line (see posting "no subject" from Oct. 11; archives: http://www.nifl.gov/pipermail/assessment/2006/date.html) There have been a number of responses to this from the Technology Discussion List (http://www.nifl.gov/mailman/listinfo/Technology) - I wonder if folks out there have any comments, information, or further questions regarding this topic? What are your thoughts? 1. Have you assessed your students' web page reading skills? Has anyone assessed adult learners' web page reading skills? Is there such an assessment? 2. What impact do adult literacy programs have on students' access to or use of computers or the Internet? I have seen an unpublished study which found they have --- none -- and that makes me wonder why. Any ideas? Are you aware of any studies of adult literacy programs' impact on students' access to or use of computers? 3. Are adult literacy programs helping students to use assistive technology -- for example, (free) text-to-speech web page reader software that would enable them to join the community of internet users even if they have difficulty reading text? If not, should this be a program responsibility? David J. Rosen djrosen at comcast.net Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061016/00a9b903/attachment.html From marie.cora at hotspurpartners.com Tue Oct 17 21:06:34 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 17 Oct 2006 21:06:34 -0400 Subject: [Assessment 525] Measuring Education Gains in Adult Literacy Message-ID: <02b601c6f251$a81ddd00$0302a8c0@LITNOW> Dear Subscribers, I hope this email finds you well. There has been an interesting discussion regarding this subject taking place over the past week on the NLA Discussion List. (http://lists.literacytent.org/mailman/listinfo/aaace-nla/) I have compiled the discussion in user-friendly format and I attach it here for your convenience - and for your comments, questions, and suggestions. I believe that when to test, how often, and what type of testing is one of our greatest concerns in Adult Literacy in terms of policy and practice. What do you think? I would like to hear your thoughts. Please take a little time to read through the discussion, and post here your ideas. Here are some questions related to the discussion to spark your interest: Do you think that the present system of testing annually is effective or not? Why? What are some of the consequences of an annual assessment cycle? What do you think would be a good timeframe for measuring learning gains of adult students? Why? Do you think a timeframe for measuring learning gains is affected by the level (beginning, intermediate, advanced, etc) of the adult student? Is it affected by the subject matter or content - for example, reading, math, writing, etc? If our system were based on multi-year funding, how would that affect other parts of the system like follow-up? What would be some suggestions for managing this? What are some of your suggestions for changing our present policy of testing on an annual basis? Do you believe that we could change policy on this issue? How could we do that? Thanks, and I'm very much looking forward to your replies. Marie Cora Assessment Discussion List Moderator Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061017/ea7fbafb/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Over What Period of Time Should We Measure Gains.doc Type: application/msword Size: 81920 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20061017/ea7fbafb/attachment.doc From Mylinh.Nguyen at ed.gov Wed Oct 18 11:42:20 2006 From: Mylinh.Nguyen at ed.gov (Nguyen, My Linh) Date: Wed, 18 Oct 2006 11:42:20 -0400 Subject: [Assessment 526] CROSSPOSTED: Update on Discussion Lists Message-ID: Dear Discussion List subscribers, As the National Institute for Literacy begins a new fiscal year, we have taken steps to streamline the way we deliver professional development to our discussion list members. We would like to let you know about some changes ahead for some of the National Institute for Literacy Discussion Lists. The changes affect the Women and Literacy List; Poverty, Race, and Literacy List; Content Standards List; and Program Leadership and Improvement List. First, effective October 30, 2006, we will be closing the Content Standards and Program Leadership & Improvement lists. We have chosen to close these two lists because we recognize that many of the issues that impact Content Standards and Program Leadership & Improvement carry across all the subject areas of our other Discussion Lists. Second, effective November 6, 2006, we will be merging of the Poverty, Race and Literacy List with the Women and Literacy List. We have chosen to combine the two lists because we recognize that many of the issues that impact one group also affect the other group, and believe that many of the topics discussed on one list can benefit the other list. Race and gender issues often intersect, and it is most both practical and appropriate to have them intersect on one combined list. We will continue to provide access to discussion archives on our website. Thank you for your active participation in the Institute's lists. We invite you to explore all of our lists for to help you further your own professional development. The Institute's lists include: Adult Literacy Professional Development Assessment Adult English Language Learners Family Literacy Focus on Basics Health & Literacy Learning Disabilities Poverty, Race, Women and Literacy Special Topics Technology & Literacy Workplace Literacy Descriptions and instructions on how to register for the Institute's Discussion Lists can be found at http://www.nifl.gov/lincs/discussions/discussions.html My Linh Nguyen Associate Director of Communications National Institute for Literacy (202) 233-2041 fax (202) 233-2050 mnguyen at nifl.gov From tarv at chemeketa.edu Wed Oct 18 13:16:29 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Wed, 18 Oct 2006 10:16:29 -0700 Subject: [Assessment 527] Re: Measuring Education Gains in Adult Literacy In-Reply-To: <02b601c6f251$a81ddd00$0302a8c0@LITNOW> Message-ID: Do you think that the present system of testing annually is effective or not? Why? I think testing should be done on a more project based model. I wouldn't assess my field workers, on stream monitoring team once a year; I'd assess them on their project and it would be daily, weekly, monthly or every 6 weeks. Assessment should be linked to the reason for it. If one is assessing to build a program for a student, it should be tried out for a bit then checked to see if it is working. Twice a year doesn't seem to be enough, but many of our classes meet only 2 hours a week or 4, so common sense has to come into play also. One doesn't want to spend what little class time one has doing assessments. Students are only interested in assessment if it tells them something useful: instructors should listen to students in this matter. We need to figure out how to use assessments to inform instruction instead of running two side by side systems. If I assessed my water monitoring crew once a year, an entire year of data would be invalid and all that time would be wasted. I think instructors need to have their fingers on the heartbeat of learning and assessment need to be the fuel for that learning. What are some of the consequences of an annual assessment cycle? Too little too late What do you think would be a good timeframe for measuring learning gains of adult students? Why? A systematic way to capture student learning gains. Time is not the important factor here. Time is an arbitrary introduced due to funding cycle. Do you think a timeframe for measuring learning gains is affected by the level (beginning, intermediate, advanced, etc) of the adult student? Is it affected by the subject matter or content - for example, reading, math, writing, etc? It makes sense to assess based upon the subject matter covered, discussed, etc. If our system were based on multi-year funding, how would that affect other parts of the system like follow-up? What would be some suggestions for managing this? Our students ebb and flow in their lives; they often complete their GED in times when they are not in our classes. This means that, while we have been the instructional inspiration, we often get no credit for their completion. This means that an accountability system that was larger than a year would be useful: 1) for graduation rates and 2) for tracking students who take longer than a year to complete or to make gains. What are some of your suggestions for changing our present policy of testing on an annual basis? Do you believe that we could change policy on this issue? How could we do that? Institutions have ways to track students through time even if the system or policy does not address this. I see no particular need to change the policy. ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, October 17, 2006 6:07 PM To: Assessment at nifl.gov Subject: [Assessment 525] Measuring Education Gains in Adult Literacy Dear Subscribers, I hope this email finds you well. There has been an interesting discussion regarding this subject taking place over the past week on the NLA Discussion List. (http://lists.literacytent.org/mailman/listinfo/aaace-nla/) I have compiled the discussion in user-friendly format and I attach it here for your convenience - and for your comments, questions, and suggestions. I believe that when to test, how often, and what type of testing is one of our greatest concerns in Adult Literacy in terms of policy and practice. What do you think? I would like to hear your thoughts. Please take a little time to read through the discussion, and post here your ideas. Here are some questions related to the discussion to spark your interest: Do you think that the present system of testing annually is effective or not? Why? What are some of the consequences of an annual assessment cycle? What do you think would be a good timeframe for measuring learning gains of adult students? Why? Do you think a timeframe for measuring learning gains is affected by the level (beginning, intermediate, advanced, etc) of the adult student? Is it affected by the subject matter or content - for example, reading, math, writing, etc? If our system were based on multi-year funding, how would that affect other parts of the system like follow-up? What would be some suggestions for managing this? What are some of your suggestions for changing our present policy of testing on an annual basis? Do you believe that we could change policy on this issue? How could we do that? Thanks, and I'm very much looking forward to your replies. Marie Cora Assessment Discussion List Moderator Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061018/f1636051/attachment.html From marie.cora at hotspurpartners.com Thu Oct 19 10:22:59 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 19 Oct 2006 10:22:59 -0400 Subject: [Assessment 528] Re: Measuring Education Gains in Adult Literacy In-Reply-To: Message-ID: <03dd01c6f38a$14fb5590$0302a8c0@LITNOW> Hi Virginia, Thank you so much for your thoughts on this. Regarding the GED issue - if you track the students via data match, then I believe a program can get credit for that achievement even if the student has left your program - up to 12 months upon exiting your program. That's the way it's done here in Massachusetts. Others? What are you thoughts on this? Do others use data match for some of the goals achievement? Also, I'm a bit surprised by your comment that you see no need to change policy - but you note that "Time is an arbitrary introduced due to funding cycle." Wouldn't it be a good thing if we did change this issue of time so that it addressed the learning process instead of the funding cycle? Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Virginia Tardaewether Sent: Wednesday, October 18, 2006 1:16 PM To: The Assessment Discussion List Subject: [Assessment 527] Re: Measuring Education Gains in Adult Literacy Do you think that the present system of testing annually is effective or not? Why? I think testing should be done on a more project based model. I wouldn't assess my field workers, on stream monitoring team once a year; I'd assess them on their project and it would be daily, weekly, monthly or every 6 weeks. Assessment should be linked to the reason for it. If one is assessing to build a program for a student, it should be tried out for a bit then checked to see if it is working. Twice a year doesn't seem to be enough, but many of our classes meet only 2 hours a week or 4, so common sense has to come into play also. One doesn't want to spend what little class time one has doing assessments. Students are only interested in assessment if it tells them something useful: instructors should listen to students in this matter. We need to figure out how to use assessments to inform instruction instead of running two side by side systems. If I assessed my water monitoring crew once a year, an entire year of data would be invalid and all that time would be wasted. I think instructors need to have their fingers on the heartbeat of learning and assessment need to be the fuel for that learning. What are some of the consequences of an annual assessment cycle? Too little too late What do you think would be a good timeframe for measuring learning gains of adult students? Why? A systematic way to capture student learning gains. Time is not the important factor here. Time is an arbitrary introduced due to funding cycle. Do you think a timeframe for measuring learning gains is affected by the level (beginning, intermediate, advanced, etc) of the adult student? Is it affected by the subject matter or content - for example, reading, math, writing, etc? It makes sense to assess based upon the subject matter covered, discussed, etc. If our system were based on multi-year funding, how would that affect other parts of the system like follow-up? What would be some suggestions for managing this? Our students ebb and flow in their lives; they often complete their GED in times when they are not in our classes. This means that, while we have been the instructional inspiration, we often get no credit for their completion. This means that an accountability system that was larger than a year would be useful: 1) for graduation rates and 2) for tracking students who take longer than a year to complete or to make gains. What are some of your suggestions for changing our present policy of testing on an annual basis? Do you believe that we could change policy on this issue? How could we do that? Institutions have ways to track students through time even if the system or policy does not address this. I see no particular need to change the policy. _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Tuesday, October 17, 2006 6:07 PM To: Assessment at nifl.gov Subject: [Assessment 525] Measuring Education Gains in Adult Literacy Dear Subscribers, I hope this email finds you well. There has been an interesting discussion regarding this subject taking place over the past week on the NLA Discussion List. (http://lists.literacytent.org/mailman/listinfo/aaace-nla/) I have compiled the discussion in user-friendly format and I attach it here for your convenience - and for your comments, questions, and suggestions. I believe that when to test, how often, and what type of testing is one of our greatest concerns in Adult Literacy in terms of policy and practice. What do you think? I would like to hear your thoughts. Please take a little time to read through the discussion, and post here your ideas. Here are some questions related to the discussion to spark your interest: Do you think that the present system of testing annually is effective or not? Why? What are some of the consequences of an annual assessment cycle? What do you think would be a good timeframe for measuring learning gains of adult students? Why? Do you think a timeframe for measuring learning gains is affected by the level (beginning, intermediate, advanced, etc) of the adult student? Is it affected by the subject matter or content - for example, reading, math, writing, etc? If our system were based on multi-year funding, how would that affect other parts of the system like follow-up? What would be some suggestions for managing this? What are some of your suggestions for changing our present policy of testing on an annual basis? Do you believe that we could change policy on this issue? How could we do that? Thanks, and I'm very much looking forward to your replies. Marie Cora Assessment Discussion List Moderator Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061019/0c74807b/attachment.html From marie.cora at hotspurpartners.com Mon Oct 23 10:41:51 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Mon, 23 Oct 2006 10:41:51 -0400 Subject: [Assessment 529] peep? Message-ID: <059901c6f6b1$612421c0$0302a8c0@LITNOW> Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061023/003ad445/attachment.html From mjjerdems at yahoo.com Mon Oct 23 11:10:22 2006 From: mjjerdems at yahoo.com (Mary Jane Jerde) Date: Mon, 23 Oct 2006 08:10:22 -0700 (PDT) Subject: [Assessment 530] Re: peep? In-Reply-To: <059901c6f6b1$612421c0$0302a8c0@LITNOW> Message-ID: <20061023151022.21479.qmail@web54004.mail.yahoo.com> Summer is a much better time for me to respond than the middle of the fall. Sorry, Mary Jane Jerde Marie Cora wrote: Clean Clean DocumentEmail MicrosoftInternetExplorer4 st1\:*{behavior:url(#default#ieooui) } /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman";} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia?s post last week, I haven?t heard from any of the 550 subscribers to the List. I?m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you?d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program?s walls, so if I?m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen ? and I?ll make them happen. I value and appreciate your membership highly ? but a Discussion List is only as good as the discussions that occur. If you?ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- All-new Yahoo! Mail - Fire up a more powerful email and get things done faster. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061023/b7c809ca/attachment.html From Karen.Limkemann at fwliteracyalliance.org Mon Oct 23 11:33:49 2006 From: Karen.Limkemann at fwliteracyalliance.org (Limkemann, Karen) Date: Mon, 23 Oct 2006 11:33:49 -0400 Subject: [Assessment 531] Re: peep? Message-ID: Hi Marie, We're still here.... At least I am. My program is funded in part by the Department of Ed for the State of Indiana so we have to follow the state testing policy, requiring TABE or CASAS, testing after 40-80 hours or at teacher discretion etc. I am in favor of more not less testing. Most of our students come in thinking they are going after the GED. Periodic retesting is not only good for information but also for testing practice for the student. We test at least three times a year. Karen Ft. Wayne, IN ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, October 23, 2006 11:01 AM To: Assessment at nifl.gov Subject: [Assessment 529] peep? Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061023/5861cc15/attachment.html From mmhefner at charter.net Mon Oct 23 15:30:36 2006 From: mmhefner at charter.net (Melinda Hefner) Date: Mon, 23 Oct 2006 15:30:36 -0400 Subject: [Assessment 532] Re: peep? In-Reply-To: Message-ID: <012f01c6f6d9$b7745b30$74349e18@Melinda> I'm definitely interested in the subject! I simply have been too busy to actively participate. Melinda _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Limkemann, Karen Sent: Monday, October 23, 2006 11:34 AM To: The Assessment Discussion List Subject: [Assessment 531] Re: peep? Hi Marie, We're still here.. At least I am. My program is funded in part by the Department of Ed for the State of Indiana so we have to follow the state testing policy, requiring TABE or CASAS, testing after 40-80 hours or at teacher discretion etc. I am in favor of more not less testing. Most of our students come in thinking they are going after the GED. Periodic retesting is not only good for information but also for testing practice for the student. We test at least three times a year. Karen Ft. Wayne, IN _____ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Marie Cora Sent: Monday, October 23, 2006 11:01 AM To: Assessment at nifl.gov Subject: [Assessment 529] peep? Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061023/daf93a5a/attachment.html From sfallsliteracy at yahoo.com Mon Oct 23 15:56:32 2006 From: sfallsliteracy at yahoo.com (Nancy Hansen) Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Subject: [Assessment 533] Re: peep? In-Reply-To: <059901c6f6b1$612421c0$0302a8c0@LITNOW> Message-ID: <20061023195632.99404.qmail@web34707.mail.mud.yahoo.com> Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: st1\:*{behavior:url(#default#ieooui) } Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia?s post last week, I haven?t heard from any of the 550 subscribers to the List. I?m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you?d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program?s walls, so if I?m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen ? and I?ll make them happen. I value and appreciate your membership highly ? but a Discussion List is only as good as the discussions that occur. If you?ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061023/4e0aba94/attachment.html From mjjerdems at yahoo.com Mon Oct 23 18:49:56 2006 From: mjjerdems at yahoo.com (Mary Jane Jerde) Date: Mon, 23 Oct 2006 15:49:56 -0700 (PDT) Subject: [Assessment 534] Re: peep? In-Reply-To: <20061023195632.99404.qmail@web34707.mail.mud.yahoo.com> Message-ID: <20061023224956.48480.qmail@web54003.mail.yahoo.com> Hi, The testing that is required for government funding tends to fly in the face of some serious principles for assessment: never depend on one tool, don't allow the students to become familiar with a specific test form, use a variety of testing methods, numbers do not give a full, realistic assessment. CASAS does do several things fairly well: easy training, easy scoring, easy make up. How many students know Form 54 by sheer repetition? It's always a shock when they hit Form 56. I use BEST Plus with CASAS. It's not perfect either, but I have the "luxury" of a trained assessor willing to come on site to give it. Between the two, I'm getting a better picture and more options for reporting. Mary Jane Jerde Howard Community College Columbia, MD Nancy Hansen wrote: Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: st1\:*{behavior:url(#default#ieooui) } Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia???s post last week, I haven???t heard from any of the 550 subscribers to the List. I???m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you???d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program???s walls, so if I???m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen ??? and I???ll make them happen. I value and appreciate your membership highly ??? but a Discussion List is only as good as the discussions that occur. If you???ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2???/min or less.------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Want to be your own boss? Learn how on Yahoo! Small Business. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061023/d59f054a/attachment.html From donnaedp at cox.net Tue Oct 24 11:40:53 2006 From: donnaedp at cox.net (Donna Chambers) Date: Tue, 24 Oct 2006 11:40:53 -0400 Subject: [Assessment 535] Re: peep? References: <20061023224956.48480.qmail@web54003.mail.yahoo.com> Message-ID: <006a01c6f782$caf830b0$edd5ac46@DH89L251> Wow! Like others, I lose track of the discussion because I am just too busy to keep up. There is so much work to be done in thinking through this issue, but in the meantime, we must keep up with our duties in the classroom and running our programs . Now I just heard on the news that sitting at a computer for hours at a time can be addictive and may require medical and psychiatric treatment Here is my quick "peep" because I don't have the time for another addiction. I agree with both Nancy and Mary Jane. The testing requirement for government funding is not enough and sometimes not appropriate. I have spent my career working with competency based assessment and/or authentic assessment and so that is what I inherently use to see if a learner understands something. By testing this way, I am also challenging the thinking skills of the learner which I believe is critical in the process for the adult. Lately I have been doing a bit of informal experimentation. Here is what I find more times than not. The learner knows the skill because it was demonstrated to me when asked to verbally to solve the problem and explain the solution. Yet the learner got the item wrong on the paper and pencil, multiple choice test. The learner demonstrates an ability to do a problem and apply the skill in several examples and then gets the same type of problem wrong when asked to do it on a test . Pre test scores do not always correlate with what I believe a learner knows and yet such we are asked to place such importance on them. Sometimes even scores go down between pre and post testing. I do believe this begs the question, "What is happening here?" It is definitely worth considering more varied assessment methods. We are working with adults that may be test anxious and certainly language plays a huge role in being able to answer the question correctly. This is not to say that the tests we use are not valid, but getting a question right or wrong depends on more circumstances than knowledge or skill, especially for adults. How do we know when they know it, be able to retain it, and to apply it again in other circumstances? " This is more than a "peep" ,but I feel this topic is critical to our programs and the whole "accountability/assessment" issue in education today. Donna Chambers ----- Original Message ----- From: Mary Jane Jerde To: sfallsliteracy at yahoo.com ; The Assessment Discussion List Sent: Monday, October 23, 2006 6:49 PM Subject: [Assessment 534] Re: peep? Hi, The testing that is required for government funding tends to fly in the face of some serious principles for assessment: never depend on one tool, don't allow the students to become familiar with a specific test form, use a variety of testing methods, numbers do not give a full, realistic assessment. CASAS does do several things fairly well: easy training, easy scoring, easy make up. How many students know Form 54 by sheer repetition? It's always a shock when they hit Form 56. I use BEST Plus with CASAS. It's not perfect either, but I have the "luxury" of a trained assessor willing to come on site to give it. Between the two, I'm getting a better picture and more options for reporting. Mary Jane Jerde Howard Community College Columbia, MD Nancy Hansen wrote: Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia???s post last week, I haven???t heard from any of the 550 subscribers to the List. I???m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you???d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program???s walls, so if I???m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen ??? and I???ll make them happen. I value and appreciate your membership highly ??? but a Discussion List is only as good as the discussions that occur. If you???ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ---------------------------------------------------------------------------- Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2???/min or less.------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------------------------------------------------------ Want to be your own boss? Learn how on Yahoo! Small Business. ------------------------------------------------------------------------------ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061024/93c5a5e1/attachment.html From khinson at future-gate.com Tue Oct 24 12:36:08 2006 From: khinson at future-gate.com (Katrina Hinson) Date: Tue, 24 Oct 2006 17:36:08 +0100 Subject: [Assessment 536] Re: peep? Message-ID: <453E5D18020000A000003998@fgwiel01a.wie.de.future-gate.com> I'm in the same boat as Donna below. Normally I try to keep up with the discussions but I've been silent on the lists I'm on lately simply because I wear way too many hats at the moment and it's difficult to keep up with sheer volume of emails generated sometimes. Likewise, I have my own 'peep' to add to the discussion: We use TABE (9 & 10) at the moment, for ABE/GED students and CASAS for ESL and Family Literacy and Comp. Ed.. One problem occurs when students move between programs. Students tested via CASAS, are not given a math component - EVER - just Reading and Listening and those in Family Literacy are only given Reading. They may or may not ever have a math placement score. Additionally, the program I'm in places great emphasis on these scores due to the nature of the funding - so much so that paperwork now has to go through multiple hands to ensure that it's all correctly filled in to ensure that the numbers are all accurate. Another area where there is a problem is the move from those students tested on TABE 8 to TABE 9. We saw a dramatic decline in test scores and were left asking if it was accurate? We asked why the huge drop? Had these students regressed? Had the test not been administered properly before hand? Had they memorized (which I do agree is a major issue with standa! rdized testing) the test? There were no easy answers and we're still seeing scores all over the place sometimes. Instructors here and administrators are very much tied to the "tests" as if that is the only measure. I sometimes feel like a lone voice saying "Yes, BUT..." at a lot of meetings...or trying to explain that the data isn't always a valid reflection of student ability. I'm always met with the same response - the tests need to match student ability to ensure funding. It's a catch 22 for instructors. We're caught in a loop of having to meet performance measures that may or may not truly reflect student ability yet the work a student does that shows his or her ability is often ignored. Regards, Katrina Hinson >>> "Donna Chambers" 10/24/06 8:40 AM >>> Wow! Like others, I lose track of the discussion because I am just too busy to keep up. There is so much work to be done in thinking through this issue, but in the meantime, we must keep up with our duties in the classroom and running our programs . Now I just heard on the news that sitting at a computer for hours at a time can be addictive and may require medical and psychiatric treatment Here is my quick "peep" because I don't have the time for another addiction. I agree with both Nancy and Mary Jane. The testing requirement for government funding is not enough and sometimes not appropriate. I have spent my career working with competency based assessment and/or authentic assessment and so that is what I inherently use to see if a learner understands something. By testing this way, I am also challenging the thinking skills of the learner which I believe is critical in the process for the adult. Lately I have been doing a bit of informal experimentation. Here is what I find more times than not. The learner knows the skill because it was demonstrated to me when asked to verbally to solve the problem and explain the solution. Yet the learner got the item wrong on the paper and pencil, multiple choice test. The learner demonstrates an ability to do a problem and apply the skill in several examples and then gets the same type of problem wrong when asked to do it on a test . Pre test scores do not always correlate with what I believe a learner knows and yet such we are asked to place such importance on them. Sometimes even scores go down between pre and post testing. I do believe this begs the question, "What is happening here?" It is definitely worth considering more varied assessment methods. We are working with adults that may be test anxious and certainly language plays a huge role in being able to answer the question correctly. This is not to say that th! e tests we use are not valid, but getting a question right more circumstances than knowledge or skill, especially for adults. How do we know when they know it, be able to retain it, and to apply it again in other circumstances? " This is more than a "peep" ,but I feel this topic is critical to our programs and the whole "accountability/assessment" issue in education today. Donna Chambers ----- Original Message ----- From: Mary Jane Jerde To: sfallsliteracy at yahoo.com ; The Assessment Discussion List Sent: Monday, October 23, 2006 6:49 PM Subject: [Assessment 534] Re: peep? Hi, The testing that is required for government funding tends to fly in the face of some serious principles for assessment: never depend on one tool, don't allow the students to become familiar with a specific test form, use a variety of testing methods, numbers do not give a full, realistic assessment. CASAS does do several things fairly well: easy training, easy scoring, easy make up. How many students know Form 54 by sheer repetition? It's always a shock when they hit Form 56. I use BEST Plus with CASAS. It's not perfect either, but I have the "luxury" of a trained assessor willing to come on site to give it. Between the two, I'm getting a better picture and more options for reporting. Mary Jane Jerde Howard Community College Columbia, MD Nancy Hansen wrote: Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia??????s post last week, I haven??????t heard from any of the 550 subscribers to the List. I??????m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responsese topic is not hot (which surprises me). Are there other topics you??????d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program??????s walls, so if I??????m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen ?????? and I??????ll make them happen. I value and appreciate your membership highly ?????? but a Discussion List is only as good as the discussions that occur. If you??????ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ---------------------------------------------------------------------------- Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2??????/min or less.------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------------------------------------------------------ Want to be your own boss? Learn how on Yahoo! Small Business. ------------------------------------------------------------------------------ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From templege at hotmail.com Tue Oct 24 13:56:06 2006 From: templege at hotmail.com (Grace Temple) Date: Tue, 24 Oct 2006 13:56:06 -0400 Subject: [Assessment 537] Re: peep? In-Reply-To: <20061023195632.99404.qmail@web34707.mail.mud.yahoo.com> Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061024/cf2bc4b1/attachment.html From marie.cora at hotspurpartners.com Tue Oct 24 15:30:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Tue, 24 Oct 2006 15:30:08 -0400 Subject: [Assessment 538] Re: Measuring Outside the NRS In-Reply-To: Message-ID: <070a01c6f7a2$d18980d0$0302a8c0@LITNOW> Dear Grace and Nancy ? (First a big thank you to everyone who has jumped in ? I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I?m sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements ? what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy ? you said: ?I sense that by your colleagues' standards my program would be deemed ineffective.? I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy?s question Subscribers: is this true? Do you feel that Grace and Nancy?s programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student?s goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let?s hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like ? and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen Reply-To: sfallsliteracy at yahoo.com,The Assessment Discussion List To: The Assessment Discussion List Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia?s post last week, I haven?t heard from any of the 550 subscribers to the List. I?m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you?d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program?s walls, so if I?m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. Feel free to respond to the discussion on Measuring Gains, but also, feel free to start your own discussion topic, or feel free to send your thoughts to the List or to me personally regarding other discussions that you would like to see happen ? and I?ll make them happen. I value and appreciate your membership highly ? but a Discussion List is only as good as the discussions that occur. If you?ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061024/b9575be8/attachment.html From donnaedp at cox.net Tue Oct 24 16:31:15 2006 From: donnaedp at cox.net (Donna Chambers) Date: Tue, 24 Oct 2006 16:31:15 -0400 Subject: [Assessment 539] Re: peep? References: <453E5D18020000A000003998@fgwiel01a.wie.de.future-gate.com> Message-ID: <002e01c6f7ab$5ace84f0$edd5ac46@DH89L251> Katrina, You are not alone and it is a catch 22. I know that the test/data doesn't often show the accurate picture. However, in reality, we are forced to deal with standardized tests as the main measurement tool. Truth be told, adult literacy learners are often not good traditional testers. What can we do? Donna Chambers ----- Original Message ----- From: "Katrina Hinson" To: Sent: Tuesday, October 24, 2006 12:36 PM Subject: [Assessment 536] Re: peep? > I'm in the same boat as Donna below. Normally I try to keep up with the > discussions but I've been silent on the lists I'm on lately simply because > I wear way too many hats at the moment and it's difficult to keep up with > sheer volume of emails generated sometimes. Likewise, I have my own > 'peep' to add to the discussion: > > We use TABE (9 & 10) at the moment, for ABE/GED students and CASAS for > ESL and Family Literacy and Comp. Ed.. One problem occurs when students > move between programs. Students tested via CASAS, are not given a math > component - EVER - just Reading and Listening and those in Family Literacy > are only given Reading. They may or may not ever have a math placement > score. Additionally, the program I'm in places great emphasis on these > scores due to the nature of the funding - so much so that paperwork now > has to go through multiple hands to ensure that it's all correctly filled > in to ensure that the numbers are all accurate. Another area where there > is a problem is the move from those students tested on TABE 8 to TABE 9. > We saw a dramatic decline in test scores and were left asking if it was > accurate? We asked why the huge drop? Had these students regressed? Had > the test not been administered properly before hand? Had they memorized > (which I do agree is a major issue with standa! > rdized testing) the test? > > There were no easy answers and we're still seeing scores all over the > place sometimes. > > Instructors here and administrators are very much tied to the "tests" as > if that is the only measure. I sometimes feel like a lone voice saying > "Yes, BUT..." at a lot of meetings...or trying to explain that the data > isn't always a valid reflection of student ability. I'm always met with > the same response - the tests need to match student ability to ensure > funding. > > It's a catch 22 for instructors. We're caught in a loop of having to meet > performance measures that may or may not truly reflect student ability yet > the work a student does that shows his or her ability is often ignored. > > Regards, > Katrina Hinson > >>>> "Donna Chambers" 10/24/06 8:40 AM >>> > Wow! Like others, I lose track of the discussion because I am just too > busy to keep up. There is so much work to be done in thinking through this > issue, but in the meantime, we must keep up with our duties in the > classroom and running our programs . Now I just heard on the news that > sitting at a computer for hours at a time can be addictive and may > require medical and psychiatric treatment Here is my quick "peep" > because I don't have the time for another addiction. > > I agree with both Nancy and Mary Jane. The testing requirement for > government funding is not enough and sometimes not appropriate. I have > spent my career working with competency based assessment and/or authentic > assessment and so that is what I inherently use to see if a learner > understands something. By testing this way, I am also challenging the > thinking skills of the learner which I believe is critical in the process > for the adult. > > Lately I have been doing a bit of informal experimentation. Here is what > I find more times than not. The learner knows the skill because it was > demonstrated to me when asked to verbally to solve the problem and > explain the solution. Yet the learner got the item wrong on the paper > and pencil, multiple choice test. The learner demonstrates an ability to > do a problem and apply the skill in several examples and then gets the > same type of problem wrong when asked to do it on a test . Pre test > scores do not always correlate with what I believe a learner knows and yet > such we are asked to place such importance on them. Sometimes even scores > go down between pre and post testing. I do believe this begs the question, > "What is happening here?" It is definitely worth considering more varied > assessment methods. We are working with adults that may be test anxious > and certainly language plays a huge role in being able to answer the > question correctly. This is not to say that th! > e tests we use are not valid, but getting a question right more > circumstances than knowledge or skill, especially for adults. How do we > know when they know it, be able to retain it, and to apply it again in > other circumstances? " > > This is more than a "peep" ,but I feel this topic is critical to our > programs and the whole "accountability/assessment" issue in education > today. > > Donna Chambers > > ----- Original Message ----- > From: Mary Jane Jerde > To: sfallsliteracy at yahoo.com ; The Assessment Discussion List > Sent: Monday, October 23, 2006 6:49 PM > Subject: [Assessment 534] Re: peep? > > > Hi, > > The testing that is required for government funding tends to fly in the > face of some serious principles for assessment: never depend on one tool, > don't allow the students to become familiar with a specific test form, use > a variety of testing methods, numbers do not give a full, realistic > assessment. CASAS does do several things fairly well: easy training, easy > scoring, easy make up. How many students know Form 54 by sheer repetition? > It's always a shock when they hit Form 56. > > I use BEST Plus with CASAS. It's not perfect either, but I have the > "luxury" of a trained assessor willing to come on site to give it. Between > the two, I'm getting a better picture and more options for reporting. > > Mary Jane Jerde > Howard Community College > Columbia, MD > > > > Nancy Hansen wrote: > Marie - > > Without a peep .... I have been lurking ... every once in a while but > not regularly on this thread. Part of the reason I haven't replied is the > emails that were posted were so loooong. I felt it would require research > to read thoroughly and respond, so I didn't. And I'm as always busy with > many projects as the only full-time paid staff. > > It's not that I am not interested personally, but via my scanning the > posts I feel The Movement is not taking into consideration one (of a > couple) very important factors: Some adult learners cannot commit to the > kind of time that many of you speak about. That means their time with > their study is also very precious even though elongated. If testing takes > away that time, it would be resented. > > Ours is an adult literacy program driven by volunteer instructors. The > focus of the materials includes periodic check-ups built into the lessons. > (Note: Not called Tests.) However, I sense that by your colleagues' > standards my program would be deemed ineffective. You know .... the > learners aren't gaining a grade level every year. Quite frankly the > learners don't *care* about that form of measurement. > > So I lurk. I feel, No. 2, The System places too *little* importance on > what it is that the adult learner has brought with them as goals in their > need to read, write and spell better. It cannot be measured in many cases > ... except, perhaps, in smiles, self-confidence and improved worth. > *That* our learners *do* treasure! How do members of the adult education > system intend those skill development factors to be measured? Learner > Portfolios are part of *our* system, yet unacceptable to the NRS. It used > to be that the check-up scores counted. But no longer. > > Until the answers are clear, this agency director will remain on the > perimeters of the assessment discussion ... and *consequently* the agency > will continue without funding that is tied to a grade level increase > requirement. The kicker is: The learners *like* what they are receiving > and that matters more. At least to me. > > Nancy Hansen > Executive Director > Sioux Falls Area Literacy Council > Sioux Falls, SD > > Marie Cora wrote: > Dear colleagues, > > Are you out there? Is this a bad time for a discussion? Is the > topic not of interest? > > Aside from Virginia??????s post last week, I haven??????t heard from > any of the 550 subscribers to the List. I??????m assuming that the topic > (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of > interest. But you need to let me know if it is or not. I generally gauge > interest based on Subscriber responsese topic is not hot (which surprises > me). > > Are there other topics you??????d prefer to engage in? What are > they? The purpose of this List is to provide a forum for discussion that > goes well beyond your program??????s walls, so if I??????m not hitting on > the right stuff, I really do need to hear from you. Membership here > continues to climb, but you are all very silent. > > I really want to hear your thoughts and I want to know how this List > can serve you well. Please let me know. Feel free to respond to the > discussion on Measuring Gains, but also, feel free to start your own > discussion topic, or feel free to send your thoughts to the List or to me > personally regarding other discussions that you would like to see happen > ?????? and I??????ll make them happen. > > I value and appreciate your membership highly ?????? but a Discussion > List is only as good as the discussions that occur. If you??????ve never > posted and that makes you a bit reticent, feel free to send me your post > and I can do a couple of things like help you compose your message, or I > can post your message for you anonymously. The important thing is for > your voices to be heard. > > Thanks! > > marie > > Marie Cora > marie.cora at hotspurpartners.com > NIFL Assessment Discussion List Moderator > http://www.nifl.gov/mailman/listinfo/assessment > Coordinator, LINCS Assessment Special Collection > http://literacy.kent.edu/Midwest/assessment/ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > ---------------------------------------------------------------------------- > Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ > countries) for 2??????/min or less.------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------------------------------------------------------ > Want to be your own boss? Learn how on Yahoo! Small Business. > > > ------------------------------------------------------------------------------ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > -------------------------------------------------------------------------------- > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > From Karen.Limkemann at fwliteracyalliance.org Tue Oct 24 17:10:46 2006 From: Karen.Limkemann at fwliteracyalliance.org (Limkemann, Karen) Date: Tue, 24 Oct 2006 17:10:46 -0400 Subject: [Assessment 540] Re: peep? Message-ID: I feel some sort of testing is essential since we are in the business of education. Sometimes students develop an attachment to a teacher or tutor that allows for a comfortable feeling that they will perceive as a "benefit" from the service. Those students might actually be in need of mental health services or case management of some type that could be better provided by another agency.. Just a thought... Karen Ft. Wayne, IN -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Donna Chambers Sent: Tuesday, October 24, 2006 4:46 PM To: The Assessment Discussion List Subject: [Assessment 539] Re: peep? Katrina, You are not alone and it is a catch 22. I know that the test/data doesn't often show the accurate picture. However, in reality, we are forced to deal with standardized tests as the main measurement tool. Truth be told, adult literacy learners are often not good traditional testers. What can we do? Donna Chambers ----- Original Message ----- From: "Katrina Hinson" To: Sent: Tuesday, October 24, 2006 12:36 PM Subject: [Assessment 536] Re: peep? > I'm in the same boat as Donna below. Normally I try to keep up with the > discussions but I've been silent on the lists I'm on lately simply because > I wear way too many hats at the moment and it's difficult to keep up with > sheer volume of emails generated sometimes. Likewise, I have my own > 'peep' to add to the discussion: > > We use TABE (9 & 10) at the moment, for ABE/GED students and CASAS for > ESL and Family Literacy and Comp. Ed.. One problem occurs when students > move between programs. Students tested via CASAS, are not given a math > component - EVER - just Reading and Listening and those in Family Literacy > are only given Reading. They may or may not ever have a math placement > score. Additionally, the program I'm in places great emphasis on these > scores due to the nature of the funding - so much so that paperwork now > has to go through multiple hands to ensure that it's all correctly filled > in to ensure that the numbers are all accurate. Another area where there > is a problem is the move from those students tested on TABE 8 to TABE 9. > We saw a dramatic decline in test scores and were left asking if it was > accurate? We asked why the huge drop? Had these students regressed? Had > the test not been administered properly before hand? Had they memorized > (which I do agree is a major issue with standa! > rdized testing) the test? > > There were no easy answers and we're still seeing scores all over the > place sometimes. > > Instructors here and administrators are very much tied to the "tests" as > if that is the only measure. I sometimes feel like a lone voice saying > "Yes, BUT..." at a lot of meetings...or trying to explain that the data > isn't always a valid reflection of student ability. I'm always met with > the same response - the tests need to match student ability to ensure > funding. > > It's a catch 22 for instructors. We're caught in a loop of having to meet > performance measures that may or may not truly reflect student ability yet > the work a student does that shows his or her ability is often ignored. > > Regards, > Katrina Hinson > >>>> "Donna Chambers" 10/24/06 8:40 AM >>> > Wow! Like others, I lose track of the discussion because I am just too > busy to keep up. There is so much work to be done in thinking through this > issue, but in the meantime, we must keep up with our duties in the > classroom and running our programs . Now I just heard on the news that > sitting at a computer for hours at a time can be addictive and may > require medical and psychiatric treatment Here is my quick "peep" > because I don't have the time for another addiction. > > I agree with both Nancy and Mary Jane. The testing requirement for > government funding is not enough and sometimes not appropriate. I have > spent my career working with competency based assessment and/or authentic > assessment and so that is what I inherently use to see if a learner > understands something. By testing this way, I am also challenging the > thinking skills of the learner which I believe is critical in the process > for the adult. > > Lately I have been doing a bit of informal experimentation. Here is what > I find more times than not. The learner knows the skill because it was > demonstrated to me when asked to verbally to solve the problem and > explain the solution. Yet the learner got the item wrong on the paper > and pencil, multiple choice test. The learner demonstrates an ability to > do a problem and apply the skill in several examples and then gets the > same type of problem wrong when asked to do it on a test . Pre test > scores do not always correlate with what I believe a learner knows and yet > such we are asked to place such importance on them. Sometimes even scores > go down between pre and post testing. I do believe this begs the question, > "What is happening here?" It is definitely worth considering more varied > assessment methods. We are working with adults that may be test anxious > and certainly language plays a huge role in being able to answer the > question correctly. This is not to say that th! > e tests we use are not valid, but getting a question right more > circumstances than knowledge or skill, especially for adults. How do we > know when they know it, be able to retain it, and to apply it again in > other circumstances? " > > This is more than a "peep" ,but I feel this topic is critical to our > programs and the whole "accountability/assessment" issue in education > today. > > Donna Chambers > > ----- Original Message ----- > From: Mary Jane Jerde > To: sfallsliteracy at yahoo.com ; The Assessment Discussion List > Sent: Monday, October 23, 2006 6:49 PM > Subject: [Assessment 534] Re: peep? > > > Hi, > > The testing that is required for government funding tends to fly in the > face of some serious principles for assessment: never depend on one tool, > don't allow the students to become familiar with a specific test form, use > a variety of testing methods, numbers do not give a full, realistic > assessment. CASAS does do several things fairly well: easy training, easy > scoring, easy make up. How many students know Form 54 by sheer repetition? > It's always a shock when they hit Form 56. > > I use BEST Plus with CASAS. It's not perfect either, but I have the > "luxury" of a trained assessor willing to come on site to give it. Between > the two, I'm getting a better picture and more options for reporting. > > Mary Jane Jerde > Howard Community College > Columbia, MD > > > > Nancy Hansen wrote: > Marie - > > Without a peep .... I have been lurking ... every once in a while but > not regularly on this thread. Part of the reason I haven't replied is the > emails that were posted were so loooong. I felt it would require research > to read thoroughly and respond, so I didn't. And I'm as always busy with > many projects as the only full-time paid staff. > > It's not that I am not interested personally, but via my scanning the > posts I feel The Movement is not taking into consideration one (of a > couple) very important factors: Some adult learners cannot commit to the > kind of time that many of you speak about. That means their time with > their study is also very precious even though elongated. If testing takes > away that time, it would be resented. > > Ours is an adult literacy program driven by volunteer instructors. The > focus of the materials includes periodic check-ups built into the lessons. > (Note: Not called Tests.) However, I sense that by your colleagues' > standards my program would be deemed ineffective. You know .... the > learners aren't gaining a grade level every year. Quite frankly the > learners don't *care* about that form of measurement. > > So I lurk. I feel, No. 2, The System places too *little* importance on > what it is that the adult learner has brought with them as goals in their > need to read, write and spell better. It cannot be measured in many cases > ... except, perhaps, in smiles, self-confidence and improved worth. > *That* our learners *do* treasure! How do members of the adult education > system intend those skill development factors to be measured? Learner > Portfolios are part of *our* system, yet unacceptable to the NRS. It used > to be that the check-up scores counted. But no longer. > > Until the answers are clear, this agency director will remain on the > perimeters of the assessment discussion ... and *consequently* the agency > will continue without funding that is tied to a grade level increase > requirement. The kicker is: The learners *like* what they are receiving > and that matters more. At least to me. > > Nancy Hansen > Executive Director > Sioux Falls Area Literacy Council > Sioux Falls, SD > > Marie Cora wrote: > Dear colleagues, > > Are you out there? Is this a bad time for a discussion? Is the > topic not of interest? > > Aside from Virginia??????s post last week, I haven??????t heard from > any of the 550 subscribers to the List. I??????m assuming that the topic > (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of > interest. But you need to let me know if it is or not. I generally gauge > interest based on Subscriber responsese topic is not hot (which surprises > me). > > Are there other topics you??????d prefer to engage in? What are > they? The purpose of this List is to provide a forum for discussion that > goes well beyond your program??????s walls, so if I??????m not hitting on > the right stuff, I really do need to hear from you. Membership here > continues to climb, but you are all very silent. > > I really want to hear your thoughts and I want to know how this List > can serve you well. Please let me know. Feel free to respond to the > discussion on Measuring Gains, but also, feel free to start your own > discussion topic, or feel free to send your thoughts to the List or to me > personally regarding other discussions that you would like to see happen > ?????? and I??????ll make them happen. > > I value and appreciate your membership highly ?????? but a Discussion > List is only as good as the discussions that occur. If you??????ve never > posted and that makes you a bit reticent, feel free to send me your post > and I can do a couple of things like help you compose your message, or I > can post your message for you anonymously. The important thing is for > your voices to be heard. > > Thanks! > > marie > > Marie Cora > marie.cora at hotspurpartners.com > NIFL Assessment Discussion List Moderator > http://www.nifl.gov/mailman/listinfo/assessment > Coordinator, LINCS Assessment Special Collection > http://literacy.kent.edu/Midwest/assessment/ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > ---------------------------------------------------------------------------- > Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ > countries) for 2??????/min or less.------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------------------------------------------------------ > Want to be your own boss? Learn how on Yahoo! Small Business. > > > ------------------------------------------------------------------------------ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > -------------------------------------------------------------------------------- > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From baera at floridaliteracy.org Tue Oct 24 16:03:48 2006 From: baera at floridaliteracy.org (Alyssa Baer) Date: Tue, 24 Oct 2006 16:03:48 -0400 Subject: [Assessment 541] 2007 Florida Literacy Conference Message-ID: <002e01c6f7a7$858f06a0$1e02a8c0@floridaliteracy.org> Please join us for the 2007 Florida Literacy Conference! Dates: May 2-4, 2007, with May 1, 2007 pre-conference Location: Orlando Marriott, Lake Mary, Florida One of Florida's premier literacy events, this three day annual conference offers a wide range of training and networking opportunities to literacy practitioners and volunteers. Full Conference Early Bird: postmarked by March 9: Member $170 /Non-Member $195 Full Conference: postmarked by April 13: Member $195 / Non-Member $220 Full Conference: on-site, after April 13: $235 Full Conference: Adult Learner: $70 (no fee for adult learners attending May 2nd only) Call for Presenters! To help make the 2007 Conference a success, the Florida Literacy Coalition seeks session proposals throughout the state and nation in the following topic ideas: Adult Learner, Corrections, Family Literacy, English Literacy, Learning Disabilities, Library Literacy, Program Management, Reading, Technology, Volunteers in Literacy and Workforce Education (ABE, GED and adult high school). We welcome your participation and encourage you to propose a session by December 13 by downloading the Call for Presenters form from www.floridaliteracy.org. On behalf of the Florida Literacy Coalition, thank you and we hope to see you in May. Alyssa Baer, AmeriCorps*VISTA Member Florida Literacy Coalition baera at floridaliteracy.org Telephone: 407.246.7110 extension 207 Facsimile: 407.246.7104 934 North Magnolia Avenue, Suite 104 Orlando, Florida 32803 www.floridaliteracy.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061024/1e3a53e3/attachment.html From Tina_Luffman at yc.edu Tue Oct 24 21:04:15 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Tue, 24 Oct 2006 18:04:15 -0700 Subject: [Assessment 542] Re: peep? Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061024/c99a3fd3/attachment.html From pammenjk at haslett.k12.mi.us Wed Oct 25 09:56:42 2006 From: pammenjk at haslett.k12.mi.us (JO PAMMENT) Date: Wed, 25 Oct 2006 09:56:42 -0400 Subject: [Assessment 543] Re: Measuring Outside the NRS Message-ID: <453F34DA0200009E000037E9@10.1.0.15> Hello Marie, I have been following the assessment discussion. In Michigan we are mandated to use CASAS and TABE 9 and 10. And we must test every 90 hours of student attendance. These tests have not met all of our needs however. I am interested in learning about the TABE ESL test that is coming out. Has it been approved by the Federal government yet? Has anyone seen or tried it yet? Jo Pamment Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Marie Cora" 10/24/06 3:30 PM >>> Dear Grace and Nancy ? (First a big thank you to everyone who has jumped in ? I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I?m sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements ? what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy ? you said: ?I sense that by your colleagues' standards my program would be deemed ineffective.? I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy?s question Subscribers: is this true? Do you feel that Grace and Nancy?s programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student?s goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let?s hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like ? and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen To: The Assessment Discussion List Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia?s post last week, I haven?t heard from any of the 550 subscribers to the List. I?m assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you?d prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program?s walls, so if I?m not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. hts to the List or to me personally regarding other discussions that you would like to see happen ? and I?ll make them happen. I value and appreciate your membership highly ? but a Discussion List is only as good as the discussions that occur. If you?ve never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From tarv at chemeketa.edu Wed Oct 25 15:20:57 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Wed, 25 Oct 2006 12:20:57 -0700 Subject: [Assessment 544] Re: Measuring Outside the NRS In-Reply-To: <453F34DA0200009E000037E9@10.1.0.15> Message-ID: Wow TABE And CASAS, that seems like unnecessary testing to me. Keeping up with one is plenty enough I hope we never go to more. va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of JO PAMMENT Sent: Wednesday, October 25, 2006 6:57 AM To: assessment at nifl.gov Subject: [Assessment 543] Re: Measuring Outside the NRS Hello Marie, I have been following the assessment discussion. In Michigan we are mandated to use CASAS and TABE 9 and 10. And we must test every 90 hours of student attendance. These tests have not met all of our needs however. I am interested in learning about the TABE ESL test that is coming out. Has it been approved by the Federal government yet? Has anyone seen or tried it yet? Jo Pamment Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Marie Cora" 10/24/06 3:30 PM >>> Dear Grace and Nancy - (First a big thank you to everyone who has jumped in - I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I'm sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements - what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy - you said: "I sense that by your colleagues' standards my program would be deemed ineffective." I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy's question Subscribers: is this true? Do you feel that Grace and Nancy's programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student's goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let's hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like - and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen To: The Assessment Discussion List Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. hts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From mdaniels at proliteracy.org Wed Oct 25 19:26:06 2006 From: mdaniels at proliteracy.org (mdaniels) Date: Wed, 25 Oct 2006 19:26:06 -0400 Subject: [Assessment 545] Seeking Model Programs for Accountability Project Message-ID: <221E918332E818488687199E5CBD74CE0537EAF3@keats.proliteracy.org> Dear Colleagues, As you know, funders in both the public and private sectors are placing heightened expectations upon people who manage literacy and adult education programs. The funders want to know: o Are programs able to demonstrate real learning gains among students? o Are programs worthy of the trust placed in them by community partners, business volunteers, students, and other stakeholders? o Do program personnel have the information they need to improve their programs and allocate precious resources wisely? ProLiteracy, in partnership with the Dollar General Literacy Foundation, is working to help program managers acquire the knowledge and tools they need to answer questions like these. This new, three-year project is called the Dollar General/ProLiteracy Performance Accountability Initiative. Together, we will 1) identify successful accountability practices being used in a variety of literacy and adult education programs, and 2) use the practices to develop and deliver training modules and print resources to improve performance accountability. Right now we are looking for up to eight exemplary literacy and adult education programs to serve as models as we develop the first training module: Data Collection and Management. You may be part of a program that does a good job collecting and managing data. If so, please consider nominating your program. If you know of another program that would be a good model, please forward this message to that program's director. We would also appreciate your help in disseminating this call for model programs throughout your networks. The nomination information and form may be downloaded from the ProLiteracy Web site by using the following links: ? www.proliteracy.org/downloads/guidelines.doc ? www.proliteracy.org/downloads/application_form.doc It is important to act quickly, as our deadline for accepting nominations via e-mail or mail is 5 p.m. EST on Wednesday, Nov. 8, 2006. You may fill out and submit the nomination form on paper or online. See the form for detailed instructions. Model programs will receive a $1,500 stipend, increased visibility, and an opportunity to make a positive impact on the adult education and literacy field. We appreciate your help in this critical endeavor and hope to hear from you by Nov. 8. If you have any questions, please contact me via e-mail at mdaniels at proliteracy.org. Thank you, Melanie Daniels, Project Manager Dollar General/ProLiteracy Performance Accountability Initiative ProLiteracy Worldwide 1320 Jamesville Avenue Syracuse, NY 13210 mdaniels at proliteracy.org http://www.proliteracy.org From pammenjk at haslett.k12.mi.us Thu Oct 26 12:13:38 2006 From: pammenjk at haslett.k12.mi.us (JO PAMMENT) Date: Thu, 26 Oct 2006 12:13:38 -0400 Subject: [Assessment 546] Re: Measuring Outside the NRS In-Reply-To: References: <453F34DA0200009E000037E9@10.1.0.15> Message-ID: <4540A6720200009E00003996@10.1.0.15> Currently we can choose either TABE or CASAS for our program. Most of us use CASAS for ESL and TABE for ABE. However we are looking at how to meet the needs of the students who can test out of CASAS because they can read well, but still can't speak or understand well. I've tried the CASAS listening test for the higher level testing population, but it works more as a reading test so doesn't help us much. Jo Pamment Michigan Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Virginia Tardaewether" 10/25/06 3:20:57 PM >>> Wow TABE And CASAS, that seems like unnecessary testing to me. Keeping up with one is plenty enough I hope we never go to more. va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of JO PAMMENT Sent: Wednesday, October 25, 2006 6:57 AM To: assessment at nifl.gov Subject: [Assessment 543] Re: Measuring Outside the NRS Hello Marie, I have been following the assessment discussion. In Michigan we are mandated to use CASAS and TABE 9 and 10. And we must test every 90 hours of student attendance. These tests have not met all of our needs however. I am interested in learning about the TABE ESL test that is coming out. Has it been approved by the Federal government yet? Has anyone seen or tried it yet? Jo Pamment Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Marie Cora" 10/24/06 3:30 PM >>> Dear Grace and Nancy - (First a big thank you to everyone who has jumped in - I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I'm sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements - what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy - you said: "I sense that by your colleagues' standards my program would be deemed ineffective." I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy's question Subscribers: is this true? Do you feel that Grace and Nancy's programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student's goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let's hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like - and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen To: The Assessment Discussion List Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. hts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment From ltaylor at casas.org Thu Oct 26 14:56:56 2006 From: ltaylor at casas.org (Linda Taylor) Date: Thu, 26 Oct 2006 11:56:56 -0700 Subject: [Assessment 547] Re: Measuring Outside the NRS References: <453F34DA0200009E000037E9@10.1.0.15> <4540A6720200009E00003996@10.1.0.15> Message-ID: CASAS is currently completing development of a new Life and Work Listening Series in which there are no written distractors for students to read during the test, only aural prompts and distractors at the Intermediate and Advanced levels. Everything is repeated to minimize the demand on short term memory. We believe this will provide a more accurate measure of students' listening ability, as distinct from their reading ability. If any programs would like to participate in field testing this new series, please let me know. Also, some programs use the CASAS Functional Writing Assessment Picture Task for NRS reporting for students at higher levels since writing is an area in which they often still need instruction. Linda Taylor, Director of Assessment Development, CASAS ltaylor at casas.org (800) 255-1036, ext. 186 ________________________________ From: assessment-bounces at nifl.gov on behalf of JO PAMMENT Sent: Thu 10/26/2006 9:13 AM To: The Assessment Discussion List Subject: [Assessment 546] Re: Measuring Outside the NRS Currently we can choose either TABE or CASAS for our program. Most of us use CASAS for ESL and TABE for ABE. However we are looking at how to meet the needs of the students who can test out of CASAS because they can read well, but still can't speak or understand well. I've tried the CASAS listening test for the higher level testing population, but it works more as a reading test so doesn't help us much. Jo Pamment Michigan Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Virginia Tardaewether" 10/25/06 3:20:57 PM >>> Wow TABE And CASAS, that seems like unnecessary testing to me. Keeping up with one is plenty enough I hope we never go to more. va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of JO PAMMENT Sent: Wednesday, October 25, 2006 6:57 AM To: assessment at nifl.gov Subject: [Assessment 543] Re: Measuring Outside the NRS Hello Marie, I have been following the assessment discussion. In Michigan we are mandated to use CASAS and TABE 9 and 10. And we must test every 90 hours of student attendance. These tests have not met all of our needs however. I am interested in learning about the TABE ESL test that is coming out. Has it been approved by the Federal government yet? Has anyone seen or tried it yet? Jo Pamment Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Marie Cora" 10/24/06 3:30 PM >>> Dear Grace and Nancy - (First a big thank you to everyone who has jumped in - I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I'm sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements - what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy - you said: "I sense that by your colleagues' standards my program would be deemed ineffective." I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy's question Subscribers: is this true? Do you feel that Grace and Nancy's programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student's goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let's hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like - and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen To: The Assessment Discussion List Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. hts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 14105 bytes Desc: not available Url : http://www.nifl.gov/pipermail/assessment/attachments/20061026/c7a74565/attachment.bin From andresmuro at aol.com Thu Oct 26 16:01:28 2006 From: andresmuro at aol.com (andresmuro at aol.com) Date: Thu, 26 Oct 2006 16:01:28 -0400 Subject: [Assessment 548] Re: Measuring Outside the NRS In-Reply-To: <4540A6720200009E00003996@10.1.0.15> References: <453F34DA0200009E000037E9@10.1.0.15> <4540A6720200009E00003996@10.1.0.15> Message-ID: <8C8C750A84B9E81-768-143E@webmail-db06.sysops.aol.com> We have a migrant GED program in El Paso and several other GED classes funded by several non state ABE sources. Because our programs are not funded by ABE monies, we don't have to do NRS. We graduated approximately 250 students per year with a Spanish GED. This is more than any of the local state funded ABE programs graduate. I told the state assistant director that if he wanted, we would report them, if there was a way to do this. He said that we would have to have them progress through the TABE to get to pre-GED level. Well, we don't do any testing. Our students are Spanish Speakers so, in Texas the would first have to go into ESL. Most tend to place in the lowest levels of the ESL test used in Texas, the BEST. We would have to provide English literacy together with Spanish GED, and we would have to show that they are demonstrating gains in the BEST for the state to get credit for these students. We are not going to do this, because it is a waste of time. rather, we focus on intensive and extensive Sapnish GED instruction. Once the students have their GED certificate, we transition them to credit ESL classes at the CC where they can get financial aid. They also do well because they have the necessary academic skills. If NRS would allow us to give the students one test and then would allow the GED test to count as progress from the original placement, we would do this. However, in Texas we can't do this. I don't know if this is doable in other states. However, because the GED is an attractive goal for the students to work towards, they have a motivation. On the other hand, for students to work towards completing a level is frustrating. progressing from literacy ESL to Beginning ESL means nothing to the students. It only means something to some mysterious bureaucracy. It also interferes with the ability of teachers and programs to teach meaningful stuff. This is very sad. all the mandatory testing that has been imposed at all levels from K-12 to adult education has been very detrimental. My wife has been teaching remedial reading at a University. The other day she took poetry to a class. The students had never interacted with poetry throughout High school. Since the only concern was for the students to show progress in some test, teachers only focused in preparing them for the stupid test. They didn't do well in the standardized test nor they were ever exposed to poetry and little prose. Our spanish GED students don't take standardized test to show progress, but they read a lot of poetry and prose. They have published two books of poetry and prose. You can see the books at http://bordersenses.com/memorias. Even if they never got their GEDs, this is a lot more menaingful than showing that you can go from literacy to intermedaite literacy according to some test. Sorry, my rant is over. Andres -----Original Message----- From: pammenjk at haslett.k12.mi.us To: assessment at nifl.gov Sent: Thu, 26 Oct 2006 10:13 AM Subject: [Assessment 546] Re: Measuring Outside the NRS Currently we can choose either TABE or CASAS for our program. Most of us use CASAS for ESL and TABE for ABE. However we are looking at how to meet the needs of the students who can test out of CASAS because they can read well, but still can't speak or understand well. I've tried the CASAS listening test for the higher level testing population, but it works more as a reading test so doesn't help us much. Jo Pamment Michigan Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Virginia Tardaewether" 10/25/06 3:20:57 PM >>> Wow TABE And CASAS, that seems like unnecessary testing to me. Keeping up with one is plenty enough I hope we never go to more. va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of JO PAMMENT Sent: Wednesday, October 25, 2006 6:57 AM To: assessment at nifl.gov Subject: [Assessment 543] Re: Measuring Outside the NRS Hello Marie, I have been following the assessment discussion. In Michigan we are mandated to use CASAS and TABE 9 and 10. And we must test every 90 hours of student attendance. These tests have not met all of our needs however. I am interested in learning about the TABE ESL test that is coming out. Has it been approved by the Federal government yet? Has anyone seen or tried it yet? Jo Pamment Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Marie Cora" 10/24/06 3:30 PM >>> Dear Grace and Nancy - (First a big thank you to everyone who has jumped in - I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I'm sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements - what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy - you said: "I sense that by your colleagues' standards my program would be deemed ineffective." I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy's question Subscribers: is this true? Do you feel that Grace and Nancy's programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student's goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let's hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like - and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen To: The Assessment Discussion List Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. hts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ________________________________________________________________________ Check out the new AOL. Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061026/0179b8bd/attachment.html From kabeall at comcast.net Fri Oct 27 11:24:49 2006 From: kabeall at comcast.net (Kaye Beall) Date: Fri, 27 Oct 2006 11:24:49 -0400 Subject: [Assessment 549] New from NCSALL Message-ID: <008d01c6f9dc$0d3f9570$0202a8c0@your4105e587b6> Research on the Economic Impact of the GED Diploma Panel The National Institute for Literacy (Institute) and the National Center for the Study of Adult Learning and Literacy (NCSALL) announce the Research on the Economic Impact of the GED Diploma Panel, a 30-minute video produced by the Institute. This panel discussion focuses on the economic benefits that accrue to holders of the General Educational Development (GED) credential. It is based on a review by John Tyler of eight recent (published and working) research papers on the GED. Several of these papers were authored by John Tyler, Richard Murnane, and John Willett, researchers with NCSALL whose work has influenced what we know about the economic benefits of the GED. Presenters include John Tyler, Sara Fass, and Sue Snider; the moderator is David Rosen. To view in streaming format, go to: http://www.nifl.gov/nifl/webcasts/ged/webcast_ged.html To order in DVD for $5.00 from NCSALL, go to: www.ncsall.net/?id=675 To order DVD version from NIFL, send request with mailing address to: info at nifl.gov Transitioning Adults to College: Adult Basic Education Program Models by Cynthia Zafft, Silja Kallenbach, and Jessica Spohn This NCSALL Occasional Paper describes five models that the staff at the New England Literacy Resource Center at World Education, Inc., categorized through a survey of adult education centers with transition components from around the United States. This NCSALL Occasional Paper describes the five models-Advising, GED-Plus, ESOL, Career Pathways, and College Preparatory-and themes and recommendations that others contemplating adult transition services might find helpful. It also chronicles the experiences of four states (Connecticut, Kentucky, Maine, and Oregon) in their efforts to institutionalize transitions for adults. To download the paper, go to http://www.ncsall.net/?id=26 Beyond the GED: Making Conscious Choices About the GED and Your Future Newly revised to include new data and information on the Internet, this guide for GED instructors offers lesson plans and helps teachers develop as professionals. It also gives adult learners an opportunity to practice writing, use graphs, read charts, and analyze research findings on the economic impact of the GED. To download the guide, go to http://www.ncsall.net/?id=35. **************** Kaye Beall Outreach Coordinator/NCSALL Dissemination Project World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.ncsall.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061027/63d12143/attachment.html From sfallsliteracy at yahoo.com Fri Oct 27 12:31:01 2006 From: sfallsliteracy at yahoo.com (Nancy Hansen) Date: Fri, 27 Oct 2006 09:31:01 -0700 (PDT) Subject: [Assessment 550] Re: peep? In-Reply-To: <002e01c6f7ab$5ace84f0$edd5ac46@DH89L251> Message-ID: <20061027163101.23611.qmail@web34715.mail.mud.yahoo.com> Had to send a brief reply. Who is the advocate for learner rights in this issue? If what Katrina is true in her concluding paragraph: << > It's a catch 22 for instructors. We're caught in a loop of having to meet performance measures that may or may not truly reflect student ability yet the work a student does that shows his or her ability is often ignored.>> Who is The Loser here? Isn't it the students in these programs? What does this inaccurate performance measure do to the adult learners' self-confidence - their self-esteem - the way they feel about being successful? I'll bet they know they know *more* than the tests show yet they are being told differently. Aren't the consequences going to be a higher dropout rate? And what is yours? Nancy Hansen Sioux Falls Area Literacy Council Donna Chambers wrote: Katrina, You are not alone and it is a catch 22. I know that the test/data doesn't often show the accurate picture. However, in reality, we are forced to deal with standardized tests as the main measurement tool. Truth be told, adult literacy learners are often not good traditional testers. What can we do? Donna Chambers ----- Original Message ----- From: "Katrina Hinson" To: Sent: Tuesday, October 24, 2006 12:36 PM Subject: [Assessment 536] Re: peep? > I'm in the same boat as Donna below. Normally I try to keep up with the > discussions but I've been silent on the lists I'm on lately simply because > I wear way too many hats at the moment and it's difficult to keep up with > sheer volume of emails generated sometimes. Likewise, I have my own > 'peep' to add to the discussion: > > We use TABE (9 & 10) at the moment, for ABE/GED students and CASAS for > ESL and Family Literacy and Comp. Ed.. One problem occurs when students > move between programs. Students tested via CASAS, are not given a math > component - EVER - just Reading and Listening and those in Family Literacy > are only given Reading. They may or may not ever have a math placement > score. Additionally, the program I'm in places great emphasis on these > scores due to the nature of the funding - so much so that paperwork now > has to go through multiple hands to ensure that it's all correctly filled > in to ensure that the numbers are all accurate. Another area where there > is a problem is the move from those students tested on TABE 8 to TABE 9. > We saw a dramatic decline in test scores and were left asking if it was > accurate? We asked why the huge drop? Had these students regressed? Had > the test not been administered properly before hand? Had they memorized > (which I do agree is a major issue with standa! > rdized testing) the test? > > There were no easy answers and we're still seeing scores all over the > place sometimes. > > Instructors here and administrators are very much tied to the "tests" as > if that is the only measure. I sometimes feel like a lone voice saying > "Yes, BUT..." at a lot of meetings...or trying to explain that the data > isn't always a valid reflection of student ability. I'm always met with > the same response - the tests need to match student ability to ensure > funding. > > It's a catch 22 for instructors. We're caught in a loop of having to meet > performance measures that may or may not truly reflect student ability yet > the work a student does that shows his or her ability is often ignored. > > Regards, > Katrina Hinson > >>>> "Donna Chambers" 10/24/06 8:40 AM >>> > Wow! Like others, I lose track of the discussion because I am just too > busy to keep up. There is so much work to be done in thinking through this > issue, but in the meantime, we must keep up with our duties in the > classroom and running our programs . Now I just heard on the news that > sitting at a computer for hours at a time can be addictive and may > require medical and psychiatric treatment Here is my quick "peep" > because I don't have the time for another addiction. > > I agree with both Nancy and Mary Jane. The testing requirement for > government funding is not enough and sometimes not appropriate. I have > spent my career working with competency based assessment and/or authentic > assessment and so that is what I inherently use to see if a learner > understands something. By testing this way, I am also challenging the > thinking skills of the learner which I believe is critical in the process > for the adult. > > Lately I have been doing a bit of informal experimentation. Here is what > I find more times than not. The learner knows the skill because it was > demonstrated to me when asked to verbally to solve the problem and > explain the solution. Yet the learner got the item wrong on the paper > and pencil, multiple choice test. The learner demonstrates an ability to > do a problem and apply the skill in several examples and then gets the > same type of problem wrong when asked to do it on a test . Pre test > scores do not always correlate with what I believe a learner knows and yet > such we are asked to place such importance on them. Sometimes even scores > go down between pre and post testing. I do believe this begs the question, > "What is happening here?" It is definitely worth considering more varied > assessment methods. We are working with adults that may be test anxious > and certainly language plays a huge role in being able to answer the > question correctly. This is not to say that th! > e tests we use are not valid, but getting a question right more > circumstances than knowledge or skill, especially for adults. How do we > know when they know it, be able to retain it, and to apply it again in > other circumstances? " > > This is more than a "peep" ,but I feel this topic is critical to our > programs and the whole "accountability/assessment" issue in education > today. > > Donna Chambers > > ----- Original Message ----- > From: Mary Jane Jerde > To: sfallsliteracy at yahoo.com ; The Assessment Discussion List > Sent: Monday, October 23, 2006 6:49 PM > Subject: [Assessment 534] Re: peep? > > > Hi, > > The testing that is required for government funding tends to fly in the > face of some serious principles for assessment: never depend on one tool, > don't allow the students to become familiar with a specific test form, use > a variety of testing methods, numbers do not give a full, realistic > assessment. CASAS does do several things fairly well: easy training, easy > scoring, easy make up. How many students know Form 54 by sheer repetition? > It's always a shock when they hit Form 56. > > I use BEST Plus with CASAS. It's not perfect either, but I have the > "luxury" of a trained assessor willing to come on site to give it. Between > the two, I'm getting a better picture and more options for reporting. > > Mary Jane Jerde > Howard Community College > Columbia, MD > > > > Nancy Hansen wrote: > Marie - > > Without a peep .... I have been lurking ... every once in a while but > not regularly on this thread. Part of the reason I haven't replied is the > emails that were posted were so loooong. I felt it would require research > to read thoroughly and respond, so I didn't. And I'm as always busy with > many projects as the only full-time paid staff. > > It's not that I am not interested personally, but via my scanning the > posts I feel The Movement is not taking into consideration one (of a > couple) very important factors: Some adult learners cannot commit to the > kind of time that many of you speak about. That means their time with > their study is also very precious even though elongated. If testing takes > away that time, it would be resented. > > Ours is an adult literacy program driven by volunteer instructors. The > focus of the materials includes periodic check-ups built into the lessons. > (Note: Not called Tests.) However, I sense that by your colleagues' > standards my program would be deemed ineffective. You know .... the > learners aren't gaining a grade level every year. Quite frankly the > learners don't *care* about that form of measurement. > > So I lurk. I feel, No. 2, The System places too *little* importance on > what it is that the adult learner has brought with them as goals in their > need to read, write and spell better. It cannot be measured in many cases > ... except, perhaps, in smiles, self-confidence and improved worth. > *That* our learners *do* treasure! How do members of the adult education > system intend those skill development factors to be measured? Learner > Portfolios are part of *our* system, yet unacceptable to the NRS. It used > to be that the check-up scores counted. But no longer. > > Until the answers are clear, this agency director will remain on the > perimeters of the assessment discussion ... and *consequently* the agency > will continue without funding that is tied to a grade level increase > requirement. The kicker is: The learners *like* what they are receiving > and that matters more. At least to me. > > Nancy Hansen > Executive Director > Sioux Falls Area Literacy Council > Sioux Falls, SD > > Marie Cora wrote: > Dear colleagues, > > Are you out there? Is this a bad time for a discussion? Is the > topic not of interest? > > Aside from Virginia??????s post last week, I haven??????t heard from > any of the 550 subscribers to the List. I??????m assuming that the topic > (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of > interest. But you need to let me know if it is or not. I generally gauge > interest based on Subscriber responsese topic is not hot (which surprises > me). > > Are there other topics you??????d prefer to engage in? What are > they? The purpose of this List is to provide a forum for discussion that > goes well beyond your program??????s walls, so if I??????m not hitting on > the right stuff, I really do need to hear from you. Membership here > continues to climb, but you are all very silent. > > I really want to hear your thoughts and I want to know how this List > can serve you well. Please let me know. Feel free to respond to the > discussion on Measuring Gains, but also, feel free to start your own > discussion topic, or feel free to send your thoughts to the List or to me > personally regarding other discussions that you would like to see happen > ?????? and I??????ll make them happen. > > I value and appreciate your membership highly ?????? but a Discussion > List is only as good as the discussions that occur. If you??????ve never > posted and that makes you a bit reticent, feel free to send me your post > and I can do a couple of things like help you compose your message, or I > can post your message for you anonymously. The important thing is for > your voices to be heard. > > Thanks! > > marie > > Marie Cora > marie.cora at hotspurpartners.com > NIFL Assessment Discussion List Moderator > http://www.nifl.gov/mailman/listinfo/assessment > Coordinator, LINCS Assessment Special Collection > http://literacy.kent.edu/Midwest/assessment/ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > ---------------------------------------------------------------------------- > Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ > countries) for 2??????/min or less.------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------------------------------------------------------ > Want to be your own boss? Learn how on Yahoo! Small Business. > > > ------------------------------------------------------------------------------ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > -------------------------------------------------------------------------------- > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Do you Yahoo!? Everyone is raving about the all-new Yahoo! Mail. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061027/b81e677b/attachment.html From KHinson at future-gate.com Mon Oct 30 19:40:13 2006 From: KHinson at future-gate.com (Katrina Hinson) Date: Tue, 31 Oct 2006 01:40:13 +0100 Subject: [Assessment 551] Re: peep? In-Reply-To: <20061027163101.23611.qmail@web34715.mail.mud.yahoo.com> References: <002e01c6f7ab$5ace84f0$edd5ac46@DH89L251> <20061027163101.23611.qmail@web34715.mail.mud.yahoo.com> Message-ID: <4546551E.121C.00A0.0@future-gate.com> I don't know the answer to this and apologize for not getting to this sooner but it's been busy. The instructor should be the advocate for student rights but ultimately the instructor is bound by the constraints of his or her program - rightly or wrongly. All we can do is personally motivate our students to see beyond any score - to give positive reinforcement all the time so the focus isn't on a test score - those are just my thoughts. I have good retention in my class so far but I try not to ever make the focus on the tests - whether it's the TABE or GED. I put the focus on what my students show me they know and show me they can do. I make a lot out of their successful moments. I think that goes a long way. Regards, Katrina Hinson >>> Nancy Hansen 10/27/2006 12:31 pm >>> Had to send a brief reply. Who is the advocate for learner rights in this issue? If what Katrina is true in her concluding paragraph: << > It's a catch 22 for instructors. We're caught in a loop of having to meet performance measures that may or may not truly reflect student ability yet the work a student does that shows his or her ability is often ignored.>> Who is The Loser here? Isn't it the students in these programs? What does this inaccurate performance measure do to the adult learners' self-confidence - their self-esteem - the way they feel about being successful? I'll bet they know they know *more* than the tests show yet they are being told differently. Aren't the consequences going to be a higher dropout rate? And what is yours? Nancy Hansen Sioux Falls Area Literacy Council Donna Chambers wrote: Katrina, You are not alone and it is a catch 22. I know that the test/data doesn't often show the accurate picture. However, in reality, we are forced to deal with standardized tests as the main measurement tool. Truth be told, adult literacy learners are often not good traditional testers. What can we do? Donna Chambers ----- Original Message ----- From: "Katrina Hinson" To: Sent: Tuesday, October 24, 2006 12:36 PM Subject: [Assessment 536] Re: peep? > I'm in the same boat as Donna below. Normally I try to keep up with the > discussions but I've been silent on the lists I'm on lately simply because > I wear way too many hats at the moment and it's difficult to keep up with > sheer volume of emails generated sometimes. Likewise, I have my own > 'peep' to add to the discussion: > > We use TABE (9 & 10) at the moment, for ABE/GED students and CASAS for > ESL and Family Literacy and Comp. Ed.. One problem occurs when students > move between programs. Students tested via CASAS, are not given a math > component - EVER - just Reading and Listening and those in Family Literacy > are only given Reading. They may or may not ever have a math placement > score. Additionally, the program I'm in places great emphasis on these > scores due to the nature of the funding - so much so that paperwork now > has to go through multiple hands to ensure that it's all correctly filled > in to ensure that the numbers are all accurate. Another area where there > is a problem is the move from those students tested on TABE 8 to TABE 9. > We saw a dramatic decline in test scores and were left asking if it was > accurate? We asked why the huge drop? Had these students regressed? Had > the test not been administered properly before hand? Had they memorized > (which I do agree is a major issue with standa! > rdized testing) the test? > > There were no easy answers and we're still seeing scores all over the > place sometimes. > > Instructors here and administrators are very much tied to the "tests" as > if that is the only measure. I sometimes feel like a lone voice saying > "Yes, BUT..." at a lot of meetings...or trying to explain that the data > isn't always a valid reflection of student ability. I'm always met with > the same response - the tests need to match student ability to ensure > funding. > > It's a catch 22 for instructors. We're caught in a loop of having to meet > performance measures that may or may not truly reflect student ability yet > the work a student does that shows his or her ability is often ignored. > > Regards, > Katrina Hinson > >>>> "Donna Chambers" 10/24/06 8:40 AM >>> > Wow! Like others, I lose track of the discussion because I am just too > busy to keep up. There is so much work to be done in thinking through this > issue, but in the meantime, we must keep up with our duties in the > classroom and running our programs . Now I just heard on the news that > sitting at a computer for hours at a time can be addictive and may > require medical and psychiatric treatment Here is my quick "peep" > because I don't have the time for another addiction. > > I agree with both Nancy and Mary Jane. The testing requirement for > government funding is not enough and sometimes not appropriate. I have > spent my career working with competency based assessment and/or authentic > assessment and so that is what I inherently use to see if a learner > understands something. By testing this way, I am also challenging the > thinking skills of the learner which I believe is critical in the process > for the adult. > > Lately I have been doing a bit of informal experimentation. Here is what > I find more times than not. The learner knows the skill because it was > demonstrated to me when asked to verbally to solve the problem and > explain the solution. Yet the learner got the item wrong on the paper > and pencil, multiple choice test. The learner demonstrates an ability to > do a problem and apply the skill in several examples and then gets the > same type of problem wrong when asked to do it on a test . Pre test > scores do not always correlate with what I believe a learner knows and yet > such we are asked to place such importance on them. Sometimes even scores > go down between pre and post testing. I do believe this begs the question, > "What is happening here?" It is definitely worth considering more varied > assessment methods. We are working with adults that may be test anxious > and certainly language plays a huge role in being able to answer the > question correctly. This is not to say that th! > e tests we use are not valid, but getting a question right more > circumstances than knowledge or skill, especially for adults. How do we > know when they know it, be able to retain it, and to apply it again in > other circumstances? " > > This is more than a "peep" ,but I feel this topic is critical to our > programs and the whole "accountability/assessment" issue in education > today. > > Donna Chambers > > ----- Original Message ----- > From: Mary Jane Jerde > To: sfallsliteracy at yahoo.com ; The Assessment Discussion List > Sent: Monday, October 23, 2006 6:49 PM > Subject: [Assessment 534] Re: peep? > > > Hi, > > The testing that is required for government funding tends to fly in the > face of some serious principles for assessment: never depend on one tool, > don't allow the students to become familiar with a specific test form, use > a variety of testing methods, numbers do not give a full, realistic > assessment. CASAS does do several things fairly well: easy training, easy > scoring, easy make up. How many students know Form 54 by sheer repetition? > It's always a shock when they hit Form 56. > > I use BEST Plus with CASAS. It's not perfect either, but I have the > "luxury" of a trained assessor willing to come on site to give it. Between > the two, I'm getting a better picture and more options for reporting. > > Mary Jane Jerde > Howard Community College > Columbia, MD > > > > Nancy Hansen wrote: > Marie - > > Without a peep .... I have been lurking ... every once in a while but > not regularly on this thread. Part of the reason I haven't replied is the > emails that were posted were so loooong. I felt it would require research > to read thoroughly and respond, so I didn't. And I'm as always busy with > many projects as the only full-time paid staff. > > It's not that I am not interested personally, but via my scanning the > posts I feel The Movement is not taking into consideration one (of a > couple) very important factors: Some adult learners cannot commit to the > kind of time that many of you speak about. That means their time with > their study is also very precious even though elongated. If testing takes > away that time, it would be resented. > > Ours is an adult literacy program driven by volunteer instructors. The > focus of the materials includes periodic check-ups built into the lessons. > (Note: Not called Tests.) However, I sense that by your colleagues' > standards my program would be deemed ineffective. You know .... the > learners aren't gaining a grade level every year. Quite frankly the > learners don't *care* about that form of measurement. > > So I lurk. I feel, No. 2, The System places too *little* importance on > what it is that the adult learner has brought with them as goals in their > need to read, write and spell better. It cannot be measured in many cases > ... except, perhaps, in smiles, self-confidence and improved worth. > *That* our learners *do* treasure! How do members of the adult education > system intend those skill development factors to be measured? Learner > Portfolios are part of *our* system, yet unacceptable to the NRS. It used > to be that the check-up scores counted. But no longer. > > Until the answers are clear, this agency director will remain on the > perimeters of the assessment discussion ... and *consequently* the agency > will continue without funding that is tied to a grade level increase > requirement. The kicker is: The learners *like* what they are receiving > and that matters more. At least to me. > > Nancy Hansen > Executive Director > Sioux Falls Area Literacy Council > Sioux Falls, SD > > Marie Cora wrote: > Dear colleagues, > > Are you out there? Is this a bad time for a discussion? Is the > topic not of interest? > > Aside from Virginia??????s post last week, I haven??????t heard from > any of the 550 subscribers to the List. I??????m assuming that the topic > (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of > interest. But you need to let me know if it is or not. I generally gauge > interest based on Subscriber responsese topic is not hot (which surprises > me). > > Are there other topics you??????d prefer to engage in? What are > they? The purpose of this List is to provide a forum for discussion that > goes well beyond your program??????s walls, so if I??????m not hitting on > the right stuff, I really do need to hear from you. Membership here > continues to climb, but you are all very silent. > > I really want to hear your thoughts and I want to know how this List > can serve you well. Please let me know. Feel free to respond to the > discussion on Measuring Gains, but also, feel free to start your own > discussion topic, or feel free to send your thoughts to the List or to me > personally regarding other discussions that you would like to see happen > ?????? and I??????ll make them happen. > > I value and appreciate your membership highly ?????? but a Discussion > List is only as good as the discussions that occur. If you??????ve never > posted and that makes you a bit reticent, feel free to send me your post > and I can do a couple of things like help you compose your message, or I > can post your message for you anonymously. The important thing is for > your voices to be heard. > > Thanks! > > marie > > Marie Cora > marie.cora at hotspurpartners.com > NIFL Assessment Discussion List Moderator > http://www.nifl.gov/mailman/listinfo/assessment > Coordinator, LINCS Assessment Special Collection > http://literacy.kent.edu/Midwest/assessment/ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > ---------------------------------------------------------------------------- > Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ > countries) for 2??????/min or less.------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > > > > > ------------------------------------------------------------------------------ > Want to be your own boss? Learn how on Yahoo! Small Business. > > > ------------------------------------------------------------------------------ > > > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > > -------------------------------------------------------------------------------- > ------------------------------- > National Institute for Literacy > Assessment mailing list > Assessment at nifl.gov > To unsubscribe or change your subscription settings, please go to > http://www.nifl.gov/mailman/listinfo/assessment > ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment --------------------------------- Do you Yahoo!? Everyone is raving about the all-new Yahoo! Mail. From tarv at chemeketa.edu Mon Oct 30 21:10:37 2006 From: tarv at chemeketa.edu (Virginia Tardaewether) Date: Mon, 30 Oct 2006 18:10:37 -0800 Subject: [Assessment 552] Re: Measuring Outside the NRS In-Reply-To: <8C8C750A84B9E81-768-143E@webmail-db06.sysops.aol.com> Message-ID: Great rant Andres! I don't see standardizing testing as so far removed from instruction, but there should be a better fit. va ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of andresmuro at aol.com Sent: Thursday, October 26, 2006 1:01 PM To: assessment at nifl.gov Subject: [Assessment 548] Re: Measuring Outside the NRS We have a migrant GED program in El Paso and several other GED classes funded by several non state ABE sources. Because our programs are not funded by ABE monies, we don't have to do NRS. We graduated approximately 250 students per year with a Spanish GED. This is more than any of the local state funded ABE programs graduate. I told the state assistant director that if he wanted, we would report them, if there was a way to do this. He said that we would have to have them progress through the TABE to get to pre-GED level. Well, we don't do any testing. Our students are Spanish Speakers so, in Texas the would first have to go into ESL. Most tend to place in the lowest levels of the ESL test used in Texas, the BEST. We would have to provide English literacy together with Spanish GED, and we would have to show that they are demonstrating gains in the BEST for the state to get credit for these students. We are not going to do this, because it is a waste of time. rather, we focus on intensive and extensive Sapnish GED instruction. Once the students have their GED certificate, we transition them to credit ESL classes at the CC where they can get financial aid. They also do well because they have the necessary academic skills. If NRS would allow us to give the students one test and then would allow the GED test to count as progress from the original placement, we would do this. However, in Texas we can't do this. I don't know if this is doable in other states. However, because the GED is an attractive goal for the students to work towards, they have a motivation. On the other hand, for students to work towards completing a level is frustrating. progressing from literacy ESL to Beginning ESL means nothing to the students. It only means something to some mysterious bureaucracy. It also interferes with the ability of teachers and programs to teach meaningful stuff. This is very sad. all the mandatory testing that has been imposed at all levels from K-12 to adult education has been very detrimental. My wife has been teaching remedial reading at a University. The other day she took poetry to a class. The students had never interacted with poetry throughout High school. Since the only concern was for the students to show progress in some test, teachers only focused in preparing them for the stupid test. They didn't do well in the standardized test nor they were ever exposed to poetry and little prose. Our spanish GED students don't take standardized test to show progress, but they read a lot of poetry and prose. They have published two books of poetry and prose. You can see the books at http://bordersenses.com/memorias. Even if they never got their GEDs, this is a lot more menaingful than showing that you can go from literacy to intermedaite literacy according to some test. Sorry, my rant is over. Andres -----Original Message----- From: pammenjk at haslett.k12.mi.us To: assessment at nifl.gov Sent: Thu, 26 Oct 2006 10:13 AM Subject: [Assessment 546] Re: Measuring Outside the NRS Currently we can choose either TABE or CASAS for our program. Most of us use CASAS for ESL and TABE for ABE. However we are looking at how to meet the needs of the students who can test out of CASAS because they can read well, but still can't speak or understand well. I've tried the CASAS listening test for the higher level testing population, but it works more as a reading test so doesn't help us much. Jo Pamment Michigan Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Virginia Tardaewether" > 10/25/06 3:20:57 PM >>> Wow TABE And CASAS, that seems like unnecessary testing to me. Keeping up with one is plenty enough I hope we never go to more. va -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov ] On Behalf Of JO PAMMENT Sent: Wednesday, October 25, 2006 6:57 AM To: assessment at nifl.gov Subject: [Assessment 543] Re: Measuring Outside the NRS Hello Marie, I have been following the assessment discussion. In Michigan we are mandated to use CASAS and TABE 9 and 10. And we must test every 90 hours of student attendance. These tests have not met all of our needs however. I am interested in learning about the TABE ESL test that is coming out. Has it been approved by the Federal government yet? Has anyone seen or tried it yet? Jo Pamment Jo Pamment Director Adult Ed. ESL Haslett Public Schools 1118 S. Harrison East Lansing, Michigan 48823 TEL: 517 337-8353 FAX: 517 337-3195 E-Mail: pammenjk at haslett.k12.mi.us >>> "Marie Cora" > 10/24/06 3:30 PM >>> Dear Grace and Nancy - (First a big thank you to everyone who has jumped in - I knew that this was important to you all. As your busy time permits, keep on posting!) Thanks so much for your perspectives. I'm sure that you are not the only ones whose programs conduct assessment outside of the NRS requirements. You both have hit on a number of extremely important issues. If you do not receive federal dollars, then you do not have to adhere to NRS requirements - what types of assessments and tools do you use if this is your case? What sort of schedule or structure do you use for intake, pre-assessment, on-going assessment, post-assessment? Who does the assessing? Do you gauge non-academic skills? If so, how? Do you set up benchmarks such as gaining grade levels or is this not a part of the program? How about timeframes? Is that not as important because you are not bound by the federal system? Nancy - you said: "I sense that by your colleagues' standards my program would be deemed ineffective." I have many questions: who is to say what is effective if your students stay and keep learning, and improve their skills? How about Nancy's question Subscribers: is this true? Do you feel that Grace and Nancy's programs are less effective because of their focus and structure? Or do you feel a bit envious that their programs are not bound by the strict requirements of NRS-tied funding? If your program receives federal dollars, do you feel that you can really truly focus on the student's goal, or do you feel like you need to figure their goal out for them (with the best of intentions) in order to make sure you can show learning gains across your program and keep your funding? Katrina lamented about this in her post I think. Do you feel that helping the student work on her goal and following the federal requirements are at odds or can they work together? Let's hear from others on the List who are in a similar position to Grace and Nancy. If you are not tied to using the tools, structures, and timeframes required by federal funding, then what does your program look like - and how do you feel about that? And for those folks who do receive federal funding, what are your thoughts and questions for folks whose programs work outside of the NRS requirements? Thanks! Marie Cora Assessment Discussion List Moderator -----Original Message----- From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov ] On Behalf Of Grace Temple Sent: Tuesday, October 24, 2006 1:56 PM To: assessment at nifl.gov Subject: [Assessment 537] Re: peep? A great big AMEN! goes out to Nancy. This echos my position exactly. I too, am a director of our county Literacy Council. I read all the posts but respond to only a few as most don't seem to connect to our program. All post assessment is done by the tutors (volunteers) who have been trained to use the SLOSSON lists. This only gives us a ballpark figure but it is the student's satisfaction that counts the most. Do they feel they have gained? If they feel better about their ability to learn, think they can handle everyday reading better and are willing to continue, then our tutors have done well. Grace Temple Executive Director Sanilac Literacy Council Sanilac County, MI _____ From: Nancy Hansen > To: The Assessment Discussion List > Subject: [Assessment 533] Re: peep? Date: Mon, 23 Oct 2006 12:56:32 -0700 (PDT) Marie - Without a peep .... I have been lurking ... every once in a while but not regularly on this thread. Part of the reason I haven't replied is the emails that were posted were so loooong. I felt it would require research to read thoroughly and respond, so I didn't. And I'm as always busy with many projects as the only full-time paid staff. It's not that I am not interested personally, but via my scanning the posts I feel The Movement is not taking into consideration one (of a couple) very important factors: Some adult learners cannot commit to the kind of time that many of you speak about. That means their time with their study is also very precious even though elongated. If testing takes away that time, it would be resented. Ours is an adult literacy program driven by volunteer instructors. The focus of the materials includes periodic check-ups built into the lessons. (Note: Not called Tests.) However, I sense that by your colleagues' standards my program would be deemed ineffective. You know .... the learners aren't gaining a grade level every year. Quite frankly the learners don't *care* about that form of measurement. So I lurk. I feel, No. 2, The System places too *little* importance on what it is that the adult learner has brought with them as goals in their need to read, write and spell better. It cannot be measured in many cases ... except, perhaps, in smiles, self-confidence and improved worth. *That* our learners *do* treasure! How do members of the adult education system intend those skill development factors to be measured? Learner Portfolios are part of *our* system, yet unacceptable to the NRS. It used to be that the check-up scores counted. But no longer. Until the answers are clear, this agency director will remain on the perimeters of the assessment discussion ... and *consequently* the agency will continue without funding that is tied to a grade level increase requirement. The kicker is: The learners *like* what they are receiving and that matters more. At least to me. Nancy Hansen Executive Director Sioux Falls Area Literacy Council Sioux Falls, SD Marie Cora > wrote: p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:'Times New Roman';} a:link, span.MsoHyperlink {color:blue;text-decoration:underline;text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:purple;text-decoration:underline;text-underline:single;} span.EmailStyle17 {font-family:Arial;color:windowtext;} span.SpellE {;} span.GramE {;} @page Section1 {size:8.5in 11.0in;margin:1.0in 1.25in 1.0in 1.25in;} div.Section1 {page:Section1;} Dear colleagues, Are you out there? Is this a bad time for a discussion? Is the topic not of interest? Aside from Virginia's post last week, I haven't heard from any of the 550 subscribers to the List. I'm assuming that the topic (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of interest. But you need to let me know if it is or not. I generally gauge interest based on Subscriber responses, so if this is an indication, the topic is not hot (which surprises me). Are there other topics you'd prefer to engage in? What are they? The purpose of this List is to provide a forum for discussion that goes well beyond your program's walls, so if I'm not hitting on the right stuff, I really do need to hear from you. Membership here continues to climb, but you are all very silent. I really want to hear your thoughts and I want to know how this List can serve you well. Please let me know. hts to the List or to me personally regarding other discussions that you would like to see happen - and I'll make them happen. I value and appreciate your membership highly - but a Discussion List is only as good as the discussions that occur. If you've never posted and that makes you a bit reticent, feel free to send me your post and I can do a couple of things like help you compose your message, or I can post your message for you anonymously. The important thing is for your voices to be heard. Thanks! marie Marie Cora > marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment _____ Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2?/min or less. >------------------------------- >National Institute for Literacy >Assessment mailing list >Assessment at nifl.gov >To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ------------------------------- National Institute for Literacy Assessment mailing list Assessment at nifl.gov To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment ________________________________ size=2 width="100%" align=center> Check out the new AOL . Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061030/c15e856b/attachment.html From Tina_Luffman at yc.edu Tue Oct 31 12:35:46 2006 From: Tina_Luffman at yc.edu (Tina_Luffman at yc.edu) Date: Tue, 31 Oct 2006 10:35:46 -0700 Subject: [Assessment 553] Re: peep? Message-ID: An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061031/13a6438b/attachment.html From marie.cora at hotspurpartners.com Thu Nov 2 08:49:18 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 2 Nov 2006 08:49:18 -0500 Subject: [Assessment 554] Update on Speaking Tour Message-ID: <0b1701c6fe85$b2147b30$0302a8c0@LITNOW> Colleagues, The following post is from Tom Sticht - some of you may be interested in attending one of Tom's speaking engagements. Marie Cora Assessment Discussion List Moderator ----- Update on speaking engagements for 2006-2007. Tom Sticht International Consultant in Adult Education For those who have contacted me to find out if, when, and where I might be speaking in their region the following shows dates, locations, and contacts where I will be giving invited presentations at special meetings or conferences during the remainder of 2006 and first half of 2007. 1. November 9, 2006. Atlanta, Georgia. Contact: Daphne Greenberg, (dgreenberg at gsu.edu). Topic: Celebrating 40 Years of the Adult Education & Literacy System (AELS) of the United States: A Vision of the Future Through a Prism of the Past This presentation looks at the past and reviews the lives of pioneer adult literacy educators whose work influenced the policies and practices of the AELS. It considers the current state of the AELS and presents evidence for multiplier effects that adult literacy educators can achieve using Functional Context Education working in collaboration with other agencies and organizations. It them presents a vision of a future in which national education policy is changed from a focus on one life span or life cycle to a Multiple Life Cycles education policy that explicitly recognizes and fosters one multiplier effect of adult literacy education, the intergenerational transfer of literacy from parents to children. 2. November 17, 2006. Merrillville, Indiana. Contact: Mary Corder (mcorder@ thediscoveryalliance.com) Topic: Literacy Frees the World. This presentation focuses on the present United Nations Literacy Decade with its theme of Literacy as Freedom. Within the past contexts of President Franklin D. Roosevelt's 1941 World War II message of the Four Freedoms, contemporary issues of the functionality of literacy, globalization, sustained development, multiple literacies, and the intergenerational transmission of literacy are explored for their implications in future adult literacy education policy, practice, and research. 3. March 4, 2007. Orlando, Florida. National Family Literacy Conference. Contact: Shannon Baete (sbaete at famlit.org) Topic: The "Hard Data" for Increasing Investments in Adult Literacy Education: Moving From a one life cycle to a Multiple Life Cycles education policy. In this presentation I review with numerous graphics eight lines of research that establish the value of adult literacy education in preschool programs and adult literacy programs for improving the literacy of children across life cycles and providing multiple returns to investments in adult literacy education. 4. March 8 2007 Springfield, Illinois. IACEA annual conference. Contact: Laura Bercovitz (lbercovitz at thecenterweb.org) Topic: The Shoulders on Which we Stand. This keynote is a survey of over two hundred years of the professional wisdom and accomplishments of outstanding teachers of adult literacy and the remarkable changes they have wrought in our nation. This is a motivational presentation for volunteers, teachers, students, and others who are dedicated to adult literacy education. I will present two additional breakout sessions following the keynote address. 5. . June 11 2007 Fayetteville, Arkansas - Keynote for the South Central Literacy Action Conference. Contact: Kerri Miles Email: kerri-woklearn at sbcglobal.net. Topic: To be announced. I will present a keynote and a breakout session. I always enjoy meeting folks from various discussion lists at these presentations and have a chance to put faces with names. I look forward to seeing many of you at one or more of these sessions. Tom Sticht Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019-2059 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net From marie.cora at hotspurpartners.com Fri Nov 3 10:52:08 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Fri, 3 Nov 2006 10:52:08 -0500 Subject: [Assessment 555] Poverty, Race, Women and Literacy Message-ID: <0bc301c6ff60$04fe0f20$0302a8c0@LITNOW> Dear List members, You may have heard that the National Institute for Literacy is combining the Poverty, Race, and Literacy Discussion List with the Women and Literacy Discussion List. Below is an invitation from the moderator of the new list, Daphne Greenburg, to join. Please read on..... ______________ On November 6th, a new list will be starting called: Poverty, Race, Women, and Literacy. If you are interested in subscribing, please to do so by going to: http://www.nifl.gov/mailman/listinfo/PovertyRaceWomen The purpose of this list is to provide an on-going professional development forum for providers, advocates, researchers, learners, policy makers, and all other persons who are interested in exploring the linkages between poverty, race, women and literacy. Examples of topics include: the relationships among poverty, race, women and literacy in the United States and in other countries; health as it pertains to women and poverty issues; the hidden rules of persons living with the effects of poverty, the intersection of these effects with gender and race, and the misunderstandings these can cause in the teaching/learning process; the role of women's literacy in family literacy programs, and the assumptions about race and poverty often made in these programs; domestic violence and its intersection with poverty, race, and literacy; women's literacy levels and its ties to economics and welfare of families; access to literacy in different cultures based on gender, racial, and economic status; connection between women's literacy, race, poverty and public policy; identification of supportive communication networks; and discussion of action steps addressing women, race, poverty and literacy. If the above description interests you, please go to the following address to subscribe: http://www.nifl.gov/mailman/listinfo/PovertyRaceWomen . The list opens on November 6th. Thanks, Daphne Daphne Greenberg Assistant Professor Educational Psych. & Special Ed. Georgia State University P.O. Box 3979 Atlanta, Georgia 30302-3979 phone: 404-651-0127 fax:404-651-4901 dgreenberg at gsu.edu Daphne Greenberg Associate Director Center for the Study of Adult Literacy Georgia State University P.O. Box 3977 Atlanta, Georgia 30302-3977 phone: 404-651-0127 fax:404-651-4901 dgreenberg at gsu.edu From marie.cora at hotspurpartners.com Wed Nov 29 13:11:26 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 29 Nov 2006 13:11:26 -0500 Subject: [Assessment 556] Workplace/ESOL discussion on assessment Message-ID: <008701c713e1$c99764e0$0202a8c0@LITNOW> Dear Colleagues, I have finally fixed up the assessment thread of a discussion held on the Workplace Literacy and English Language Learners Discussion List from September. You can find this discussion at the ALE Wiki assessment area at: http://wiki.literacytent.org/index.php/Assessment_Information Click on discussions and you will see it listed at the top with a tag that says "NEW!!" I hope you find this interesting and useful - and if anyone would like to discuss the issues further here, I encourage us to do so. Thanks! marie cora Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061129/77651589/attachment.html From marie.cora at hotspurpartners.com Wed Nov 29 13:38:47 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Wed, 29 Nov 2006 13:38:47 -0500 Subject: [Assessment 557] Measuring Gains discussion resource Message-ID: <008f01c713e5$9c373210$0202a8c0@LITNOW> Hello again! A second discussion has just been posted to the ALE Wiki Assessment Area at: http://wiki.literacytent.org/index.php/Assessment_Information Click on Discussions and look for "Over What Period of Time Should We Measure Gains?" This discussion took place in October on the NLA Discussion List - and I also sent the discussion to you all as an attachment - a quite lengthy document. I hope that you find this resource helpful and useful as well. I would really like to revive that discussion here - many interesting questions and points were raised. Please feel free!! Thanks, marie Marie Cora marie.cora at hotspurpartners.com NIFL Assessment Discussion List Moderator http://www.nifl.gov/mailman/listinfo/assessment Coordinator, LINCS Assessment Special Collection http://literacy.kent.edu/Midwest/assessment/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061129/ebf2ca36/attachment.html From kabeall at comcast.net Wed Nov 29 17:15:28 2006 From: kabeall at comcast.net (Kaye Beall) Date: Wed, 29 Nov 2006 17:15:28 -0500 Subject: [Assessment 558] New Focus on Basics Message-ID: <008f01c71403$e087ebd0$0202a8c0@your4105e587b6> It's with pleasure and sadness that I announce that there's a new issue of "Focus on Basics" now available at http://www.ncsall.net/index.php?id=1150 The pleasure? It's a great issue. The sadness, it's the last issue. It includes articles from a variety of NCSALL researchers on their research: --John Strucker on the need for curriculum structure --Steve Reder and Clare Strawn on the tendency for adults without high school diplomas to study on there own, and what that means for adult basic education. (Molly Robertson and Lauri Schoneck write about programs that capitalize on the motivation to "self-study") --John Tyler on whether GED attainers are entering postsecondary education at a rate on par with regular high school completers --Rima Rudd and Jennie Anderson on why it's so hard to find your way around in healthcare facilities and what we can do about it --Cristine Smith, Mary Beth Bingman, and Kaye Beall on lessons learned from ten years of disseminating research Enjoy! After ten years of research and development, the National Center for the Study of Adult Learning and Literacy (NCSALL) project is coming to an end. NCSALL's dissemination efforts will end in March 2007. The Web site (www.ncsall.net) will remain available for free downloading of NCSALL materials. Due to our limited budget, we have discontinued the production of Focus on Basic:  Volume 8, Issue B, "Learners' Experiences" was the last issue published in print. Subscribers: Your subscription(s) of "Focus on Basics" ends with this issue and we will issue a refund for undelivered issue(s). You will receive an email from Caye Caplan that will explain how to file for a refund.  Volume 8, Issue C, "Self-Study, Health, GED to Postsecondary, Disseminating Research" is the last issue of Focus on Basics. It is the only issue to be "internet only". It is posted on NCSALL's Web site only at http://www.ncsall.net/index.php?id=1150  All issues of "Focus on Basics" will continue to be available on the NCSALL website at http://www.ncsall.net/index.php?id=31. We would like to take this opportunity to thank you for your support throughout the years. We hope that you find "Focus on Basics" useful and will continue to use it as a staff development tool and for program design guidance. If you have any questions about the refund process, please contact Caye Caplan at ccaplan at worlded.org. Barb Garner Editor, "Focus on Basics" **************** Kaye Beall World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.worlded.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061129/e026ced0/attachment.html From kabeall at comcast.net Mon Dec 18 14:21:12 2006 From: kabeall at comcast.net (Kaye Beall) Date: Mon, 18 Dec 2006 14:21:12 -0500 Subject: [Assessment 559] New from NCSALL Message-ID: <00ac01c722d9$ae9a1690$0302a8c0@your4105e587b6> How Do You Teach Content in Adult Education? An Annotated Bibliography (PDF) by Elizabeth M. Zachary and John P. Comings The occasional paper provides sources of research and professional wisdom that are useful to the design of evidence-based instruction. This annotated bibliography is divided into seven subsections that focus on reading, writing, math and numeracy, English as a second language, GED, adult learning theory, and technology. Each section presents adult education sources and then additional resources based on K-12 research, instruction, and professional development resources. To download the paper, go to: http://www.ncsall.net/?id=26#content Using Going Beyond the GED (PDF) This 4-hour seminar introduces teachers and tutors to Going Beyond the GED: Making Conscious Choices about the GED and Your Future (PDF), a set of classroom materials designed for use in GED classrooms. The materials provide learners with practice in graph and chart reading, calculation, information analysis, and writing, while they examine the labor market, the role of higher education and the economic impact of the GED. To download the seminar, go to: http://www.ncsall.net/?id=597#using_going_beyond_GED **************** Kaye Beall World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 208-694-8262 kaye_beall at worlded.org http://www.worlded.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061218/800dd576/attachment.html From gspangenberg at caalusa.org Tue Dec 19 09:23:33 2006 From: gspangenberg at caalusa.org (Gail Spangenberg) Date: Tue, 19 Dec 2006 09:23:33 -0500 Subject: [Assessment 560] Document from National Commission on Adult Literacy Message-ID: <72D5C30B-C91D-418D-A0A3-655DA1CE971B@caalusa.org> Colleagues, From time to time over the next couple of years, the National Commission on Adult Literacy, which CAAL manages, will make background papers developed for the Commission available to the general public. The first set of four papers -- all summary information papers on the role of the federal government in adult literacy -- have been pulled together into an informal publication that can be downloaded from the CAAL website (www.caalusa.org). The papers were written for the Commission by Lennox McLendon, Garrett Murphy, and Jim Parker. They are titled as follows: 1. Adult Education and Literacy Legislation and Its Effects on the Field (McLendon) 2. Adult Education & Literacy in the Unites States: Need for Services, What the Current Delivery System Looks Like (Murphy) 3. Introduction to Main Strands of Federal Adult Literacy Programming (Parker) 4. Federal Role in Adult Literacy, FY05-06 (Murphy) The combined document is quite large (2.2 MB) but it will download quickly for those with high speed connections. For those who prefer, CAAL can make copies available by regular mail at $25 per copy plus postage (contact bheitner at caalusa.org for ordering instructions). Happiest of holidays to you all, and a wonderful new year -- Gail Spangenberg President Council for Advancement of Adult Literacy 1221 Avenue of the Americas - 46th Fl New York, NY 10020 212-512-2362, F: 212-512-2610 www.caalusa.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061219/9869666e/attachment.html From gspangenberg at caalusa.org Wed Dec 20 17:00:03 2006 From: gspangenberg at caalusa.org (Gail Spangenberg) Date: Wed, 20 Dec 2006 17:00:03 -0500 Subject: [Assessment 561] A Holiday Bon Bon for Everyone Message-ID: Friends, I am pleased to share this wonderful bon bon with you. It is authored by Samuel Halperin, who was in on the founding of our adult education system. Sam is a member of the new National Commission on Adult Literacy and, I am proud to say, a member of CAAL'S board of directors. Enjoy, remember, and dare to hope. Merry Holidays and Happy New Year. Gail Spangenberg REFLECTIONS ON THE FORTIETH BIRTHDAY OF THE ADULT EDUCATION ACT OF 1966 Samuel Halperin December 22, 2006 In November 1966, the United States took a small, but potentially momentous, legislative step to support a federally aided network of adult education providers under the Adult Education Act of 1996. (Technically, the Act is Title III of the Elementary and Secondary Education Act of 1965 (ESEA), as amended). Who would have guessed then that this relatively unheralded act would spur a national network providing education and literacy services to over 2.5 million adult learners annually, including one million 16-24 year-olds, about half of whom study English as a second language? With hindsight, making sure that adults have a second chance to raise their literacy skills and continue their education beyond high school would seem to be clearly in the public interest. Yet, this obvious no- brainer required no less than a massive realignment of Congressional attitudes, a perceived serious threat to national security, and quite possibly a presidential assassination to turn a simple idea into legislative reality. The legislative paths to enactment of the Adult Education Act merit reflection in this 40th anniversary year because they demonstrate how extremely malleable, porous, and often quirky is the process of making our laws, and also because it illuminates the many opportunities for advocates who perceive opportunities and know how to seize them to make progress in the public interest. Here are brief personal memories of those early days, as viewed from my experience as an executive branch ?lobbyist? for Presidents John F. Kennedy and Lyndon B. Johnson:* In the early 1960s, adult educators were barely a presence in the halls of Congress. ?None of its advocates,? notes veteran educator Thomas Sticht, ?was having much success getting adult education or adult literacy education implemented in federal legislation.?** Ever since the demands of World War I had revealed how poorly prepared for military service were so many potential recruits -- intellectually through very low literacy as well as physically, an Adult Basic Education bill (ABE) had been intermittently introduced in Congress beginning in 1918, and then promptly ignored. In a Congress long dominated by southern conservatives, ?adult basic education? became conflated with efforts by liberals and the growing civil rights movement to teach ?Negroes? how to pass the literacy tests that southern states had erected as effective barriers to the exercise of voting rights. (Southerners also noted with suspicion that the U.S. Office of Education?s small adult education branch was headed, and almost exclusively staffed, by a de facto segregated staff of distinguished African American educators in a federal agency where Black senior executives were notable mostly by their absence.) After the defeat of President Kennedy?s education proposals in 1961-62, his Administration devised an omnibus education bill, the National Education Improvement Act of 1963 (NEIA), consisting of 14 parts and incorporating everything from teacher salaries, vocational education, public libraries, student financial aid, higher education construction, and several long-languishing proposals, including adult basic education. Commissioner of Education Francis Keppel, leading the NEIA legislative effort, knew that the entire 14-part package would not survive the church-state hurdles that had doomed earlier Kennedy proposals, but hoped that it might reduce the internecine warfare then prevailing among Washington?s many diverse and fragmented education associations. Through hard work by House and Senate education committees headed, respectively, by Rep. Adam Clayton Powell and Sen. Wayne Morse, major parts of NEIA advanced in early 1963. But progress soon stalled as House and Senate chairs and various education associations quarreled and checkmated each other. It took the shock of President Kennedy?s assassination and Lyndon Johnson?s rise to vigorous leadership to open the legislative floodgates. By year?s end, major bills for vocational and higher education were signed into law. Indeed, by the end of 1964, 12 of the less controversial parts of the NEIA had become law. During that period, too, several developments made it conceivable that adult education, with its anti-poverty focus, could at last get attention on Capitol Hill. In 1963, Daniel Patrick Moynihan, then an assistant secretary in the U.S. Department of Labor, was struck by the fact that among potential draftees under the Selective Service System at least one-third were found unfit for induction due to poor health or mental limitations, that is very low levels of literacy. (Analysts believed that if all 18 year-olds had been examined, fully one-half would be found unfit.). At the urging of Moynihan, Secretary of Labor Willard Wirtz, Secretary of Defense Robert McNamara, and General Lewis Hershey of the Selective Service System, President Kennedy ordered a Task Force on Manpower Conservation to develop appropriate plans for federal action. The Task Force report, One- Third of a Nation, was delivered to President Johnson on January 1, 1964. The report did not call for immediate legislation, nor is there any evidence that it led to Congressional action. Nevertheless, the critical connections between low literacy, national security, and poverty were given new and high-level visibility in the Nation?s Capital. A mood was fast developing that some kind of federal action was long overdue. Then, in May 1964, President Johnson committed his administration to wage War on Poverty. He directed federal agencies to suggest what they could contribute to the development of what soon became the Economic Opportunity Act of 1964 and its new federal agency, the Office of Economic Opportunity (OEO). Assigned as the Department of Health, Education and Welfare?s liaison to the OEO legislative task force headed by Adam Yarmolinsky, I argued there that several parts of the still-pending NEIA bill had relevance for any effort to combat poverty, and that such Administration proposals as adult basic education (ABE), libraries and college work study could be appropriately incorporated in the emerging OEO bill. Future OEO director Sargent Shriver and Yarmolinsky, however, rejected any targeted earmarking of anti-poverty funds, preferring to wield the broadest possible blanket authority to wage war on poverty in all its forms. Moreover, OEO people wanted nothing to do with a state grant program like ABE that would, they argued, be administered by unsympathetic, possibly even racist, state and local officials. Knowing that passage of the OEO bill in the Congress depended on gaining the support of southerners, many of whom saw ABE as a wedge to undercut state literacy voting laws, ABE would have no place in the fast-developing OEO bill. But legislative possibilities changed dramatically when Congress passed the historic Voting Rights Act of 1964. The power of state literacy tests to thwart voting by Blacks would sharply decline, if not entirely disappear. The mood and tactics among southern lawmakers shifted accordingly. As one leading southern senator said in closed caucus, ?If we are going to have to let ?them? vote, we had better be sure they can at least read.? In the House of Representatives, responsibility for overseeing the contents of the draft Economic Opportunity bill was assigned to a subcommittee chaired by Carl Perkins, the ranking majority member on the House Education and Labor Committee, who represented an East Kentucky district characterized by high poverty and even higher illiteracy. During a meeting of Committee members and our HEW legislative staff to consider the provisions of the draft OEO bill, I raised with Mr. Perkins the relevance of including HEW?s proposals for ABE and college work-study. Mr. Perkins immediately and enthusiastically embraced incorporating both provisions in the OEO bill when it was reported to the House of Representatives for its approval. Despite opposition from OEO to these inclusions, Perkins argued that the added provisions would strengthen support for the overall bill. Thus, when President Johnson signed the Economic Opportunity Act on August 20, 1964 (Public Law 88-452), its Title IIB, the Adult Basic Education Act, authorized OEO to make grants to state education agencies to advance adult literacy. OEO promptly assigned administration of the two new programs to the U.S. Office of Education. On March 1, 1966, ABE and the college work-study legislative authorizations were formally transferred from OEO to the Office of Education. These transfers were much less the result of adult educators? lobbying efforts than of OEO?s desire to rid itself of an unwelcome burden and, more especially, of the energetic campaign of Edith Green of Oregon, subcommittee chairman for higher education issues on the House Committee on Education and Labor. Mrs. Green, a formidable education leader, was strongly critical of President Johnson?s war on poverty and, particularly, of the powers and funds it conferred on the new OEO ?super-czar agency? to intervene in the traditional operations of many levels of government, including schools. Amid mounting sharp criticism of OEO?s initial ventures in community action and legal services, Mrs. Green met scant resistance to ?returning? HEW?s original proposals to the U.S. Office of Education. Thus, forty years ago, the Adult Education Act was born, a small but durable foundation stone on which to build a much-needed adult learning system for the American people. Today, however, research shows that 93 million Americans over age 16 lack the literacy and skill levels needed to function effectively in a globally competitive, economically challenging world, one characterized by massive in-immigration of low-literacy workers. We must question whether a 40 year-old, generally under-funded adult education ?system,? staffed 80 percent by part-time instructors and often detached from the needs of cutting-edge economic developments, is even faintly adequate to meeting the challenges of the 21st Century. This is clearly not America?s moment to rest on the anniversary laurels of 1966. Rather, we must forge ahead to help our nation?s children and adults become the most skilled, the most literate, and the most empowered generations in our national history. * As Assistant U.S. Commissioner of Education for Legislation and Deputy Assistant Secretary for Legislation in the Department of Health, Education and Welfare during the years 1961-1969. ** Sticht, T. (2002). The Rise of the Adult Education and Literacy System of the United States: 1600-2000, in J. Comings, B. Garner and C. Smith (eds.) Annual Review of Adult Learning and Literacy, vol. 3, San Francisco: Jossey-Bass, pp.10-43. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061220/00371ced/attachment.html From marie.cora at hotspurpartners.com Thu Dec 21 10:21:59 2006 From: marie.cora at hotspurpartners.com (Marie Cora) Date: Thu, 21 Dec 2006 10:21:59 -0500 Subject: [Assessment 562] Message for the New Year Message-ID: <0a8801c72513$c31184a0$0202a8c0@LITNOW> The following message is from Tom Sticht. Marie Cora Assessment Discussion List Moderator ------- December 20, 2006 Peeling Potatoes for Freedom from Want: A Message for the New Year of 2007 Tom Sticht International Consultant in Adult Education On February 13, 2003, the United Nations Literacy Decade was launched, with the theme of Literacy as Freedom. To celebrate the U.N. Decade of Literacy's theme of Literacy as Freedom, I conducted a speaking tour in the United Kingdom, Canada and the United States called Literacy Frees the World (LFW). Part of the LFW presentation discussed the work of adult literacy educators contributing to the achievement by adult learners of each of the Four Freedoms discussed by President Franklin D. Roosevelt in 1941. The presentation also discussed the importance of teaching literacy following Functional Context Education (FCE) principles in which the teaching of basic skills is integrated with or embedded in the teaching of knowledge within the contexts of the adult students' lives. One illustration of this FCE approach occurred during World War II. During that war, over a quarter million young men learned to read in Special Training Units in the Army. One of the resources used to teach reading was a newspaper, Our War, which was published monthly from June 1942 through September 1945. Each issue of Our War included a cartoon strip about the fictional Private Pete and his buddy, Daffy. The January 1944 issue discussed the New Year. It has a message pertinent to today's times and circumstances, including the United Nation's Decade of Literacy theme of Literacy as Freedom. Following is a synopsis of the strip. Our War January 1944 Private Pete Starts the New Year Right The strip opens with a panel showing Private Pete and Daffy dancing at a U.S.O. party. Then they walk back to their barracks and Private Pete says to Daffy, "That was a swell party. Sure makes you want to start the New Year Right. The next strip shows Pete saying to Daffy, who is still in his bunk, "Time to get up! Remember your New Year's resolutions. Let's go!" After making his bunk up, Pete says, "Are these corners o.k., Daffy?" Daffy says, "They sure are. And look at mine, too. Daffy then says, "Look at the shine on these shoes, Pete." Pete says, "Good! But what about your laundry?" The next panel is two weeks later on Jan. 15. Daffy says, "Gee, I've kept my promises for two whole weeks." Next panel, Pete and Daffy walk into the barracks day room. Daffy says, "What are you going to do, Pete?" Pete says, "I'm going to find out more about the war. Remember, I promised to to that all year long." Looking at a magazine in the day room Pete says, "This magazine shows what happened when the war started." Several panels on, Pete says, "And there's another big job to do after the war is over." Next panel, Pete says, "Remember the poster we say about the Four Freedoms?" Daffy says excitedly, "I know them.FREEDOM FROM WANT, FREEDOM FROM FEAR, FREEDOM OF RELIGION, FREEDOM OF SPEECH." Next the cartoon shows Pete and Daffy walking from their barracks to the mess hall. Pete says, "Everything we do helps to win the war." Daffy says, "You mean K.P., too?" In the final panel, Daffy and Pete are pealing potatoes while on K.P. and Daffy says, with a smile on his face, "Guess this takes care of freedom from want!" Still, today, over sixty years from the fictional discussion of Private Pete and Daffy, men and women around the world are still seeking literacy as one of the key tools for securing the Four Freedoms for themselves and their families. And hundreds of thousands of adult literacy teachers are working, often under dangerous and oppressive conditions, to help these adults achieve life sustaining and enhancing literacy skills. With great resolve, this work by literacy learners and their teachers will continue in the New Year of 2007! Literacy Frees the World! Happy New Year! Thomas G. Sticht International Consultant in Adult Education 2062 Valley View Blvd. El Cajon, CA 92019-2059 Tel/fax: (619) 444-9133 Email: tsticht at aznet.net From kabeall at comcast.net Wed Dec 27 19:51:38 2006 From: kabeall at comcast.net (Kaye Beall) Date: Wed, 27 Dec 2006 19:51:38 -0500 Subject: [Assessment 563] New from NCSALL Message-ID: <008001c72a1a$56756940$0302a8c0@your4105e587b6> The Components of Numeracy by Lynda Ginsburg, Myrna Manly, and Mary Jane Schmitt This occasional paper attempts to describe the complex nature of numeracy as it exists today. While there are large-scale assessments, standards documents, and position papers, there has not been a field- and research-based synthesis of the components required for adults to be numerate, to act numerately, and to acquire numeracy skills. This paper attempts to identify and clarify the nature of these components with the hope that such identification and clarification will guide instruction, contribute to the design of assessments, frame research, and inform policy. To download the paper, go to: http://www.ncsall.net/?id=26#numeracy **************** Kaye Beall World Education 4401 S. Madison St. Muncie, IN 47302 Tel: 765-717-3942 Fax: 617-482-0617 kaye_beall at worlded.org http://www.worlded.org -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20061227/c5154f3f/attachment.html