This is the accessible text file for GAO report number GAO-06-948T 
entitled 'No Child Left Behind Act: States Face Challenges Measuring 
Academic Growth' which was released on July 27, 2006. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Testimony: 

Before the House Committee on Education and the Workforce: 

United States Government Accountability Office: 

GAO: 

For Release on Delivery Expected at 10:00 a.m. EDT: 

Thursday, July 27, 2006: 

No Child Left Behind Act: 

States Face Challenges Measuring Academic Growth: 

Statement of Marnie S. Shaul, Director, Education, Workforce, and 
Income Security Issues: 

GAO-06-948T: 

GAO Highlights: 

Highlights of GAO-06-948T, a testimony before the House Committee on 
Education and the Workforce 

Why GAO Did This Study: 

The No Child Left Behind Act (NCLBA) requires that states improve 
academic performance so that all students reach proficiency in reading 
and mathematics by 2014 and that achievement gaps close among student 
groups. States set annual proficiency targets using an approach known 
as a status model, which calculates test scores 1 year at a time. Some 
states have interest in using growth models that measure changes in 
test scores over time to determine if schools are meeting proficiency 
targets. 

The Chairman of the Committee on Education and the Workforce asked GAO 
to testify on its recent report on measuring academic growth. 
Specifically, this testimony discusses (1) how many states are using 
growth models and for what purposes, (2) how growth models can measure 
progress toward achieving key NCLBA goals, and (3) what challenges 
states face in using growth models especially to meet the law’s key 
goals. 

While growth models may be defined as tracking the same students over 
time, GAO used a definition that also included tracking the performance 
of schools and groups of students. In comments on the report, Education 
said that this definition could be confusing. GAO used this definition 
of growth to reflect the variety of approaches states were taking. 

What GAO Found: 

Nearly all states were using or considering growth models for a variety 
of purposes in addition to their status models as of March 2006. Twenty-
six states were using growth models, and another 22 were either 
considering or in the process of implementing them. Most states using 
growth models measured progress for schools and for student groups, and 
7 also measured growth for individual students. States used growth 
models to target resources for students that need extra help or award 
teachers bonuses based on their school’s performance. 

Figure: States That Reported Using or Considering Growth Models, as of 
March 2006: 

[See PDF for Image] 

Source: GAO analysis. 

[End of Figure] 

Certain growth models are capable of tracking progress toward the goals 
of universal proficiency by 2014 and closing achievement gaps. For 
example, Massachusetts uses its model to set targets based on the 
growth that it expects from schools and their student groups. Schools 
can make adequate yearly progress (AYP) if they reach these targets, 
even if they fall short of reaching the statewide proficiency targets 
set with the state’s status model. Tennessee designed a model that 
projects students’ test scores and whether they will be proficient in 
the future. In this model, if 79 percent of a school’s students are 
predicted to be proficient in 3 years, the school would reach the 
state’s 79 percent proficiency target for the current school year (2005-
2006). 

States face challenges measuring academic growth, such as creating data 
and assessment systems to support growth models and having personnel to 
analyze and communicate results. The use of growth models to determine 
AYP may also challenge states to make sure that students in low-
performing schools receive needed assistance. U.S. Department of 
Education (Education) initiatives may help states address these 
challenges. Education started a pilot project for states to use growth 
models that meet the department’s specific criteria, including models 
that track progress of individual students, to determine AYP. Education 
also provided grants to states to track individual test scores over 
time. 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-948T]. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Marnie S. Shaul (202) 512-
7215 or shaulm@gao.gov. 

[End of Section] 

Mr. Chairman and Members of the Committee: 

I am pleased to be here today to discuss our report, which describes 
how states use growth models to measure academic performance and how 
these models can measure progress toward achieving key goals of the No 
Child Left Behind Act of 2001 (NCLBA). With annual expenditures 
approaching $13 billion dollars for Title I alone, NCLBA represents the 
federal government's single largest investment in the education of the 
48 million students who attend public schools. The NCLBA--the most 
recent reauthorization of the Elementary and Secondary Education Act of 
1965--requires states to improve academic performance so that all 
students are proficient by 2014 and achievement gaps among groups such 
as economically disadvantaged students close. The upcoming 
reauthorization of the law presents an opportunity to discuss some key 
issues associated with the act. 

To measure whether schools are making adequate yearly progress (AYP) 
toward having all students proficient by 2014, states set annual 
proficiency targets using an approach known as a status model, which 
calculates test scores 1 year at a time. With status models, states or 
districts determine whether schools make AYP based on performance for 
the year while generally not taking into account how much better or 
worse the school did than during the previous year. Thus, a school that 
is showing significant improvement in student achievement but has too 
few students at the proficient level would not likely make AYP. 

In addition to determining whether schools meet proficiency targets, 
some states have interest in also recognizing schools that make 
progress toward NCLBA goals. Growth models can measure progress in 
achievement or proficiency over time and vary in complexity, such as 
calculating annual progress in a school's average test scores from year 
to year; estimating test score progress while taking into account how 
factors such as student background may affect such progress; or 
projecting future scores based on current and prior years' results. 
While growth models are sometimes defined as tracking the same students 
over time, because of the committee's interest in the range of models 
states are using to measure academic improvement, we define a growth 
model as a model that measures changes in proficiency levels or test 
scores of a student, group, grade, school, or district for 2 or more 
years. We included models that track schools and student groups in 
order to provide a broad assessment of options that may be available to 
states. 

My testimony today will focus on how growth models may provide useful 
information on academic performance. Specifically, I will discuss (1) 
how many states are using growth models and for what purposes, (2) how 
growth models can measure progress toward achieving key NCLBA goals, 
and (3) what challenges states face in using growth models especially 
to meet the law's key goals. 

My written statement is drawn from our recent report on growth models, 
which we completed for the committee.[Footnote 1] For this report, we 
conducted a survey of all states to determine whether they were using 
growth models. We conducted telephone interviews with state and local 
education agency officials in eight states that collectively use a 
variety of growth models, and conducted site visits to California, 
Massachusetts, North Carolina and Tennessee. For Massachusetts and 
Tennessee we analyzed student-level data from selected schools to 
illustrate how their models measure progress toward key NCLBA goals. We 
conducted this work in accordance with generally accepted government 
auditing standards. 

Summary: 

In summary, nearly all states were using or considering growth models 
for a variety of purposes in addition to their status models as of 
March 2006. Twenty-six states were using growth models, and another 22 
were either considering or in the process of implementing them. Most 
states that used growth models did so for schools as a whole and for 
particular groups of students, and 7 also measured growth for 
individual students. For example, Massachusetts measured growth for 
schools and groups of students but does not track individual students' 
scores, while Tennessee set different expectations for growth for each 
student based on the student's previous test scores. Seventeen of the 
states that used growth models had been doing so prior to passage of 
the NCLBA, while 9 began after the law's passage. States used their 
growth models for a variety of purposes, such as targeting resources 
for students that need extra help or awarding teachers bonus money 
based on their school's relative performance. 

Certain growth models are capable of tracking progress toward the goals 
of universal proficiency by 2014 and closing achievement gaps. For 
example, Massachusetts uses its model to set targets based on the 
growth that it expects from schools and their student groups. Schools 
can make AYP if they reach these targets, even if they fall short of 
reaching the statewide proficiency targets set with the state's status 
model. Tennessee designed a model, different from the one used for 
state purposes described above, that projects students' test scores and 
whether they will be proficient in the future. In this model, if 79 
percent of a school's students are predicted to be proficient in 3 
years, the school would reach the state's 79 percent proficiency target 
for the current school year (2005-2006). 

States face challenges in developing and implementing growth models 
that would allow them to meet NCLBA goals. Technical challenges include 
creating data and assessment systems to meet the substantial data 
requirements of growth models and having personnel that can analyze and 
communicate growth model results. For example, states need to have 
tests that are comparable from one year to the next to accurately 
measure progress. Further, some models require sophisticated data 
systems that have the capacity to track individual student performance 
across grades and schools. Using growth models can present risks for 
states if schools are designated as making AYP while still needing 
assistance to progress. For example, one school in Tennessee that did 
not make AYP under the status model would make AYP under the state's 
proposed growth model. This school is located in a high-poverty, inner- 
city neighborhood and has been receiving federal assistance targeted to 
improving student performance. If the school continues to make AYP 
under the growth model, its students would no longer receive federally 
required services, such as tutoring or the option of transferring to a 
higher performing school. On the other hand, the school's progress may 
result in its making AYP in the future under the state's status model. 
U.S. Department of Education (Education) initiatives may help states 
address these challenges. For example, Education started a pilot 
project for states to use growth models that meet the department's 
specific criteria to determine AYP. Education also provided grants to 
states to support their efforts to track individual test scores over 
time. 

By proceeding with a pilot project with clear goals and criteria and by 
requiring states to compare results from their growth model with status 
model results, Education is poised to gain valuable information on 
whether or not growth models are overstating progress or whether they 
appropriately give credit to fast-improving schools. In comments on a 
draft of our recent report, Education expressed concern that the use of 
a broader definition of growth models would be confusing. GAO used this 
definition in order to reflect the variety of approaches states have 
been taking to measure growth in academic performance. 

Background: 

The NCLBA[Footnote 2] requires states to set challenging academic 
content and achievement standards in reading or language arts and 
mathematics[Footnote 3] and determine whether school districts and 
schools make AYP toward meeting these standards.[Footnote 4] To make 
AYP, schools generally must: 

* show that the percentage of students scoring at the proficient level 
or higher meets the state proficiency target for the school as a whole 
and for designated student groups, 

* test 95 percent of all students and those in designated groups, and: 

* meet goals for an additional academic indicator, such as the state's 
graduation rate. 

The purpose of Title I Part A is to improve academic achievement for 
disadvantaged students. Schools receiving Title I federal funds that do 
not make AYP for 2 or more years in a row must take action to assist 
students, such as offering students the opportunity to transfer to 
other schools or providing additional educational services like 
tutoring. 

States measure AYP using a status model that determines whether or not 
schools and students in designated groups meet proficiency targets on 
state tests 1 year at a time. States generally used data from the 2001- 
2002 school year to set the initial percentage of students that needed 
to be proficient for a school to make AYP, known as a starting point. 
From this point, they set annual proficiency targets that increase up 
to 100 percent by 2014. For example, for schools in a state with a 
starting point of 28 percent to achieve 100 percent by 2014, the 
percentage of students who scored at or above proficient on the state 
test would have to increase by 6 percentage points each year, as shown 
in figure 1.[Footnote 5] Schools that do not reach the state target 
will generally not make AYP. 

Figure 1: Hypothetical Example of Annual Proficiency Targets Set under 
a Status Model: 

[See PDF for image] 

Source: GAO. 

[End of figure] 

The law indicates that states are expected to close achievement gaps, 
but does not specify annual targets to measure progress toward doing 
so. States, thus, have flexibility in the rate at which they close 
these gaps. To determine the extent that achievement gaps are closing, 
states measure the difference in the percentage of students in 
designated student groups and their peers that reach proficiency. For 
example, an achievement gap exists if 40 percent of a school's non- 
economically disadvantaged students were proficient compared with only 
16 percent of economically disadvantaged students, a gap of 24 
percentage points. To close the gap, the percentage of students in the 
economically disadvantaged group that reaches proficiency would have to 
increase at a faster rate than that of their peers. 

If a school misses its status model target in a single year, the law 
includes a "safe harbor" provision that provides a way for schools that 
are showing significant increases in proficiency rates of student 
groups to make AYP. Safe harbor measures academic performance in a way 
that is similar to certain growth models do and allows a school to make 
AYP by reducing the percentage of students in designated student groups 
that were not proficient by 10 percent, so long as the groups also show 
progress on another academic indicator. For example, in a state with a 
status model target of 40 percent proficient, a school could make AYP 
under safe harbor if 63 percent of a student group was not proficient 
compared to 70 percent in the previous year. 

Nearly All States Reported Using or Considering Growth Models to 
Measure Academic Performance: 

Twenty-six states reported using growth models in addition to using 
their status models to track the performance of schools, designated 
student groups, or individual students, as reported in our March 2006 
survey. Additionally, nearly all states are considering the use of 
growth models (see fig. 2). 

Figure 2: States That Reported Using or Considering Growth Models, as 
of March 2006: 

[See PDF for image] 

Source: GAO survey. 

[End of figure] 

Of the 26 states using growth models, 19 states reported measuring 
changes for schools and student groups, while 7 states reported 
measuring changes for schools, student groups, and individuals, as 
shown in table 1. 

Table 1: Types of Growth Models and States Using Them, as of March 
2006: 

Measures growth of schools and groups: Compares the change in scores or 
proficiency levels of schools or groups of students over time. Data 
requirements, such as measuring proficiency rates for schools or 
groups, are similar to those for status models; 
Measures growth of schools, groups, and individual students: Compares 
the change in scores or proficiency levels of schools, groups of 
students, and individual students over time; Data requirements, such as 
tracking the proficiency levels or test scores for individual students, 
are typically more involved than those for status models. 

Measures growth of schools and groups: Arizona California Colorado 
Connecticut Delaware Indiana Kentucky Louisiana Massachusetts Michigan 
Minnesota Missouri New York Ohio Oklahoma Oregon Pennsylvania Vermont 
Washington; 
Measures growth of schools, groups, and individual students: Florida 
Mississippi North Carolina South Carolina Tennessee Texas Utah. 

Source: GAO survey. 

[End of table] 

For example, Massachusetts used a model that measures growth for the 
school as a whole and for designated student groups. The state awards 
points to schools in 25-point increments for each student,[Footnote 6] 
depending on how students scored on the state test. Schools earn 100 
points for each student who reaches proficiency, but fewer points for 
students below proficiency. The state averages the points to award a 
final score to schools. Growth in Massachusetts is calculated by taking 
the difference in the annual scores that a school earns between 2 
years. Figure 3 illustrates the growth a school can make from one year 
to the next as measured by Massachusetts model. 

Figure 3: Illustration of School-Level Growth: 

[See PDF for image] 

Source: GAO and Art Explosion. 

[End of figure] 

Tennessee reported using a growth model that sets different goals for 
each individual student based on the students' previous test scores. 
The goal is the score that a student would be expected to receive, and 
any difference between a student's expected and actual score is 
considered that student's amount of yearly growth,[Footnote 7] as shown 
in figure 4. 

Figure 4: Example of Higher-than-Expected Growth for a Fourth-Grade 
Student under Tennessee's Model: 

[See PDF for image] 

Source: GAO illustration based on information provided by the state of 
Tennessee. 

[End of figure] 

In addition, Tennessee's model, known as the Tennessee Value-Added 
Assessment System, estimates the unique contribution--the value added-
-that the teacher and school make to each individual student's growth 
in test scores over time.[Footnote 8] The state then uses that amount 
of growth, the unique contribution of the school, and other information 
to determine whether schools are below, at, or above their level of 
expected performance. The model also grades schools with an A, B, C, D, 
or F, which is considered a reflection of the extent to which the 
school is meeting its requirements for student learning. 

Seventeen of the 26 states using growth models reported that their 
models were in place before the passage of the NCLBA during the 2001- 
2002 school year, and the remaining 9 states implemented them after the 
law was passed. States used them for purposes such as rewarding 
effective teachers and designing intervention plans for struggling 
schools. For example, North Carolina used its model as a basis to 
decide whether teachers receive bonus money. Tennessee used its value- 
added model to provide information about which teachers are most 
effective with which student groups. In addition to predicting 
students' expected scores on state tests, Tennessee's model was used to 
predict scores on college admissions tests, which is helpful for 
students who want to pursue higher education. In addition, California 
used its model to identify schools eligible for a voluntary improvement 
program. 

Certain Growth Models Can Measure Progress toward Key NCLBA Goals: 

Certain growth models can measure progress in achieving key NCLBA goals 
of reaching universal proficiency by 2014 and closing achievement gaps. 
While states developed growth models for purposes other than NCLBA, 
states such as Massachusetts and Tennessee have adjusted their state 
models to use them to meet NCLBA goals. The Massachusetts model has 
been used to make AYP determinations as part of the state's 
accountability plan in place since 2003. Tennessee submitted a new 
model to Education for the growth models pilot that differs from the 
value-added model described earlier. This new model gives schools 
credit for students projected to reach proficiency within 3 years in 
order to meet key NCLBA goals. Our analysis of how models in 
Massachusetts and Tennessee can measure progress toward the law's two 
key goals is shown in table 2. 

Table 2: How a Status Model and Certain Growth Models Measure Progress 
in Achieving Key NCLBA Goals: 

Growth models: Universal proficiency by 2014: ; Growth models: 
Universal proficiency by 2014: . 

Universal proficiency by 2014; 
Status model: Sets same annual proficiency target for all schools in 
the state; 
Growth models: Massachusetts (school- level and group-level): Sets 
biennial growth targets for each school/group in the state; 
Growth models: Tennessee[A] (Student-Level): Sets same annual 
proficiency target for all schools in the state. 

Universal proficiency by 2014; 
Status model: State proficiency targets increase incrementally to 100% 
by 2014; 
Growth models: Massachusetts (school- level and group-level): 
School/group growth targets increase incrementally to 100% proficiency 
by 2014; increments may be different by school/group; 
Growth models: Tennessee[A] (Student-Level): State proficiency targets 
increase incrementally to 100% by 2014; Projects future test scores to 
determine if students may be proficient. 


Universal proficiency by 2014; 
Status model: School makes AYP if it reaches the state proficiency 
target; 
Growth models: Massachusetts (school- level and group-level): School 
makes AYP if it reaches the state proficiency target or its own growth 
model targets; 
Growth models: Tennessee[A] (Student-Level): School makes AYP if it 
reaches the state proficiency target based on students projected to be 
proficient in the future. 

Closing achievement gaps; 
Status model: State proficiency target applies to each student group in 
all schools; 
Growth models: Massachusetts (school- level and group-level): Each 
student group in a school has its own growth target; 
Growth models: Tennessee[A] (Student-Level): State proficiency target 
applies to each student group in all schools. 

Closing achievement gaps; 
Status model: School makes AYP if each student group reaches the state 
proficiency target; 
Growth models: Massachusetts (school- level and group-level): School 
makes AYP if each student group reaches the state proficiency target or 
its own growth model target; 
Growth models: Tennessee[A] (Student-Level): School makes AYP if each 
student group reaches the state proficiency target based on students 
projected to be proficient in the future.  

Source: GAO analysis of NCLBA and of information provided by the states 
of Massachusetts and Tennessee. 

Note: Additional requirements for schools to make AYP are described in 
the background section of our report. Massachusetts refers to 
proficiency targets as performance targets and refers to growth targets 
as improvement targets. 

[A] The information presented in this table reflects the model 
Tennessee proposed to use as part of Education's growth model pilot 
project, as opposed to the value-added model it uses for state 
purposes. The information is based on the March 2006 revision of the 
proposal the state initially made in February 2006. 

[End of table]

Massachusetts designed a model that can measure progress toward the key 
goals of NCLBA by setting targets for the improvement of schools and 
their student groups that increase over time until all students are 
proficient in 2014. Schools can get credit for improving student 
proficiency even if, in the short term, the requisite number of 
students has yet to reach the state's status model proficiency targets. 
For example, figure 5 illustrates a school that is on track to make AYP 
annually through 2014 by reaching its growth targets. While these 
growth targets increase at a faster pace than the state's annual 
proficiency target until 2014, they do provide the school with an 
additional measure by which it can make AYP. 

Figure 5: Targets for a Selected School in Massachusetts Compared to 
State Status Model Targets: 

[See PDF for image] 

Source: GAO analysis of data provided by Massachusetts Department of 
Education; Commonwealth of Massachusetts Consolidated State Application 
Accountability Workbook, June 29, 2005. 

[End of figure] 

The model also measures whether achievement gaps are closing by setting 
targets for designated student groups, similar to how it sets targets 
for schools as a whole. Schools that increase proficiency too slowly-- 
that is, do not meet status or growth targets--will not make AYP. For 
example, one selected school in Massachusetts showed significant gains 
for several designated student groups that were measured against their 
own targets. However, the school did not make AYP because gains for one 
student group were not sufficient. This group--students with 
disabilities--fell short of its growth target, as shown in figure 6. 

Figure 6: Results for a Selected School in Massachusetts in 
Mathematics: 

[See PDF for image] 

Source: GAO analysis of data provided by the Massachusetts Department 
of Education. 

[End of figure] 

Tennessee developed a different model that can also measure progress 
toward the NCLBA goals of universal proficiency and closing achievement 
gaps. Tennessee created a new version of the model it had been using 
for state purposes to better align with NCLBA.[Footnote 9] Referred to 
as a projection model, this approach projects individual student's test 
scores into the future to determine when they may reach the state's 
status model proficiency targets. 

In order to make AYP under this proposal, a school could reach the 
state's status model targets by counting as proficient in the current 
year those students who are predicted to be proficient in the future. 
The state projects scores for elementary and middle school students 3 
years into the future to determine if they are on track to reach 
proficiency, as follows: 

* fourth-grade students projected to reach proficiency by seventh 
grade, 

* fifth-grade students projected to reach proficiency by eighth grade, 
and: 

* sixth-, seventh-, and eighth-grade students projected to reach 
proficiency on the state's high school proficiency test. 

These projections are based on prior test data and assume that the 
student will attend middle or high schools with average performance (an 
assumption known as average schooling experience). [Footnote 10] At our 
request, Tennessee provided analyses for students in several schools 
that would make AYP under the proposed model. To demonstrate how the 
model works, we selected students from a school and compared their 
actual results in fourth grade (panel A) with their projected results 
for seventh grade (panel B) (see fig. 7). 

Figure 7: Results for Selected Students in Mathematics from a School in 
Tennessee: 

[See PDF for image] 

Source: Tennessee Department of Education.  

Note: The same students are presented in both panels (for example, 
student A in panel A is the same student as student A in panel B). 
While these data reflect the scores of individual students, Tennessee 
provided data to GAO in such a way that student privacy and 
confidentiality were ensured. Data are illustrative and are not meant 
to be a statistical representation of the distribution of students in 
this school. 

[End of Figure] 

Tennessee's proposed model can also measure achievement gaps. Under 
NCLBA, a school makes AYP if all student groups meet the state 
proficiency target. In Tennessee's model, whether the achievement gap 
is potentially closed would be determined through projections of 
students' performance in meeting the state proficiency target. 

States Face Challenges in Implementing Growth Models: 

States generally face challenges in collecting and analyzing the data 
required to implement growth models including models that would meet 
the law's goals. In addition, using growth models can present risks for 
states if schools are designated as making AYP while still needing 
assistance to progress. Education has initiatives that may help states 
address these challenges. 

States must have certain additional data system requirements to 
implement growth models, including models that would meet NCLBA 
requirements. 

First, a state's ability to collect comparable data over at least 2 
years is a minimum requirement for any growth model. States must ensure 
that test results are comparable from one year to the next and possibly 
from one grade to the next, both of which are especially challenging 
when test questions and formats change. Second, the capacity to collect 
data across time and schools is also required to implement growth 
models that use student-level data. This capacity often requires a 
statewide system to assign unique numbers to identify individual 
students. Developing and implementing these systems is a complicated 
process that includes assigning numbers, setting up the system in all 
schools and districts, and correctly matching individual student data 
over time, among other steps. Third, states need to ensure that data 
are free from errors in their calculations of performance. While 
ensuring data accuracy is important for status models, doing so is 
particularly important for growth models, because errors in multiple 
years can accumulate, leading to unreliable results. 

States also need greater research and analysis expertise to use growth 
models as well as support for people who need to manage and communicate 
the model's results. For example, Tennessee officials told us that they 
have contracted with a software company for several years because of 
the complexity of the model and its underlying data system. Florida has 
a contract with a local university to assist it with assessing data 
accuracy, including unique student identifiers required for its model. 
In addition, states will incur training costs as they inform teachers, 
administrators, media, legislators, and the general public about the 
additional complexities that occur when using growth models. For 
example, administrators in one district in North Carolina told us that 
their district lacks enough specialists who can explain the state's 
growth model to all principals and teachers in need of guidance and 
additional training. 

Using growth models can present risks for states if schools are 
designated as making AYP while still needing assistance to progress. On 
the basis of growth model results, some schools would make AYP even 
though these schools may have relatively low-achieving students. As a 
result, some students in Title I schools may be disadvantaged by not 
receiving federally-required services. 

In two Massachusetts districts that we analyzed, 23 of the 59 schools 
that made AYP did so based on the state's growth model, even though 
they did not reach the state's status model proficiency rate targets in 
2003-2004.[Footnote 11] Consequently, these schools may not be eligible 
to receive services required under NCLBA for schools in need of 
improvement, such as tutoring and school choice. Because these schools 
would need to sustain high growth rates in order to achieve universal 
proficiency by 2014, it is likely that their students would benefit 
from additional support. 

In Tennessee, 47 of the 353 schools that had not made AYP in the 2004- 
2005 school year would do so under the state's proposed projection 
model. One school that would be allowed to make AYP under the proposed 
model was located in a high-poverty, inner-city neighborhood. That 
school receives Title I funding, as two-thirds of its students are 
classified as economically disadvantaged. The school was already 
receiving services required under NCLBA to help its students. If the 
school continues to make AYP under the growth model, these services may 
no longer be provided. 

Education's initiatives, such as the growth model pilot project, may 
facilitate growth model implementation. In November 2005, Education 
announced a pilot project for states to submit proposals for using a 
growth model--one that meets criteria established by the department-- 
along with their status model, to determine AYP. While NCLBA does not 
specify the use of growth models for making AYP determinations, the 
department started the pilot to evaluate how growth models might help 
schools meet NCLBA proficiency goals and close achievement gaps. 

For the growth model pilot project, each state had to demonstrate how 
its growth model proposal met Education's criteria, many of which are 
consistent with the legal requirements of status models. In addition to 
those requirements, Education included criteria that the proposed 
models track student progress over time and have an assessment system 
with tests that are comparable over time. Of the 20 proposals, 
Education approved 2 states--North Carolina and Tennessee--to use 
growth models to make AYP determinations in the 2005-2006 school year. 
States may submit proposals for the pilot again this fall. 

In addition to meeting all of the criteria, Education and peer 
reviewers noted that Tennessee and North Carolina had many years of 
experience with data systems that support growth models. These states 
must report to Education the number of schools that made AYP on the 
basis of their status and growth models. Education expects to share the 
results with other states, Congress, and the public after it assesses 
the effects of the pilot. 

In addition to the growth model pilot project, Education awarded nearly 
$53 million in grants to 14 states for the design and implementation of 
statewide longitudinal data systems--systems that are essential for the 
development of student-level growth models. While independent of the 
pilot project, states with a longitudinal data system--one that gathers 
data such as test scores on the same student from year to year--will be 
better positioned to implement a growth model than they would have been 
without it. Education intended the grants to help states generate and 
use accurate and timely data to meet reporting requirements, support 
decision making, and aid education research, among other purposes. 
Education plans to disseminate lessons learned and solutions developed 
by states that received grants. 

Conclusion: 

While status models provide a snapshot of academic performance, growth 
models can provide states with more detailed information on how 
schools' and students' performance has changed from year to year. 
Growth models can recognize schools whose students are making 
significant gains on state tests but are still not proficient. 
Educators can use information about the academic growth of individual 
students to tailor interventions to the needs of particular students or 
groups. In this respect, models that measure individual students' 
growth provide the most in-depth and useful information, yet the 
majority of the models currently in use are not designed to do this. 

Through its approval of Massachusetts' model and the growth model pilot 
program, Education is proceeding prudently in its effort to allow 
states to use growth models to meet NCLBA requirements. Education is 
allowing only states with the most advanced models that can measure 
progress toward NCLBA goals to use the models to determine AYP. Under 
the pilot project, which has clear goals and criteria that requires 
states to compare results from their growth model with status model 
results, Education is poised to gain valuable information on whether or 
not growth models are overstating progress or whether they 
appropriately give credit to fast-improving schools. 

While growth models may be defined as tracking the same students over 
time, GAO used a definition that also includes tracking the performance 
of schools and groups of students. In comments on our report, Education 
expressed concern that this definition may confuse readers because it 
is very broad and includes models that compare changes in scores or 
proficiency levels of schools or groups of students. GAO used this 
definition of growth to reflect the variety of approaches states are 
taking to measure academic progress. 

Mr. Chairman, this concludes my prepared statement. I would be happy to 
respond to any questions that you or members of the committee may have. 

GAO Contact and Staff Acknowledgments: 

For more information on this testimony, please call Marnie S. Shaul at 
(202) 512-7215. Individuals making key contributions to this testimony 
include Blake Ainsworth, Karen Febey, Harriet Ganson, Shannon Groff, 
Andrew Huddleston, Jason Palmer, and Rachael Valliere. 

FOOTNOTES 

[1] GAO, No Child Left Behind Act: States Face Challenges Measuring 
Academic Growth That Education's Initiatives May Help Address, GAO-06-
661 (Washington, D.C.: July 17, 2006). 

[2] Pub. L. No. 107-110 (Jan. 8, 2002). 

[3] The law also requires content standards to be developed for science 
beginning in the 2005-2006 school year and science tests to be 
implemented in the 2007-2008 school year. 

[4] States determine whether schools and school districts make AYP or 
not. For this report, we will discuss AYP determinations in the context 
of schools. 

[5] States were able to map out different paths to universal 
proficiency subject to certain limitations. For example, states must 
increase the targets at least once every 3 years and those increases 
must lead to 100 percent proficiency by 2014. See GAO, No Child Left 
Behind Act: Improvements Needed in Education's Process for Tracking 
States' Implementation of Key Provisions, GAO-04-734, (Washington, 
D.C.: Sept. 30, 2004). 

[6] Students with disabilities are generally included in these 
calculations. The state is allowed to give different tests to students 
with significant cognitive impairments and to count them differently 
for calculating points awarded to schools. 

[7] Tennessee's growth model mentioned here is not used to make AYP 
determinations under NCLBA. However, Tennessee developed a different 
growth model to determine AYP for Education's growth model pilot 
project. That model is discussed later in this testimony. 

[8] The state calculates the unique contribution of schools and 
teachers by using a multivariate, longitudinal statistical method where 
results are estimated using data specific for students within each 
classroom or school. 

[9] Tennessee continues to use its original model to rate schools based 
in part on the unique contributions--or the value added--of school to 
student achievement. 

[10] While Tennessee's model estimates future performance, other models 
are able to measure growth without these projections. For example, 
Florida uses a model that calculates results for individual students by 
comparing performance in the current year with performance in prior 
years. 

[11] Another 11 schools also met the growth target, but these 11 
schools made AYP under NCLBA's safe harbor provision. 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts 
newly released reports, testimony, and correspondence on its Web site. 
To have GAO e-mail you a list of newly posted products every afternoon, 
go to www.gao.gov and select "Subscribe to Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 441 G Street NW, Room LM 
Washington, D.C. 20548: 

To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202) 
512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400 U.S. 
Government Accountability Office, 441 G Street NW, Room 7125 
Washington, D.C. 20548: 

Public Affairs: 

Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800 
U.S. Government Accountability Office, 441 G Street NW, Room 7149 
Washington, D.C. 20548: