A r c h i v e d  I n f o r m a t i o n

Using Technology to Support Education Reform -- September 1993

Cost-Effectiveness Studies

One way in which research on educational technologies is increasing in sophistication is to recast the question within a cost-benefit framework. These studies seek to compare the relative cost of attaining a certain level of improvement in student performance through various technologies. Such studies are appealing to policymakers because they deal with the ingredients and variables an administrator can manipulate (e.g., staff, facilities, instructional time). If carefully performed, such studies can be very informative simply by making explicit all of the cost elements going into a particular instructional approach. Some of these are often overlooked. For example, in an analysis of the economics of using CAI delivered through microcomputers, Levin (1989) points out that the ingredients needed may include (1) the time not only of teachers but also of teaching specialists, coordinators, and administrators; (2) physical space for the equipment and whatever security devices, air conditioning, or special wiring is needed; (3) not only the computers themselves but supporting equipment, such as printers, cooling fans, surge protectors, special furnishings, and paper for printers; (4) software and any associated instructional materials; and (5) miscellaneous costs, such as insurance, maintenance, and energy.

Technology is widely believed to make instruction more efficient and hence more cost-effective. In part, this belief is probably an extrapolation from observed effects of technology on productivity in the workplace. There is empirical support for the position as well. A widely cited study by the Institute for Defense Analyses compared the cost of computer-based instruction to that of stand-up training per unit of achievement in military training programs and found that CAI was, on average, 30 percent less expensive (Fletcher & Orlansky 1986). These cost-savings stemmed from faster learning, with associated reductions in personnel and travel costs.

Levin (1989) analyzed cost and effectiveness data for eight mathematics and reading CAI programs implemented in the 1980s. To achieve a given increment in student test scores, CAI proved more cost-effective than reducing class size, extending the length of the school day, or using adult tutors, but considerably less so than peer tutoring. Levin noted that both effectiveness and costs varied markedly from site to site, even though the CAI under study was fairly structured, standardized drill and practice. He concluded that results depend heavily on how the CAI is implemented at the particular site, and that CAI could be much more cost effective if it were not underutilized.

Delineating all cost elements and placing a value on them can help dispel misconceptions about the relative price of different instructional technologies. For instance, it is widely argued that because the cost of computers is dropping so rapidly, current cost comparisons that favor conventional approaches will soon be totally irrelevant. Levin (1989) points out that computer hardware and peripheral costs are too small a proportion of the total cost of CAI implementations for this to be true. Even if all hardware costs were reduced to zero, the total cost of the average intervention studied by Levin (i.e., fairly straightforward drill-and-practice CAI programs) would be reduced by only 11 percent.

Cost comparisons between distance-learning and face-to-face training within the corporate and military training sectors nearly always favor the former (Moore 1989). These findings should not be extrapolated to K-12 education without careful examination. The primary reason for the cost advantage of these technologies in corporate and military training is the large savings in travel expenses and travel time for personnel who must be trained (expense categories with less relevance for K-12 education). A national survey on the cost- effectiveness of distance-learning in schools found that the cost per student was lower with distance-learning than with a live teacher in only 15 of 34 classes (Ellertson, Wydra & Jolley 1987). Thus, distance-learning is not necessarily more cost-effective in a school setting (but it may be the only viable alternative if qualified instructors are not available locally).

A problem with these cost-effectiveness studies is that they depend on the kind of comparative study described above for their index of program efficacy. The measures of effectiveness used in the primary studies may capture the objectives of some programs better than others. For example, Slavin (1991) reanalyzed data from evaluations of the Writing to Read program and concluded that other kindergarten reading programs, costing on the order of 1 percent of the price of Writing to Read, are equally effective. However, Slavin used scores on standardized tests of reading achievement as the measure of effectiveness. The test items seemed more like the contents of the competing reading programs than to the Writing to Read activities. Slavin dismisses effects on expressive writing skills (which would be similar to Writing to Read activities but not to most of the competing programs) as irrelevant to the decision to adopt such a program. The measures used to estimate effect size need to be scrutinized; policymakers may or may not share the analyst's viewpoint concerning the outcomes that are relevant in comparing program efficacy.

More fundamentally, many in the education reform movement object to the cost-benefit studies, because they perpetuate the view that the reason for using technology is to do the same things faster. Those who regard technology as a tool for education reform--who see it as contributing to the adoption of a higher set of expectations for students, to more emphasis on complex tasks and collaborative learning, to a change in the roles of students and teachers--contend that an analysis showing that computers can teach lower-level skills faster than can worksheets simply misses the point.


-###-
[Experimental Studies] [Table of Contents] [Chapter VI: Implementation Issues ]

This page was last updated December 27, 2001 (jca)