Programs that Work
Children Succeeding in School
Students performing at grade level or meeting state curriculum standards
Students graduating from high school
Age of Child
Early Childhood (0-8)
Middle Childhood (9-12)
Type of Setting
Type of Service
Type of Outcome Addressed
Cognitive Development / School Performance
Evidence Level (What does this mean?)
Direct Instruction (DI) is a highly structured approach to instruction designed to accelerate the learning of at-risk students. Previously known as the DISTAR (Direct Instruction Systems for Teaching Arithmetic and Reading) program and Project Follow Through, DI is based on the theory that learning is maximized when instructional presentations are clear, likely misinterpretations are eliminated, and generalizations are facilitated. Classroom teachers learn how to define tasks clearly, build toward more-complex concepts, use interactive lessons with large and small groups, use frequent praise for responses, and recognize and correct errors immediately. To maximize time spent on tasks, students are placed in instructional groups based on similar performance, and grouping may take place across classes and grades. DI can be used as a school-wide program, or the reading/language arts and math portions of the program can be implemented separately. Individual DI programs are currently used in more than 10,000 schools throughout the world, and more than 500 schools use DI on a school-wide basis.
Students in preschool through grade 12, with a focus on kindergarten through grade six (K-6)
Nearly 100 studies of the Direct Instruction program have been undertaken since the late 1960s, although the majority of the evaluations of the program did not utilize methodologies that were rigorous enough to meet Promising Practices Network (PPN) criteria for inclusion in this review. A total of 20 evaluations, representing 17 different study populations, were identified that met PPN criteria for methodological rigor, and characteristics of those studies are summarized below. Student outcomes were evaluated using a wide range of measures assessing skills in vocabulary, reading, language, mathematics, and general cognitive abilities. These instruments included validated measures such as the Wide Range Achievement Test, the Stanford Achievement Test, the Peabody Individual Achievement Test, the Illinois Test of Psycholinguistic Abilities, the Comprehensive Test of Basic Skills, and the Stanford-Binet Intelligence Scale. Other outcome measures included high school graduation rates, grade retention rates, highest grade level achieved, household income and employment, school misconduct, arrest records, and self-reported juvenile delinquency.
The 17 studies of the Direct Instruction program reviewed here employed rigorous study designs. Eight of the studies used a quasi-experimental design, while the other nine were randomized trials in which program participants were randomly assigned to receive the DI program or a different program. When creating the groups of students to be compared, the researchers attempted to group together students in the DI program and the comparison program who were similar in terms of academic ability. Several of the studies evaluated long-term effects following participation in the DI program, ranging from one to 20 years. The largest of the studies reviewed here included 245 students who participated in the DI program.
DI was evaluated as implemented with students over a wide range of grade levels, from preschool through grade 6. Some studies included only students in regular classrooms, while other studies focused on students in special-education programs. Students from different racial and ethnic groups were included in the studies, as were students from different household income levels. The studies were conducted throughout the country, in both urban and rural settings. Specifically, studies were conducted in Alaska, California, Illinois, Michigan, Mississippi, New York, and Washington State. Also included in this review is a study conducted in Great Britain.
In many studies, DI was compared with other educational programs, including the regular classroom curriculum and alternative educational programs. Examples of the alternative programs include Phonetic Keys to Reading (PKR) (Williamson, 1970); Johnny Right-to-Read (Summerell and Brannigan, 1997); Integrated Skills Method (Richardson et al., 1978); Palo Alto Reading Program (Stein and Goldman, 1980); Peabody Language Development Kit (Mosley, 1980); Ginn Language Development Program (Mosley and Plue, 1980); Open Court Language Development Program (Mosley and Plue, 1980); Action Reading (Rawl and O’Tuel, 1982); English Colour Code (Lewis, 1982); Mediated Learning (Dale and Cole, 1988; Cole, Mills, and Dale, 1989; Cole, Dale, and Mills,1991; Cole et al., 1993; Mills et al., 1995; Mills et al., 2002); Harcourt Brace Jovanovich (HBJ) Basal Reading Program (Sexton, 1989); Integrated Reading-Writing program (Traweek and Berninger, 1997); and High/Scope (Schweinhart and Weikart, 1997).
Key Evaluation Findings
Studies Finding Positive Effects of Direct Instruction
Four studies reported statistically significant, positive findings for DI in contrast with a comparison group. Outcomes included reading, language ability, auditory language comprehension, math, and high school graduation rate.
Stein and Goldman's (1980) study of 53 students age 6 through 8 reported that
- a statistically significant difference favoring the DI students over the Palo Alto Reading Program students was found for reading/recognition scores.
- significantly more students in the DI cohorts than in the control cohorts graduated from high school, with average graduation rates of 60 percent and 38 percent, respectively.
- results on the California Achievement Test (CAT) reading and math scores indicated that the average of all DI-group students was significantly higher than the average of the control-group students.
- students in the DI group significantly outscored students in the HBJ group on a measure of language ability.
- statistically significant differences favoring the DI group were found for all four of the Test of Auditory Comprehension of Language subtests, with effect sizes ranging from small to moderate.
Eight studies reported mixed findings for the effects of DI, with DI students outscoring comparison students on some measures but no significant differences between groups on others. In three of the studies, students in comparison programs (the Peabody Language Development Kit; Action Reading) and/or control groups significantly outscored DI students on some study measures.
Williamson’s (1970) study comparing DI students with Phonetic Keys to Reading and control group students reported that
- on three types of achievement tests, DI students scored better than students in the PKR and control groups. The evidence of improvement was strongest for DI students with initial low reading ability, and mixed for students of moderate initial reading ability.
- student gains from pre-test to post-test on the Paragraph Meaning subtest of the Stanford Achievement Test (SAT) were significantly greater for the DI group than for the Johnny-Right-to-Read group.
- no significant differences were found between the groups on the Word Meaning subtest of the SAT.
- analysis of student gains on the Illinois Test of Psycholinguistic Abilities between pre-test and post-test indicated that DI students scored significantly higher than the control group (regular reading curriculum with no prescribed manual) and Open Court students.
- No significant differences were found between the DI and Ginn Language Development Program groups.
- Students in the Peabody Language Development Kit group (an alternative curriculum) demonstrated significantly higher gain scores than the DI students.
- there were no significant differences between groups for Comprehensive Test of Basic Skills (CTBS) Alphabet skills.
- both the Action-Reading and the control group scored significantly higher than the DI group on the CTBS Visual Auditory Discrimination subtest.
- results for the CTBS Language subtest indicated no significant differences between the DI students and the control group, but significantly higher scores for the DI students than the Action-Reading students.
- similarly, for scores on the CTBS Total Pre-reading subtest, DI students did not differ significantly from control students but scored significantly higher than Action-Reading students.
- the CTBS Mathematics subtest scores indicated that the control group scored significantly higher than the DI group, but there were no significant differences between the DI group and the Action-Reading group.
- students in the DI group outperformed students in both the English Colour Code program and control group in two of eight assessments of reading—specifically, assessments of reading accuracy and comprehension.
- no significant differences among the DI group, the English Colour Code program group, and the control group were observed for the other six assessments.
- Study 1: DI students significantly outscored control students on numerical-problem training scores and maintenance tests and on picture problems.
- Study 2:
- Middle-ability Grade 1 students: Students in the DI group scored significantly higher than students in the control group on both the post-test and the transfer test.
- High-ability Grade 2 students: DI students scored significantly higher than control students on the post-test, but not on the transfer test.
- Study 3: Statistically significant differences favoring the DI group were found for the training-probe measures. No significant differences were noted between groups on the transfer test or on the maintenance measure.
- DI students significantly outscored control students on some tests (Test of Early Language Development (TELD) raw score and TELD Language Quotient), while control students performed better on other outcomes (McCarthy Verbal and Memory Scales and McCarthy General Cognitive Index).
- fourth-grade students in the control group significantly outperformed DI-group students in gains in reading proficiency scores between pre-test and post-test.
- no significant gains in reading proficiency scores were noted between the DI or control group fifth-grade students.
- A statistically significant difference was found in gains in reading proficiency scores for sixth-graders, with DI students improving by a larger amount than control students.
Four studies found no significant results for students participating in the DI program compared with other students in control groups. Those studies include
- Richardson et al.'s (1978) study of 72 second- through sixth-grade students from New York City
- two-year results by Cole, Mills, and Dale (1989)
- Mills et al.'s (1995) age-9 follow up to Cole et al. (1993)
- Traweek and Berninger's (1997) study comparing first-grade students in the Integrated Reading-Writing program and DI.
Cole, Dale, and Mills’ (1991) study of 107 children in a special-education program in Seattle found
- no significant differences between groups for seven of eight measures of reading and language skills.
- in the case of the Peabody Picture Vocabulary Test Standard Score, Mediated Learning students outscored DI students.
- there were no significant differences between groups on any of the seven language-skills measures.
The study by Mills et al. (2002), an age-15 follow-up to Cole et al. (1993), found
- no significant differences between groups for any of the self-reported delinquency acts, including personal violence, property damage, stealing, drug abuse, and status offenses.
- At age 23, the DI group had experienced significantly more felony arrests per person than did the other two groups, significantly higher rates of felony arrests at age 22 and older, and significantly higher rates of arrest for property crime.
- At age 23, no significant differences were found between groups for
- highest year of schooling attained, high school graduation/GED rates, or on-time high-school graduation
- rates of past and present pregnancies
- employment over the past five years, current employment, monthly earnings from work, monthly income from all sources, months on welfare in the past ten years
- number of times suspended or expelled from high school,
- lifetime arrests (juvenile and adult), misdemeanors, arrests for drug-related crimes, violent-crime arrests, felony arrests at age 17 through 21, and felony convictions and prison sentences.
- Between the ages of 5 and 15, virtually no differences were found in intellectual or academic performance among the three groups. The only exception was at age 5, when the DI group scored a significantly higher average IQ on the Stanford-Binet Intelligence Test than did the Nursery School group.
Public and private preschools and public K–12 schools
Schools implementing the DI program frequently use Title I monies, or apply for grant funds. DI is also used in many Reading First schools.
DI can be used as a school-wide comprehensive reform program, or specific programs (the reading, language arts, and math curricula) can be implemented individually. Although DI’s original focus was on reading, language, and math, the program has since been expanded to include social and physical sciences, fact-learning, and handwriting. DI provides highly scripted and interactive lessons for small, homogeneous groups of students. Groupings may vary across subjects, accounting for the fact that students may be weak in one subject and strong in another.
Instruction using the DI curriculum is fast-paced and involves frequent interaction between teachers and students. A placement test is used to assign students to the appropriate group, and the performance level of each group dictates the pace of instruction. Frequent assessment of student progress ensures the continued appropriateness of achievement-level grouping. Teachers monitor student performance every five to ten days, using methods such as calculating reading speed and reading-error ratios. These data, along with weekly grades, are used to regroup students according to performance level.
When implementing DI, teachers use "presentation books," which are lesson plans that include instructions for monitoring and assessing student progress and for providing immediate feedback to students. Curricular materials, daily lessons, and teachers' guides are available for grades K-6 in reading, language arts, spelling, and math; grades 4-6 in expressive writing; grades 3-6 in science; grades 3-12 in corrective reading; and grades 4-12 in corrective math.
The most significant DI organizational requirement for schools is the recommendation that all teachers of reading and English-language arts (and other subject areas, if desired) be scheduled to teach that subject at the same time during the school day. This practice allows for cross-class grouping of students of similar achievement levels. Schools are encouraged to have a peer coach (facilitator) to help teachers implement the program. Classroom paraprofessionals can also be integrated into the program, working as instructional aides, one-on-one tutors, and small-group leaders.
Implementation Support/Professional Development
Implementation support and professional development can be contracted from various providers (see Available Resources below). DI’s training program begins with a one-week, on-site, pre-implementation session for all staff. Implementation managers visit each school approximately four days per month for three years and help to model techniques, observe classrooms, address problems that teachers are having, and assist in the grouping of students. Weekly one-hour in-service sessions for teachers are also recommended, during which teachers may learn and practice DI techniques.
In 2005, the first-year cost of adopting school-wide DI is approximately $245,000 for an average school of 500 students. Schools implementing DI will incur costs for training and technical assistance, personnel, and materials. The following costs are for an average school of 500 students and 20 to 25 teachers: Technical assistance will cost $65,000 to $75,000 a year for three to five years, which includes direct costs for teacher training at the start of and during the school year, but does not include the cost of faculty time devoted to training. Training time amounts to five days of pre-implementation at the start of the school year for the entire faculty, plus at least one hour per week (approximately 4.5 days per year) for each teacher. Finally, curricular materials, purchased separately, cost approximately $210 per child in the first year, $165 per child in the second year, and $65 per child in subsequent years.
Issues to Consider
This program received a "promising" rating. Many of the evaluations of Direct Instruction were experimental in design, and those that were quasi-experimental used reasonably convincing comparison groups. The results varied significantly from study to study, with ten of the studies reporting mixed results, six reporting no significant differences between groups, and four reporting solely positive and significant differences for DI students. This lack of consistency among studies suggests that the overall evidence of DI effectiveness is limited. However, when results are considered across all studies, the majority of the evaluations reported at least some significant benefit accruing to students who participated in the Direct Instruction program. These benefits were found on standardized tests of general cognitive skills, reading, and mathematics, and on high school graduation rates.
A pattern to note is that more-recent studies of the DI program, conducted in the 1990s and later, were less likely to report significant, positive program effects for students participating in Direct Instruction. It is unclear whether this is an artifact of study methodology, more-effective comparison group programs, changes in student learning patterns, or something else.
Two studies assessed the Direct Instruction program in special-education settings, and neither reported significant findings favoring DI. Furthermore, on one measure in the study by Cole, Dale, and Mills (1991), Mediated Learning students significantly outscored DI students.
Additionally, two studies assessed the effects of DI on juvenile and young-adult crime and arrest rates, and DI was not shown to be effective at reducing or preventing juvenile delinquency.
The Direct Instruction program has been implemented in thousands of sites around the United States and in other countries. Some examples include Fairbanks, Alaska; Champaign and East St. Louis, Illinois; Flint and Ypsilanti, Michigan; Tupelo, Mississippi; Brooklyn, New York; Dayton. Ohio; Williamsburg County, South Carolina; Smithville, Tennessee; Seattle, Washington; Great Britain; and Canada.
Association for Direct Instruction
P.O. Box 10252
Eugene, OR 97440
Tel (800) 995-2464
Fax (541) 868-1397
National Institute for Direct Instruction
PO Box 11248
Eugene, OR 97440
Tel (877) 485-1973
Fax (541) 683-7543
There are several key contacts for schools or districts interested in implementing the DI program. The Association for Direct Instruction (http://www.adihome.org/) is a nonprofit organization dedicated to promoting and supporting the use of DI programs. Support includes several annual regional DI conferences, an annual national conference, publications, online networking and assistance, and two semiannual publications.
The National Institute for Direct Instruction (NIFDI) is a not-for-profit corporation dedicated to providing school districts with a solid training program and approach for the implementation of DI in districts, schools, and classrooms (http://www.nifdi.org/).
DI is a commercially published program, and curriculum materials are published by Science Research Associates (SRA), a division of McGraw-Hill (http://www.sraonline.com/index.php/home/curriculumsolutions/di/9). Materials may be purchased by individual grade and subject, or in a package suitable for school-wide implementations.
J/P Associates, Inc. provides access to Direct Instruction materials and staff development resources. They have also developed supplemental programs to support, extend, and enhance the Direct Instruction program. Information is available at www.jponline.com, or by contacting Robert Harris, Executive Director of Programming at email@example.com.
Benner, Gregory J., Alexandra Trout, Philip D. Nordness, J. Ron Nelson, Michael H. Epstein, Maria-Louisa Knobel, Alice Epstein, Ken Maguire, and Rodney Birdsell,
"The Effects of the Language for Learning Program on the Receptive Language Skills of Kindergarten Children," Journal of Direct Instruction, Vol. 2, No. 2, 2002, pp. 67–74.
Cole, Kevin N., Paulette E. Mills, and Philip S. Dale, “A Comparison of the Effects of Academic and Cognitive Curricula for Young Handicapped Children One and Two Years Postprogram,” Topics in Early Childhood Special Education, Vol. 9, No. 3, 1989, pp. 110–127.
Cole, Kevin N., Philip S. Dale, and Paulette E. Mills, "Individual Differences in Language Delayed Children's Responses to Direct and Interactive Preschool Instruction," Topics in Early Childhood Special Education, Vol. 11, No. 1, 1991, pp. 99–124.
Cole, Kevin N., Philip S. Dale, Paulette E. Mills, and Joseph R. Jenkins, “Interaction Between Early Intervention Curricula and Student Characteristics,” Exceptional Children, Vol. 60, No. 1, 1993, pp. 17–28.
Dale, Philip S., and Kevin N. Cole, “Comparison of Academic and Cognitive Programs for Young Handicapped Children,” Exceptional Children, Vol. 54, No. 5, 1988, pp. 439–447.
Kameenui, Edward J., Douglas W. Carnine, Craig B. Darch, and Marcy Stein, "Two Approaches to the Development Phase of Mathematics Instruction," The Elementary School Journal, Vol. 86, No. 5, 1986, pp. 633–650.
Lewis, Ann L., "An Experimental Evaluation of a Direct Instruction Programme (Corrective Reading) with Remedial Readers in a Comprehensive School," Educational Psychology, Vol. 2, No. 2, 1982, pp. 121–135.
Meyer, Linda A., "Long-Term Academic Effects of the Direct Instruction Project Follow Through," The Elementary School Journal, Vol. 84, No. 4, 1984, pp. 380–394.
Mills, Paulette E., Kevin N. Cole, Joseph R. Jenkins, and Philip S. Dale, "Early Exposure to Direct Instruction and Subsequent Juvenile Delinquency: A Prospective Examination," Exceptional Children, Vol. 69, No. 1, 2002, pp. 85–96.
Mills, Paulette E., Philip S. Dale, Kevin N. Cole, and Joseph R. Jenkins, "Follow-Up of Children from Academic and Cognitive Preschool Curricula at Age 9," Exceptional Children, Vol. 61, No. 4, 1995, pp. 378–393.
Mosley, Beatrice B., and W. V. Plue, “A Comparative Study of Four Curriculum Programs for Disadvantaged Preschool Children,” Hattiesburg. Miss.: University of Southern Mississippi, 1980.
Rawl, Ruth K., and Frances S. O'Tuel, "A Comparison of Three Prereading Approaches for Kindergarten Students," Reading Improvement, Vol. 19, No. 3, 1982, pp. 205–211.
Richardson, Ellis, Barbara DiBenedetto, Adolph Christ, Mark Press, and Bertrand G. Winsberg, “An Assessment of Two Methods for Remediating Reading Deficiencies,” Reading Improvement, Vol. 15, 1978, pp. 82–94.
Schweinhart, Lawrence J., and David P. Weikart, "The High/Scope Preschool Curriculum Comparison Study Through Age 23," Early Childhood Research Quarterly, Vol. 12, 1997, pp. 117–143.
Sexton, C. Waynel, "Effectiveness of the DISTAR Reading I Program in Developing First Graders' Language Skills," Journal of Educational Research, Vol. 82, No. 5, 1989, pp. 289–293.
Stein, Claudia L’E., and Jacquelin Goldman, “Beginning Reading Instruction for Children with Minimal Brain Dysfunction,” Journal of Learning Disabilities, Vol. 13, No. 4, 1980, pp. 52–55.
Summerell, Sally, and Gary G. Brannigan, "Comparison of Reading Programs for Children with Low Levels of Reading Readiness," Perceptual and Motor Skills, Vol. 44, No. 3, 1977, pp. 743 - 746.
Traweek, Dean, and Virginia Wise Berninger, "Comparisons of Beginning Literacy Programs: Alternative Paths to the Same Learning Outcome,” Learning Disability Quarterly, Vol. 20, No. 2, 1997, pp. 160 - 168.
Williamson, Florence, “DISTAR Reading--Research and Experiment,” Illinois University, Urbana, 1970.
Yu, Lei, and Robert Rachor, “The Two-Year Evaluation of the Three-Year Direct Instruction Program in an Urban Public School System,” Paper presented at the Annual Meeting of the American Educational Research Association, 2000, New Orleans, LA.