By Regina Waugh and Jennifer Slivka
This is the fifth module of NDTAC's Self-Study Toolkit. The Self-Study Toolkit is designed to help facilities measure how they are doing and determine what they can do to improve. All modules in the toolkit contain background information on the issue, a printable data collection table, and a list of additional resources. Links to modules on other topics, as well as links to other NDTAC toolkits, can be foundon NDTAC's Toolkits Web page.
This module is divided into three parts:
Part I. Introduction
Part II. Data Collection: How Am I Doing?
Part III. Resources
Part I outlines why collecting and using data is important and provides tips on collecting quality data. Part II is designed to help you determine how your facility is doing in terms of collecting quality data that is accurate and complete and offers suggestions on how to improve collection practices. Part III provides additional resources on data collection, including further reading, step-by-step guides, and information on data systems currently in place throughout the country.
It's the Law
In 2001, President Bush’s No Child Left Behind (NCLB) education plan proclaimed that the Nation’s education programs would be held accountable for student outcomes across multiple domains, including areas that have largely been overlooked in the past. For those programs funded by Title I, Part D, the emphasis on data and accountability has manifested itself in the requirement that programs be evaluated at least once every 3 years. The goal of the evaluation, according to Title I, Part D, Subpart 3, Section 1431 (A), is to “determine Part D’s impact on the ability of participants to:
- Maintain and improve educational achievement;
- Accrue school credits that meet State requirements for grade promotion and secondary school graduation;
- Make the transition to a regular program or the education program operated by an LEA, and
- Complete secondary school (or secondary school equivalency requirements) and obtain employment after leaving the institution.”
Meeting the evaluation and reporting requirements of Title I, Part D necessitates effectively collecting and using student data.
It can help your students succeed
Fortunately, collecting the student demographic and achievement data required by State and Federal law can help improve the quality of the programming offered at your facility. Monitoring student achievement data provides States and facilities with greater information from which to make decisions about institutional priorities and efficiently allocate resources. At the classroom level, collecting and monitoring data allows teachers to track individual student progress, catch small problems before they become big problems, and effectively tailor instruction to individual student needs.
Data-driven decision making, that is the process of using data to make instructional, managerial, and operational choices, is garnering increasing attention in the field of education. At its most basic level, data-driven decision making can be reduced to five stages:
- Setting goals
- Planning implementation
- Monitoring progress
- Analyzing results
Ultimately, teachers and adminstrators are better informed by going through this process, and can use the results to improve instructional and administrative practices.
Teachers and administrators already use data to make decisions on a regular basis, usually without even thinking about it. Nevertheless, formalizing the decision making process can seem overwhelming. A good place to start is by working with your staff to answer the following questions:
- Where are we now?
- Where do we want to go?
- What does the research say?
- What have we learned from our own experience?
Conducting a thorough assessment of students at entry, and reviewing records from former education placements, can form the baseline for measuring achievement over the course of their stay in the facility. At an individual level, this can help answer the question of “where we are now?” and form the front end of a student’s transition plan, providing a record of baseline test scores and documentation of any learning, mental or emotional disabilities. In addition, a comprehensive assessment at entry, including a student interview, can help students, teachers and administrators think about short and long term goals for each student (“where do we want to go?”).
At the aggregate facility or State level, taking a look at student achievement data can help you to get a handle on how your students are performing while involved in your education program. Focusing on the average gain that students make in reading and math, the average number of credits earned, and the number of students who transition from your program into community schools or job placement each year will not only allow you to accurately report the required data to State and Federal departments of education; you will also be able to use that data to determine program strengths and weakness in terms of meeting student needs, and to set realistic goals for student achievement in the future.
However, for the data you collect to be useful, it must be an accurate reflection of what is actually happening within your facility. “It is important to understand that ‘quality data’ is not something that just occurs when an office clerk hits the right number on a keyboard. It is a process. We need to pay attention to the process involved because the information derived from school data is vital.”
The collection of quality data does not occur by accident, nor does it occur in a vacuum. Your facility staff must be aware of the importance of collecting and using high quality data. The following four areas are of primary importance:
- Accuracy —The information must be correct and complete. Data entry procedures must be reliable across the facility to ensure that the data collected will be the same regardless of who completes the recording forms.
- Security —Data must be kept safe, and the confidentiality of students and staff must be protected.
- Utility —The data have to provide the information necessary to inform the decisions you need to make.
- Timeliness —Deadlines must be understood facilitywide, and data must be entered in a timely manner.
The four factors detailed above are important in any educational environment. Timeliness, however, is especially important in an institutional environment where the population is constantly changing. Data must be updated systematically and frequently to ensure that records are current when a student leaves a facility. Y ou can increase a student’s chances of successful transition into his or her community schools by regularly updating student records and establishing a formal procedure for transferring them to the student’s next placement.
You can see the achievement of the students in your facility on a daily basis. Achievement data can help you to communicate your students’ improvement to people outside the facility and are a powerful tool when advocating for funding for your education program.
Consider this scenario:
Students leaving your juvenile justice institution experience higher school achievement, higher rates of school completion, lower rates of school dropout, and lower rates of juvenile arrest for both violent and nonviolent charges than other institutions in your State. Some of your grant funding is ending this year, and you need to demonstrate that your program’s benefits to society exceed the program’s running costs. A potential funder asks if you could put a dollar figure on program benefits after 3 years.
Luckily, you have been keeping careful track not only of where your money has been spent, but also of the specific changes that your students have made while enrolled in your education program. You conducted a thorough assessment of each student at entry and thus had baseline reading and math achievement scores; students were tested for learning disabilities and where appropriate an Individual Education Plan (IEP) was developed. In addition, you requested, received, and reviewed their records from previous educational placements which allowed you to compare the number of school credits they had at entry and at exit from your program. You have also been keeping track of the number of students who completed their high school diploma or GED both within your facility and in the six months following their release into the community. Putting together all of this information with the records of how your program has allocated funds allows you to provide an accurate account of the dollar value of the educational programming provided in your facility.
Cost-benefit analysis can be an effective approach to determining whether a particular program is worthwhile. In the juvenile justice setting, cost-benefit analysis involves adding up the value of benefits of a course of action (a new vocational education will provide job training and skills to participants), then subtracting the costs associated with that action over a specified period of time (buying tools for hands-on activities and instructor salaries). While benefits tend to be accrued over time, costs can be either one time (constructing an additional building) or ongoing (gas costs for heating the new building). This technique can also be used to estimate the dollar value of savings to society created by a program that reduces recidivism.
Simple cost-benefit analyses may take into account only financial costs and benefits. A more sophisticated model would put dollar values on the opportunity cost of a course of action (i.e., the value of the most valuable, forgone, alternative action), as well as intangible costs and benefits, such as public safety and lifecourse gains. For example, a study of approximately 3,000 exoffenders 3 years after release found a 29 percent drop in recidivism for educational program participants. The study estimated that for every dollar spent on correctional education, two dollars in correction costs are saved, before including police and court costs.
Collecting achievement data and conducting a cost-benefit analysis on the educational programming offered in the juvenile justice system provides a rare opportunity to demonstrate the potential for positive outcomes for juvenile justice students. We are already bombarded by data regarding youth crime and violence. Despite the fact that “there is virtually no change between the percentage of crimes committed by kids today, as compared to thirty years-ago,” national concern with youth violence and the call for stricter punishments for juvenile offenders have resulted in a higher proportion of young people behind bars than ever before. Between 1985 and 1994, the arrest rate for juveniles increased by 67.2 percent. Harnessing data on the educational achievements of incarcerated youth not only provides an opportunity for continuous improvement, but also can be a powerful tool for communicating the value of quality educational programming.
 Title I, Part D, Subpart 3, Section 1431 (A) of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 6301 et seq.).
 Serim, F. (2004). Using data-driven decision-making to guide student learning. Washington, DC: Consortium for School Networking.
 Kinder, A. (2002, Summer). D3M: Helping schools distill data. NCREL’s Learning Point.
 National Central Regional Educational Laboratory. (2002). Summary of the steering committee meeting for the National Panel on Closing Achievement Gaps. Retrieved August 15, 2005, from: http://www.ncrel.org/gap/summary.htm
 National Forum on Education Statistics. (2004). Forum guide to building a culture of quality data: A school & district resource (NFES 2005-801). U. S. Department of Education. Washington, DC: National Center for Education Statistics.
 Steurer, S., Smith, L., & Tracy, A. (2001) Three state recidivism study. Lanham, MD: Correctional Education Association.
 Chambliss, W. J. (1998). The myth of teenage “super-predators”. Washington, DC: The Justice Policy Institute.
 Federal Bureau of Investigation. (1995). Uniform crime reports for 1994. Washington, DC: Federal Bureau of Investigation.
Published August 2005