oversight

Texas Consolidated State Performance Report

Published by the Department of Education, Office of Inspector General on 2006-03-21.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                         UNITED STATES DEPARTMENT OF EDUCATION
                               OFFICE OF INSPECTOR GENERAL
                                  1999 BRYAN STREET, HARWOOD CENTER, SUITE 1440 

                                                DALLAS, TEXAS 75201-6817 

                                                   PHONE: (214) 661-9530 

                               AUDIT FAX: (214) 661-9531 INVESTIGATION FAX: (214) 661-9589




                                                     March 21, 2006

                                                                                           Control Number
                                                                                           ED-OIG/A06F0020

Dr. Shirley J. Neeley
Commissioner of Education
Texas Education Agency
1701 North Congress Avenue
Austin, TX 78701

Dear Dr. Neeley:

This Final Audit Report, entitled Data Quality Review of the Texas Consolidated State
Performance Report presents the results of our audit. The purpose of the audit was to determine
whether the Texas Education Agency’s required reporting of dropout and graduation rates in the
2003-2004 Consolidated State Performance Report were supported by reliable data and met the
requirements of the Elementary and Secondary Education Act (ESEA). Our review covered the
reporting period of July 1, 2003 – June 30, 2004.



                                                 BACKGROUND 



Sections 9302 and 9303 of the ESEA, as amended by the No Child Left Behind Act of 2001
(NCLB) provide to States the option of applying for and reporting on multiple ESEA programs
through a single consolidated application and report.

The Consolidated State Performance Report includes the following ESEA programs:

   •     Title I, Part A, Part B, Subpart 3, Part C, Part D, and Part F
   •     Title II, Part A and Part D
   •     Title III, Part A
   •     Title IV, Part A, Subparts 1 & 2 and Part B
   •     Title V, Part A
   •     Title VI, Section 6111 and Part B

The NCLB Consolidated State Performance Reports (CSPR) consist of two information
collections. Part I of the CSPR must be submitted in January and provide information from the
prior school year related to the five ESEA Goals. Part II of the CSPR, due to the Department by


       Our mission is to promote the efficiency, effectiveness, and integrity of the Department’s programs and operations
ED-OIG/A06F0020                                                                         	Page 2 of 6


April 15, consists of information related to State activities and the outcomes of specific ESEA
programs. The five ESEA Goals established in the June 2002 Consolidated State Application
were as follows:

   • 	 Performance Goal 1: By 2013-2014, all students will reach high standards, at a
       minimum attaining proficiency or better in reading/language arts and mathematics.
   • 	 Performance Goal 2: All limited English proficient students will become proficient in
       English and reach high academic standards, at a minimum attaining proficiency or better
       in reading/language arts and mathematics.
   • 	 Performance Goal 3: By 2005-2006, all students will be taught by highly qualified
       teachers.
   • 	 Performance Goal 4: All students will be educated in learning environments that are
       safe, drug free, and conducive to learning.
   • 	 Performance Goal 5: All students will graduate from high school.

Texas has a comprehensive data collection system to gather information on their students, Public
Education Information Management System (PEIMS). The first PEIMS data collection took
place in the fall of 1987. Districts were responsible for reporting organizational, financial, and
staff information. The following year, dropout records became the first individual student data
records submitted through PEIMS. A Person Identification Database system was implemented
shortly thereafter, enabling records for an individual to be linked across collections by matching
identification information. With student-level data and a system for linking student records,
Texas Education Agency (TEA) could produce automated aggregations of campus-, district-, and
state-level information.

In the early 1990s, districts began submitting student-level enrollment and graduation records.
This information, combined with the dropout record, enabled TEA to analyze different statuses
attained by students on an annual basis. It also became possible to consider tracking student
progress across multiple years. According to the TEA Department of Accountability and Data
Quality, as PEIMS continued to evolve, refinements in data collection, processing, and reporting
helped meet the growing demand for reliable information about public education. By the late
1990s, districts provided information on all students who left the district, not just students who
dropped out or graduated. With this information, TEA had the means to calculate a longitudinal
graduation rate or the rate of graduation for students as they progressed through Grades 9-12 of
high school. The group of students who progress from Grade 9-12 are called a cohort.

The definitions below are terms used when discussing the cohort.

   • 	 Graduate – An individual who has completed high school and has received formal
       recognition from school authorities.
   • 	 Leaver – The status of a student who was enrolled or in attendance during a school year.
       Each fall, returning students are reported on enrollment records and students who left
       during the previous year or did not return are reported on "leaver records" describing the
       circumstances of a student’s departure. School leavers are categorized as dropouts, or
       students who withdraw to: (a) enroll in other public or private schools in the state; (b)
       enroll in schools outside the state; (c) enroll in colleges or GED preparation programs; or
       (d) enter home schooling.
ED-OIG/A06F0020                                                                            	Page 3 of 6


   • 	 Cohort - Students who started high school (i.e., ninth grade) plus student transfers in,
       less student transfers out in year Y; plus student transfers in, less student transfers out in
       year Y+1; plus student transfers in, less student transfers out in year Y+2; plus student
       transfers in, less student transfers out in year Y+3.

For our review, we selected the three largest school districts in Texas--Dallas Independent
School District (DISD); Fort Worth Independent School District (FWISD); and Houston
Independent School District (HISD)-- and visited the three largest high schools within those
districts.



                                      AUDIT RESULTS 



TEA met the requirements of ESEA by reporting dropout and graduation rates. We determined
that for the nine schools reviewed, TEA collected generally reliable data to support the dropout
rates reported in the 2003-2004 CSPR. However, data used to compute graduation rates were
not as reliable. In its comments to the draft report, TEA concurred with our recommendation.
The full text of TEA’s comments on the draft report is included as an Attachment to the report.


FINDING – Graduation Rates Were Not Always Reliable

Although TEA data for dropout rates were generally reliable, graduation rates reported in the
2003-2004 CSPR were not always verifiably reliable. Specifically, we determined the data in the
first three years of the cohort leavers reviewed were not sufficiently accurate to produce reliable
data. However, the data for the fourth year of the cohort leavers were sufficiently accurate and
reliable. Therefore, we determined the low rate of deficient support, as discussed below, to be
within acceptable levels and determined the data were generally reliable.

The annual dropout rate is calculated by dividing the number of grade 9-12 dropouts during the
school year by the total number of students served in those grades during the school year. To
review data factored into the annual dropout rate calculation, we selected a sample of 310
students from the 9-12 grade leavers for the reporting year. These leavers included transfers
between Texas districts, transfers out of Texas, and dropouts, among others. Of the leaver
records reviewed, 19 (6 percent) had inadequate or no supporting documentation. Without
sufficient supporting documentation we could not determine the accuracy of the leaver codes.
However, we found the six percent error rate to be within acceptable levels and determined the
data were verifiably reliable.

Graduation rates are calculated by dividing the number of graduates in the reporting year by the
cohort for that year. To review data used in the graduation rate calculation, we selected two
samples. The first sample was of the cohort graduates in school year 2002-2003 and the second
sample was of leavers over the four-year cohort period, school years 1999-2000 through 2002-
2003.
ED-OIG/A06F0020                                                                         	Page 4 of 6


   • 	 The graduate sample data populate the numerator of the cohort graduation rate formula.
       All of the 364 graduate records reviewed were fully supported. We determined the data
       were verifiably reliable.
   • 	 The cohort leavers’ sample data populate the denominator of the cohort graduation rate
       formula. Of the 132 cohort leaver records reviewed, 41 (31 percent) had inadequate or
       no supporting documentation. Without sufficient supporting documentation we could not
       determine whether the leaver codes were accurate. Additional analysis showed that 90
       percent of the supporting document deficiencies occurred in the first three years of the
       cohort with only 10 percent occurring in the fourth year. The reduced supporting
       document deficiencies were the result of TEA and district implementation of improved
       leaver data procedures during 2000-2002. We determined the data in the first three years
       of the cohort were not sufficiently accurate to produce reliable data. However, the data
       for the fourth year of the cohort leavers was sufficiently accurate and verifiably reliable.

These deficiencies occurred because of inadequate guidance, training, and monitoring. TEA and
the districts we reviewed identified leaver data deficiencies. They responded with increased
oversight and implemented improved procedures for leaver data management through additional
training, more detailed guidance and procedural manuals, and increased monitoring. Examples
of increased oversight and improved procedures were:
    • 	 District provided training that specifically addressed leaver data management issues, such
        as registrar training each fall for the coming school year.
    • 	 TEA updates of PEIMS Standards and the Student Attendance Accounting Handbook, in
        turn, generated updates in the corresponding district guides and procedure manuals.
    • 	 District- and campus-level data audits and monitoring of data quality through periodic
        reviews of data input, system reports, and comparisons to source documents.

Each of the three districts we reviewed implemented improved procedures in 2000-2001 and/or
2001-2002. Improved procedures resulted in better supporting documentation and increased data
accuracy and reliability in 2002-2003. Maintaining continual leaver data process improvement
should further increase the accuracy of data in subsequent years.

The No Child Left Behind Act of 2001, Public Law 107-110, enacted January 8, 2002, places
emphasis on and strengthens accountability for results. It also increases the importance of the
Department having reliable and valid data. Unreliable data causes graduation and dropout rates
to be inaccurate. It is important that the data are reliable because the graduation and dropout
rates can be used by the Department, the State, and the public in comparisons with other states’
performances. The information can also be used to assess school, district, and State
accountability.


Recommendation

We recommend that the Assistant Secretary for Elementary and Secondary Education require
TEA to maintain continual leaver data process improvements by increasing oversight of
reporting and reliability, and developing and implementing improved procedures and monitoring.

TEA concurred with our recommendation.
ED-OIG/A06F0020                                                                            	Page 5 of 6



                  OBJECTIVES, SCOPE, AND METHODOLOGY 



The objective of our audit was to determine whether TEA’s required reporting of dropout and
graduation rates in the 2003-2004 CPSR were supported by reliable data and met the
requirements of the ESEA. Specifically, we determined whether the—

   • 	 Data for graduates were accurate and documented;
   • 	 Leaver data in the cohort were accurate and documented; and
   • 	 Leaver data in the reporting year for dropouts were accurate and documented.

To accomplish our objective, we—

   •	   Reviewed written policies and procedures for monitoring school supplied data;
   •	   Reviewed applicable laws, regulations, and other guidance;
   •	   Interviewed officials at TEA and selected school districts; and
   •	   Reviewed student files at the three largest high schools in the selected school districts.

We judgmentally selected DISD, FWISD, and HISD for review because they were the largest
districts in the State. For testing purposes, we had TEA extract from PEIMS a database of
freshman students in the 2000 academic year class from each of the selected high schools that
constituted the cohort group, plus additions and deletions to the cohort. From this extract, we
created a database of graduates and another database of leavers from the cohort.

   • 	 From the graduates, we drew a random 10 percent sample with a maximum of 50 and a
       minimum of 10 students for review at each high school from a universe of 4,345 students.
   • 	 From the cohort leavers, we drew a 10 percent random sample of students (with the same
       minimum/maximum as above) that left the cohort at each high school to ensure they were
       properly classified from a universe of 1,220 students.

We also obtained another extract of students that left school during the 2003-2004 academic
reporting year. From this extract, we drew a 10 percent random sample with a maximum of 50
and a minimum of 10 students at each high school to ensure they were properly classified from a
universe of 3,161 students.

To achieve our audit objective, we relied, in part, on computer-processed data related to the
student information contained in TEA’s PEIMS database. We verified the completeness of the
data by comparing source records to computer-generated data, and verified the authenticity by
comparing computer-generated data to source documents. Based on these tests, we concluded
that the data were sufficiently reliable to be used in meeting the audit’s objective.

We conducted our fieldwork at TEA, DISD, FWISD, and HISD between June 13, 2005, and
September 21, 2005. An exit conference was held with TEA officials on December 15, 2005.

Our audit was performed in accordance with generally accepted government auditing standards
appropriate to the scope of the review described above.
ED-OIG/A06F0020                                                                         Page 6 of 6



                            ADMINISTRATIVE MATTERS



Statements that managerial practices need improvements, as well as other conclusions and
recommendations in this report, represent the opinions of the Office of Inspector General.
Determinations of corrective action to be taken will be made by the appropriate Department of
Education officials.

If you have any additional comments or information that you believe may have a bearing on the
resolution of this audit, you should send them directly to the following Education Department
official, who will consider them before taking final Departmental action on this audit:

                              Henry Johnson
                              Assistant Secretary
                              Office of Elementary and Secondary Education
                              U.S. Department of Education
                              400 Maryland Ave., SW
                              Washington, DC 20202

It is the policy of the U. S. Department of Education to expedite the resolution of audits by
initiating timely action on the findings and recommendations contained therein. Therefore,
receipt of your comments within 30 days would be appreciated.

In accordance with the Freedom of Information Act (5 U.S.C. §552), reports issued by the Office
of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.


                                             Sincerely,


                                             /s/
                                             Sherri L. Demmel
                                             Regional Inspector General
                                               for Audit




Attachment
                                                                                                                  Attachment

                  T E X A S E D U C A T I O N A G E N C Y

                  1701 North Congress Ave.Austin, Texas 78701-1494 512/463-9734  FAX: 512/463-9838  http://www.tea.state.tx.us



Shirley J. Neeley, Ed.D.
    Commissioner


March 10, 2006



Sherri L. Demmel
1999 Bryan Street
Suite 1440
Dallas, Texas 75201

Dear Ms. Demmel:

This letter is in response to your letter of February 10, 2006, Control Number ED-OIG/A06F0020, regarding
the results of the draft audit report for the Texas Education Agency (TEA).

With regard to the recommendation:

"We recommend that the Assistant Secretary for Elementary and Secondary Education require TEA to
maintain continual leaver data process improvements by increasing oversight of reporting and reliability, and
developing and implementing improved procedures and monitoring."

TEA responds:

TEA concurs with the recommendation to maintain continual leaver data process improvements by increasing
oversight of reporting and reliability, and developing and implementing improved procedures and monitoring.

Actions include:
1.	 Monitor leaver data submissions and apply interventions
    Description: Under the data integrity component of a performance-based monitoring system, districts with
    serious and systematic leaver data reporting problems are subject to interventions based on the level of
    data integrity concern. Emphasis is on a continuous improvement process, in which districts undertake
    activities that promote improved data reporting, and the agency monitors their progress. Interventions can
    lead to corrective actions or sanctions.
    Target completion date: Ongoing

2.	 Audit of 2001-02 dropout data by independent auditors
    Description: Every district's 2001-02 leaver data were evaluated by independent auditors. Audit reports
    were submitted to the agency, and the results provided to the public by the agency.
    Target completion date: Complete




         "Good, Better, Best—never let it rest—until your good is better—and your better is BEST!"
                                                                                                     Attachment
3. 	 Ensure reporting of allieavers
     Description: The underreported students rate was implemented and integrated into the accountability
     rating system. The rate measures the completeness of a district's reporting of students served in the prior
     year by providing current year enrollment or leaver information on each student. Standards on the
     measure continue to be made more rigorous. Districts failing to meet standards on the measure may
     receive sanctions and are subject to agency investigation.
     Target completion date: Ongoing

4. 	 Implement accountability ratings consequences for inaccurate reporting
     Description: A district with serious and systematic data reporting problems is subject to several
     accountability ratings consequences: a high accountability rating cannot be earned, the accountability
     rating can be lowered, and an accountability rating that indicates data integrity issues can be assigned.
     Target completion date: Ongoing

5. 	 Ensure accurate reporting of student identification information
     Description: Student records reported with incorrect identification information cannot be matched to other
     records for the same student, which compromises the ability to produce accurate longitudinal rates.
     Standards on this measure continue to be made more rigorous. Districts must meet standards for accuracy
     or submit improvement plans.
     Target completion date: Ongoing

6. 	 Provide Public Education Information Management System (PEIMS) resources for guidance and
     training
     Description: Published annually by TEA, the PEIMS Data Standards provide detailed reporting
     requirements, data element definitions, and TEA contact information. PEIMS coordinators in each
     regional education service center serve as consultants to the districts in preparing their data submissions,
     as well as provide training and technical assistance. TEA staff members conduct workshops for districts
     and regional education service center staff who work with PEIMS data. Software made available to
     districts shortly after the beginning of each school year enables them to identify data problems and correct
     data errors before the submission is due. The PEIMS web page provides on-line access to general
     information about PEIMS, the PElMS Data Standards, other reporting instructions, and contact
     information for inquiries.
     Target completion date: Ongoing


Sincerely,




Criss Cloudt, Associate Commissioner
Accountability and Data Quality