oversight

The Texas Education Agency's System of Internal Control Over Statewide Test Results

Published by the Department of Education, Office of Inspector General on 2013-09-26.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                                         UNITED STATES DEPARTMENT OF EDUCATION
                                                            OFFICE OF INSPECTOR GENERAL

                                                                                                                  AUDIT SERVICES
                                                                                                     Chicago/Kansas City Audit Region


                                                                                                    Control Number
                                                                                                    ED-OIG/A05N0006


September 26, 2013


Michael L. Williams
Commissioner of Education
Texas Education Agency
1701 North Congress Avenue
Austin, TX 78701

Dear Mr. Williams:

This Final Audit Report, “The Texas Education Agency’s System of Internal Control Over
Statewide Test Results,” presents the results of our assessment of selected aspects of the systems
of internal control over statewide test results designed and implemented by the Texas Education
Agency (TEA) and three local educational agencies (LEAs). Our work in the State of Texas
(Texas) was part of a nationwide audit of the systems of internal control over statewide test
results put in place by the U.S. Department of Education (Department) and five State educational
agencies (SEAs). The purpose of the nationwide audit is to determine whether the Department
and the five SEAs have systems of internal control that prevent, detect, and require corrective
action if they find indicators of inaccurate, unreliable, or incomplete statewide test results. We
will issue an audit report that presents the results of the nationwide audit to the Department.

In Texas, we performed our audit work at TEA, La Joya Independent School District (La Joya),
Lufkin Independent School District (Lufkin), and Marion Independent School District (Marion).
Our audit covered statewide test results for school years 2007–2008 through 2009–2010.1 The
purpose of this report is to separately address internal control weaknesses so the Department,
TEA, and La Joya can take appropriate corrective action before the next round of statewide tests.

TEA and La Joya need to improve their systems of internal control designed to prevent, detect,
and require corrective action if they find indicators of inaccurate, unreliable, or incomplete
statewide test results. TEA could improve its system of internal control by (1) using reviews of
test results and analyses of erasure data to identify LEAs and schools to monitor, (2) identifying
ways in which LEAs and schools can improperly influence the State of Texas Assessments of
Academic Readiness (STAAR) test results and designing mitigating controls, and


1
    In Texas, classes generally run from August to May or June.



    The Department of Education’s mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                      excellence and ensuring equal access.
Final Report
ED–OIG/A05N0006                                                                          Page 2 of 15

(3) documenting the corrective actions that it recommends and verifying that the LEAs
implement the corrective actions.

La Joya could strengthen its system of internal control by (1) properly administering statewide
tests, (2) adequately documenting its reviews of potential test administration irregularities,
(3) retaining records of all reviews of potential test administration irregularities, and (4) reporting
all test administration irregularities to TEA. We did not identify any significant control
deficiencies at Lufkin or Marion.

In response to the draft of this report, both TEA and La Joya agreed with the findings and
recommendations. In addition, they described the corrective actions that they have already
initiated and those that they plan to initiate to address our recommendations. See Attachment 2
for the full text of the comments. Other than including TEA’s and La Joya’s comments, we did
not make any changes to the report.




                                        BACKGROUND



The Elementary and Secondary Education Act of 1965, as amended (ESEA), requires States to
establish a set of high-quality, yearly student academic tests. The tests must measure the
proficiency of students in mathematics, reading or language arts, and science. States must
establish a single minimum percentage of students who are required to meet or exceed the
proficient level on these tests. States use these tests to determine the yearly performance of the
SEA, each LEA, and each public school in the State. Section 1111(b)(3)(C)(iii) requires each
State’s tests to be valid, reliable, and consistent with relevant, nationally recognized professional
and technical standards.

From school years 2007–2008 through 2012–2013, TEA developed seven tests to measure SEA,
LEA, and school performance:

       1.	 The Texas Assessment of Knowledge and Skills (TAKS) is a general test, offered
           with or without accommodations, given to students in grades 3 through 11 and
           exit-level students during school years 2007–2008 through 2010–2011. TEA has
           been phasing out the TAKS during school years 2011–2012 through 2013–2014. 

       2.	 The TAKS-Modified is an alternative test that uses modified academic achievement
           standards and is given to students receiving special education services. TEA has been
           phasing out the TAKS-Modified during school years 2011–2012 through 2013–2014. 

       3.	 The STAAR is a test or set of tests given to students in different grades. The STAAR
           given to students in grades 3 through 8 assesses the same subjects as the TAKS. The
           STAAR given to high school students consists of 12 end–of–course tests rather than
           the grade-specific TAKS. The STAAR began replacing the TAKS in school year
           2011–2012.
Final Report
ED–OIG/A05N0006                                                                                     Page 3 of 15

        4.	 The STAAR-Modified is an alternative test that uses modified academic achievement
            standards and is given to students receiving special education services. The STAAR-
            Modified began replacing the TAKS-Modified in school year 2011–2012.

        5.	 The STAAR-Alternate is a test given to students in grades 3 through high school who
            have significant cognitive disabilities and are receiving special education services. 

        6.	 The STAAR-L is a version of the STAAR test given to English language learners
            who meet the linguistic accommodation requirements.

        7.	 The Texas English Language Proficiency Assessment System consists of tests given
            to students with limited English language proficiency to assess their progress in
            learning the English language.

Students take these tests at various times throughout the school year depending on their grade
levels and the specific test being administered.

TEA developed all the tests with its contractor. After TEA’s Committee of Texas Educators
approved the tests, the contractor printed the test materials and distributed them to LEAs and
schools. Schools administered the tests, and the LEA or schools returned completed tests and
unused test materials to the contractor. The contractor scored the tests and provided the results
to TEA for review and verification. After TEA verified the results, the contractor made the
results available to LEAs and schools via a secure Web site.

TEA uploaded required test results data to the Department’s Education Data Exchange Network
(EDEN) Submission System, part of the Department’s EDFacts initiative. EDFacts is a
Department initiative to make performance data available for policy, management, and budget
decisions for all K–12 educational programs. EDFacts centralizes K–12 performance data
supplied by SEAs with other data, such as financial grant information, within the Department to
enable better analysis and use in policy development, planning, and management. EDFacts
includes statewide test data on student proficiency, participation rates, and graduation rates at the
SEA, LEA, and school levels.

As part of our LEA and school selection methodology, we used the EDFacts data to identify
935 LEAs in Texas that had at least 1 school with total enrollment of more than 200 students
during school years 2007–2008, 2008–2009, and 2009–2010. These 935 LEAs had more than
4.3 million students attending 6,105 schools that each had more than 200 students. For each
grade tested in the subjects of math and reading at each of the schools, we calculated a risk score.
We calculated the risk scores to evaluate how anomalous increases and decreases in student
proficiency for a grade and subject were from one year to the next in relation to the change for
that grade and subject across the State.2 A school’s composite risk score was the average of the
school’s five highest risk scores across all years, grades, and subjects. We selected the three
LEAs with the highest composite risk scores (see Scope and Methodology for a further


2
 We used the risk score to select LEAs and schools for internal control reviews, not to determine whether cheating
occurred at a particular LEA or school.
Final Report
ED–OIG/A05N0006                                                                          Page 4 of 15

description of our selection methodology). We then selected two schools from each of the three
LEAs with the largest annual test score fluctuations.

Tables 1 and 2 show the number of schools and students for the three LEAs and six schools that
we selected for review.

Table 1. Schools and Students in the LEAs Reviewed
                                                                                       Number of
     LEA                               Number of Schools                                Students
La Joya                                      30                                          26,401
Lufkin                                         7                                          5,878
Marion                                         3                                          1,053
    Totals                                   40                                          33,332

Table 2. Students in the Schools Reviewed
                                                                                       Number of
     LEA                               School Name                                      Students
La Joya           La Joya High School                                                     2,186
La Joya           Sam Fordyce Elementary School                                             517
Lufkin            Brandon Elementary School                                                 449
Lufkin            Lufkin High School                                                      2,257
Marion            Marion High School                                                        409
Marion            Marion Middle School                                                      361
     Total                                                                                6,179




                                      AUDIT RESULTS



TEA could improve its system of internal control over preventing, detecting, and taking
corrective actions if it finds indicators of inaccurate, unreliable, or incomplete statewide test
results. Specifically, TEA could strengthen its risk assessment and monitoring processes by
(1) using reviews of test results and analyses of erasure data to identify LEAs and schools to
monitor, (2) identifying ways that LEAs and schools can improperly influence STAAR test
results and designing mitigating controls, and (3) documenting the corrective actions that it
recommends and verifying that the LEAs implement the corrective actions.

TEA performed some internal control activities related to the administration of statewide tests.
The control activities included providing LEA and school personnel with guidance manuals and
training documents that emphasized test security and proper handling of test administration
irregularities, as well as requirements for reporting improper conduct. TEA also required
campus coordinators and principals, LEA coordinators, and test administrators to sign a
statement acknowledging these guidelines. In addition, TEA generally established adequate
control over input, processing, and reporting of statewide test results.
Final Report
ED–OIG/A05N0006                                                                          Page 5 of 15

We also identified weaknesses in La Joya’s system of internal control over preventing, detecting,
and taking corrective actions if it finds potential test administration irregularities that could call
into question the validity of the statewide test results. La Joya could strengthen its system of
internal control by (1) properly administering statewide tests, (2) adequately documenting its
reviews of potential test administration irregularities, (3) retaining records of all reviews of
potential test administration irregularities, and (4) reporting all test administration irregularities
to TEA.

We did not identify any significant control deficiencies at Lufkin or Marion. The two LEAs
generally were following the guidance that TEA provided.

In response to the draft of this report, both TEA and La Joya agreed with the findings and
recommendations. In addition, they described the corrective actions that they have already
initiated and those that they plan to initiate to address our recommendations. See Attachment 2
for the full text of the comments.

FINDING NO. 1 – TEA Could Strengthen Its Risk Assessment and Monitoring Processes

TEA could strengthen its risk assessment and monitoring processes by (1) using reviews of test
results and analyses of erasure data to identify LEAs and schools to monitor, (2) identifying
ways that LEAs and schools can improperly influence STAAR test results and designing
mitigating controls, and (3) documenting the corrective actions that it recommends and verifying
that the LEAs implement the corrective actions.

TEA Did Not Use Analyses of Test Results and Erasures to Identify LEAs or Schools to
Monitor
From school years 2007–2008 through 2010–2011, TEA conducted onsite monitoring visits to
schools during test administration. After school year 2010–2011, TEA relied on LEAs to
conduct their own onsite monitoring using guidelines that TEA provided to them. The LEAs
were supposed to report test administration irregularities to TEA so that TEA could consider
additional follow-up activities. However, TEA could identify more potential test administration
irregularities if it used the results of forensic analyses, rather than relying solely on LEA
reporting. TEA had access to test results data and erasure analyses from its testing contractor,
but it did not use these resources to proactively identify LEAs and schools with possible test
administration irregularities.

TEA received test results data from its statewide testing contractor and used the data to identify
test values outside an expected range, such as excessive absences during testing. However, TEA
did not analyze year-to-year test score changes to identify possible test administration
irregularities and determine potential follow-up activities. As of school year 2012–2013, TEA
was still reviewing the practice of using test score analyses to identify possible test
administration irregularities and had not implemented the practice.

In addition, since school year 2005–2006, TEA’s statewide testing contracts have contained
provisions for collecting and summarizing erasure data for all schools and students. TEA
required its statewide testing contractor to summarize the data by listing the total number of
erasures, wrong-to-right erasures, right-to-wrong erasures, and wrong-to-wrong erasures by
subject and student. In addition, when an LEA reported a test administration irregularity, TEA
Final Report
ED–OIG/A05N0006                                                                         Page 6 of 15

had the option to require the testing contractor to report any class in the subject school with an
average erasure rate greater than three standard deviations above the statewide mean in wrong-
to-right erasures for each of the subjects within each grade tested. TEA had the option to use
these analyses to obtain additional information on the specific test administration irregularity.
TEA agreed that reviews of test results and analyses of erasure data could help identify LEAs
and schools to monitor. However, TEA informed us that its statewide testing contractor
cautioned in a report to the Texas Technical Advisory Committee (TTAC) that erasure analyses
alone provided no insight into the reason behind excessive erasures and stressed that these
statistical analyses serve only to identify an extreme number of light marks or erasures. TEA’s
statewide testing contractor further reported that these statistical analyses should serve only as a
screening device.

As a result, TEA chose to couple the data from the erasure analyses with information that LEAs
provided through the incident reporting process and to use this data as supporting evidence of a
potential test administration irregularity. TEA added that it considered two new methods for
identifying possible test administration irregularities: residual analysis and pass rate analysis.
TEA presented the results from the pilot analyses for both statistical methods to the TTAC in
March 2012 and January 2013, respectively. The TTAC expressed concern about using data
derived from these methods as sole identifiers of test administration irregularities, noting that
TEA could use the same data to identify effective teaching.

The TTAC also agreed with the contractor’s recommendation that the use of residual and pass
rate analyses should always take place within a larger review process. The TTAC stated that
these two analyses should be supplemented with the collection of additional evidence, such as
locally maintained seating charts, reports of test administration irregularities, test participation
data, erasure analyses, and records of test security and administration training for campuses. The
TTAC concluded that statistical measures alone should not be used to trigger a review.

Because of its limited resources and the caution expressed by the TTAC, TEA concluded that it
was not feasible or appropriate to monitor schools based on outcomes of proposed statistical
analyses at this time. TEA added that it will continue to discuss with its statewide testing
contractor and TTAC members the use of statistical measures to determine appropriate and fair
methods by which TEA can use these data.

We agree that TEA should not use forensic analyses as the sole evidence of test administration
irregularities. There are a variety of possible explanations for anomalous wrong-to-right erasure
counts and changes in proficiency, and sanctions should not be based solely on forensic analysis.
However, TEA could use the results of these analyses to identify LEAs and schools for possible
monitoring. Improved monitoring procedures will help TEA ensure that LEAs and schools are
properly administering statewide tests.

TEA Did Not Ensure That LEAs Tested All Qualified 10th Grade Students and Has Not
Assessed How LEAs or Schools Could Influence Outcomes of New State Tests
TEA did not track students’ grade-level reclassifications or assess the potential impact of these
reclassifications on the accuracy, reliability, and completeness of test results and determinations
of adequate yearly progress (AYP). For school years 2007–2008 through 2012–2013, TEA used
the results of the prior year’s TAKS given to qualified 10th grade students and graduation rates
to determine whether a high school made AYP. High school students took the TAKS only if
Final Report
ED–OIG/A05N0006                                                                                Page 7 of 15

their LEA classified them as 10th grade students at the time of the test. In Texas, LEAs are
responsible for assigning grade-level classifications to students. Therefore, an LEA could avoid
classifying students as 10th grade students so that lower performing students would never take
the 10th grade TAKS.

The three LEAs that we reviewed reclassified students to the next grade level only at the
beginning of each school year. The three LEAs did not classify a high school student as a
10th grade student until the student earned the number of credits that each LEA’s policy
required. However, some students might not have earned enough credits by the beginning of
their second year of high school for the LEA to classify them as 10th grade students. Because
the LEAs would classify them as 9th grade students for their entire second year of high school,
these students would not be given the 10th grade TAKS at any time during their second year of
high school. By the beginning of their third year of high school, the students might have earned
enough credits for the LEA to classify them as 11th grade students. The students never would
have been classified as 10th grade students at any time during high school and never would have
been given the 10th grade TAKS that TEA used to determine AYP.

Section 1111(b)(2)(A) of the ESEA requires States to implement a statewide accountability
system that ensures LEAs and public schools make AYP.3 The accountability system must be
based on academic standards and annual academic tests that demonstrate AYP towards academic
achievement standards. TEA’s accountability system requires LEAs and public schools to meet
AYP in three measures: reading/language arts, mathematics, and either graduation rate for LEAs
and high schools or attendance rate for elementary and middle schools. In addition, section
1111(b)(3)(C) of the ESEA requires States to use such annual academic tests for purposes for
which such tests are valid and reliable. States must administer the tests not less than once in
grades 10 through 12 and provide for the participation of all students, including limited English
proficient students.

TEA mitigated the risk that an LEA could avoid testing students by never classifying them as
10th grade students when it instituted the STAAR end-of-course tests. Starting with school year
2011–2012, high school students take 12, end–of–course tests instead of the grade-specific
TAKS. Students take STAAR end-of-course tests whenever they complete specific courses,
regardless of their grade levels. TEA includes the students’ results on the STAAR
end-of-course tests when determining whether high schools are making AYP.

Although TEA eliminated the potential impact of grade-level reclassifications on test results for
students who were 9th grade students beginning with school year 2011–2012, it had not assessed
the possibility that LEAs and schools might influence STAAR test results in other ways. For
example, a school potentially could improve its test results in a given year by either preventing
low-performing students from taking these courses during that year or encouraging high-
achieving students from multiple grades to take these courses during that year.

As Texas implements the STAAR end-of-course tests and phases out the 10th grade TAKS, the
AYP calculations will be modified to include the STAAR end-of-course tests. In addition,

3
 AYP is a measurement used to determine how States, LEAs, and public schools are performing academically
according to students’ results on standardized tests.
Final Report
ED–OIG/A05N0006                                                                       Page 8 of 15

TEA’s transition plan for school years 2010–2011 through 2013–2014 includes long-term goals
and targets for graduation rates. TEA also should take this opportunity to determine how best to
prevent or detect potential manipulation of test results, which could impact whether schools
make AYP.

TEA Neither Documented Corrective Actions That It Recommended for La Joya Nor
Verified That La Joya Implemented These Corrective Actions
For school years 2007–2008 through 2009–2010, TEA did not document its recommended
corrective actions to address 3 of 12 statewide test administration irregularities that La Joya
reported through incident reports. These irregularities included 13 students not taking a
statewide test, students completing answer documents that La Joya assigned to other students,
and a student receiving an incorrect test score because the campus testing coordinator coded the
answer document incorrectly.

TEA informed us that it provided La Joya with recommendations to resolve the three test
administration irregularities. TEA made the recommendations when La Joya called TEA to
report the three irregularities and seek guidance. However, TEA did not record these
recommendations in its online incident report system. Instead, TEA relied on La Joya to record
the recommendations that TEA made to La Joya in the online incident reports that La Joya later
submitted to TEA. If TEA relies on LEAs to record its recommendations, rather than recording
the recommendations itself, LEAs might incorrectly record TEA's recommendations. TEA
would have no record of the actual recommendations that it made.

In addition, TEA did not maintain records showing that it verified La Joya implemented the
corrective actions. Instead, TEA documented only that it notified La Joya that it was closing its
records of the incidents with no further action. If TEA does not maintain these records, then it
cannot ensure that the LEA implemented the corrective actions. TEA stated that it will start
documenting all communications and retaining support for recommendations that it provides to
LEAs regarding corrective actions needed to address test administration irregularities.

Federal Regulations and Guidance Require Monitoring for Compliance With Federal
Requirements
According to 34 Code of Federal Regulations § 80.40(a), grantees are responsible for monitoring
grant- and subgrant-supported activities to ensure compliance with applicable Federal
requirements and the achievement of performance goals. According to “Key Policy Letters from
the Secretary or Deputy Secretary,” June 24, 2011, States are urged to make test security a high
priority. States should review and, if necessary, strengthen efforts to protect student achievement
and accountability data, ensure the quality of those data, and enforce test security.

Recommendations

We recommend that the Assistant Secretary for Elementary and Secondary Education require
TEA to strengthen its risk assessment and monitoring processes by—

1.1	   Using the test results and erasure analyses completed by its testing contractor more
       effectively to identify LEAs and schools for possible monitoring.
Final Report
ED–OIG/A05N0006                                                                          Page 9 of 15

1.2	   Identifying ways that LEAs and schools can improperly influence STAAR test results
       and design mitigating controls. This could include reviewing enrollment and
       participation trends in end-of-course tests.

1.3	   Documenting the corrective actions that it recommends LEAs take in response to self–
       reported testing irregularities and verifying that the LEAs implement the corrective
       actions.

TEA Comments
TEA agreed with the finding and recommendations and stated that it already implemented, or has
plans to implement, corrective actions for all issues identified in the finding. TEA added that it
will implement the new policies and controls to coordinate with its upcoming test
administrations. However, it is likely the activities will be ongoing for the next several years.
Legislation passed during the most recent Texas legislative session will result in components
being added to the accountability system and changes being made to the State assessment
system.

FINDING NO. 2 –La Joya Could Strengthen Its System of Internal Control Over
                Administering Tests, Retaining Records, and Reporting Test
                Administration Irregularities

La Joya did not (1) properly administer statewide tests, (2) adequately document its reviews of
potential test administration irregularities, (3) provide complete records of all reviews of
potential test administration irregularities, or (4) report all test administration irregularities to
TEA in a timely manner. TEA needs to identify why the test administration irregularities
occurred and ensure that La Joya implements procedures to prevent them from happening in the
future.

La Joya Did Not Properly Administer Statewide Tests
For school years 2007–2008 through 2009–2010, of the 65 total test administration irregularities, 

La Joya identified 25 in which it did not properly administer statewide tests. Of these 25, 

La Joya: 


       	 Did not administer the correct statewide test to students 16 times. Each irregularity
          affected at least one student and in one case affected an entire classroom. La Joya
          placed students in incorrect testing rooms, provided the wrong grade-level tests,
          provided tests in the wrong language, provided versions of tests with
          accommodations that students were not approved to receive, and provided versions of
          tests that did not allow students to have the accommodations that they were approved
          to receive according to their Individualized Education Programs.

       	 Provided answer documents to the wrong students four times. In each instance, the
          irregularity affected 2 to 3 students. Students with identical names received each
          other’s answer documents, students who took a test in April used versions of the
          answer documents from the March test, and an 11th grade student was provided an
          answer document intended for a 10th grade student.
Final Report
ED–OIG/A05N0006                                                                       Page 10 of 15

          Coded answer documents with incorrect information in the “Test Taken Info” field or
           did not correct inaccurate information on pre–coded student labels on answer
           documents five times. In each instance, the irregularity affected one to three students.

According to pages 13–14 of its “Test Security Supplement 2007–2008,” TEA requires campus
testing coordinators to “ensure that students are being administered the appropriate tests and
have the correct corresponding answer document.” On pages 95–100 of its “2008 District and
Campus Coordinator Manual,” TEA explains how LEAs need to properly attach pre–coded
student labels to answer documents and correct inaccurate information displayed on the
labels. Pages 109–113 of the manual explain how LEAs need to properly fill out the “Test
Taken Info” field, because failure to do so might result in a student receiving an incorrect score.

Without identifying the underlying causes of these instances and developing mitigating controls,
La Joya does not have reasonable assurance that tests will be administered properly in the future.
Additionally, because it did not always properly administer the statewide tests, La Joya cannot
provide assurances about the validity of student test results.

La Joya Did Not Adequately Document Its Reviews of Potential Test Administration
Irregularities
For school years 2007–2008 through 2009–2010, La Joya did not adequately document its
review of 17 of the 65 potential test administration irregularities. The potential irregularities
included incidents of students caught using or possessing cell phones or other electronic devices
during test administration, a student distracting other students, a student taking a restroom break
and never returning, students caught cheating, and students receiving test materials with broken
seals. Although La Joya has its own form to document the required information, the
documentation that it maintained for each incident did not clearly describe (1) what happened,
(2) how the incident occurred, (3) the actions La Joya took to resolve the incident, or
(4) La Joya’s determination of whether the incident constituted a deviation from documented
testing practices that should be reported to TEA.

According to pages 25–26 of TEA’s “Test Security Supplement, 2007–2008,” TEA requires
LEAs to review each potential test administration irregularity and collect statements and reports
that clearly outline the sequence of events, explain exactly what happened and how it occurred,
include information about how the problem was resolved or remedied, and document the LEA’s
determination in the matter.

Without complete documentation of the potential test administration irregularity review process,
La Joya cannot show that it reviewed and resolved all potential test administration irregularities
fairly, appropriately, and consistently.

La Joya Did Not Provide Records of All Reviews of Potential Test Administration
Irregularities
La Joya was unable to provide documentation for all of its reviews of potential statewide testing
irregularities during school year 2007–2008. It could locate documentation for only nine of its
reviews. As a result, we could not determine the total number of reviews that La Joya conducted
during that school year.
Final Report
ED–OIG/A05N0006                                                                        Page 11 of 15

Without records for all of its reviews of potential test administration irregularities, La Joya
cannot show that it thoroughly reviewed all of them and took corrective actions to address
internal control deficiencies identified as a result of these reviews. La Joya could not tell us the
total number of reviews of potential test administration irregularities that it conducted during
school year 2007–2008.

According to page 75 of its “2008 District and Campus Coordinator Manual,” “Beginning with
the spring 2008 TAKS administration, districts will be required to maintain test security
materials for five years.”

La Joya Did Not Report All Test Administration Irregularities to TEA or Did Not Report
the Irregularities Timely
For school years 2007–2008 through 2009–2010, of the 28 incidents of test administration
irregularities from the two schools we visited, La Joya did not report 16 to TEA. La Joya also
provided us with records for 37 incidents that occurred during school year 2009–2010 at the
remaining La Joya schools, but it never reported the incidents to TEA. The irregularities
included giving students the wrong answer documents, misplacing testing booklets and answer
documents, and leaving students unmonitored during a testing session. Additionally, La Joya
gave students the wrong statewide tests, resulting in students (1) receiving accommodation that
they were not approved to receive, (2) not receiving accommodations that they were approved to
receive according to their Individualized Education Programs, or (3) taking tests that did not
include all required subject areas.

In addition, for school years 2009–2010 through 2011–2012, La Joya reported 13 statewide test
administration irregularities to TEA more than 6 months after the irregularities occurred. In 1 of
the 13 cases, La Joya reported an irregularity to TEA more than 2 years after the irregularity
occurred. To test for timely reporting, we used the date that the irregularity occurred instead of
the date that the district test coordinator became aware of the incident because we could not
determine when the district coordinator learned of the incident or actually received a complaint.
La Joya did not explain why it took more than 6 months to report the test administration
irregularities. See Table 3 for a breakdown of the number of test administration irregularities at
La Joya.

According to pages 21–24 of its “2008 District and Campus Coordinator Manual” and pages 19–
23 of its “Test Security Supplement, 2007–2008,” TEA requires LEAs to report all incidents
resulting in deviations from documented testing procedures, including eligibility errors, improper
accounting for secure test materials, monitoring errors, and procedural errors.

According to page 23 of its “2009 District and Campus Coordinator Manual,” and page 30 of its
“Test Security Supplement, 2009–2010,” TEA requires LEAs to submit all required
documentation for testing irregularities, including statements from individuals involved and an
incident report, “within 10 working days of the district testing coordinator being made aware of
the incident.”

Because TEA was not aware of all test administration irregularities at La Joya, it did not have a
complete understanding of La Joya’s system of internal control over statewide testing. If La
Joya had notified TEA of all the test administration irregularities, TEA might have decided to
take corrective actions, such as assessing penalties or performing additional monitoring.
Final Report
ED–OIG/A05N0006                                                                                   Page 12 of 15

Table 3. Test Administration Irregularities
                     Number of La Joya Incidents Reviewed
                                                                                     Number of
                         School Years                                               Incidents for
                          2007–2008                               School            Which TEA
                           Through                              Years 2007–           Provided
                          2009–2010:         School Year           2008               Records:
                         Two Schools         2009–2010:          Through            School Years
                           That We            All Other         2009–2010:         2010–2011 and
    Category                Visited           Schools(a)           Total             2011–2012            Total
Improper Test
Administration                  17                 8(b)               25                   (c)              25
Inadequate
Documentation                    4                  13                17                   (c)              17

Did Not Report to
TEA                             16                  37                53                   (c)              53
Reported to TEA
Untimely                         1                   0                  1                  12               13
Reported to TEA                 11                   0                 11                  15               26
Timely
Total Number of
Records of
Incidents                       28                  37               65(d)                 27               92
    (a) La Joya supplied unreported incidents from other schools only for school year 2009–2010.
    (b) There were seven separate incidents but one incident had two different improper test administration
        irregularities associated with it.
    (c) We did not review any records provided by TEA for incidents in this category.
    (d) We did not review the reported incidents for all other La Joya schools. La Joya could not determine the
        total number of incidents.

Recommendations

We recommend that the Assistant Secretary for Elementary and Secondary Education require
TEA to—

2.1 	   Require La Joya to identify why it did not administer the correct tests to all students and
        ensure that La Joya implements procedures to prevent it from administering incorrect
        tests to students in the future.

2.2 	   Require La Joya to ensure that it properly documents all reviews of potential test
        administration irregularities.

2.3 	   Require La Joya to retain records of its reviews of potential test administration
        irregularities for the minimum number of years that TEA requires.

2.4 	   Ensure that La Joya timely reports to TEA all test administration irregularities that TEA
        requires it to report.
Final Report
ED–OIG/A05N0006                                                                       Page 13 of 15

La Joya Comments
La Joya agreed with the finding and recommendations and stated that it already implemented
corrective actions for all issues identified in the finding.




                            SCOPE AND METHODOLOGY



To achieve our objective, we—

       1.	 Reviewed and gained an understanding of Federal law, regulations, and guidance
           applicable to the audit objective.

       2.	 Reviewed prior (1) Single Audits for TEA and the three LEAs and (2) Department
           program monitoring reports to identify areas of potential internal control weaknesses
           related to our audit objective.

       3.	 Reviewed written policies and procedures, contracts for scoring tests, and test
           administration records at TEA, three LEAs, and six schools.

       4.	 Interviewed officials at TEA, three LEAs, and six schools.

       5.	 Gained an understanding and assessed the adequacy of TEA’s and the three LEAs’
           systems of internal control over statewide test results in the following areas: data
           collection, monitoring, and guidance.
 
Sampling Methodology
We judgmentally selected three LEAs from a universe of 935 LEAs. The 935 LEAs in our
universe consisted of all LEAs in Texas that had at least 1 school with more than 200 students
during school years 2007–2008, 2008–2009, and 2009–2010. Next, we calculated for each
school a risk score for each grade tested in the subjects of math and reading. We calculated a
risk score to determine how anomalous each increase or decrease in the percentage of students
who scored at a proficient level for a grade or subject was from one year to the next in relation to
the change for that grade or subject across the State. A school’s composite risk score was the
average of the school’s five highest risk scores across all years, grades, and subjects. We also
considered the number of schools in the LEA, income level, and annual fluctuations in whether
schools met AYP requirements. After selecting La Joya, Lufkin, and Marion, we selected the
two schools at each LEA that had significant annual test score fluctuations.

To determine whether La Joya completely and accurately documented and reported potential test
administration irregularities, we reviewed documentation for 65 incidents of potential test
administration irregularities for school years 2007–2008, 2008–2009, and 2009–2010. The
documentation included information such as the date of each incident, the date each incident was
reported to TEA, the individuals involved, explanations, resolutions, whether each incident was
Final Report
ED–OIG/A05N0006                                                                      Page 14 of 15

reported to TEA, and whether the resolution was satisfactory. These 65 incidents included the
following:

       	 28 incidents that occurred at the 2 La Joya schools in our sample during school years
          2007–2008 through 2009–2010. La Joya reported 12 of these incidents to TEA and
          did not report 16. 

       	 37 unreported incidents that occurred at all other La Joya schools during school year
          2009–2010. We did not review the reported incidents for all the other La Joya
          schools.

To determine whether La Joya timely reported incidents of test administration irregularities to
TEA, we reviewed documentation that La Joya provided us for 65 incidents of potential test
administration irregularities that occurred during school years 2007–2008, 2008–2009, and
2009–2010. We also reviewed documentation that TEA provided us for 27 incidents that
occurred at La Joya during school years 2010–2011 and 2011–2012.

Data Reliability
To achieve our objectives, we relied on data from EDFacts. EDFacts includes data fields for
student proficiency on statewide tests, participation rates, and graduation rates at the SEA, LEA,
and school levels. We used this data to select LEAs and schools to visit as part of this audit. We
looked for patterns of changes in student proficiency and used enrollment data to limit the
universe of schools to those with more than 200 students during school years 2007–2008,
2008–2009, and 2009–2010.

To determine whether the EDFacts proficiency scores (for selected grades and subjects at
selected schools) were accurate and complete, we reconciled the scores that we obtained from
EDFacts with scores we calculated using data that TEA provided. We did not find any
discrepancies between the scores recorded in EDFacts and the scores that we calculated using
TEA’s data. Therefore, we determined that the data from EDFacts were sufficiently reliable for
our intended use.

We conducted this audit at TEA’s offices in Austin, Texas; La Joya’s offices in La Joya, Texas;
Lufkin’s offices in Lufkin, Texas; Marion’s offices in Marion, Texas; and at our offices from
August 2012 through May 2013. We discussed the results of our audit with TEA officials on
May 29, 2013. We provided TEA officials with a draft of this report on July 22, 2013.

We conducted this performance audit in accordance with generally accepted government
auditing standards (December 2011 revision). Those standards require that we plan and perform
the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings
and conclusions on our audit objective. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions on our audit objective.
Final Report
ED–OIG/A05N0006                                                                        Page 15 of 15



                             ADMINISTRATIVE MATTERS



Statements that managerial practices need improvements, as well as other conclusions and
recommendations in this report, represent the opinions of the Office of Inspector General.
Determinations of corrective action to be taken will be made by the appropriate Department
officials.

This report incorporates the comments you provided in response to the draft report. If you have
any additional comments or information that you believe may have a bearing on the resolution of
this audit, you should send them directly to the following Department official, who will consider
them before taking final action on this audit:

                               Deborah S. Delisle
                               Assistant Secretary
                               Office of Elementary and Secondary Education
                               400 Maryland Avenue, SW
                               Washington, D.C. 20202

It is the policy of the Department to expedite the resolution of audits by initiating timely action
on the findings and recommendations contained therein. Therefore, receipt of any additional
comments within 30 days would be appreciated.

In accordance with the Freedom of Information Act (5 U.S.C. § 552), reports issued by the
Office of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.

Sincerely,

/s/

Gary D. Whitman
Regional Inspector General for Audit
                                                                      ATTACHMENT 1

             Acronyms, Abbreviations, and Short Forms Used in this Report

AYP                Adequate Yearly Progress

Department         U.S. Department of Education

ESEA               Elementary and Secondary Education Act of 1965, as amended

La Joya            La Joya Independent School District

LEA                Local Educational Agency

Lufkin             Lufkin Independent School District

Marion             Marion Independent School District

SEA                State Educational Agency

STAAR              State of Texas Assessments of Academic Readiness

TAKS               Texas Assessment of Knowledge and Skills

TEA                Texas Education Agency

Texas              State of Texas

TTAC               Texas Technical Advisory Committee
                                                                                 ATTACHMENT 2




August 26, 2013

Mr. Gary D. Whitman
Regional Inspector General for Audit
United States Department of Education
Office of Inspector General
Citigroup Center
500 West Madison Street, Suite 1414
Chicago, IL 60661

RE: 	   Draft Audit Report; Texas Education Agency’s System of Internal Control over Statewide
        Test Results; Control Number ED-OIG/A05N0006

Dear Mr. Whitman,

This letter is in response to the Draft Audit Report regarding TEA identified above. As noted
below, the Texas Education Agency (TEA) agrees with the findings and recommendations set
forth in the draft report.

Response to Recommendations for Finding No. 1 – TEA Could Strengthen Its Risk
Assessment and Monitoring Processes

With respect to specific recommendations, TEA has the following comments and anticipated
completion dates:

        Recommendation 1.1 – Using the test results and erasure analyses completed by
        its testing contractor more effectively to identify LEAs and schools for possible
        monitoring.

        TEA agrees with the recommendation, and will use information from annually conducted
        erasure analyses to supplement information collected by other divisions within the
        agency to identify campuses where monitoring or other follow-up might be warranted.
         Additionally, in light of the state’s new assessment and accountability systems, the
        agency will review and analyze its existing data validation analyses and internal controls
        in the fall of 2013 to determine whether they are appropriate for the new systems or
        whether other types of analyses and internal controls are needed. It is likely these
        activities will be ongoing for the next several years as additional components are added
        to the accountability system and as changes are made to the state assessment system
        as a result of legislation passed during the most recent Texas legislative session.
       Recommendation 1.2 – Identifying ways that LEAs and schools can improperly
       influence STAAR test results and design mitigating controls.

       TEA agrees with the recommendation and will continue to (a) identify ways that LEAs
       and schools can improperly influence STAAR test results; and (b) design mitigating
       controls. In light of the new accountability system implemented for the first time in
       August 2013, the agency will review and analyze, in fall 2013, its existing policies and
       internal controls to determine whether they are appropriate for the new accountability
       system or whether additional policies and internal controls are needed. While any
       additional policies or controls may be implemented as soon as winter 2013 or spring
       2014, it is likely these activities will be ongoing as other components are added to the
       accountability system over the next several years, and as changes are made to the state
       assessment system as a result of legislation passed during the most recent Texas
       legislative session.

       Recommendation 1.3 – Documenting the corrective actions that it recommends
       LEAs take in response to self-reported testing irregularities and verifying that the
       LEAs implement the corrective actions.

       TEA agrees with the recommendation and has developed a process to document
       corrective actions recommended to districts in response to self-reported serious
       irregularities and to verify that LEAs have implemented them accordingly.

       Serious testing irregularities that are reported to TEA on the telephone or through email
       are documented in the database as a “Staff Report.” Districts follow up by submitting
       online incident reports that document their investigation of the incident. This online form
       will contain a field for districts to specify corrective action that they will take in response
       to the testing irregularity.

       Beginning in fall 2014 after incident reports from the spring 2014 administration have
       been processed by staff in the Student Assessment Division, TEA plans to audit a
       random sample of districts to verify that they have implemented corrective actions in
       response to reported irregularity events. The number of districts selected for follow-up
       and the nature of the follow-up activities will be dependent on available agency
       resources. Results will be used to reinforce actions taken by the district and to improve
       training on test administration and the maintenance of test security.

Response to Recommendations for Finding No. 2 – La Joya Could Strengthen Its System
of Internal Control Over Administering Tests, Retaining Records, and Reporting Test
Administration Irregularities

With respect to specific recommendations, La Joya ISD has the following comments and
anticipated completion dates:
Recommendation 2.1 – Require La Joya to identify why it did not administer the
correct tests to all students and ensure that La Joya implements procedures to
prevent it from administering incorrect tests to students in the future.

La Joya ISD has investigated the errors committed by staff to identify the underlying
causes.that included failure to follow procedures and misunderstanding of procedures.
In response, La Joya ISD has developed more stringent controls to ensure the proper
administration of statewide tests as described below.

For example, in the case where 13 students were not administered the TELPAS test, we
found that the campus testing coordinator (CTC) did not understand that when an ARD
committee decides that a student should not be administered any of the components of
the TELPAS test, the CTC is still required to register the student for the online TELPAS
test and enter coding into the online system indicating that the student was not required
to take the test due to a decision by the student’s ARD committee. The clarification of
this procedure has been emphasized during training.

In 2010, La Joya ISD analyzed testing errors committed by staff and initiated procedures
intended to eliminate those errors. The most common type of errors involved the
implementation of special education IEPs. Therefore, written local procedures designed
to prevent the administration of incorrect tests to students were developed and continue
to be revised as needed. Local procedures addressing the testing of students receiving
special education services were developed in May of 2010 and revised in March of
2011. Other local procedures intended to eliminate errors include procedures for the use
of locally developed checklists, determining the eligibility for students to take retests,
establishing a secure storage location, distributing and collecting secure materials, the
verification of PEIMS coding on answer documents, the retention of testing records, the
retention of Alternate testing student records, TELPAS validity and reliability procedures,
a 30-minute duty-free lunch for teachers, the regrouping of students during State tests,
the verification of test results, and the processing of testing incidents and irregularities.
Campus principals and campus testing coordinators (CTCs) are trained annually on
these procedures as well as the procedures in the State’s testing manuals. CTCs
participate in several training sessions throughout the year including updates to
emphasize certain procedures in order to eliminate errors.

Curriculum and instruction staff members at the Central Office are trained to monitor
State test administrations at each campus. The Central Office test monitors are trained
to use written checklists to document their monitoring activities. Central Office test
monitors are trained to notify the campus testing coordinator and the principal
immediately when they discover a concern so that the concern is addressed as soon as
possible.

La Joya ISD provides frequent training for campus testing coordinators and encourages
campus testing coordinators to contact the district testing coordinator for additional
support. The participation of campus testing coordinators in training is monitored to
ensure that they attend the training and make-up sessions are offered for those who do
not attend the regularly scheduled training sessions. For the 2013-2014 school year, La
Joya ISD has increased the number of professional staff in the central office testing
department so that it may provide additional support to the campuses as needed. With
more stringent controls in place, La Joya ISD has a reasonable assurance that tests will
be administered properly in the future.
Recommendation 2.2 – Require La Joya to ensure that it properly documents all
reviews of potential test administration irregularities.

In April 2013, La Joya ISD established written local procedures to supplement State
procedures for the processing of irregularities and potential irregularities. Campus
testing coordinators (CTCs) will contact the district testing coordinator (DTC) as early as
possible after discovering a testing irregularity or potential testing irregularity. CTCs will
investigate to determine whether an irregularity or student cheating occurred. CTCs will
submit statements within 2 days of discovering a testing irregularity or potential
irregularity. Statements will include how any testing irregularities or potential
irregularities were resolved and a determination by the CTC of whether a testing
irregularity or a student attempt to cheat occurred. CTCs will submit the required online
reports to TEA within 6 working days of reporting it to the DTC. The written local
procedures were created in part to ensure the proper documentation of investigations of
potential test administration irregularities. These procedures will ensure that La Joya
ISD will review and resolve all potential test administration irregularities fairly,
appropriately, and consistently.

Recommendation 2.3 – Require La Joya to retain records of its reviews of
potential test administration irregularities for the minimum number of years that
TEA requires.

During 2007-2008 and 2008-2009, La Joya ISD retained testing records by submitting
them to a central office of Admissions and Records which followed the record retention
rules of the Texas State Library and Archives Commission. Testing records for those
years were destroyed by the Office of Admissions and Records in accordance with the
rules from the Texas State Library not realizing that there were other rules in the State’s
testing manuals governing the retention of testing records.

In December 2010, La Joya ISD established a written local procedure for the retention of
testing records requiring that the records be kept at the campus with copies in the
central office testing office. The procedure supplemented the State’s requirements for
record retention making it clear that the records were to be kept in the campus testing
coordinator’s office with copies in the district testing coordinator’s office.

On August 19, 2013, La Joya ISD revised its written local procedure for the retention of
testing records to include statements about irregularities and potential test administration
irregularities.

Recommendation 2.4 – Ensure that La Joya timely reports to TEA all test
administration irregularities that TEA requires it to report.

In April 2013, La Joya ISD established written local procedures for the processing of
irregularities and potential irregularities that changed how the district submitted reports
to TEA through the online system. The new procedures have decentralized the
processing of reports so that campus testing coordinators have greater control over the
submission of reports with oversight from the central office. Through TEA’s online
reporting system, the district testing coordinator reviews all reports submitted by
campuses. The new procedures did result in significant improvements in the timeliness
            of submissions during the Spring 2013 test administrations. These procedures and the
            results will be studied to ensure compliance with reporting requirements.
 
 
    The Agency appreciates the insights the report provides regarding its system of internal control
    over statewide test results, and would like to thank the USDE audit team for the courtesy
    extended to Agency and district staff during the audit process.
 
    Sincerely,
     
    /s/
    Michael Williams
    Commissioner of Education
 

    cc: 	   Criss Cloudt, Associate Commissioner, Texas Education Agency
            Gloria Zyskowski, Director, Texas Education Agency
            Aida Benavides, Superintendent, La Joya Independent School District