oversight

U.S. Department of Education's Implementation and Oversight of Approved Elementary and Secondary Education Act Flexibility Requests

Published by the Department of Education, Office of Inspector General on 2015-01-15.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                                      UNITED STATES DEPARTMENT OF EDUCATION
                                                      OFFICE OF INSPECTOR GENERAL

                                                                                                                            AUDIT SERVICES
                                                                                                                           Atlanta Audit Region



                                                                 January 15, 2015

                                                                                                                   Control Number
                                                                                                                   ED-OIG/A04N0012
Deborah S. Delisle
Assistant Secretary
U.S. Department of Education
Office of Elementary and Secondary Education
400 Maryland Avenue, S.W.
Washington, DC 20202-4300

Dear Ms. Delisle:

This final audit report, “U.S. Department of Education’s Implementation and Oversight of Approved
Elementary and Secondary Education Act Flexibility Requests,” presents the results of our audit. The
purpose of the audit was to (1) assess the U.S. Department of Education’s (Department) monitoring
efforts of State educational agencies’ (SEA) compliance with approved Elementary and Secondary
Education Act, as amended by the No Child Left Behind Act of 2001 (ESEA) flexibility requests, and
(2) determine how the Department assessed the sufficiency and accuracy of information received from
SEAs to validate implementation of the approved ESEA flexibility requests. 1 Our review covered the
Department’s ESEA flexibility monitoring process from September 5, 2012, through April 14, 2014.

We performed our review at the Department’s Office of Student Achievement and School Accountability
(SASA) within the Office of Elementary and Secondary Education (OESE). We also reviewed policies
and procedures for ensuring the accuracy of ESEA flexibility data submitted to the Department for
monitoring at nine SEAs—Arizona Department of Education, Georgia Department of Education, Kansas
State Department of Education, Louisiana Department of Education, Minnesota Department of Education,
Oregon Department of Education, South Carolina State Department of Education, South Dakota
Department of Education, and Washington Office of the Superintendent of Public Instruction.

The Department established and implemented an extensive and effective process for assessing SEAs’
compliance with approved flexibility requests based on the information the SEAs submitted during
monitoring. However, we found that the Department could improve its oversight of SEAs by taking steps
to ensure the accuracy of the data submitted.




1
    “Approved ESEA flexibility requests” are also referred to as ESEA waivers.
         The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                           excellence and ensuring equal access.
Final Report
ED-OIG/A04N0012                                                                                         Page 2 of 19


                                                    BACKGROUND


Pursuant to Section 9401 of the ESEA, the Secretary may waive, with certain exceptions, any statutory or
regulatory requirement of the ESEA for an SEA, Indian Tribe, local educational agency (LEA), or school
through an LEA that received funds under an authorized ESEA program and requests a waiver. On
September 23, 2011, using the authority granted in Section 9401, Secretary of Education Arne Duncan
invited Chief State School Officers to request flexibility regarding specific requirements of the ESEA on
behalf of their State, LEA, and schools. Secretary Duncan explained that flexibility would be offered
because the ESEA inadvertently encouraged some States to set low academic standards, failed to
recognize or reward growth in student learning, and did little to elevate the teaching profession or
recognize the most effective teachers. Attachment 1 lists the ESEA requirements subject to flexibility.

To exercise flexibility, an SEA submitted an ESEA flexibility request to the Department identifying
specific requirements to be waived in exchange for a rigorous and comprehensive State-developed plan
for improving educational outcomes for all students, closing achievement gaps, increasing equity, and
improving the quality of instruction. The request also included the SEA’s plan for adhering to the
approved request and a list of documents the SEA would provide as evidence to demonstrate that it
adhered to its plan and followed the four required ESEA flexibility principles by the approved deadlines.
The four required principles are (1) college and career-ready expectations for all students; (2) State-
developed differentiated recognition, accountability, and support systems 2 for all LEAs in the State and
for all Title I schools in the LEAs; (3) supporting effective instruction and leadership; and (4) reducing
duplicative, unnecessary, and burdensome reporting requirements.

SEAs submitted ESEA flexibility requests to the Department in December 2011, March 2012, and
October 2012. External peer and SASA office reviewers evaluated the requests and provided comments
to OESE officials. OESE management considered the comments when providing recommendations to the
Secretary, who was ultimately responsible for approving the requests. If the Department did not grant a
request for flexibility, the SASA office and OESE provided feedback to the SEA about the components
that needed additional development for approval.

As of May 2014, 45 SEAs (43 States, the District of Columbia, and Puerto Rico) had submitted flexibility
requests and received approval. In addition, three SEAs (Iowa, Wyoming, and the Bureau of Indian
Education) had applications under review but had not been granted waivers. Five SEAs either did not
apply or withdrew their requests—California, 3 Montana, and Nebraska did not apply; North Dakota and
Vermont withdrew their requests.


2
  Differentiated recognition is a system each SEA developed to identify its focus schools, priority schools, and other Title I
schools. Focus schools are Title I schools with the largest within-school gaps between the highest-achieving subgroup and the
lowest-achieving subgroup; the largest within-school gaps in graduation rates; a subgroup with low achievement; or low
graduation rates. Priority schools must be among the lowest 5 percent of Title I schools in the State based on both achievement
and lack of progress; a participating Title I or eligible high school with a graduation rate less than 60 percent; or a school that
currently receives School Improvement Grants.
3
  California did not apply for ESEA flexibility. However, according to the Department’s response to the draft report, eight
LEAs in the California Office to Reform Education consortium of school districts that submitted bundled requests were granted
waivers under a separate process that included concepts parallel to ESEA flexibility. The SEA reviewed the requests prior to
the Department’s approval.
Final Report
ED-OIG/A04N0012                                                                      Page 3 of 19

To monitor SEAs with approved ESEA flexibility requests, the Department developed a monitoring
process designed to identify areas in which SEAs need assistance and support to meet their goals and
address the Department’s responsibilities for continued fiscal and programmatic oversight. Table 1
describes the Department’s monitoring process, which was divided into three components.

Table 1: Department’s Monitoring of SEAs With Approved ESEA Flexibility Requests
 Monitoring           Purpose of Monitoring            Method of       Planned Dates
 Component                  Component                  Monitoring      for Monitoring
 Part A       To gain a deeper understanding of     Desk Monitoring      Summer/Fall
              each SEA’s goals and approach to                               2012
              implementing ESEA flexibility and
              ensure that the SEA had the critical
              elements of ESEA flexibility in
              place to begin implementation of its
              plan in the 2012–2013 school year
 Part B       To take a deeper look at the SEA’s   Desk Monitoring and   Winter 2013
              early implementation of ESEA         On-Site Monitoring
              flexibility and other Title I
              requirements that were not waived,
              as well as follow up on any
              outstanding issues or concerns from
              Part A
 Part C       To look at each SEA’s ongoing         Desk Monitoring,      Fall/Winter
              implementation of its approved       On-Site Monitoring,       2014
              ESEA flexibility request and other           and
              Title I requirements that were not     Progress Checks
              waived

As of May 2014, the Department had performed Part A monitoring on 35 SEAs with approved ESEA
flexibility requests and Part B monitoring on 34 SEAs. In its response to the draft report, the Department
stated that it no longer planned to perform Part C monitoring due to office and program restructuring, but
planned to continue comprehensive monitoring as part of its overall performance management plan.



                                         AUDIT RESULTS


The Department established and implemented an extensive and effective process for assessing SEAs’
compliance with approved flexibility requests based on the information the SEAs submitted during
monitoring. Specifically, the Department followed established protocols, assessed sufficiency of
information, and followed up on problem areas. Through desk reviews and on-site monitoring, the
Department identified compliance issues with all nine SEAs we reviewed. The Department also provided
technical support to the SEAs and provided input to the Office of Management and Budget (OMB) to
update the OMB Circular A-133 Compliance Supplement for 2013 to include guidance to external
auditors for ensuring compliance with approved ESEA flexibility requests. The Department’s monitoring
process provides reasonable assurance that the Department has sufficient information to properly assess
SEA compliance with waiver provisions.
Final Report
ED-OIG/A04N0012                                                                                    Page 4 of 19

However, we found that the Department could improve its oversight of SEAs by taking steps to ensure the
accuracy of the data submitted. Specifically, the Department relied on SEAs to ensure the accuracy of the
information but did not verify that the SEAs had policies and procedures to ensure accuracy. In addition,
the Department did not require SEAs to provide an assurance statement covering the accuracy of the data
submitted and did not have procedures requiring SEAs to disclose any limitations of the information, data,
or validation process. Although the Department lacked procedures for verifying accuracy, all nine SEAs
we reviewed followed their respective State policies and procedures for ensuring the accuracy of the data
submitted to the Department. Since we did not review all SEAs, there is a risk that the remaining SEAs
may not be taking steps to ensure data accuracy.

In its response to the draft audit report, the Department stated that it appreciated our recommendations
and would integrate them into its continuous improvement process. The Department concurred with
Finding No.1 and proposed corrective action to address Recommendation 1.1. Although the Department
did not state whether it concurred with Finding No. 2, it proposed corrective action to address
Recommendation 2.1 and both concurred with and proposed corrective action to address
Recommendation 2.2. The Department’s comments are summarized at the end of each finding. The
Department also provided technical comments that we considered and addressed, as appropriate, in the
body of the report. The full text of the Department’s comments on the draft report is included as
Attachment 3 to the report.

FINDING No. 1 – The Department Established and Implemented an Extensive and Effective
Monitoring Process

We found that the Department established and implemented an extensive and effective monitoring
process for assessing SEAs’ compliance with approved ESEA flexibility requests. Specifically, the
Department conducted desk and on-site reviews, provided technical support to the SEAs, and provided
input to OMB to update the OMB Circular A-133 Compliance Supplement for 2013 to include guidance
to external auditors for ensuring compliance with approved ESEA flexibility requests.

Desk and On-site Monitoring Reviews

The Department’s monitoring reviews obtain sufficient information to reasonably assess SEA compliance
with waiver provisions. Specifically, the Department (1) developed monitoring protocols sufficient to
determine whether the SEAs complied with their approved flexibility requests; (2) followed the protocols,
which required the SEAs to provide documentation to support their responses to the protocol questions;
(3) assessed the sufficiency of SEA documentation to validate compliance; (4) documented the results of
the reviews and identified compliance issues through Next Steps; 4 (5) followed up on the compliance
issues; and (6) performed an adequate number of reviews.

We found that the Department’s monitoring protocols included questions that reflected the established
purpose of Part A and Part B monitoring. In addition, we found that the protocols contained sufficient
steps for the Department to determine whether the SEAs had adequately complied with approved ESEA
flexibility requests and associated plans. Specifically, the questions addressed the SEA’s progress in

4
  The Department included “Next Steps” in both Part A and Part B monitoring reports to identify SEAs’ ESEA flexibility
compliance issues. In its Part B monitoring efforts, the Department followed up on the Next Steps identified in the Part A
monitoring reports to ensure that the SEAs implemented the components of ESEA flexibility consistent with the principles and
timelines in the ESEA flexibility guidance and the SEAs’ approved requests.
Final Report
ED-OIG/A04N0012                                                                                     Page 5 of 19

implementing the first three principles of ESEA flexibility. For example, one principle of ESEA
flexibility is supporting effective instruction and leadership. In the Part B protocol, we found questions
asking the status of the SEAs’ efforts in developing, adopting, piloting, and implementing teacher
evaluation and support systems.

     •   For the nine SEAs we reviewed, we found that the Department required and obtained a response
         to every applicable question in the Part A and Part B protocols or issued a Next Step when SEAs
         did not provide the required documentation. 5 For example, one protocol question required SEAs
         to provide evidence of their progress in issuing State report cards. However, one of the SEAs we
         reviewed for Part B monitoring did not provide the required documentation and the Department
         issued a Next Step.

     •   For the Part A and Part B protocol questions we reviewed, we determined that Next Steps were
         included in the monitoring reports in cases where the Department determined that there was not
         sufficient documentation. The Department provided examples of documentation SEAs could
         submit that would be sufficient to demonstrate implementation of ESEA flexibility requirements
         consistent with approved requests. For example, in the Part A monitoring protocol, the
         Department suggested that the SEAs provide the final list of reward, priority, and focus schools
         to support that the SEAs had identified these schools in accordance with the ESEA flexibility
         request. In the Part B monitoring protocol, the Department suggested that the SEAs provide
         documentation such as training activities, guidance to LEAs and schools, a consortia letter or
         memorandum of understanding, copies of legislation, or State Board of Education minutes to
         support the SEAs’ explanation of the progress made in adopting English language proficiency
         standards. If the SEA did not receive a Next Step for any question reviewed, we confirmed that it
         provided the Department information consistent with the suggested documentation.

         In addition, we found that the Department used the information the SEAs submitted for ESEA
         flexibility monitoring to validate compliance with approved ESEA flexibility requests. Through
         team discussions, the Department examined the information to determine whether the SEAs met
         the requirements outlined in the approved ESEA flexibility requests. In addition, the Department
         held debriefings every Thursday during which the program monitors explained to other
         Department staff the rationale for determining whether the SEAs were meeting expectations for
         each ESEA flexibility requirement.

     •   For the nine SEAs we reviewed, we found that the Department documented the results of the
         reviews. In addition, the Department identified compliance issues (through Next Steps) for all
         nine of the SEAs. Specifically, the Department’s Part A and Part B 6 monitoring reports included
         its validation decisions for all ESEA flexibility requirements and its rationale for the identified
         compliance issues. For example, for one SEA, the Department’s monitoring team was not
         confident that the interventions in the SEA’s focus schools were aligned with the reason why the
         school was identified as a focus school. The Department included a Next Step in the monitoring

5
  For 9 of 16 questions in the Part A protocol, we tested the Department’s assessment of the sufficiency of documentation. We
performed the same test for 31 to 53 of 108 questions in the Part B protocol. The number of questions verified on the Part B
protocol varied depending on whether the Department selected the SEA to receive desk or on-site monitoring. On-site
monitoring included additional questions for the SEAs. See the “Objectives, Scope, and Methodology” section for more
details.
6
  For Part B monitoring, the Department created after action reports, which were internal documents that contained the
monitoring teams’ validation decisions and rationale that were subsequently included in the Part B monitoring reports.
Final Report
ED-OIG/A04N0012                                                                       Page 6 of 19

        report requiring the SEA to create and submit a plan to align its interventions with the
        requirements of its ESEA flexibility request.

    •   For the four SEAs we reviewed for Part B monitoring, we found that the Department followed up
        to ensure that the SEAs addressed the issues identified in Part A monitoring, holding them
        accountable for not being in compliance with the approved ESEA flexibility requests. For
        example, for one of the SEAs we reviewed, the Department determined that the SEA’s priority
        schools would not implement interventions aligned with required turnaround principles in the
        appropriate time frame. The SEA’s Part A monitoring report included a Next Step for it to
        submit a plan within 60 days that detailed how it would comply with the requirement. During
        Part B monitoring, the Department followed up on this issue and determined that the SEA had
        aligned its interventions as required.

    •   We found that the Department performed an adequate number of ESEA flexibility monitoring
        reviews. Out of the 43 SEAs with approved ESEA flexibility requests as of May 2014, the
        Department performed Part A monitoring on 35 SEAs and Part B monitoring on 34 SEAs.

Technical Support to SEAs

The Department also provided technical support to SEAs with approved ESEA flexibility requests.
Specifically, the Department hosted a forum in 2011 that provided SEAs with an overview of ESEA
flexibility and an opportunity to learn from other SEAs and national experts about approaches to key
policy areas addressed in ESEA flexibility. Also, from 2011 through 2014, the Department provided
numerous general technical assistance webinars pertaining to various aspects of ESEA flexibility. For
example, in November 2011, the Department held a webinar discussion on how reward, priority, and
focus schools could be incorporated into SEA systems of differentiated recognition, accountability, and
support.

In addition to providing general technical assistance, the Department provided technical assistance to
individual SEAs. For example, in 2013, the Department held a workshop to provide guidance and
technical assistance to selected SEAs to help them design and implement comprehensive systems of
evaluation and support. The Department also provided technical assistance to individual SEAs during
Part B monitoring.

The Department’s monitoring process was enhanced by its collaborative efforts with other offices within
the Department and the training it provided to program monitors. For example, in developing the ESEA
flexibility monitoring process, the Department involved staff from OESE; the Office of Planning,
Evaluation, and Policy Development; the Risk Management Service; and the Council of Chief State
School Officers. In addition, SASA officials provided program monitors with continuous direction and
support in carrying out their monitoring responsibilities, including at least one mandatory training class,
several other training classes on performing monitoring, and a step-by-step monitoring guide.

Update of OMB Circular A-133

The Department provided input to OMB to update Part 4 of the OMB Circular A-133 Compliance
Supplement for 2013 to include areas pertaining to ESEA flexibility. The update covers areas that include
the awarding of School Improvement Grants funds to priority schools; transferring Title II, Part A funds;
Final Report
ED-OIG/A04N0012                                                                                      Page 7 of 19

earmarking school improvement funds; 7 and determining Title I, Part A eligibility for schools. The
Compliance Supplement includes suggested audit procedures the auditor can choose to perform and apply
ESEA flexibility requirements instead of the regular program requirements. The Compliance Supplement
was not available before the Department performed Part A and Part B monitoring. During our review, the
Department was in the process of planning for future ESEA flexibility monitoring but had not finalized its
plans. For its future monitoring efforts, A-133 audit reports should be available for the Department to use
to identify and follow up on ESEA flexibility issues.

Recommendation

We recommend that the Assistant Secretary for OESE encourage the SASA office to—

    1.1   Review SEAs’ A-133 audit reports to identify any ESEA flexibility issues to follow up on during
          future ESEA flexibility monitoring efforts.

Department Comments

The Department concurred with the finding and the recommendation and stated that, in the next ESEA
flexibility monitoring cycle, it will include a requirement for the assigned program officer to review the
SEA’s most recent A-133 audit to identify any issues potentially related to the implementation of the
SEA’s flexibility request.

OIG Response

The Department’s proposed corrective action sufficiently addresses the finding and recommendation.

FINDING No. 2- The Department Can Improve Oversight to Ensure the Accuracy of the SEA
Information Submitted

Although the Department obtained sufficient documentation to assess SEA compliance with approved
flexibility requests, it did not assess the accuracy of the information. 8 The Department relied on the SEAs
to ensure the accuracy of the information submitted but did not obtain information on what the SEAs did
to validate data reliability. In addition, it did not require the SEAs to provide assurance that the
information submitted was accurate, reliable, and complete. Further, the Department did not require the
SEAs to disclose any limitations of the information, data, or validation process.

Although SEAs were not required to provide the Department an assurance of accuracy for submitted
information, the majority of the SEA information provided for Part A and Part B monitoring was actual
documentation that demonstrated the State’s progress. For example, States were required to submit
progress in transitioning to high-quality assessments as evidence that they planned to develop and
administer high-quality assessments. At the time of the first two monitoring efforts, SEAs were either
developing policies to implement provisions in approved flexibility requests or were in the first stages of
initiating procedures. As a result, much of the information reported back for monitoring was evidence of

7
  Earmarking refers to an SEA’s flexibility to allocate school improvement funds to an LEA to serve priority schools or focus
schools.
8
  We did not assess whether the SEA documentation was sufficient; we assessed only whether the Department followed its
policies and procedures for determining that the documentation was sufficient.
Final Report
ED-OIG/A04N0012                                                                                        Page 8 of 19

the SEA’s approach and early implementation. However, future monitoring efforts will rely to a greater
extent on submitted data to assess the SEA’s progress and accomplishments. Consequently, data
accuracy will likely have more impact on the Department’s future assessments of SEA progress.

For data submitted for Parts A or B monitoring, all nine SEAs we reviewed followed their respective
State-established policies and procedures to ensure data accuracy. In our review of documents supporting
selected protocol questions, we found that the nine SEAs followed the established policies and
procedures. For each protocol question selected, the SEA officials explained the SEA’s policies,
procedures, and process for ensuring the accuracy of the data used to create the documents and the
sources of the data. Also, each SEA provided documentation to support compliance with its established
policies and procedures over data reliability. For example, in support of a document on allocations
provided to the Department for Part B monitoring, an SEA team leader explained that the SEA used State
assessments, enrollment data, and graduation data to grade each school. The SEA gave each school a
grade from A to F. The schools that received a grade D or F became the “Other Title I Schools” for
ESEA flexibility purposes. To allocate funds to D or F schools, the SEA used poverty per-pupil
expenditure data. 9 The SEA team leader provided the poverty per-pupil expenditure data used to support
the SEA’s allocation of Title I funds, as well as the supporting documentation used to validate the
accuracy of the State assessments, enrollment data, and graduation data used to grade each school.

Although we did not identify issues in the nine SEAs we reviewed, we did not review the remaining
36 SEAs and 8 LEAs with approved ESEA flexibility requests. Because the Department relies on the
SEAs to provide reliable data and does not determine whether SEAs have and follow policies and
procedures, it cannot be sure whether or not SEAs are providing accurate information.

The Government Performance and Results Act of 1993 (GPRA), as amended by the GPRA
Modernization Act of 2010, requires agencies to clarify their missions, set strategic and annual
performance goals, and measure and report on performance towards achieving those goals in their Annual
Performance Report. The Department uses the ESEA data from monitoring reports in its Annual
Performance Report to discuss the Department’s established ESEA GPRA indicators. In its Annual
Performance Plan for fiscal year 2015, the Department included GPRA indicators related to ESEA
flexibility implementation, with ESEA flexibility monitoring listed as a source for reporting the related
performance data. In addition, when the SEAs first applied for ESEA funds under the current
authorization, Section 9304(a)(6)(A) required assurance that they would submit reports to the Secretary as
necessary to enable him to perform his duties under such program.

According to OMB Circular No. A-11, “Agencies should have in place verification and validation
techniques that will ensure the completeness and reliability of all performance measurement data
contained in their Annual Performance Plans…” The circular also directs agencies to have a data
validation plan for performance reporting and to include an assessment of the reliability and completeness
of the performance data included in the plan. Further, the circular requires the agency to describe how it
ensures the accuracy and reliability of the data it uses to measure progress in meeting performance goals.

The Department needs accurate information from SEAs to assess compliance with approved flexibility
requests that allow the SEAs to waive strict requirements of ESEA. The approved requests provide SEAs
flexibility in achieving the overall goals of ESEA, and each SEA must demonstrate progress in
implementing its plan to achieve those goals in order to receive continued waivers for those requirements.

9
    Poverty per-pupil expenditure data is determined by the number of students receiving free or reduced lunch.
Final Report
ED-OIG/A04N0012                                                                                        Page 9 of 19

The Department relies on SEAs to ensure accuracy; however, it has neither assessed the SEAs’ processes
for doing so nor required SEAs to certify the accuracy of submitted information. If the Department does
not oversee the accuracy, reliability, and completeness of the SEAs’ reported data, it risks using
inaccurate, unreliable, or incomplete information to meet its program obligations and to report on ESEA
GPRA performance in its Annual Performance Report.

Recommendations

We recommend that the Assistant Secretary for OESE require the SASA office to—

2.1      Include in its monitoring reviews a step to determine how SEAs with approved ESEA flexibility
         requests ensure the accuracy of the information they submit to the Department for monitoring so
         the Department can determine the adequacy of their policies and procedures and whether the SEAs
         are following them.

2.2      Require all SEAs to provide certifications that the information they submit is accurate, reliable,
         and complete and disclose any limitations of the information, data, or validation process,
         especially for information used for GPRA reporting.

Department Comments

The Department did not state whether it concurred with Finding No. 2 and Recommendation 2.1.
According to the Department’s response, it expects SEAs to submit accurate data to support all
monitoring activities and added that our report noted that all nine States reviewed followed State policies
and procedures for ensuring the accuracy of the data submitted. However, the Department stated that it
will include a step in its monitoring reviews to determine how an SEA ensures the accuracy of the data
submitted.

The Department concurred with Recommendation 2.2 and stated that in each SEA request to renew
ESEA flexibility, due to the Department by the end of March 2015, the SEA will be required to assure
that it will provide to the Department, in a timely manner, all required reports, data, and evidence of the
SEA’s progress in implementing the plans detailed in the approved flexibility request. 10 The Department
will also require the SEA to ensure that all such reports, data, and evidence are accurate, reliable, and
complete and to disclose any issues related to the accuracy, reliability, or completeness of its reports,
data, or evidence. 11

OIG Response

The Department’s proposed corrective actions sufficiently address the finding and recommendations.

10
   In November 2014, the Department invited each SEA with an approved request that will expire at the end of the 2014–2015
school year to request a 3-year or, in some cases, 4-year renewal of ESEA flexibility.
11
   In response to a related audit regarding the Department and SEAs’ internal controls over assessment results, the Department
stated that it is also requiring SEAs to respond to all flagged comments in the data collections related to academic assessments
and accountability, and that it is revising the Consolidated State Performance Report (CSPR) to include an annual State
certification that the State has a system of internal controls for reviewing assessment data. The CSPR is a required annual
report that each State, the District of Columbia, and Puerto Rico has to submit to the Department. The CSPR collects
information related to State activities and outcomes of specific ESEA programs, which the Department uses to monitor States’
progress in implementing ESEA and to identify technical assistance needs and program management and policy needs.
Final Report
ED-OIG/A04N0012                                                                                Page 10 of 19


                         OBJECTIVE, SCOPE, AND METHODOLOGY


The objective of our audit was to (1) assess the Department’s monitoring efforts of SEAs’ compliance
with approved ESEA flexibility requests and (2) determine how the Department assessed the sufficiency
and accuracy of information received from SEAs to validate implementation of the approved ESEA
flexibility requests. Our review covered the Department’s ESEA flexibility monitoring process from
September 5, 2012, through April 14, 2014. 12

We performed our on-site review at the Department’s SASA office in Washington, D.C., from
September 23, 2013, through September 27, 2013. In addition, between February 28, 2014, and
May 8, 2014, we reviewed policies and procedures for ensuring the accuracy of ESEA flexibility data
submitted to the Department for monitoring at nine SEAs—Arizona Department of Education, Georgia
Department of Education, Kansas State Department of Education, Louisiana Department of Education,
Minnesota Department of Education, Oregon Department of Education, South Carolina State Department
of Education, South Dakota Department of Education, and Washington Office of the Superintendent of
Public Instruction. We held our exit conference with the Department’s SASA office on June 27, 2014.

To gain an understanding of ESEA flexibility, we reviewed background information related to ESEA
flexibility requirements, flexibility requests, and applicable laws and guidance. We also reviewed
information on ESEA programs affected by ESEA flexibility and funding information for those programs.
We obtained background information on the Department’s SASA office, which oversees ESEA
flexibility, and reviewed ESEA flexibility monitoring reports the SASA office issued to SEAs with
approved ESEA flexibility requests.

We interviewed key officials and program specialists in the Department’s SASA office and the Office of
Planning, Evaluation, and Policy Development, and reviewed related documentation to gain an
understanding of the following:

     •   policies and procedures over the Department’s ESEA flexibility monitoring,
     •   assessments of the accuracy and sufficiency of SEA submitted information,
     •   training to the individuals conducting the monitoring, and
     •   technical support provided to SEAs with approved ESEA flexibility requests.

In addition to reviewing SEA policies and procedures in the nine States in our review, we interviewed key
officials from those SEAs to determine how they ensured the accuracy of the information submitted to the
Department for monitoring. In addition, we obtained the supporting information to assess whether or not
SEAs followed their policies and procedures.




12
 The Department conducted the first Part A monitoring on September 5, 2012, for the nine SEAs we reviewed.
Final Report
ED-OIG/A04N0012                                                                   Page 11 of 19

SEA Selection

In selecting the 9 SEAs included in our review, we created a risk matrix with a universe of the 35 SEAs
that had an approved ESEA flexibility request in December 2011 or March 2012 (the first two
opportunities to apply) and had received Part A monitoring before September 13, 2013. We considered
various factors, such as whether the SEA (1) received Part A monitoring only or had also received Part B
monitoring; (2) provided limited documentation for monitoring based on the Department’s determination;
and (3) was on the Department’s high-risk list.

We judgmentally selected about 25 percent of the universe of 35 to review, resulting in 9 SEAs. Those
nine SEAs included the five that provided limited documentation for monitoring or were placed on the
high-risk list by the Department, and four others from the remaining SEAs with approved flexibility
requests. The results from the SEAs included in our review cannot be projected across all SEAs or LEAs.

Department’s Monitoring Assessment

We performed an assessment of the Department’s on-site and desk monitoring by performing seven tests
of its Part A and Part B monitoring efforts for the nine SEAs included in our review. We obtained the
following data and information to use in our testing:
   •   the Department’s Part A and Part B monitoring protocols,
   •   the supporting documentation the nine SEAs submitted to the Department for monitoring,
   •   the nine SEAs’ monitoring reports,
   •   the four SEAs’ Part B after action reports,
   •   the nine SEAs’ approved ESEA flexibility requests, and
   •   the Department’s Part B meeting minutes.

Table 2 describes tests performed, documentation and information reviewed, and the evaluation
methodology used to assess the Department’s monitoring efforts.
Final Report
ED-OIG/A04N0012                                                                          Page 12 of 19

                       Table 2: Tests Performed on the Department’s Monitoring Efforts
          Tests Performed                 Documentation and            Evaluation Methodology
                                         Information Reviewed
 Were Department goals reflected       • Department’s Part A and B     We reviewed the monitoring
 in the monitoring protocol              monitoring protocols          protocols to determine whether
 questions?                                                            the questions in the protocols
                                                                       related to the goals of the
                                                                       monitoring.
 Were monitoring protocol              • Nine SEAs’ ESEA               We reviewed the requirements
 questions sufficient to help            flexibility requests          in the SEAs’ ESEA flexibility
 determine SEAs’ compliance with       • Department’s Part A and B     requests and determined
 approved ESEA flexibility               monitoring protocols          whether questions in the
 requests?                                                             protocols would provide the
                                                                       Department with enough
                                                                       evidence to determine whether
                                                                       the SEAs were in compliance
                                                                       with approved ESEA
                                                                       flexibility requests.
 Did the Department follow the         • Department’s Part A and B     We reviewed the questions in
 monitoring protocols?                   monitoring protocols          the monitoring protocols and
                                       • Documentation submitted       verified whether the SEAs
                                         by the nine SEAs              provided a response for each
                                                                       applicable question.
 Did the Department document           • Part A and Part B             We reviewed the reports and
 results from on-site monitoring         monitoring reports            verified whether the
 and desk reviews?                     • Part B after action reports   Department included the
                                                                       results of the reviews.
 Did the Department monitor a          • List of SEAs with approved    We compared the two lists to
 sufficient number of SEAs across        ESEA flexibility requests     determine the number of SEAs
 all SEAs with approved requests?        (December 2011 and            with approved requests that
                                          March 2012)                  had received Part A and Part B
                                       • Part A and Part B             monitoring.
                                         monitoring reports
 Did the Department follow its         • Department’s Part A and B     We verified whether the SEAs
 established criteria, policies, and     monitoring protocols          provided evidence that was
 procedures for assessing the          • Documentation submitted       consistent with the examples
 sufficiency of the information the      by the nine SEAs              listed in the monitoring
 SEAs submitted for monitoring?                                        protocols.
 Did the Department follow its         • Documentation submitted       We verified whether the SEAs’
 established policies and                by the nine SEAs              monitoring documents
 procedures for using the SEA-         • Department’s Part B           supported the validation
 submitted information to validate       meeting minutes               decisions and rationales
 the SEAs’ compliance with             • Part B after action reports   included in the SEAs’
 approved ESEA flexibility             • Part A and Part B             monitoring reports, after action
 requests?                               monitoring reports            reports, and the Department’s
                                                                       ESEA flexibility meeting
                                                                       minutes.

Because of the extensive number of questions in the monitoring protocols, we did not review all the
questions and related documentation for the nine SEAs included in our review. Instead, we selected a
sample of questions from each of the Part A and Part B monitoring protocols to determine whether the
Final Report
ED-OIG/A04N0012                                                                                    Page 13 of 19

Department followed its established criteria, policies, and procedures for assessing the sufficiency of the
information submitted and for using the information to validate the SEAs’ compliance with approved
ESEA flexibility requests. In selecting our sample of questions, we focused on the questions pertaining to
areas of ESEA flexibility that had related Next Steps in the nine SEAs’ Part A and Part B monitoring
reports. For Part A monitoring, we selected 9 out of 16 questions 13 from the Part A monitoring protocol.
For Part B, we selected ESEA flexibility elements instead of questions because the SEAs received
different questions based on whether they received a desk or on-site review. The Part B monitoring
protocol is arranged by ESEA flexibility elements. Desk reviews include foundational and technical
assistance questions for each element and an additional comprehensive set of questions for the SEA
systems and processes element. On-site reviews include foundational and technical assistance questions
for each element, an additional comprehensive set of questions for the SEA systems and processes
element, and additional comprehensive questions for three other ESEA flexibility elements in the
protocol. The Part B protocol contained a total of 18 elements, which included a total of 108 questions.
We selected 7 out of 18 elements, which included 31 to 53 questions, depending on the type of review the
SEA received.

To determine whether the nine SEAs included in our review had policies and procedures for ensuring the
accuracy of the information submitted to the Department for monitoring, we started with a universe
containing all the questions we selected for our sufficiency tests, as described in the paragraph above.
However, not all of the questions required responses that could be tested for accuracy given the status of
ESEA flexibility implementation at the SEAs. For example, some questions required that the SEA submit
evidence of its implementation plans and procedures. In which case, the SEA either submitted its plans
and procedures or received a “Next Step” in its monitoring report. We selected all the questions that we
could test for accuracy, which resulted in six questions from Part A and five from Part B. 14 For the
selected questions, we asked the SEAs to explain their policies, procedures, and process for ensuring the
accuracy of the data used to create the documents, and the sources of the data. We also asked the SEAs to
submit documentation to support the policies and procedures described in our interviews with them.

Our review of the monitoring protocol was limited to the subsets of judgmentally selected questions we
described previously and only for the SEAs included in our review. As such, we did not review all of the
monitoring protocol questions or all SEAs with approved ESEA flexibility requests. Therefore, our
results from the protocol reviews are applicable only to the questions and the SEAs included in our
review.

Use of computer-processed data for the audit was limited to documentation provided by the nine SEAs we
reviewed as evidence that they were meeting ESEA flexibility requirements. We used data contained in
these reports to assess the Department’s monitoring controls. As such, we did not assess the reliability of
the computer-processed data.

Our review of internal controls is reflected in our test of the Department’s Part A and Part B monitoring
and our interviews with the nine SEAs regarding their controls for ensuring the accuracy of documents
sent to the Department for monitoring. We concluded that the Department does not have adequate
controls in place to ensure the accuracy of the information SEAs submit during ESEA flexibility
13
   The protocol had 22 questions, but we did not include 6 in our universe because 5 of the questions did not appear in the
monitoring reports because they were framing questions, and the purpose of the sixth question was to provide outreach to the
SEAs.
14
   For the questions selected for Part B accuracy testing, not all of the questions selected resulted in the SEAs providing
documentation that we could test for accuracy.
Final Report
ED-OIG/A04N0012                                                                      Page 14 of 19

monitoring. In addition, we concluded that the nine SEAs we reviewed followed State-established
policies and procedures to ensure the accuracy of the documentation we selected for review.

We conducted this performance audit in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate
evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We
believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on
our audit objectives.



                                ADMINISTRATIVE MATTERS


Corrective actions proposed (resolution phase) and implemented (closure phase) by your office will be
monitored and tracked through the Department’s Audit Accountability and Resolution Tracking System.
The Department’s policy requires that you develop a final corrective action plan (CAP) for our review in
the automated system within 30 calendar days of the issuance of this report. The CAP should set forth the
specific action items, and targeted completion dates, necessary to implement final corrective actions on
the findings and recommendations contained in this final audit report. An electronic copy of this report
has been provided to your Audit Liaison Officer.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector General is
required to report to Congress twice a year on the audits that remain unresolved after 6 months from the
date of issuance.

Statements that managerial practices need improvements, as well as other conclusions and
recommendations in this report, represent the opinions of the Office of Inspector General. Determinations
of corrective action to be taken will be made by the appropriate Department Education officials.

In accordance with the Freedom of Information Act (5 U.S.C. §552), reports issued by the Office of
Inspector General are available to members of the press and general public to the extent information
contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please call
Denise Wempe at (404) 974-9416.

                                             Sincerely,

                                             /s/
                                             Patrick J. Howard
                                             Assistant Inspector General for Audit


Attachments
Final Report
ED-OIG/A04N0012                                                                           Page 15 of 19

                 Attachment 1: ESEA Requirements Subject to Flexibility
ESEA Requirements           Approved Flexibility
Determining Adequate        SEAs are provided the flexibility to develop new ambitious but
Yearly Progress (AYP)       achievable annual measurable objectives in reading/language arts and
                            mathematics.
School Improvement          LEAs are no longer required to identify for improvement, corrective
Requirements                action, or restructuring their Title I schools that fail, for two consecutive
                            years or more, to make AYP. Currently required improvement actions are
                            required.
LEA Improvement             SEAs are no longer required to identify for improvement or corrective
Requirements                action an LEA that, for two consecutive years or more, fails to make
                            AYP. Currently required improvement actions are no longer required.
Rural LEAs                  LEAs are provided the flexibility to use Small, Rural School Achievement
                            Program funds or Rural and Low-Income School Program funds for any
                            authorized purpose, regardless of the LEA’s AYP status.
Schoolwide Programs         LEAs are provided the flexibility to operate a schoolwide program in a
                            Title I school that does not meet the 40 percent poverty threshold, under
                            certain conditions.
Support School              SEAs are provided the flexibility to allocate 1003(a) funds to an LEA to
Improvement                 serve any priority or focus school if the SEA determines such schools are
                            most in need of additional support.
Reward Schools              SEAs are provided the flexibility to use 1117(c)(2)(A) funds to provide
                            financial rewards to any reward school if the SEA determines such
                            schools are most appropriate for financial rewards.
Highly Qualified Teacher    An LEA that fails to meet Highly Qualified Teacher targets no longer has
Improvement Plans           to develop an improvement plan and has flexibility in how to use Title I
                            and Title II funds.
Transfer of Certain Funds   SEAs and their LEAs have flexibility to transfer up to 100 percent of
                            funds under ESEA section 6123 among those programs and into Title I,
                            Part A.
School Improvement          SEAs have the flexibility to award 1003(g) funds to an LEA to implement
Grants Funds to Support     one of the School Improvement Grants models in any priority school.
Priority Schools
Use of Twenty-First         SEAs have the flexibility to allow community learning centers that
Century Community           receive funds under the Twenty-First Century Community Learning
Learning Centers Program    Centers program to use funds to support expanded learning time during
Funds                       the school day and activities during nonschool hours or periods when
                            school is not in session
Making AYP                  SEAs and LEAs are no longer required to comply with 1116(a)(1)(A)-(B)
Determinations              and 1116(c)(1)(A) requirements to make AYP determinations for LEAs
                            and schools. Instead, SEAs and their LEAs must report on their
                            performance against the annual measurable objectives for
                            1111(b)(2)(C)(v) identified subgroups and use performance against the
                            annual measurable objectives to support continuous improvement in Title
                            I schools.
Within-District Title I     LEAs have the flexibility to serve a Title I-eligible high school with a
Allocations                 graduation rate below 60 percent for identified priority schools even if
                            that school does not rank sufficiently high to be served based on the
                            school’s poverty rate.
Final Report
ED-OIG/A04N0012                                                           Page 16 of 19

  Attachment 2: Acronyms, Abbreviations, and Short Forms Used in This Report

AYP               Adequate Yearly Progress

CAP               Corrective Action Plan

CSPR              Consolidated State Performance Report

Department        U.S. Department of Education

ESEA              Elementary and Secondary Education Act of 1965, as amended by the
                  No Child Left Behind Act of 2001

GPRA              Government Performance and Results Act of 1993, as amended by the GPRA
                  Modernization Act of 2010

LEA               Local Educational Agency

OESE              Office of Elementary and Secondary Education

OMB               Office of Management and Budget

SASA              Student Achievement and School Accountability

SEA               State Educational Agency

Title I           Title I, Part A of the ESEA
Final Report
ED-OIG/A04N0012                                                                  Page 17 of 19


              Attachment 3: Department’s Comments on the Draft Report


TO:           Patrick J. Howard
              Assistant Inspector General for Audit

FROM:         Deborah S. Delisle
              Assistant Secretary for Elementary and Secondary Education

SUBJECT: Draft Audit Report "U.S. Department of Education’s Implementation and Oversight of
              Approved Elementary and Secondary Education Act Flexibility Requests" Control Number
              ED-OIG/A04N0012

The Office of Elementary and Secondary Education (OESE) appreciates the opportunity to provide
written comments on the Draft Audit Report, "U.S. Department of Education’s Implementation and
Oversight of Approved Elementary and Secondary Education Act Flexibility Requests," ED­
OIG/A04N0012, dated October 9, 2014 (Draft Audit Report).

ESEA flexibility is a relatively new initiative and an evolving process for the U.S. Department of
Education (ED). Accordingly, we appreciate the recommendations included in the draft report and will
integrate them into our continuous improvement process.

OESE's responses and our proposed Corrective Action Plan are detailed below, organized by finding and
recommendation. Any subsequent questions, comments, or concerns should be addressed to:

Deborah S. Delisle
Assistant Secretary
Office of Elementary and Secondary Education
U.S. Department of Education
400 Maryland Avenue, SW, Suite 3W315
Washington, DC 20202

Please note that ED's Office of the General Counsel has reviewed OESE's response and concurs in it.
Final Report
ED-OIG/A04N0012                                                                  Page 18 of 19

FINDING NO. 1 - The Department Established and Implemented an Extensive and Effective
Monitoring Process

Recommendation 1.1- Review State Educational Agencies" (SEAs) A-133 audit reports to identify
any ESEA flexibility issues to follow up on during future ESEA flexibility monitoring efforts.

Comments: OESE agrees with Finding l and with Recommendation 1.1.

Proposed Corrective Action: OESE will build into the next monitoring cycle for ESEA flexibility
a requirement that the Program Officer assigned to each State review the SEA’s most-recent A-133
audit to identify any issues potentially related to the implementation of the SEA’s ESEA flexibility
request.

FINDING NO. 2 - The Department Can Improve Oversight to Ensure the Accuracy of the
SEA Information Submitted

Recommendation 2.1- Include in its monitoring reviews a step to determine how SEAs with
approved ESEA flexibility requests ensure the accuracy of the information they submit to ED for
monitoring so ED can determine the adequacy of their policies and procedures and whether the
SEAs are following them.

Comments: As noted in the OIG draft report, much of the information collected for Part A and
Part B monitoring was evidence of an SEA's approach and early implementation, rather than data
demonstrating progress and accomplishments. The majority of information provided to ED to
support implementation of ESEA flexibility was descriptive in nature. OESE presumes that an
SEA submits accurate data to support all monitoring activities and, as was noted in the OIG draft
report, all nine States reviewed by the OIG followed their respective State policies and procedures
for ensuring the accuracy of the data submitted to ED.

Proposed Corrective Action: OESE will include in its monitoring process a step to determine how
an SEA ensures the accuracy of the data it submits to ED.

In addition to this corrective action, please be aware that, in response to a related internal audit
regarding ED and SEA internal controls over assessment results, OESE will (1) require SEAs to
respond to all flagged comments in the Consolidated State Performance Report (CSPR) and EDFacts
data collections related to academic assessments and accountability and (2) revise the CSPR to
include an annual State certification that the State has in place a system of internal controls for
reviewing assessment data.

Recommendation 2.2 - Require all SEAs to provide certifications that the information they
submit is accurate, reliable, and complete and disclose any limitations of the information, data, or
validation process, especially for information used for GPRA reporting.

Comments: OESE agrees with Recommendation _2.2.

Proposed Corrective Action: In each SEA's request to renew ESEA flexibility, due to ED by the
end of March 2015, each SEA will be required to assure that the SEA will provide to the
Department, in a timely manner, all required reports, data, and evidence regarding its progress in
Final Report
ED-OIG/A04N0012                                                                    Page 19 of 19

implementing the plans contained throughout this request, and will ensure that all such reports,
data, and evidence are accurate, reliable, and complete or, if it is aware of issues related to the
accuracy, reliability, or completeness of its reports, data, or evidence, it will disclose those
issues.

Other Matters-

   •   At the bottom of page two, the draft OIG report indicates that eight LEAs in California
       submitted ESEA flexibility requests. This is not entirely accurate. The waivers to these
       eight LEAs were not granted under ESEA flexibility; rather, they were considered under a
       separate process that included parallel concepts to ESEA flexibility.

   •   At the top of page three, the draft OIG report indicates the "[four other SEAs either did not
       apply or withdrew their requests." This should be amended to include California as an
       SEA that did not apply for ESEA flexibility.

   •   In Table l on page three, the draft OIG report indicates that Part C monitoring will occur in
       Fall/Winter of 2014. This was true at the time of the review. However, as a result of
       office and program restructuring, Part C monitoring is no longer planned. OESE will
       continue to do comprehensive monitoring as a part of its overall performance management
       plan.