oversight

The Department's Monitoring of Race to the Top Program Recipient Performance

Published by the Department of Education, Office of Inspector General on 2014-01-03.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                                    UNITED STATES DEPARTMENT OF EDUCATION
                                          OFFICE OF INSPECTOR GENERAL


                                                                                                                  Control Number
                                                                                                               ED-OIG/A19M0003
                                                          January 3, 2014
James H. Shelton
Acting Deputy Secretary
Office of the Deputy Secretary
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202-4300

Dear Mr. Shelton:

This final audit report, titled The Department’s Monitoring of Race to the Top Program
Recipient Performance, presents the results of our audit. The objectives of our audit were to
(1) determine the extent to which Race to the Top (RTT) grantees have (a) adhered to timelines
established in their applications and related scopes of work, and (b) achieved project
performance measures and goals; and (2) evaluate the effectiveness of program oversight to
ensure that funds were used as intended and anticipated recipient performance was achieved in
support of overall programmatic goals.



                                                      BACKGROUND


The RTT program is a U.S. Department of Education (Department) discretionary grant program
authorized under the American Recovery and Reinvestment Act of 2009 (ARRA). It consists of
two separate grant programs: (1) RTT assessment grants, for which $350 million was set aside
for the purpose of supporting consortia of States in the development of new assessments aligned
to common sets of standards, and (2) RTT competitive State grants for reform, valued
cumulatively at $4 billion, and the focus of this audit. The purpose of the RTT program is to
encourage and reward States that are creating the conditions for education innovation and
reform; achieving significant improvement in student outcomes, including making substantial
gains in student achievement, closing achievement gaps, improving high school graduation rates,
and ensuring student preparation for success in college and careers; and implementing ambitious
plans in four core education reform areas:

    •     Adopting standards and assessments that prepare students to succeed in college and the
          workplace and to compete in the global economy;
    •     Building data systems that measure student growth and success, and inform teachers and
          principals about how they can improve instruction;

 The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                   excellence and ensuring equal access.
Final Audit Report
ED-OIG/A19M0003                                                                                       Page 2 of 36
    •    Recruiting, developing, rewarding, and retaining effective teachers and principals,
         especially where they are needed most; and
    •    Turning around our lowest-achieving schools.

RTT State grants were awarded in three phases. Phase 1 winners, of which there were two, were
announced in March 2010. Phase 2 winners, of which there were 10, were announced in
August 2010. The Department decided on the size of each State's award based on a detailed
review of the budget the State requested, considering such factors as the size of the State, level of
planned local educational agency (LEA) participation, and the proposed activities. In
December 2011, the Department made Phase 3 awards to seven of the nine States that were
identified as finalists but did not receive funding in Phase 2 of the RTT grant competition. These
States received part of $200 million in fiscal year (FY) 2012 RTT funds that were set aside to
allow States that were Phase 2 finalists that did not receive grants an opportunity to implement
parts of their detailed education reform plans.

The table below shows the 19 grant recipients, ordered by the phase in which they received RTT
funding; the associated funding for each State; and the total amount of funding awarded to all
States under the RTT program. We limited our review to a sample of awards made during
Phases 1 and 2 of the RTT award process, as Phase 3 awards were made just 6 months prior to
the start of our audit. 1 Specifically, we selected for review all States that were awarded
$500 million or more in RTT funds or that the Department had designated as high-risk at the
time of our review based on concerns over performance relative to their RTT plans. This
resulted in a judgmental sample of 5 of the 12 (42 percent) Phase 1 and 2 States.




1
  Section 14006(c) of the ARRA requires that at least 50 percent of RTT funding to States be subgranted to
participating LEAs according to their relative shares of funding under Part A of Title I of the Elementary and
Secondary Education Act (ESEA), as amended, for the most recent year. States have considerable flexibility in
awarding or allocating the remaining 50 percent of their RTT awards, which are available for State-level activities,
supplemental disbursements to LEAs, and other purposes as the State proposed in its plan. Participating LEAs are
those LEAs that chose to work with the State to implement all or significant portions of the State’s RTT plan, as
specified in each LEA’s memorandum of understanding (MOU) with the State.
Final Audit Report
ED-OIG/A19M0003                                                                                        Page 3 of 36
                                    Table 1: RTT Grant Awards by Phase

                                        State              Phase       RTT Funding
                                                                        Awarded
                                Delaware                     1          $119,122,128
                                Tennessee                    1          $500,741,220
                                District of Columbia         2           $74,998,962
                                Florida                      2          $700,000,000
                                Georgia                      2          $399,952,650
                                Hawaii                       2           $74,934,761
                                Maryland                     2          $249,999,182
                                Massachusetts                2          $250,000,000
                                New York                     2          $696,646,000
                                North Carolina               2          $399,465,769
                                Ohio                         2          $400,000,000
                                Rhode Island                 2           $75,000,000
                                Arizona                      3           $25,080,554
                                Colorado                     3           $17,946,236
                                Illinois                     3           $42,818,707
                                Kentucky                     3           $17,037,544
                                Louisiana                    3           $17,442,972
                                New Jersey                   3           $37,847,648
                                Pennsylvania                 3           $41,326,299
                                Total Funding                          $4,140,360,632

The Implementation and Support Unit (ISU), in the Office of the Deputy Secretary, administers
the RTT program. In an effort to assist States as they implement what it describes as
“unprecedented and comprehensive reforms to improve student outcomes,” the Department
designed and implemented an RTT program review process that, according to the RTT Program
Review Guide, “not only addresses the Department’s responsibilities for fiscal and programmatic
oversight, but is designed to identify areas in which RTT grantees need assistance and support to
meet their goals.” The review process emphasizes outcomes and the quality of program
implementation by States, in addition to progress in meeting timelines and managing budgets,
and includes ongoing conversations between the Department and States, on-site program
reviews, grantee self-evaluations, and periodic reporting, both by the Department and in the form
of Annual Performance Reports (APR) that are submitted by States. The Department has also
partnered with the Reform Support Network (RSN) 2 to provide extensive individualized and
collective technical assistance to States in order to help them resolve barriers to implementation.

In order to track State progress toward goals established in their RTT applications, the
Department required each State to prepare a Scope of Work (SOW) that was consistent with its
application. SOWs, which are updated approximately twice per year, provide a detailed plan
containing steps that a State must take to achieve overall programmatic goals. These plans are
organized under four main criteria, which mirror the four core education reform areas noted

2
  The Department awarded a technical assistance contract valued at $43 million in September 2010. The
performance work statement (PWS) requires that the contractor conduct needs assessments and provide
individualized technical assistance to States; gather, share, and use knowledge to support continuous improvement
and scaling of effective practices; and assist States in developing subrecipient monitoring plans and processes – all
under the auspices of the “Reform Support Network.”
Final Audit Report
ED-OIG/A19M0003                                                                                              Page 4 of 36
previously and are also referred to as assurance areas. 3 States also committed to work under a
fifth criterion that requires building strong Statewide capacity to implement, scale up, and sustain
proposed plans. Main criteria are further defined by subcriteria. Subcriteria include State-
identified projects and project goals tied to overall programmatic goals. Projects are further
divided into specific deliverables or milestone activities, which, if completed, should enable the
State to accomplish its project goals and, subsequently, programmatic goals. Each RTT
assurance area contains up to five subcriteria, with a total of 19 subcriteria comprising the RTT
application. 4 We found that subcriteria in States’ plans included, on average, three applicable
projects, and that projects included anywhere from one to sometimes hundreds of specific
deliverables, spanning the life of the grant.

See below for an actual example of how State implementation plans were constructed:
[Also see Attachment 1 for a list of all RTT criteria and subcriteria.]

                                                                                 •   Deliverable 1 for Project 1 under (A)(2)
                                                                                     [Recruit and hire staff.]
                                                  Project 1 under (A)(2)
                                                                                 •   Deliverable 2 for Project 1 under (A)(2)
                                                   [RTT Performance                  [Create RTT website.]
    Criterion/            Subcriterion             Management Office]            •   Deliverable 3 for Project 1 under (A)(2)
    Assurance          [(A)(2) Building
                                                                                     [Develop guidance for LEA MOU
                                                                                     submission.]
      Area             strong statewide                                          •   Deliverable 1 for Project 2 under (A)(2)
                          capacity to                                                [Administer baseline surveys on
    [A. State          implement, scale                                              performance evaluation.]
                                                  Project 2 under (A)(2)
     Success            up, and sustain                                          •   Deliverable 2 for Project 2 under (A)(2)
    Factors]            proposed plans]           [Office of Curriculum,
                                                   Instruction, and Field
                                                                                     [Summer network team institute (5-day
                                                                                     training sessions).]
                                                Services, Network Teams to
                                                 Support Implementation]         •   Deliverable 3 for Project 2 under (A)(2)
                                                                                     [Network team trained on State-
                                                                                     approved rubrics.]


As part of their application, States also provided baseline data and established annual as well as
4-year performance measure targets under certain student outcome categories, including student
achievement on State and Federal assessments, progress in closing achievement gaps, and
graduation rates and other postsecondary data. These targets were generally not required as part
of the RTT application, although States were required to provide targets in some areas, such as
teacher and leader effectiveness and turning around persistently low-achieving schools. For the
purposes of our audit, we reviewed performance measure baseline data, annual targets, and
actual data for a judgmental sample of student outcome measures tied to the RTT program
principles.

3
  The four assurance areas are: (1) Standards and Assessments, (2) Data Systems to Support Instruction, (3) Great
Teachers and Leaders, and (4) Turning Around the Lowest-Achieving Schools. States also provided plans under a
“State Success Factors” criterion, and in areas such as emphasizing science, technology, engineering, and
mathematics (STEM) and ensuring successful conditions for high-performing charters and other innovative schools.
4
  Not all RTT subcriteria may have been applicable to each State’s implementation plan. For example, a State may
have addressed the subcriterion as a reform condition (i.e., any preexisting laws, regulations, or policies favorable to
education reform or innovation), but not proposed any associated projects. Alternatively, a State’s plan might
include a project that is relevant to the subcriterion, but connected primarily to and discussed within the context of a
different subcriterion.
Final Audit Report
ED-OIG/A19M0003                                                                                         Page 5 of 36
The Department has prepared an amendment submission process along with guidelines for States
in the event that adjustments are needed to a State’s approved SOW, timelines, or performance
measure targets. Amendment approvals and rejections are sent to States in letter form and made
available for public review on the Department's RTT website. The Department has also
identified enforcement actions if it is determined that a State is not meeting its goals, activities,
timelines, budget, or annual targets, or is not fulfilling other applicable requirements.



                                             AUDIT RESULTS


We found that the five RTT States that we reviewed as part of this audit have had varying
degrees of success in adhering to timelines established in their applications and related SOWs
and in achieving performance measures and goals. We also concluded that while Department
oversight of the RTT program has been robust, it could undertake additional analysis of overall
program implementation and potentially benefit from enhancements to its project management
process.

With regard to objective one, we determined that all States in our sample experienced delays in
at least 45 percent of their RTT projects as of the States’ first progress reports. The percentage
of delayed projects ranged from 45 percent to 92 percent, with two States experiencing delays in
over 80 percent of their respective projects. 5 We found that each State improved its adherence to
timelines in Year 2 of the grant, with the percentage of delayed projects as of the States’ second
progress reports ranging from 13 percent to 54 percent. We found that the majority of delays or
changes to timelines for deliverables were 1 year or longer in length. In some cases, the length
of these project and deliverable delays currently reaches up to 3 years. However, we could not in
all cases determine the exact length of project delays because documentation reviewed did not
always provide details about the delays. Regarding performance measures, we found that results
varied in terms of States’ success in achieving annual Year 1 and Year 2 targets for the student
outcome measures included in our sample. However, we noted that in many cases, the trend
from baseline to Year 2 actual data was positive, regardless of whether or not performance
measure targets were met. 6

Our review of documentation maintained by the Department identified a number of common
causes for project timeline delays and potential obstacles to the achievement of performance
goals across the States in our sample. These included changes in high-level State leadership;

5
  Conclusions on project status as of the first progress reports are based on events that occurred between the start of
the grant and the first progress report, while conclusions on project status as of the second progress reports are based
on events that occurred between the first and second progress reports.
6
  During our audit, we also identified a few instances in which annual targets were set lower than the baseline, often
as a result of changes in State or Federal policy. For example, the requirement that all States use a new, uniform
graduation rate calculation led at least one State to conclude that graduation rates would likely be lower than
reported under its previous methodology. Targets were thus set accordingly. Other States had already implemented,
or were planning to implement, more rigorous student assessments, and concluded that scores would probably
decrease. In some cases, this led them to set lower – and what they viewed as more realistic – proficiency targets.
Another option for States was to reset “cut scores,” which are discussed in further detail in footnote 21.
Final Audit Report
ED-OIG/A19M0003                                                                        Page 6 of 36
staffing and other organizational challenges at State education agencies (SEA); lengthy State-
level contract deliberations and/or vendor quality concerns; RTT project and other Federal and
State program interdependencies; and stakeholder support issues, particularly with regard to
teacher and leader evaluation systems. Overall, we noted that States faced significant challenges
in the area of project management early in the grant period, which likely resulted from having to
adjust to administering a grant unique in magnitude and complexity, with such far-reaching
reform goals and so many moving parts.

As a result, many States were still in the planning phase for several reform areas when
implementation activities were supposed to be taking place. We noted that it is too early in the
grant period to conclude whether the timeline delays States experienced will affect the chances
of successful outcomes for grant projects and goals. We also found no specific evidence to
suggest that States with delayed timelines will not complete projects or will miss goals, and
Department officials maintain that planned outcomes can still be achieved. However, these
officials did acknowledge that, in a few cases, the possibility exists that States may not complete
all of their reform work within the grant period, or be able to meet all of the commitments
outlined in their applications and SOWs. As States progress further into the grant period, there is
increasing risk that projects with delayed or compressed timelines will not be completed on time
or will be implemented with poor quality, and that goals may become unattainable. Further, if
States miss annual targets for performance measures or experience negative performance trends,
the chances are increased that they also may miss their 4-year goals.

With regard to objective two, we found that the Department established and implemented an
extensive and effective process for monitoring RTT program recipients. Specifically, we noted
that the Department’s oversight mechanisms provide reasonable assurance that funds are being
used as intended, in accordance with all applicable criteria, and that either anticipated recipient
performance is achieved or actions are taken to improve States’ chances of success. However,
we determined that the Department has not yet issued a Comprehensive RTT Annual Report—
an overview of RTT efforts across all grantees, to include trends and statistics across all States,
successes and accomplishments, common challenges, and lessons learned, as discussed in its
RTT Program Review Guide. Currently, the Department considers this report to be a repository
of all of the State-specific Summary Reports for a given year. The State-specific Summary
Reports are annual assessments of each State’s RTT implementation, as judged against
individual State RTT plans, at a given point in time. Although individual States’ successes,
challenges, and lessons learned are made publicly available, the Department does not have any
mechanism by which it provides information on trends and statistics across States.

As a result of its effective monitoring process, the Department has been able to readily identify
issues related to the timeliness and quality of States’ implementation of their RTT plans. We
determined that the Department's monitoring process provides a comprehensive body of
evidence that allows the Department to be acutely aware of States’ plans and efforts and to assess
States’ progress toward meeting programmatic goals. The Department’s monitoring process also
enables it to identify areas where States might need additional attention, support them in
resolving any implementation issues, and hold them accountable by taking certain enforcement
actions in the event that improvements are slow to materialize or if the State is found to be in
noncompliance.
Final Audit Report
ED-OIG/A19M0003                                                                          Page 7 of 36
Department officials stated that, because it is relatively early in the grant period and States are in
different stages of implementation, releasing the Comprehensive RTT Annual Report may lead
to incorrect interpretations regarding a State’s progress or lack thereof. However, by not
producing a Comprehensive RTT Annual Report that highlights shared accomplishments and
common challenges, qualified however necessary, the Department is missing the opportunity to
provide valuable information, increase transparency, and offer stakeholders greater insight into
the RTT program, to include lessons learned from implementation.

Lastly, we noted that the Department may benefit from enhancing its process for maintaining
project management information. We found that the Department collects and maintains for its
use all of the information that it needs to effectively monitor States, in compliance with
applicable recordkeeping requirements, but does not maintain the information in the most readily
accessible and useful manner. A more formal and systematic way of maintaining project
management information could lead to more efficient decision-making. This is particularly true
for large-scale, evolving grants that contain many interconnected components. Such a system
would also ensure that significant project activities, amendments, and management decisions are
readily identifiable, which is particularly important in the event of any staff transitions or
turnovers of program officers, and might enhance the ability of the Department to summarize
trends and statistics across States.

In its response to the draft audit report, the Department stated that it generally concurred with our
finding on States’ timeliness and progress in achieving outcomes and noted its appreciation of
our recognition of its extensive and effective processes for monitoring RTT recipients. It agreed
with our recommendation that it continue to maintain its robust monitoring efforts. However, the
Department did not concur with our recommendation that it produce a Comprehensive RTT
Annual Report. The Department instead reiterated its commitment to creating what it describes
as an unprecedented level of transparency and stated that, at this point in time, it would be
imprudent to make broad comparisons across States in a report that may lead to incorrect
interpretations regarding a State’s progress, or lack thereof.

The Department’s comments are summarized at the end of each applicable finding. The full text
of the Department’s response is included as Attachment 3 to this report. No changes were made
to the report as a result of the response.

FINDING NO. 1 – Timeliness and Progress in Achieving Outcomes Has Varied
                Across RTT States

We found that the five RTT States that we reviewed as part of this audit have had varying
degrees of success in adhering to timelines established in their applications and related SOWs
and in achieving performance measures and goals. Specifically, we noted that in some cases,
certain activities and deliverables within projects were delayed, while in other cases, entire
projects were delayed. These delays ranged from months to years and their overall effect on
States’ plans varied. At times, multiple State projects within one or more of the four education
reform areas were delayed. We also noted that States’ adherence to timelines generally
improved in Year 2 of the grant, although some projects continued to experience significant
delays. Regarding performance measures, we found that results varied in terms of States’
Final Audit Report
ED-OIG/A19M0003                                                                                        Page 8 of 36
success in achieving annual Year 1 and Year 2 targets. However, we noted that in many cases,
the trend from baseline to Year 2 actual data was positive, regardless of whether or not
performance measure targets were met. 7

Project Timeliness

Extent of Delays

We determined that all States in our sample experienced delays in at least 45 percent of their
RTT projects as of the States’ first progress reports. The percentage of delayed projects as of the
States’ first progress reports ranged from 45 percent to 92 percent, with two States experiencing
delays in over 80 percent of their respective projects. 8 We found that each State improved its
adherence to timelines in Year 2 of the grant, with the percentage of delayed projects as of the
States’ second progress reports ranging from 13 percent to 54 percent. For example, one State
had all projects in 5 of 10 applicable subcriteria on track as of its Year 2 progress report, a
significant improvement over Year 1, when delays were noted in projects in all subcriteria.
Another State had all projects in 7 of 11 applicable subcriteria on track as of its Year 2 progress
report. Just 1 year earlier, the State was identified as having experienced delays in eight
subcriteria. We further determined that the States in our sample experienced delays in 13 percent
to 54 percent of all projects for which a determination on status could be made as of both the first
and second progress reports.

Table 2 provides the number and percentage of projects that we determined to have experienced
delays as of the first and second progress reports for each State in our sample:

                                             Table 2: Project Delays

                        Number of Projects               Number of Projects               Number of Projects
       State                Delayed as of                  Delayed as of                     Delayed as of
                       First Progress Report 9        Second Progress Report 10         Both Progress Reports 11
         A                   34/54 (63%)                    11/56 (20%)                        9/53 (17%)
         B                   29/36 (81%)                    12/29 (41%)                       11/29 (38%)
         C                   24/26 (92%)                    14/26 (54%)                       14/26 (54%)
         D                   18/28 (64%)                    16/38 (42%)                       14/29 (48%)
         E                   15/33 (45%)                     4/31 (13%)                        4/30 (13%)


7
  See footnote 6 on page 5.
8
  See footnote 5 on page 5.
9
  The projects included in the columns titled “Number of Projects Delayed as of First Progress Report” and
“Number of Projects Delayed as of Second Progress Report” were identified as follows: (1) the project was listed
and/or discussed in the applicable State progress report, or (2) the project was identified in an amendment approval
letter sent during the period covered by the applicable progress report.
10
   We noted that the number of applicable projects reported under each subcriterion was not always the same from
the first progress report to the second progress report due to approved changes in State plans. For example, in some
cases, we found that the discrepancy was due to the division of one Year 1 project into multiple projects in the
Year 2 progress report. In other cases, two or more Year 1 projects were consolidated during Year 2.
11
   The projects included in this column were identifiable in both the Year 1 and Year 2 progress reports, or in related
amendment approval letters. We noted that it was not always the case that the identical deliverables or activities
within a project were delayed in Year 1 and Year 2, but the project itself continued to experience delays.
Final Audit Report
ED-OIG/A19M0003                                                                                      Page 9 of 36
We based our analysis of States’ success in adhering to established timelines in their applications
and SOWs by reviewing information maintained in applicable Department grant files, to include
amendment requests and approvals, monthly progress updates and APRs submitted by the States,
on-site review documentation, and progress reports and annual State-specific Summary Reports
prepared by the Department – all of which are discussed in further detail under Finding No. 2.
We determined that the best method for the Office of Inspector General (OIG) to assess progress
on a project-level basis was to refer primarily to information included in the progress reports and
then to take into account relevant information included in the other sources of evidence noted
above.

Progress reports, of which the Department had issued two for each State at the time of our
review, are dynamic documents that summarize a State’s outcomes to date, progress in meeting
benchmarks and timelines, features and characteristics of implementation, and next steps for the
State and the Department. 12 These documents are highly detailed and are not released publicly,
but do provide the basis for the Department’s annual State-specific Summary Reports, which are
posted on the Department’s website and highlight successes and accomplishments as of a given
point in time, identify challenges, and also provide lessons learned.

For the purposes of this finding, we considered a project to have experienced delays under the
following circumstances: (1) the entire project or deliverables and activities within the project
were specifically identified as being delayed in a progress report prepared by the Department, or
(2) a Department-approved amendment pushed back timelines for the entire project or any of its
deliverables and activities. 13 The status of projects as of the second progress report was kept as
current as possible by including recent amendment information along with information from the
Year 2 State-specific Summary Reports and from meetings with Department officials.

Length of Delays

We found from the amendments reviewed for the sample of States we selected that the majority
of delays or changes to timelines for deliverables were 1 year or longer in length. In some cases,
the length of these project and deliverable delays currently reaches up to 3 years. However, we
could not in all cases determine the exact length of project delays from progress report and
amendment information because these documents did not always provide details about these
delays. Delays were also not always explicitly documented in other monitoring documentation
that we reviewed. We noted that the lengths of delays might sometimes be unknown or not yet


12
   The Department issued final versions of States’ first progress reports in January and February 2012, based on
information it obtained from States throughout 2011. The Department issued final versions of second progress
reports between April and December 2012, based on information it obtained from States throughout 2012. Although
the first and second progress reports did not always correlate directly with Year 1 or Year 2 of the grant, there was
considerable overlap. As a result, we use the terms first and second progress report interchangeably with the terms
Year 1 and Year 2 report throughout.
13
   In many cases, projects with Department-approved amendments were also identified in the progress reports as
having experienced delays—often with a reference to the applicable amendment(s) and information on the lengths of
any such delays. In some cases, whether or not a specific project experienced delays and for how long was not
entirely clear based on information contained in the progress reports alone. As a result, we also took into account
relevant information contained in Department-approved amendments.
Final Audit Report
ED-OIG/A19M0003                                                                         Page 10 of 36
fully realized due to continuing issues with projects that remained unresolved as of the writing of
the Department’s progress reports.

Common Challenges

We noted that, based on the documents described above, all five of the States in our sample
encountered delays in each of the four education assurance areas, and also in staffing program
management and oversight offices and instituting accountability systems and processes – what
the Department refers to as building Statewide capacity to implement and sustain their RTT
plans. A number of these delays were noted as having continued from Year 1 into Year 2 of the
RTT program. This was particularly true for projects within the following subcriteria, with each
noted as having at least three States experiencing persistent delays:

     •   Supporting the transition to enhanced standards and high-quality assessments;
     •   Accessing and using State data;
     •   Using data to improve instruction;
     •   Improving teacher and principal effectiveness based on performance; and
     •   Providing effective support to teachers and principals.

With regard to capacity, we noted that all States, at least initially, faced significant organizational
challenges, from restructuring to staffing to establishing processes for implementation and
oversight. Two States also encountered issues with project evaluations and in developing
appropriate management tools for LEAs.

Across States, and within the subcriteria listed above, we noted certain common challenges.
Four of the five States that we reviewed faced continuing delays in implementing activities
related to the Common Core State Standards (CCSS) and transitioning to new student
assessments. One State, for example, has been delayed in developing an interim assessment item
bank and test platform and common core mathematics formative assessment, while another State
has been delayed in developing benchmark assessments and providing training to educators.
Four of the five States have also faced continuing delays with regard to the use of data systems to
improve instruction. One State’s longitudinal data system and related projects have had
timelines extended from the State’s initial SOW and another State has encountered delays
pertaining to the development and roll-out of its education data portal, instructional reporting and
improvement system, and related applications.

Most States have also experienced continuing delays in projects concerning teacher and leader
effectiveness, such as the development and implementation of educator evaluation and
performance-based compensation systems and induction and professional development activities.
The Department identified two States, in particular, as having encountered significant challenges
in these areas and subsequently placed them on high-risk status, as discussed in further detail in
Finding No. 2. One State has faced continuing delays in creating leadership academies for
school principals and a model teacher induction program, and another State has dealt with
similar issues related to its school leadership program.
Final Audit Report
ED-OIG/A19M0003                                                                        Page 11 of 36
Common Accomplishments

Despite the common challenges noted above, we found in our review of the progress reports and
other related documentation that there were several areas, in particular, where a number of States
in our sample made significant progress from Year 1 to Year 2 as evidenced by the relative lack
of projects with persistent delays. One area in which most States in our sample appeared to have
largely resolved issues that they experienced in Year 1 was in building Statewide capacity to
implement and sustain their RTT plans. We noted that States faced significant organizational
challenges early in the grant period, but have generally been successful in implementing
processes and procedures related to administration and oversight in Year 2. For example, one
State struggled to define roles and responsibilities and fully staff its project management office in
Year 1 and was also delayed in providing a final subrecipient monitoring plan for Department
approval. By Year 2, however, the State had hired several experienced staff, enhanced internal
and external communication and support processes, and created a data collection protocol and
other ongoing monitoring procedures. Another State struggled with implementing structures to
support LEAs and hold them accountable for results achieved in Year 1, in addition to facing
high levels of turnover of staff working on the State’s RTT projects. In Year 2, the Department
noted that oversight and evaluation activities were functioning as intended and commended the
State’s efforts to build capacity. In both of these States, the number of delayed projects
associated with building Statewide capacity to implement and sustain their RTT plans dropped to
zero.

We noted that a number of States in our sample also appeared to have experienced successes in
Year 2 within the following subcriteria:

   •   Ensuring equitable distribution of effective teachers and principals;
   •   Improving the effectiveness of teacher and principal preparation programs; and
   •   Turning around the lowest- achieving schools.

We found that States made major improvements between Year 1 and Year 2 in implementing
projects related to ensuring the equitable distribution of effective teachers and principals and
improving the effectiveness of teacher and principal preparation programs. Overall, the five
States in our sample experienced 60- and 40-percentage point decreases, respectively, in the
number of projects delayed within these two subcriteria. One State went from having 63 percent
of its projects in these subcriteria delayed in Year 1 to having no delays in Year 2. In particular,
this State was able to mitigate procurement delays that it faced during the initial year of the grant
and launch teacher and principal preparation programs and recruitment programs for minority
teachers. Another State went from having 50 percent of its projects in these subcriteria delayed
in Year 1 to having no delays in Year 2. Communication challenges related to implementation
of a Statewide evaluation system and delays in executing grants and contracts for a professional
education institute and school leader supply and demand study were largely overcome in Year 2,
and enhanced report cards for all teacher preparation programs in the State were publicly
released. A third State struggled with projects related to improving the effectiveness of teacher
and principal preparation programs, but was successful in meeting timelines under the equitable
distribution subcriterion, largely because it implemented various competitive grant awards and
partnered with nonprofit entities.
Final Audit Report
ED-OIG/A19M0003                                                                      Page 12 of 36
States also appear to have made considerable progress in projects associated with turning around
their lowest-achieving schools, as exhibited by the nearly 40-percentage point decrease in delays
across the five States in our sample. One State faced delays in 89 percent of its Year 1 projects
in this area, but managed to reduce this number to 33 percent in Year 2 by executing contracts to
build LEA leaders’ capacity to support low-performing schools in rural areas and working with
community partners to support efforts related to family literacy, among other things. Two other
States encountered issues in Year 1 pertaining to governance, staffing, and stakeholder support
within specially designated entities for school turnaround, but have since been successful in
implementing a number of projects. Among these projects are extended learning time, school
participation in the pilot of a teacher evaluation system, establishment of offices and systems
related to managing districts of low-performing schools, and charter school contracting.

Performance Measures and Goals

We found that the States included in our review have had varying degrees of success in
achieving targets established for performance measures. We also noted mixed results with
regard to whether actual results have trended positively or negatively over established baselines.

We based our analysis of States’ success in achieving performance measures and goals on
information contained primarily in Year 1 and Year 2 APRs. These documents are submitted by
States and contain information on outcomes to date, performance against measures established in
applications, and other relevant data. According to Department officials, the performance
measures that States included in their applications are leading indicators of their success towards
improving student outcomes. Therefore, the APR is one mechanism for holding States
accountable for meeting any established targets or making significant progress towards them.

The RTT Notices Inviting Applications (NIA) for Phases 1 and 2 only required that States
establish performance measure targets for specified subcriteria, but not as a condition of
eligibility or as an absolute priority. These included targets under (D)(2) improving teacher and
principal effectiveness based on performance, (D)(3) ensuring equitable distribution of effective
teachers and principals, (D)(4) improving the effectiveness of teacher and principal preparation
programs, and (E)(2) turning around the lowest-achieving schools. We noted that a number of
States nevertheless established targets in other RTT assurance areas, as well as targets under
optional performance measures of their own choosing. The Department monitors States’ success
in achieving such targets.

We noted limited State data in APRs concerning the required performance measures from the
RTT application. Generally, States did not plan to have projects involving these measures fully
implemented until after Year 2 of the grant. As such, many Year 1 and Year 2 measures in these
sections of the APR are marked as not applicable. We also found that States indicated that they
would not be able to calculate or publish certain measures as planned in Year 1 and Year 2 due
to delays in implementation of certain projects within these reform areas.

In light of the above, we decided to focus on a judgmental sample of student outcome measures
tied to the RTT program principles and to determine not only whether a performance measure
had met its annual target, but also whether it showed progress from baseline data. We reasoned
Final Audit Report
ED-OIG/A19M0003                                                                                     Page 13 of 36
that this information, taken together, would provide a more comprehensive and accurate picture
as to whether RTT implementation results are trending positively or negatively. 14 Among the
student outcome measures we reviewed were results on annual State assessments and the
biannual, Federally-administered National Assessment of Educational Progress (NAEP); changes
in the achievement gap between certain subgroups; and changes in high school graduation,
college enrollment, and college course completion rates.

A summary of our review is presented in Table 3 below and the related narrative that follows.




14
  The RTT principles entail accountability for adhering to promises made in State plans across all four education
assurance areas; ensuring fiscal responsibility and appropriate use of funds; meeting performance measure targets or
making significant progress towards them; and, ultimately, achieving increases in student outcomes.
Final Audit Report
ED-OIG/A19M0003                                                                                       Page 14 of 36
                    Table 3: States’ Success in Achieving Selected Performance Measures


                                                                              States that Set       States with
                                                          States that Set
                                                                              and Achieved        Actual Data that
                                                          and Achieved
                             Measure                                             Target in        Show a Positive
                                                          Target in Year
                                                                                  Year 2           Trend With or
                                                                1 15
                                                                               (Preliminary) 16     Without Set
                                                                                                     Targets 17
     State Assessment Results
         • English Language Arts (ELA) All Students            0 of 1              0 of 1              2 of 5
         • ELA Grade 4                                         0 of 1              0 of 1              3 of 5
         • ELA Grade 8                                         1 of 2              1 of 2              2 of 4
         • Math All Students                                   0 of 1              0 of 1              2 of 5
         • Math Grade 4                                        1 of 1              1 of 1              3 of 5
         • Math Grade 8                                        2 of 2              1 of 2              2 of 5
     NAEP Results
         • Reading Grade 4                                     1 of 4              N/A 18               N/A
         • Reading Grade 8                                     1 of 4              N/A                  N/A
         • Math Grade 4                                        1 of 4              N/A                  N/A
         • Math Grade 8                                        0 of 4              N/A                  N/A
     Closing Achievement Gaps
         • ELA White/Black (All Students)                      0 of 0              0 of 0              2 of 5
         • ELA Not Low Income/Low Income                       0 of 1              0 of 1              2 of 5
            (All Students)
         • Math White/Black (All Students)                     0 of 0              0 of 0              2 of 5
         • Math Not Low Income/Low Income                      0 of 1              0 of 1              2 of 5
            (All Students)
     Graduation Rates and Postsecondary Data
        • High School Graduation Rate (All Students)           4 of 5              3 of 5              2 of 4
        • College Enrollment Rate (All Students)               3 of 5              2 of 3              1 of 3
        • College Course Completion Rate (All                  1 of 3              1 of 2              2 of 2
            Students)




15
   States were only included in the counts for Year 1 and Year 2 if they established targets in their applications and
SOWs, or through subsequent amendments, and provided actual data in their APRs. In some cases, this led to a
decrease in the number of States reported for a particular performance measure in Year 1 and Year 2. In cases
where actual data were unavailable, we noted that States provided explanations for the delay in reporting to the
Department.
16
   Year 1 APR data were finalized, but Year 2 data are still subject to change. We found that the changes from
preliminary to final data in the Year 1 APR were insignificant and that this was likely to be the same for Year 2 data.
17
   We considered a performance measure trend not applicable (N/A) if it did not have baseline or Year 2 data. We
also did not include States in performance measure trend counts if there was no change in actual data from the
baseline to Year 2.
18
   The NAEP was most recently administered in school year (SY) 2010-2011, or Year 1 of the RTT grant. State
results on the 2012-2013 NAEP will be provided in the Year 3 APR.
Final Audit Report
ED-OIG/A19M0003                                                                                       Page 15 of 36
We found that results varied in terms of States’ success in achieving annual Year 1 and Year 2
targets. However, we noted that in many cases, the trend from baseline to Year 2 actual data was
positive, regardless of whether or not performance measure targets were met. 19

We noted that the Department allowed States to amend annual performance measure targets, and
that not all approved amendments decreased targets. We found that the Department allowed
amendments to performance measure targets for the following reasons: (1) there was a lack of
data generated for the measure as a result of delays in implementation; (2) States clarified their
reasons for established targets or targets were not initially calculated in alignment with the
definitions provided in the RTT NIA; (3) new, more accurate baseline data were collected or
issues were found in previously reported baseline data; (4) States wanted to amend targets to
align with those found in other grant programs, including ESEA flexibility; 20 and (5) States
implemented more rigorous assessments after application targets were established and reset cut
scores 21 to provide a more accurate picture of student achievement.

The RTT NIAs for Phases 1 and 2 state that each recipient is accountable for meeting the goals,
timelines, budget, and annual targets established in its application; adhering to an annual fund
drawdown schedule that is tied to meeting these goals, timelines, budget, and annual targets; and
fulfilling and maintaining all other conditions for the conduct of the project. We noted, however,
that the Department has allowed States to request revisions to their RTT projects, provided that
the following conditions are met:

     •   Such revisions do not result in the grantee’s failure to comply with the terms and
         conditions of the award and the program’s statutory and regulatory provisions;
     •   The revisions do not change the overall scope and objectives of the approved proposal;
         and
     •   The Department and the grantee mutually agree in writing to such revisions.

Further, according to the Department’s grant amendment submission process guidelines, a State
must justify any revisions to activities in its approved RTT plan that substantially diverge from
what was proposed in its initial plan and must provide compelling evidence of how such a
change will help it meet its performance measures and achieve increases in student outcomes.

Causes of Timeline Delays and Potential Obstacles to Achieving Performance Goals

Our review of progress reports and other related documentation identified a number of common
causes for project timeline delays and potential obstacles to the achievement of performance
goals across the States in our sample. These included changes in high-level State leadership;

19
   See footnote 6 on page 5.
20
   On September 23, 2011, the Department offered each interested SEA the opportunity to request flexibility on
behalf of itself, its LEAs, and its schools regarding specific requirements of the ESEA, as amended. States whose
requests were approved were offered this flexibility in exchange for committing to certain education principles,
which the Department identified in its policy statement on “ESEA Flexibility,” last updated on June 7, 2012.
21
   Cut scores are intended to denote proficiency on an assessment (i.e., the pass/fail divide). If a State implements
more rigorous assessments, but maintains the same cut scores that existed before, it will generally be the case that
fewer students will be deemed proficient. As a result, States have proposed resetting cut scores to more accurately
reflect their definition of proficiency.
Final Audit Report
ED-OIG/A19M0003                                                                     Page 16 of 36
staffing and other organizational challenges at SEAs; lengthy State-level contract deliberations
and/or vendor quality concerns; RTT project and other Federal and State program
interdependencies; and stakeholder support issues, particularly with regard to teacher and leader
evaluation systems. Overall, we noted that States faced significant challenges in the area of
project management early in the grant period, which likely resulted from having to adjust to
administering a grant unique in magnitude and complexity, with such far-reaching reform goals,
and so many moving parts. Additionally, we noted that success in achieving some performance
measure targets may be affected by required changes to methodologies for rate calculations or
implementation of more rigorous student assessments. 22

Changes in State Leadership

Leadership turnover was cited in the Department’s progress reports as contributing to delays in a
number of States in our sample. We noted that all five States changed governors in either
December 2010 or January 2011, after RTT grants were awarded, and that four of the five also
have a different Chief State School Officer than the one that they had at the time of their
application for RTT funding. We learned that one State faced delays in its commissioner’s
leadership academy project due to a leadership transition within the State department of
education, and that the transition to a new governor and State superintendent of education shortly
after another State was awarded its RTT grant resulted in the need for the State to revise most of
its timeframes. Another State’s transition from an elected to appointed board of education led to
a delay in approval of the college- and career-ready diploma, among other things.

State Capacity

All of the States in our sample faced staffing and other organizational challenges, which in turn
resulted in implementation delays throughout their RTT plans. One State, for example, initially
planned to build an electronic platform for its teacher and leader evaluation system in-house, but
later hired an external contractor to build the system because of competing demands on its
information technology staff. LEAs in this State also reported that the State did not timely
provide some resources pertaining to State assessments, which prevented the LEAs from creating
instructional materials on schedule. Another State struggled to develop project management
processes and build the necessary systems and capacity to support such a large grant. As a
result, the State faced challenges coordinating and collaborating with its large amount of
participating LEAs and all of the various stakeholders involved. This State was also delayed in
its STEM projects because of limited staff capacity and the need to prioritize other work.

Contracting

All of the States in our sample encountered significant challenges with regard to contracts.
These included difficulties navigating through complex legal and regulatory requirements,
technical systems issues, and vendor quality concerns. The overwhelming majority of one
State’s State-level RTT funds are in the contractual category, making contracts vital to grant
implementation and delays in executing contracts a significant concern. This State experienced
delays in issuing many of its contracts in Year 1 because it was not satisfied with initial
22
     See also footnote 6 on page 5.
Final Audit Report
ED-OIG/A19M0003                                                                       Page 17 of 36
contractor proposals and took time to evaluate its options. This was the case, in particular, for
projects related to implementation of the CCSS; teacher and principal preparation programs, with
an emphasis on STEM instruction; and professional development activities geared toward
educators in persistently low-achieving schools. This State has also not always been satisfied
with the quality of deliverables received, which, although a justifiable concern, has led to delays
in associated activities. Delays in issuing requests for proposals (RFP) for another State’s
benchmark assessment project were due, in part, to the conversion of the State's contracts
database. This State also experienced delays in selecting a vendor to work with to build its
value-added/growth model, and faced continuing delays with its STEM work due to repeated
requests from its legal department for clarification on deliverables.

RFPs in two States were delayed due to difficulties in clarifying resource needs and technical
requirements. One of these States also experienced issues with its financial systems and was
further delayed in executing contracts due to its lengthy procurement process, the large number
of RTT projects that relied on external assistance, and the lack of initial high-quality responses
from interested vendors. In another State, an RFP pertaining to the State’s school leadership
program was delayed as a result of the internal review process taking longer than expected.
Contract negotiations also caused delays.

Program and Project Interdependencies

During discussions with Department officials, and through our review of monitoring
documentation, we determined that States did not always consider interdependencies between
RTT projects and other State and Federal programs, and the effect that delays in one area might
have on their ability to perform work in another area. For example, one State’s reliance on
teacher evaluation data that was not available until Summer 2012 led to delays in many of its
Year 2 activities related to teacher and leader effectiveness. The State also needed to align its
RTT and State Longitudinal Data Systems (SLDS) grant work, another Department grant
program, so as to prevent any inconsistencies in implementation. Another State decided to delay
issuing an RFP for its benchmark assessments project because it wanted to avoid duplicating the
efforts of the Partnership for Assessment of Readiness for College and Careers, a consortium of
States that is working to develop assessments aligned to the CCSS. This State also experienced
challenges collecting data from LEAs that participated in a pilot of its teacher and leader
evaluation system, which then delayed its analysis of pilot results. This analysis was necessary
prior to making decisions about the design of the system, training materials, and roll-out for the
next school year.

Another State faced delays in the development of administrative rules, which resulted in a delay
in the development of a new residency-based alternative certification program since these rules
were needed to define program parameters and requirements. This State also encountered delays
due to a change in its strategy related to interim, end-of-course, and formative assessments at the
end of Year 1 out of concern for duplicating work that was already planned to be conducted.
Two States faced delays in projects associated with their strategies for turning around the lowest-
achieving schools due to the need to align the work with their approved ESEA flexibility
requests.
Final Audit Report
ED-OIG/A19M0003                                                                        Page 18 of 36
Stakeholder Support

Ongoing delays in finalizing a master contract and supplemental agreement with the State
teachers association affected one State’s ability to pursue activities related to providing
incentives to ensure the equitable distribution of teachers and leaders, planning for compensation
based on educator effectiveness, and incentives and compensation for teachers in high-demand
fields. The State was also delayed in piloting its teacher evaluation system. Another State
experienced similar delays because of a lawsuit that its teachers union filed challenging
implementation of the State’s evaluation systems. This has, in turn, led to stakeholder
communication issues.

Overall Effect on RTT Grant Program

The RTT program is a new type of grant, unique in magnitude and complexity. States planned
for sweeping Statewide reforms using many interconnected components that have the potential to
make or break success. Executing the planned work presented challenges that required States to
learn the necessary resource management techniques for this type of grant, and how to make
appropriate adjustments to how these types of resources should be dispersed.

As a result of initial capacity issues and other challenges, many States were still in the planning
phase for several reform areas when implementation activities were already supposed to be
taking place. We found that the Department noted that some projects within State plans are
working on very aggressive timelines because of the delays faced. In order for States to
compensate for falling behind on timelines, we noted that they need to increase resources and
attention to those particular projects or pieces of projects, or change the strategy for how project
goals will be accomplished within the 4-year grant period.

We noted that it is too early in the grant period to conclude whether the timeline delays
experienced by States will affect the chances of successful outcomes for grant projects and goals.
We also found no specific evidence to suggest that States with delayed timelines will not
complete projects or will miss goals. Department officials maintain that planned outcomes
remain achievable at this point in time, and emphasized that the amendment review process, as
discussed further under Finding No. 2, is designed to ensure that the impacts of any changes are
fully understood and mitigated to the extent possible. However, these officials did acknowledge
that, in a few cases, the possibility exists that States may not complete all of their reform work
within the grant period, or be able to meet all of the commitments outlined in their applications
and SOWs. In that regard, in March 2013, the Department issued guidance to grantees on
amendment requests for no cost extensions. The purpose of a no cost extension is to provide
additional time beyond the final project year for a grantee to accomplish the goals and
deliverables it committed to in its RTT application and SOW. The guidance states that, similar
to any other amendment, the Department will consider each request on a case-by-case basis in
the context of a State’s plan and SOW (i.e., implications to the project(s), scope and overall
objective(s), and outcomes for which the amendment is being sought). The Department will also
consider a State’s status and standing with the Department, to include whether the State has been
placed on high-risk status at any point during the grant period, SOW implementation progress,
and performance in other related Federal grant programs.
Final Audit Report
ED-OIG/A19M0003                                                                       Page 19 of 36
Project timelines are becoming more important as States progress further into the grant period.
Although we agree that the ability to make changes is necessary to address certain issues that
States encounter during implementation and can, in fact, be beneficial to achieving outcomes,
there is increasing risk that projects with delayed or compressed timelines will not be completed
within the grant period or will be implemented with poor quality, and that goals may become
unattainable. Further, if annual targets for performance measures are missed or if there are
negative performance trends, the chances are increased that 4-year goals may also be missed.
Although only limited conclusions can be drawn at this point in time, the effectiveness of the
RTT program overall relies in no small part on States’ success in raising student achievement,
closing gaps between subgroups, and increasing high school graduation and college enrollment
rates.

Department Comments

The Department generally concurred with our finding that the five RTT States in our sample had
varying degrees of success in adhering to timelines and achieving performance measures and
goals. The Department noted that it has allowed amendments to timelines in light of its focus on
States’ progress in meeting the goals of their grants, but has addressed this issue with its program
review processes. Lastly, the Department stated that it believes it is imperative to rigorously
monitor implementation of the RTT program and that it continually looks for ways to improve its
program review process.

FINDING NO. 2 – Department Oversight Has Been Robust, but Additional
                Reporting Could Increase Transparency

We found that the Department established and implemented an extensive and effective process
for monitoring RTT program recipients. Specifically, we noted that the Department’s oversight
mechanisms provide reasonable assurance that funds are being used as intended, in accordance
with applicable criteria, and that either anticipated recipient performance is achieved or actions
are taken to improve States’ chances of success. We noted, however, that the Department could
further its efforts to be transparent and to provide the public with greater insight into overall RTT
program implementation by preparing and issuing the Comprehensive RTT Annual Report
discussed in its RTT Program Review Guide.

Department Oversight Activities
We determined that the Department is fulfilling monitoring requirements as discussed in the
“Handbook for the Discretionary Grant Process” (OS-01, dated January 26, 2009) (Handbook).
Section 5.3 of the Handbook requires the Department to develop suitable monitoring tools that
are designed to assess the extent to which projects are meeting established program goals,
objectives, and performance measures in accordance with a grantee’s approved application and
any approved revisions. It also requires the development of a monitoring and technical
assistance plan that should serve as a standard and guide for monitoring grants under the
program and discuss activities pertaining to both fiscal and programmatic (or performance)
monitoring.
Final Audit Report
ED-OIG/A19M0003                                                                     Page 20 of 36
SOW and Amendment Process

We learned that a State’s approved SOW is the basis against which the Department performs its
monitoring activities, in conjunction with information contained in the State’s application and
budget, to determine progress towards goals and adherence to timelines. According to
Department officials, the SOW is the vehicle by which an application is translated into
implementation. Applications were high-level plans that laid out States’ overall goals and
strategies, while SOWs contained the detailed steps necessary to achieve these goals. The
Department provided guidance to States on the information that they should include in SOWs,
but each document is different because the Department was flexible on the format because of the
complexity of projects involving RTT funding.

Department officials explained that, during the process of approving initial SOWs submitted by
States, Department staff ensured that each State’s SOW reflected the goals and performance
measures established in their application. We found significant communications between State
representatives and Department staff during this time and other review activities to uphold the
“application to SOW” transition process, as described. We found no evidence to suggest any
substantial discrepancies between the initially approved SOWs and the original applications.
Department officials and program officers further explained that they held States accountable for
milestones discussed in their applications before approving their SOWs.

Department officials also explained on multiple occasions that, because RTT States are
monitored to ensure that the opportunities most likely to bring about successful program
outcomes are being pursued, a decision was made to allow flexibility in States’ plans in the form
of amendments. To this end, the Department approved amendments to RTT projects that
affected timelines and performance measures if it was determined that these changes would not
affect the overall scope of the State’s education reform proposal, or raise serious questions
regarding a State’s chances of accomplishing project goals within the grant period. The
Department also encouraged States to submit amendment requests if it was found that a State’s
RTT plan could benefit from a revised approach.

We noted that Department officials required States to provide the following information when
requesting activity or major budgetary changes, along with appropriate supporting
documentation: (1) the grant project area that would be affected by the change, (2) a description
of the requested change, (3) the State’s rationale for why the change is warranted, (4) an impact
statement regarding RTT goals, and (5) budget documentation. These amendment requests are
then reviewed with the RTT program principles in mind. It appears that the Department fully
considered how amendments would affect State plans, including whether there were any
interdependencies between the projects being amended. Communications between Department
officials and State representatives prior to amendment approvals provide evidence that the
Department sought to ensure that requests were formulated in a way that was acceptable and
consistent with the RTT program principles. We also found cases where requested amendments
were not approved. The Department explained in its approval letters to States the effect of the
approved amendments on implementation plans and performance measures.
Final Audit Report
ED-OIG/A19M0003                                                                    Page 21 of 36
Department officials said that they would not approve an amendment's requested timeline shifts
if they believed that the State could not meet the timelines. Department officials further
explained that during the internal Department amendment approval process, program officers
and ISU leadership took into account prior amendments and the State’s original application
promises. The Department put conditions on amendments if they were acceptable but caused
concern, and in some cases included amendment requests as part of the reason for putting a State
on high-risk status.

Program Monitoring Plan

The RTT Program Review Guide is the Department’s monitoring and technical assistance plan
for the RTT program. We noted that the Department was generally conducting the activities
noted in the RTT Program Review Guide. We also determined that it exceeds the requirements
of the Handbook in terms of the frequency with which States must report on the status of their
RTT plans and associated fiscal matters, as well as the types of documentation that must be
provided to the Department. It addresses the Department’s responsibilities for fiscal and
programmatic oversight, and is also designed to identify areas in which RTT grantees need
assistance and support to meet their goals. It includes various components, including ongoing
conversations between the Department and grantees, on-site program reviews, grantee self-
evaluations, and stocktake meetings with the Secretary of Education and ISU leadership. In
general, we found significant ongoing communication, weekly in most cases, between the
Department and States that showed evidence of substantive monitoring that generally ensured
that any implementation or compliance issues were followed up on. Other components of the
RTT Program Review Guide are discussed in detail below.

       Monthly Progress Updates

       We found that the Department generally received progress updates from States and
       conducted official progress update calls each month. We noted some cases where an
       update protocol document and/or an official progress update call was determined not to
       be necessary for a State in a particular month. This occurred if the Department held an
       on-site review in that month, which would mean that the progress report was under
       development and related followup was occurring on a frequent basis, or if there was some
       other circumstance by which the Department had already obtained the information that it
       needed for the monthly update (e.g., emails, phone calls, other correspondence).

       We found that each subcriterion applicable to a State’s plan is discussed at least twice a
       year, whether as part of the monthly progress update and call or annual on-site review, as
       discussed below. We also found that notes of these calls taken by program officers show
       issues discussed, issues requiring followup, next steps to help States with
       implementation, and instructions to ensure that States comply with program laws and
       regulations.

       The Department created a two-part template for States to complete and submit as part of
       the monthly progress update process: (1) Part A, which asks for a general update on the
       status of implementation of the State’s entire RTT plan, to include key accomplishments
Final Audit Report
ED-OIG/A19M0003                                                                                      Page 22 of 36
         and challenges, whether or not the State is on track to meet activity goals and timelines,
         and how the Department can help the State meet its goals; and (2) Part B, which asks for
         detailed progress updates for two subcriteria and requires the State to evaluate its
         performance using a rating system that the Department devised and which is discussed in
         further detail under the Progress Reports subsection later in this finding. 23 Among the
         questions that States must answer in Part B are the extent to which progress has been
         made toward meeting goals and performance measures; its assessment of the quality of
         implementation; the methods, tools, and processes that the State uses to make such
         determinations; why certain projects are off track and, if so, what strategies are being
         employed to get back on track; and potential obstacles or risks to implementation.

         On-site Program Reviews

         We found that the Department completed the required components of the on-site review
         for Year 1 and Year 2, and completed its Year 3 on-site reviews in June 2013. 24
         According to the RTT Program Review Guide, each State will receive an annual on-site
         visit for the duration of the grant. Phase 1 States had full on-site reviews in Year 1. The
         Department conducted limited on-site reviews in Year 1 for Phase 2 States, since the
         States had only been implementing their plans for 7 to 9 months at that time. These
         limited reviews included an analysis of documentation pertaining to two subcriteria –
         building strong Statewide capacity to implement, scale up, and sustain proposed plans
         and improving teacher and principal effectiveness based on performance – as well as
         discussions between Department and State officials. In Year 2, all States underwent a
         full on-site review, which included Department site visits to three LEAs and a review of
         all subcriteria applicable to the State’s RTT plan.

         Information from these reviews is incorporated into the Department’s State progress
         reports. States are required to provide project implementation information to the
         Department, in addition to fiscal and other compliance accountability documentation,
         several weeks before the on-site review. This includes answers to template questions on
         key accomplishments, significant challenges, and the status of all RTT projects. States
         are also required to collect and submit related information from at least three
         participating LEAs, selected in consultation with the Department. 25 A Department
         official stated that this allows program officers to corroborate information that the States
         have provided and to note any contradictions or red flags with regard to performance or
         financial data.

23
   We noted that monthly progress updates that included more detailed updates on specific subcriteria did not begin
until July 2011. The Department and States have worked in consultation to develop a schedule for specific
subcriteria discussion, ensuring all subcriteria are discussed during the course of the year. We also noted that, in
some cases, the Department will request that States provide information on only one subcriterion if it is very
complex and requires extensive discussion.
24
   Fieldwork for our audit was completed by the time the Department finished its Year 3 on-site program reviews.
As a result, we did not review documentation that would have allowed for a determination as to whether or not the
Department completed the required components of the on-site review in Year 3 as we did for Years 1 and 2.
25
   The Department asks that States include a range of LEAs in the on-site review: small and large, rural and urban,
LEAs with the lowest achieving schools, and LEAs that contain schools implementing different types of RTT
projects.
Final Audit Report
ED-OIG/A19M0003                                                                                   Page 23 of 36
        During the on-site review, which lasts 3 to 5 days, the Department holds meetings with
        State officials to review their responses to the questions noted above, analyze progress
        against performance measures, discuss quality of implementation, and identify areas for
        Department support and technical assistance. The Department also holds meetings with
        LEA officials within the State to discuss the quality of local implementation and learn
        more about LEAs’ relationships with the State, to include an overall analysis of State
        implementation and oversight. On-site program reviews in Year 1 were generally
        conducted by three to four Department officials, including a member of the ISU
        leadership team and the assigned lead and backup program officers for the State. In

        Year 2, only the assigned lead and backup program officers participated. A contractor 26
        also assists Department officials on these visits to review documentation concerning
        allocations to LEAs; fiscal management (i.e., policies and procedures, tracking, allowable
        uses of funds, cash management); ARRA reporting; and subrecipient monitoring. We
        noted that the on-site review is the primary means by which the Department assesses
        States’ compliance with fiscal accountability requirements, in addition to ongoing
        monitoring of budget and drawdown information.

        Stocktake Meetings

        We found that four of the five States in our sample had participated in one stocktake
        meeting as of the time of our review, and that all meetings occurred in the second year of
        the grant. Stocktake meetings are described in the RTT Program Review Guide as
        periodic data-based conversations, held generally twice per year, between RTT recipients
        and Department leadership. According to a Department official, discussion topics
        include potential red flags, as well as areas where the States are performing well, to help
        them figure out how to get back on track or maintain the pace of RTT implementation.

        Department staff explained that some stocktakes have been postponed, as was the case
        with the one State for which a meeting was not held, to allow States the opportunity to
        address certain issues and to enable the Department to have the most current information
        prior to the meeting. Department staff also noted that, in certain cases, Department
        officials have met with State leadership to discuss progress in place of an official
        stocktake meeting, a statement supported by our review of available documentation.

        Annual Performance Reports

        We found that each State included in our review submitted an APR to the Department for
        both Year 1 and Year 2 of the grant noting outcomes to date, performance against the
        measures established in its application, and other relevant data. We also noted that APR


26
   The Department awarded a monitoring contract valued at $5 million in September 2010. The PWS, which also
governs monitoring work under the Department’s State Fiscal Stabilization Fund (SFSF), requires that the contractor
meet and communicate regularly with Department representatives; prepare a monitoring and staffing plan; prepare
fiscal monitoring reports for SFSF and RTT grantees; finalize format and graphics for RTT State-specific reports;
collect and analyze APR data from both programs; and prepare a report at the end of each grant for each program.
Final Audit Report
ED-OIG/A19M0003                                                                     Page 24 of 36
      performance measure targets either matched application targets or those found in
      amendments, and were appropriately discussed in comment sections of the APR.

      The Department uses the APR to track State progress against the performance measures
      that were established in applications or via post-award amendments. The Department
      makes APR information available on its website. The Department automatically fills in
      certain APR data based on information provided in States’ applications and SOWs, or
      maintained by the National Center for Education Statistics, as is the case with NAEP
      scores. States are then required to submit actual performance data each year. Governors
      or their authorized representatives must also certify the validity of the State’s APR data,
      both within each assurance area and overall, and are also encouraged to disclose any
      weaknesses. After States have certified APR information, the Department performs a
      series of automated and manual reviews to ensure that the data are accurate and do not
      conflict with other data in the report. The automated review involves system checks to
      ensure that States have submitted all required information and that budget and other
      calculations are accurate. A standardized checklist facilitates the manual review of APR
      data. We noted that the Department’s comments in these checklists show substantive
      monitoring of State-submitted APR data in both Year 1 and Year 2. If any issues are
      found, the Department discusses discrepancies with the State, unlocks the applicable
      APR section, and asks the State to edit the data and recertify its information. The
      Department itself does not alter any APR data. States are required to perform any edits.
      The Department also provides States the opportunity to discuss and adjust pre-populated
      data in the APR if they find discrepancies.

      The Department requests or gives states the opportunity to provide the context for certain
      data in the comment sections of the APR. We found that comment sections included
      explanations of issues that States encountered while producing APR data, such as why
      some measures were not applicable in Year 1 and Year 2 of the grant or any intricacies
      about the data that made it difficult to assess performance relative to established targets.
      Comments also discuss APR performance measure amendments and any related changes
      to applicable targets.

      Progress Reports

      The Department had provided two progress reports to each of the States in our sample as
      of the time of our review. Overall, we found that issues noted in progress reports
      generally corresponded with issues noted in other Department monitoring documentation.
      We also noted that the Department included a “Summary of Monitoring Indicators”
      section in each States’ second progress report, which listed critical fiscal accountability
      and oversight monitoring elements that the Department reviewed during the program
      review process, and whether applicable program requirements were met. The
      Department noted issues pending with three of the States in our sample related to
      subrecipient monitoring and fiscal oversight of RTT funds. The Department required or
      recommended actions to be completed by these States. We found that the States
      complied with the required and recommended actions within the timeframes that the
      Department provided.
Final Audit Report
ED-OIG/A19M0003                                                                      Page 25 of 36
       Progress reports summarize a State’s grant outcomes as of a specific point in time,
       including progress in meeting timelines and characteristics of implementation. These
       reports are based on information collected through monthly progress updates, the on-site
       program review, the APR, and other relevant qualitative and quantitative data, and are
       considered “living documents.” They are also highly specific and very technical in
       nature, serving primarily as a means of providing formative feedback to the States. We
       noted that the Department included color-coded ratings of States’ progress on applicable
       subcriteria goals and objectives, to include consideration of both timeliness and quality of
       implementation, in the Year 1 and Year 2 progress reports. The Department's ratings, as
       noted in the introduction to each report, are based on an analysis of the information that it
       has on each subcriterion as of the writing of the report.

       These reports generally reflected Year 1 and Year 2 information, but were not always
       drafted at the same time for each State and, because of the timing of the program review
       schedule, were not always provided to States a year apart. The Department noted that the
       second progress reports provide ISU’s feedback to States on progress since the grant’s
       award date, with a focus on developments and updates since the first progress report, to
       show areas of success and areas needing additional attention and support.

       State-specific Summary Reports

       We noted that the Department prepared a State-specific Summary Report for each of the
       States in our sample for Year 1 and Year 2. We found that issues noted in State-specific
       Summary Reports generally corresponded with issues noted in progress reports and other
       monitoring documentation.

       The State-specific Summary Report is an annual assessment of a State’s RTT
       implementation. These reports, which were published in January 2012 and February
       2013, respectively, are publicly available documents that highlight successes and
       accomplishments, identify challenges, and provide lessons learned and key next steps for
       each State’s implementation plan. According to a Department official, the State-specific
       Summary Reports provide narrative feedback for both States and stakeholders to review.

Reasons for Effective Monitoring

We found that effective monitoring of RTT program recipients occurred due to a combination of
management focus and related internal control structures, as well as sufficient planning, training,
and communication. Specifically, we found that, from the onset of the program, leadership was
actively engaged in developing and implementing an effective monitoring plan and related
processes. We noted that administrative decisions (i.e., amendment approvals, enforcement
actions) involving information collected through monitoring efforts are made by program
leadership in consultation with program officers. We also noted that monitoring activities for
each State are being conducted by at least two program officers, usually a lead program officer
and one or more backup program officers.
Final Audit Report
ED-OIG/A19M0003                                                                      Page 26 of 36
We determined that the Department’s RTT Program Review Guide provides an adequate plan
with sufficient detail to enable program officers to effectively monitor States’ progress, as well
as their compliance with fiscal requirements. The plan includes guidance on collecting and
reviewing information that is key to effective monitoring and has been further supplemented by
other documents concerning program oversight. A Department official explained that, because
RTT was a new program in 2010, the Department piloted certain monitoring activities early on in
the grant period and made revisions as necessary, and when deemed appropriate by ISU
leadership. To that end, we noted that the Department prepared a self-assessment in
November 2011 of monitoring practices conducted during the first year of the grant. The
assessment noted that its purpose was to identify progress, successes, and areas of potential
weakness and to inform improvements for Year 2 and beyond. A Department official explained
that this particular self-assessment was also used to spark internal discussions and both formal
and informal conversations with States about how the Department could improve routines,
practices, and protocols.

We found that program leadership provided program officers with adequate training and
continuous direction and support in carrying out their monitoring responsibilities. Department
officials explained that, along with individual and group training sessions, program leadership
provided program officers with guidance documents describing when and how monitoring
activities should be conducted. These guidance documents were intended to help program
officers appropriately perform, coordinate, and track work and included approximate deadlines
and specific steps for how all aspects of the program review process were to be completed. We
noted that some program officers were also heavily involved in the development of the program
review process and the monitoring activities within it.

As a result of its effective monitoring process, the Department has been able to readily identify
timeliness and quality issues with regard to States’ implementation of their RTT plans and take
appropriate action as needed. Specifically, we found that the Department has noted project
timeline delays, as described in Finding No. 1, and quality issues, to include concerns regarding
overall strategies for success, the scalability and sustainability of RTT projects, data validation,
the effectiveness of subrecipient monitoring procedures, and communications between States and
participating LEAs. We determined that the Department's monitoring process provides a
comprehensive body of evidence that allows the Department to be acutely aware of States’ plans
and efforts, and allows for assessments of progress toward programmatic goals. The
Department’s monitoring process also enables it to identify areas where States might need
additional attention and to support them in resolving any implementation issues.

When the Department identified areas needing additional attention, but improvements were
either slow to materialize or deemed essential to the success of the grant, the Department held
States accountable by taking certain enforcement actions. These included placing conditions on
a grant or placing all or a portion of a grant on high-risk status. We determined that these actions
were taken at various stages of implementation in an attempt to either keep States from
encountering additional issues, or to get them back on track with regard to progress or
compliance, as further explained below.
Final Audit Report
ED-OIG/A19M0003                                                                      Page 27 of 36
High-Risk Status

In December 2011, the Department placed one of our sample State’s entire RTT grant on high-
risk status due to unsatisfactory performance during the first 14 months of the grant. Information
from an amendment update noted that, based on the Department’s Year 1 on-site program review
and monthly calls, the Department determined that the State had experienced major delays and
made inadequate progress across its plan. In addition, the Department noted that the scope and
breadth of the State’s amendment requests indicated a potentially significant shift in the State’s
approved plans. As a condition of high-risk status, the State was placed on cost reimbursement
status, which required it to submit receipts for expenditures to the Department before drawing
down grant funds. In addition, the State was required to submit documentation before obligating
funds to ensure funds were spent in alignment with the approved SOW. Finally, the State was
required to submit a revised SOW and budget in January 2012 to reflect amendments.

In June 2012, the Department removed this State from cost-reimbursement status because it had
met certain conditions. In October 2012, the Department’s lead program officer for this State
explained that the State would remain on high-risk until the Department was satisfied with its
grant progress as seen through program review process evidence, and that the Department was
not yet at the point where it felt that it must withhold funding from the State’s RTT grant, due to
the early stage of grant implementation. On February 8, 2013, the Department determined that
the evidence the State provided was sufficient to meet the expectation of clear and compelling
evidence of substantial progress in two assurance areas and that, as a result, it would remove
high-risk status for the State’s RTT grant in these two assurance areas. On July 29, 2013, the
remaining areas of the State’s grant were taken off of high-risk status, with the Department
noting that the State was executing all of its projects consistent with amended timelines and had
demonstrated that it was able to meet critical milestones in the last 6 months in a way it was
unable to do before December 2011.

In July 2012, the Department placed a portion of another State’s RTT plan on high-risk status.
Amendment information notes that this action was deemed necessary after review of the
June 2012 on-site review documentation, monthly calls, amendment requests, and followup
conversations. Department documentation shows that the State had been provided ample
opportunities to demonstrate a comprehensive decision-making process, including consideration
of dependent deliverables, a structured process for evaluating and incorporating formative
feedback, and a communications strategy including all relevant stakeholders. However, the
Department determined that the State had not provided sufficient evidence that it was designing
and implementing its teacher and leader evaluation system with a comprehensive and deliberate
approach. The Department noted that it would work with the State to identify appropriate
technical assistance to support this work.

As a condition of high-risk status, this State was required to submit a revised SY 2012-2013 plan
for its work on teacher and leader evaluation systems. The plan was required to include clear
timelines, activities, and deliverables. In addition, the State was required to submit monthly
updates in accordance with the revised work plan so that the Department can determine if the
State is making adequate progress. Finally, the State was required to submit a report no later
than July 2013 summarizing the analysis and findings related to validation of all components of
Final Audit Report
ED-OIG/A19M0003                                                                      Page 28 of 36
its educator evaluation system. The Department informed the State that its high-risk status will
be reassessed if the State demonstrates substantial progress implementing the work plan outlined
above. The State was also notified that if the conditions are not substantially complied with, the
Department would take appropriate enforcement action, which may include initiating procedures
to withhold up to $33 million in associated funding. We found that the State submitted a revised
work plan for this area, but that this piece of the grant remained on high-risk status due to
continued Department concerns. On July 30, 2013, the Department notified this State that it was
planning to withhold $9.9 million in RTT funds based on the State’s decision not to implement
the performance-based compensation system described in its approved application and
referenced in its approved SOW. The Department noted that this change in scope to the State’s
plan significantly decreased or eliminated reform in one of the areas and resulted in the grantee’s
failure to comply substantially with the terms related to this portion of its RTT award.

Other Conditions

We found that the Department placed certain conditions on States when it felt that such actions
would help keep implementation efforts on track, or if implementation efforts needed to be
corrected. Conditions required States to be subject to additional monitoring procedures, which
included submitting monthly updates in addition to those required on specific projects or
activities, submitting assurances from State representatives on certain State practices, and
submitting evidence to support that an activity had been completed.

The Department generally communicated the conditions it placed on States through amendment
approval letters because many amendments were approved subject to conditions, as described in
more detail below:

   •   The Department placed a condition on one State that required it to provide timely updates
       on RTT project timelines as related contracts were awarded. This included providing the
       Department with a monthly list of the status and expected award date for all pending
       contracts related to RTT and, after a contract was awarded, submitting a detailed timeline
       that included, at a minimum, quarterly benchmarks outlining the State’s plan for
       accomplishing the related projects outlined in the State’s approved RTT application.
   •   The Department approved another State’s request to delay a pilot of the reduction of the
       student achievement gap component in its teacher evaluation system under the condition
       that the State submit a report summarizing the steps taken to investigate other methods to
       incorporate the reduction of the student achievement gap at the classroom level, the
       results of that analysis, and a proposal for implementation of the component in the
       following school year.
   •   The Department approved an amendment request from one State with the condition that it
       submit to the Department an assurance from the State superintendent affirming that its
       end-of-course tests would not be used for accountability purposes under the ESEA, an
       activity prohibited by the RTT regulations because of the existence of a separate program
       designed for such purposes. This State also received amendment approval from the
       Department conditioned upon the State providing quarterly updates on the status of
       implementation of a new induction and mentoring program.
Final Audit Report
ED-OIG/A19M0003                                                                        Page 29 of 36
   •   The Department approved a request from another State to delay and revise its approach
       on LEA submission of final MOUs involving turning around the lowest-achieving
       schools under the condition that the State include updates about LEA progress toward
       finalizing MOUs in the monthly reports submitted to the Department.
   •   The Department approved one State’s request to amend its early warning data system
       project under certain conditions. These conditions required the State to submit an
       updated SOW that contained: (1) the sequence of major activities and significant
       milestones to ensure that the State would be able to design and release the system,
       maintain and improve an educator evaluation system, and establish the foundation for an
       enhanced overall data systems platform during the grant period; and (2) mechanisms that
       the State would use to build awareness and determine quality and usability of the early
       warning and educator evaluation data systems among LEAs. Additionally, the State was
       required to include updates about progress of the activities in this project in each of the
       monthly reports that the State submitted to the Department.

We found that all of the conditions that the Department has imposed on States have been met, are
currently being complied with, or are based on events that have not yet occurred.

Comprehensive RTT Annual Report

We determined that the Department has not yet issued a Comprehensive RTT Annual Report, as
discussed in its RTT Program Review Guide. Currently, the Department considers this report to
be a repository of all of the State-specific Summary Reports for a given year. We noted that
State progress against RTT plans is discussed in detail in the individual State-specific Summary
Reports and that student outcome data and other performance measures are discussed in the
APRs. However, although individual States’ successes, challenges, and lessons learned are made
publicly available, the Department does not have any mechanism by which it provides
information on trends and statistics across States.

The RTT Program Review Guide states that the Comprehensive RTT Annual Report is an
overview of RTT reform efforts across all grantees, and is used to inform Congress and other
stakeholders about the progress of the RTT grantees, summarize trends and statistics across
grantee States, highlight successes and accomplishments, identify common challenges, and
provide lessons learned from implementation. The State-specific Summary Reports, on the other
hand, are annual comprehensive assessments of each State’s RTT implementation, as judged
against individual State RTT plans, at a given point in time.

Department officials stated that, because it is relatively early in the grant period and States are in
different stages of implementation, releasing a Comprehensive RTT Annual Report may lead to
incorrect interpretations regarding a State’s progress, or lack thereof. These officials added that
if several States have completed a project but one State has not, it does not necessarily mean that
the one State is behind schedule. Reporting requirements in States differ and program outcomes
have different timelines in each State. We learned that the Department does not plan to release a
Comprehensive RTT Annual Report until after Year 4 of the grant, and that, until then, the State-
specific Summary Reports are what the Department considers to be its Comprehensive RTT
Annual Report. The RTT Program Review Guide, however, identifies the Comprehensive RTT
Final Audit Report
ED-OIG/A19M0003                                                                       Page 30 of 36
Annual Report and the State-specific Summary Reports as separate components of the program
review process.

We found that what is currently considered to be the Comprehensive RTT Annual Report does
not illustrate implementation progress across States and across the four education assurance
areas. At this time, the only way by which someone might determine this information would be
to read all of the State-specific Summary Reports in search of any overall conclusions on the
status of the RTT program. By not producing a Comprehensive RTT Annual Report that
highlights shared accomplishments and common challenges, qualified however necessary in light
of the concerns noted above, the Department is missing the opportunity to provide valuable
information, increase transparency, and offer stakeholders greater insight into the RTT program,
to include lessons learned from implementation.

Recommendations

We recommend that the Deputy Secretary ensure that ISU officials and RTT program officers

2.1    Maintain the robust monitoring efforts implemented to date, to include taking appropriate
       actions if States continually fail to meet project timelines and/or performance measures
       and goals.

2.2    Produce a Comprehensive RTT Annual Report that shows trends and statistics across
       States and identifies accomplishments and common challenges within the education
       reform areas.

Department Comments

The Department agreed with recommendation 2.1, stating that it will continue its oversight of
RTT grantees with the same rigor and quality that it has exemplified to this time. The
Department did not concur with recommendation 2.2. Rather, the Department reiterated the
various elements of its program review process, including an emphasis on transparency, and
expressed its primary reason for not producing a Comprehensive RTT Annual Report.
Specifically, the Department again noted that each State has a unique plan, which means that
States are in different stages of implementation. As a result, the Department believes that it
would be imprudent to make broad comparisons across States in a report that may lead to
incorrect interpretations regarding a State’s progress, or lack thereof, and will instead produce a
final comprehensive report at the end of the grant.

The Department noted that, beginning in Year 3, the APR will include a comparison page that
will allow users to select from a menu of States and compare common metrics across the chosen
grantees. The Department also stated that it expects to refresh the RTT Program Review Guide
in the near future to include information on the large amount of annual data that are being
provided by the Department and to clarify that it will issue a comprehensive RTT report at the
end of the grant. The Department concluded its comments by stating that it believes it has
already fully addressed recommendation 2.2 and that there is not a need to produce the annual
report recommended by OIG.
Final Audit Report
ED-OIG/A19M0003                                                                      Page 31 of 36
OIG Response

We considered the Department’s comments but did not make any changes to the finding or
related recommendations. We recognize the potential challenges involved in producing a
Comprehensive RTT Annual Report. We also appreciate the enhancements that the Department
has made to its APR system to allow for the comparison of common metrics across States, which
will presumably include student achievement on State and Federal assessments, progress in
closing achievement gaps, and graduation rates and other postsecondary data. However, our
position remains that stakeholders, including Congress and the American public, would further
benefit from periodic reporting on common accomplishments and challenges across States and
within each of the four education reform areas, as discussed in the RTT Program Review Guide.
We also maintain that these reports could be qualified however necessary to help guard against
incorrect interpretations by readers and would emphasize that they be provided in whatever form
and manner deemed appropriate by the Department.

The Department identifies the RTT program as the largest-ever Federal competitive investment
in school reform. As such, it has a responsibility to provide taxpayers with unprecedented
insight regarding overall program successes and failures, as well as lessons learned from
implementation. We found that what is currently considered to be the Comprehensive RTT
Annual Report does not accomplish this task and continue to recommend that the Department
improve upon its efforts in this area.



                                     OTHER MATTER


The Department May Benefit from Enhancing its Process for Maintaining Project
Management Information

We noted during the course of our audit that the Department does not use a consolidated project
management system for the purpose of tracking performance information, metrics, and other
monitoring data for States that received funding under RTT Phases 1 and 2. Specifically, we
found that while all information necessary to effectively monitor States is collected and available
for Department use, in compliance with applicable recordkeeping requirements, it may not be
maintained in the most readily accessible and useful manner.

RTT grant information is currently maintained electronically in file folders on individual
program officers’ computers, or in hardcopy binders containing hundreds of documents that were
produced at different points in time. Program officers did not express any concerns regarding
their ability to conduct effective monitoring due to their intimate familiarity with State
implementation plans and issues, given the level of communication that occurs. Program
officers further explained that they can review any of the program review process documentation
at any time to find whatever information is necessary to support their monitoring efforts.
Final Audit Report
ED-OIG/A19M0003                                                                    Page 32 of 36
Despite these assurances, we believe that the opportunity exists to improve the monitoring
process by capturing and maintaining information in an accessible database, similar to what the
Department encourages RTT recipients to use and similar to what it uses in managing other grant
programs. A more formal and systematic way of maintaining project management information
could lead to more efficient decision-making. This is particularly true for large-scale, evolving
grants that contain many interconnected components. For example, to see whether projects and
deliverables have been amended previously without having to go through all prior approval
letters and other monitoring documentation may decrease turnaround time on amendment
decisions and help the Department in identifying areas that present ongoing challenges for a
State. Such a system could maintain all of the most recent information and show due dates and
actual completion dates for project deliverables and milestones, a history of timeline changes to
specific projects, resolved and pending issues, and any areas requiring immediate attention.
Connections and dependencies among projects in the same and other subcriteria may also be
more easily viewed, which could further facilitate program oversight.

Organizing documentation in this manner may also allow new program officers that assume lead
monitoring responsibility over a State to become familiar with implementation plans, progress,
and issues more quickly and easily than they would if they were required to review all of the
monitoring documentation collected to date and conduct discussions with the previous lead
program officer and State RTT teams. During our review, both the lead and backup program
officers in charge of monitoring one of the States in our sample left the Department. The new
program officer explained that she reviewed the State's program review process documentation
and amendment approval letters, and also met with the previous lead program officer and State
RTT teams to familiarize herself with the grant. In the event a program officer suddenly leaves
the Department and is not available for consultation, the new program officer may be at a
disadvantage in obtaining historical information and discerning key information pertaining to the
grant.

During our discussions with ISU leadership, we learned that the Department is piloting certain
aspects of an interactive project management system known as GRADS360 (Grantee Records
and Assistance Database System) with RTT Phase 3 States, but that the system was not ready for
Phase 1 and Phase 2 States and also lacks certain desired functions. GRADS360 was initially
developed for the Department’s SLDS program. The system, if implemented for the RTT
program, would allow the Department to upload SOWs directly into a shared database, along
with budgets and other monitoring documentation. States would also be able to submit
amendment requests for Department approval and request technical assistance. Officials noted
that GRADS360, in its current form, is overly prescriptive and that implementing it now would
require that States rewrite and reformat their SOWs to meet a specific template. GRADS360 is
designed to be a joint project management system that would require State participation.
Adopting GRADS360 would not allow States discretion in determining the best tool or system to
use in managing their grant, and States have communicated to the Department that such
discretion is valuable and important to them.

While we understand the Department’s concerns with GRADS360, we continue to believe that it
may benefit from enhancing its process for maintaining project management information. We
suggest that the Department strongly consider using an interactive, consolidated project
Final Audit Report
ED-OIG/A19M0003                                                                     Page 33 of 36
management system during the remaining years of the RTT State grants and when planning for
the administration of future, similarly scaled grants. Such a system could ensure that significant
project activities, amendments, and management decisions are readily identifiable and trackable,
which is particularly important in the event of any staff transitions or turnovers of program
officers, and might also enhance the ability of the Department to summarize trends and statistics
across States.


                 OBJECTIVES, SCOPE, AND METHODOLOGY


The objectives of our audit were to (1) determine the extent to which RTT grantees have
(a) adhered to timelines established in their applications and related scopes of work, and
(b) achieved project performance measures and goals; and (2) evaluate the effectiveness of
program oversight to ensure that funds were used as intended and anticipated recipient
performance was achieved in support of overall programmatic goals.

To accomplish our objectives, we gained an understanding of internal control applicable to the
Department’s administration and oversight of discretionary grant programs. We reviewed
Department policies and procedures, Office of Management and Budget Circular A-123
“Management’s Responsibility for Internal Control,” and the Government Accountability
Office’s (GAO) “Standards for Internal Control in the Federal Government.” We reviewed
legislation, regulations, and Department guidance pertaining to the RTT program, in general, and
Phases 1 and 2 of the RTT competition, in particular. In addition, to identify potential
vulnerabilities, we reviewed prior OIG and GAO audit reports with relevance to our audit
objectives.

We conducted discussions with ISU officials and RTT program officers to obtain a more
complete understanding of the RTT program. These discussions focused primarily on the RTT
program review process, to include the roles and responsibilities of technical assistance and
monitoring contractors, and grantee accomplishments and challenges during Years 1 and 2 of
program implementation. We also conducted discussions with officials in the Department’s
Institute of Education Sciences involved in ongoing evaluations of the RTT program.

The scope of our review was limited to the Department’s post-award activities and applicable
documentation, through February 2013, for grants made under Phases 1 and 2 of the RTT
competition. We also reviewed information regarding enforcement actions taken in July 2013.
We specifically selected for review all States that were awarded $500 million or more in RTT
funds or that the Department had designated as high-risk based on concerns over performance
relative to their RTT plans. This resulted in a judgmental sample of 5 of the 12 (42 percent)
Phase 1 and 2 States, 1 of which was awarded a Phase 1 RTT grant and the remaining 4 of which
were awarded Phase 2 RTT grants. Our sample included $2.37 million out of $3.94 million
(60 percent) awarded to States under Phases 1 and 2 of the RTT competition. Because there is
no assurance that the judgmental sample used in this audit is representative of the respective
universe, the results should not be projected over the unsampled grant awards.
Final Audit Report
ED-OIG/A19M0003                                                                       Page 34 of 36
Adherence to Timelines

To fulfill part (a) of our first objective, we reviewed applications and SOWs, amendment
requests and approvals, monthly progress updates, on-site review documentation, APRs, progress
reports, and annual State-specific Summary Reports for each of the States in our sample. To
identify the universe of projects applicable to each State’s RTT plan we relied initially on
information contained in State-submitted applications and SOWs; however, we determined that
differences in the structure and content of these documents across States would lead to
inconsistencies in our analysis and, therefore, relied primarily on information included in
Department-created progress reports to identify all applicable projects. Information in progress
reports was corroborated to the extent possible with information in applications and SOWs in
order to provide reasonable assurance that the Department provided a complete and accurate
accounting of applicable projects. We identified projects as of each progress report based on the
following criteria: (1) the project was listed and/or discussed in the applicable State progress
report, or (2) the project was noted in an amendment approval letter sent during the period
covered by the applicable progress report.

To determine the status of individual projects comprising States’ RTT plans we assessed whether
projects were on track or had experienced delays as of the first progress report and as of the
second progress report. We considered a project to be on track or to have experienced no delays
as of a progress report if there were no indications of delays at that point in time, either per the
progress report itself or per amendments applicable to the period covered by the progress report.
Conversely, we considered a project to have experienced delays as of a progress report under the
following circumstances: (1) the entire project or deliverables and activities within the project
were specifically noted in a progress report as being delayed, or (2) an amendment applicable to
the period covered by the progress report pushed back timelines for the entire project or any of
its deliverables and activities. We subsequently determined whether projects that were
identifiable in both the first and second progress reports or in related amendments
(1) experienced no delays as of both progress reports, (2) experienced delays as of the first
progress report but were on track as of the second progress report, (3) were on track as of the
first progress report but experienced delays as of the second progress report, or (4) experienced
delays as of both progress reports.

Achievement of Performance Measures and Goals

To fulfill part (b) of our first objective, we reviewed Year 1 and Year 2 APRs for each of the
States in our sample. We determined that States were not required to establish targets for
performance measures associated with most subcriteria, although many opted to do so. We
further determined that States generally did not plan to have projects involving required
performance measures fully implemented until after Year 2 of the grant. As a result, we decided
to review performance measure baseline data, annual targets, and actual data for a judgmental
sample of student outcome measures tied to the RTT program principles as noted in Table 3 on
page 13. These included results on annual State assessments and the biannual, Federally-
administered NAEP; changes in the achievement gap between certain subgroups; and changes in
high school graduation, college enrollment, and college course completion rates. For each
performance measure selected for review, we determined whether the State provided baseline
Final Audit Report
ED-OIG/A19M0003                                                                       Page 35 of 36
data, established a target for Year 1 and Year 2, and achieved the target in Year 1 and Year 2.
We also determined whether the data for the performance measure exhibited a positive or
negative trend from the baseline to Year 2.

Program Oversight

To fulfill our second audit objective, we reviewed information specific to the RTT monitoring
process, such as the RTT Program Review Guide, supplemental guidance developed by and for
program officers, the PWS for both the technical assistance and monitoring contracts, and
documentation maintained in the official grant files. This included the materials described
above, which enabled our analysis of State progress against timelines and performance measures
and goals, as well as risk assessments, internal review checklists, fiscal monitoring
documentation, meeting agendas and notes, and miscellaneous correspondence between the
Department and States. We also compared the Department’s monitoring activities for the RTT
program with monitoring requirements as described in the Handbook. Our conclusions
regarding the effectiveness of monitoring to ensure that funds were used as intended and that
anticipated recipient performance was achieved in support of overall programmatic goals were
based on both quantitative and qualitative factors, including the timeliness, frequency, amount,
and focus of communications between program officers and States.

We relied on computer processed-data from G5, the Department’s grants management system,
and from States’ Year 1 and Year 2 APRs. We used G5 for the purpose of identifying the
universe of RTT grants and related obligation amounts. Because G5 is the Department’s system
of record for such information and the data was used primarily for informational purposes and
did not materially affect our findings and resulting conclusions, we did not assess its reliability.
We used States’ Year 1 and Year 2 APRs to answer part (b) of our first objective. In this case,
we determined that it was necessary to assess the reliability of performance measure baseline
data, annual targets, and actual data contained in the APRs. We reviewed States’ RTT
applications and approved amendments to ensure that performance measure targets had not been
changed between the application and either APR without approval from the Department. We
also reviewed the Department’s APR checklists for information on controls over data reliability
and evidence of substantive monitoring. Lastly, we compared available data from EDFacts, the
Department’s centralized data collection, analysis, and reporting initiative, with baseline and
actual data in the APRs. Based on our analysis, we concluded that the computer-processed data
were sufficiently reliable for the purposes of our audit.

We conducted fieldwork at Department offices in Washington, D.C., from June 2012 through
June 2013. We provided our audit results to Department officials during an exit conference
conducted on June 25, 2013.

We conducted this performance audit in accordance with generally accepted government
auditing standards. Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions
based on our audit objectives. We believe that the evidence obtained provides a reasonable basis
for our findings and conclusions based on our audit objectives.
Final Audit Report
ED-OIG/A19M0003                                                                      Page 36 of 36


                            ADMINISTRATIVE MATTERS


Corrective actions proposed (resolution phase) and implemented (closure phase) by your office
will be monitored and tracked through the Department’s Audit Accountability and Resolution
Tracking System (AARTS). Department policy requires that you develop a final Corrective
Action Plan (CAP) for our review in the automated system within 30 days of the issuance of this
report. The CAP should set forth the specific action items and targeted completion dates
necessary to implement final corrective actions on the findings and recommendations contained
in this final audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector
General is required to report to Congress twice a year on the audits that remain unresolved after
6 months from the date of issuance.

In accordance with the Freedom of Information Act (5 U.S.C.§ 552), reports issued by the Office
of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please
call Michele Weaver-Dugan at (202) 245-6941.




                                             Sincerely,


                                             Patrick J. Howard /s/
                                             Assistant Inspector General for Audit
                                                                                                  Attachment 1


                                       RTT Criteria and Subcriteria

A. State Success Factors
(A)(1) Articulating State’s education reform agenda and LEAs’ participation in it
(A)(2) Building strong statewide capacity to implement, scale up, and sustain proposed plans
(A)(3) Demonstrating significant progress in raising achievement and closing gaps

B. Standards and Assessments
(B)(1) Developing and adopting common standards
(B)(2) Developing and implementing common, high-quality assessments
(B)(3) Supporting the transition to enhanced standards and high-quality assessments

C. Data Systems to Support Instruction
(C)(1) Fully implementing a statewide longitudinal data system
(C)(2) Accessing and using State data
(C)(3) Using data to improve instruction

D. Great Teachers and Leaders
(D)(1) Providing high-quality pathways for aspiring teachers and principals
(D)(2) Improving teacher and principal effectiveness based on performance
(D)(3) Ensuring equitable distribution of effective teachers and principals
(D)(4) Improving the effectiveness of teacher and principal preparation programs
(D)(5) Providing effective support to teachers and principals

E. Turning Around the Lowest-Achieving Schools
(E)(1) Intervening in the lowest-achieving schools and LEAs
(E)(2) Turning around the lowest- achieving schools

F. General Selection Criteria
(F)(1) Making education funding a priority
(F)(2) Ensuring successful conditions for high-performing charters and other innovative schools
(F)(3) Demonstrating other significant reform conditions

Priorities
Priority 1: Absolute Priority – Comprehensive Approach to Education Reform
Priority 2: Competitive Preference Priority – Emphasis on Science, Technology, Engineering, and Mathematics
Priority 3: Invitational Priority – Innovations for Improving Early Learning Outcomes
Priority 4: Invitational Priority – Expansion and Adaptation of Statewide Longitudinal Data Systems
Priority 5: Invitational Priority – P-20 Coordination, Vertical and Horizontal Alignment
Priority 6: Invitational Priority – School-Level Conditions for Reform, Innovation, and Learning
                                                                   Attachment 2


               Abbreviations, Acronyms, and Short Forms
                          Used in this Report

AARTS        Audit Accountability and Resolution Tracking System
APR          Annual Performance Report
ARRA         American Recovery and Reinvestment Act of 2009
CAP          Corrective Action Plan
CCSS         Common Core State Standards
Department   U.S. Department of Education
ELA          English Language Arts
ESEA         Elementary and Secondary Education Act
FY           Fiscal Year
GAO          Government Accountability Office
GRADS360     Grantee Records and Assistance Database System
Handbook     Handbook for the Discretionary Grant Process
ISU          Implementation and Support Unit
LEA          Local Educational Agency
MOU          Memorandum of Understanding
N/A          Not Applicable
NAEP         National Assessment of Educational Progress
NIA          Notice Inviting Applications
OIG          Office of Inspector General
PWS          Performance Work Statement
RFP          Request for Proposal
RSN          Reform Support Network
RTT          Race to the Top
SEA          State Education Agency
SFSF         State Fiscal Stabilization Fund
SLDS         State Longitudinal Data Systems
SOW          Scope of Work
STEM         Science, Technology, Engineering, and Mathematics
SY           School Year
                                                                                                   Attachment 3


                   Department Response to Draft Audit Report



                          UNITED STATES DEPARTMENT OF EDUCATION
                                                                                THE DEPUTY SECRETARY
                                         December 3, 2013


TO:            Ms. Michele Weaver-Dugan, Director
               Operations Internal Audit Team

FROM :         James 1-1. Shelton
               Acting Deputy Secretary

SU BJECT:      Drall Audit Report: The epartment's Monitoring of Race to the Top Program
               Recipient Perfo rmance. A 19M0003

 We appreciate the opportunity your office provided for a meeting to discuss the possi ble findings
from the audit work on ·'The Department's Monitoring of Race to the Top Program Recipient
Performance; · and the opportunity to provide comments on the draft report. We found the audit
team working on this matter to be open. cooperative. and high!) professional. We also
appreciate the report 's acknowledgement that the Department has establ ished and implemented
an extensive and effective process fo r monitoring Race to the Top program recipients and that
the oversight mechanisms provide reasonable assurances that, among other things, funds are
being used as intended and either anticipated rec ipient performance is achieved or actio ns are
taken to improve a State·s chances of success.

Draft Finding No. I - Timeliness and progress in achieving owcomes has varied across Race 10
the Top States

We generally concur with your first finding that the fi ve Race to the Top States that the Office of
the Inspector General (010) reviewed for this report had varying degrees of success in adhering
to timelines and achieving performance measures and goals. We would like to note that as part
of the Race to the Top program, the Implementation and S uppo rt Unit (ISU) is focusing on
States' progress in meeting the goals of their grants. and ISU has a llowed amendments to
timelines. Throughout the life of the Race to the Top grants, we have addressed this issue with
our program review processes. 0 10 notes in the repo rt that ISU has implemented robust
monitoring efforts which include taking appropriate actions if a State continually fails to meet
project timelines and/or performance measures and goals. As noted in the draft audit report. the
Department is fu lly meeting the requirements under the Discretionary Gran/ Handbook
regarding its monitoring and technical assistance. We believe that it is imperative that we
continue to rigorously monitor the implementation of the Race to the Top program. In addition,
the ISU continually looks for ways to improve its program review process to make the already
robust process even stronger.

Draft Finding No. 2 Department oversig/11 has been robust, hw addiliona/ reporting could
increase /ransparency

We appreciate the 0 10·s acknowledgement that our oversight of Race to the Top grants has been
robust and appreciate the 0 10·s recognition of the extensive and effective processes we have



                            400 MARYLAND AVE. S W , WASHINGTON, DC 20202
                                              W \\ W .Cd .gov
established for monitoring Race to the Top recipients. As noted in the draft report, as a result of
its effective monitoring process, the Department has been able to readily identify timeliness and
quality issues with regard to States' implementation oftl1eir Race to the Top plans and take
appropriate action as needed. We strongly agree with recommendation 2. 1 that the ISU continue
to maintain the robust monitoring efforts implemented to date, including taking appropriate
actions if States continually fail to meet project timelines and/or performance measures and
goals. I can assure you that the ISU will continue its oversight of Race to the Top grantees with
t11e same rigor and quality that it has exemplified to this time. Indeed, as noted earlier, the ISU
continual ly looks for ways to improve its program review process.

We do not concur with recommendation 2.2 that ISU officials and program officers produce a
Comprehensive Race to the Top Annual Report that shows trends and statistics across States and
identifies accomplishments and common challenges within ilie education reform areas. The ISU
armually provides comprehensive reporting on Race to the Top grantees, creating an
unprecedented level of transparency regarding Race to tl1e Top accomplishments and challenges.
The ISU posts all Race to ilie Top grant applications, Scopes of Work, letters approving
amendments to Race to the Top plans, and letters regarding actions taken by the Department
(e.g., placing a grantee on high-risk status). In addition, the ISU disseminates State-specific
summary reports for each Race to the Top grant, as well as making performance information for
each State available publicly through the Annual Perfonnance Reports (APR) public data
display. With the addition of the APR data display system (discussed in more detail below)
comparison capabilities in Year 3, accomplishments, challenges, trends, and statistics across
States are evident.

The State-specific summary report is an annual comprehensive assessment of a State's Race to
the Top implementation at a given point in time. The report highlights successes and
accomplishments, identifies challenges, and provides lessons learned from implementation. The
Department drafts this report incorporating information available from Program Reviews,
Progress Reports, and State APRs. The Department provides the State with a draft for review
and response. Once finalized, the Department posts the final State-specific Summary Reports on
its Web site, found at http://www2.ed.gov/programs/ racetothetop/perfonnance.html.

The Race to the Top APR documents grantees' progress toward the annual or four-year targets
set forth by the grantees in their Race to the Top applications. The performance measures States
included in their applications are leading indicators of their success toward improving student
outcomes. Therefore, ilie APR is one mechanism for holding States accountable for meeting
ilieir targets or making significant progress toward them. Additionally, the APR includes State-
reported updates on ilie laws, statutes, regulations, or guidelines that affect key elements of their
Race to tl1e Top plans, and progress in meeting the absolute priority, competitive preference
priority, and invitational priorities. Beginning in Year 3 of the grant (school year 2012-20 13).
the APR also includes a progress page and a comparison page. The progress page shows
perfom1ance data across grant years for individual State grantees. The comparison page allows
the user to select from a menu of States and compare common metrics across the chosen
grantees. The final State APRs are posted on the Department' s Web site through a public data
display (https://www.rtt-apr.us/about-apr).
Each State has a unique plan with timelines and scope of work, and therefore States are in
different stages of implementation. At this point in time, it would be imprudent to make broad
comparisons across States in a report that may lead to incorrect interpretations regarding a
State' s progress, or lack thereof. However, the Department has made, and will continue to make,
available comprehensive information about the Race to the Top grants tlu-ough the State-specific
reports and APR data display system. The Department will continue to provide a high level of
transparency around State implementation of the Race to the Top program, as well as produce a
final comprehensive report at the end of the grant. We expect to refresh the Race to the Top
Program Review Guide in the near future to update it in several ways, including to indicate the
large amount of annual data that are being provided by the Department, and clarify that the
Comprehensive Race to the Top Report will be issued by the Department at the end of the grant.

For these reasons, we believe we already fully address recommendation 2.2 that the Department
report comprehensively annually on trends and statistics across States and identify
accomplishments and common challenges within the education reform areas and that there is not
a need to produce the annual report recommended.

We appreciate the opportunity to subm it these comments on the draft audit report. Please let us
know if you have questions or want additional information about our comments.