oversight

The Department's Monitoring of Investing in Innovation Program (i3) Grant Recipients. . (ED/OIG I13M0001) - Date Issued: 02/21/2013 PDF (228K)

Published by the Department of Education, Office of Inspector General on 2013-02-21.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

              U.S. Department of Education
              Office of the Inspector General


        American Recovery and
          Reinvestment Act
The Department’s Monitoring of Investing in Innovation Program Grant
                            Recipients

                      Final Inspection Report




  ED-OIG/I13M0001                               February 2013
                              UNITED STATES DEPARTMENT OF EDUCATION
                                   OFFICE OF INSPECTOR GENERAL

                                                                  Evaluation and Inspection Services




                                        February 21, 2013



James H. Shelton, III
Assistant Deputy Secretary
Office of Innovation and Improvement
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202


Dear Mr. Shelton:


This final inspection report presents the results of our inspection of the U.S. Department of
Education’s monitoring of Investing in Innovation program grant recipients. We received the
Office of Innovation and Improvement’s comments on the contents of our draft report. The
comments are summarized in the Department Comments section of this report and are attached
in their entirety.

Corrective actions proposed (resolution phase) and implemented (closure phase) by your office
will be monitored and tracked through the Department’s Audit Accountability and Resolution
Tracking System (AARTS). Department policy requires that you develop a final corrective
action plan (CAP) for our review in the automated system within 30 days of the issuance of this
report. The CAP should set forth the specific action items, and targeted completion dates,
necessary to implement final corrective actions on the findings and recommendations contained
in this final audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector
General is required to report to Congress twice a year on the audits that remain unresolved after
6 months from the date of issuance.

In accordance with the Freedom of Information Act (5 U.S.C. § 552), reports issued by the
Office of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.
We appreciate the cooperation given to us during this review. If you have any questions, please
call Christopher Wilson at (202) 245-7061.


                                            Respectfully,

                                            /s/

                                            Wanda A. Scott
                                            Assistant Inspector General
                                            Evaluation, Inspection, and Management Services




Electronic cc:        Margo Anderson, Associate Assistant Deputy Secretary, OII
                      Nadya Dabby, Associate Assistant Deputy Secretary, OII
                      Carol Lyons, Director, Oi3
                      Ayesha Edwards, Audit Liaison, OII
Draft Report
ED-OIG/I13M0001                                                                                         Page 1 of 14

                                             RESULTS IN BRIEF

The objective of our inspection was to evaluate the U.S. Department of Education’s
(Department) monitoring of Investing in Innovation (i3) program grant recipients. We found that
program officers regularly engaged with i3 grantees and provided substantive monitoring;
however, the Department did not hold i3 grantees accountable when they did not respond or did
not respond timely to Department requests. 1 We also identified two potential risks to the
Department’s ability to adequately monitor i3 grantees in the future. Specifically, if the
workload of program officers increases or if the technical assistance for the evaluation
component is no longer available, there could be a negative impact on the Department’s ability to
monitor i3 grantees.

                                               BACKGROUND

The i3 program was created and initially funded under the American Recovery and Reinvestment
Act of 2009 (ARRA). ARRA provided funding for the first i3 grant competition; subsequent
competitions were funded through additional Congressional appropriation. The purpose of the i3
program is to provide competitive grants to entities with a record of improving student
achievement and attainment in order to expand the implementation of, and investment in,
innovative practices that are demonstrated to have an impact on:
    • improving student achievement or student growth,
    • closing achievement gaps,
    • decreasing drop-out rates,
    • increasing high school graduation rates, or
    • increasing college enrollment and completion rates.

The i3 program is administered by the Office of Investing in Innovation (Oi3), within the
Department’s Office of Innovation and Improvement (OII).

Program Design and Awards
The i3 program has three types of grants: “Scale-up” grants, “Validation” grants, and
“Development” grants.

    •    Scale-up grants provide funding for scaling up practices, strategies, or programs for
         which there is strong evidence that the proposed projects have a statistically significant
         effect on student achievement and attainment.

    •    Validation grants provide funding for practices, strategies, or programs for which there is
         moderate evidence that the proposed projects have the potential to have a substantial and
         important effect on student achievement and attainment.

    •    Development grants provide funding for high potential and relatively untested practices,
         strategies, or programs whose efficacy should be systematically studied. Development
         grant applicants must provide a rationale for the proposed practices, strategies, or
         programs based on research or a reasonable hypothesis that the proposed projects could
         have an effect on student achievement and attainment.

1
 For purposes of this report, “grantees” refers to recipients of i3 discretionary grants or cooperative agreements in
either cohort, unless otherwise noted.
Draft Report
ED-OIG/I13M0001                                                                                   Page 2 of 14

The project period for i3 grants varies in length from 3 to 5 years. At the start of our review, the
Department had awarded i3 grants to two cohorts: 2
                                2010 Cohort                                      2011 Cohort
  Type of         No. of        Dollars     Average per            No. of        Dollars         Average per
  Grant          Awards        Awarded         Award              Awards        Awarded             Award
 Scale-up           4        $194,878,659   $48,719,665              1         $24,995,690       $24,995,690
 Validation        15        $310,699,851   $20,713,323              5         $72,774,304       $14,554,861
Development        30        $140,399,885    $4,679,996             17         $50,294,461        $2,958,498
   Total           49        $645,978,395                           23        $148,064,455

The Department awarded all 4 Scale-up grants in the 2010 cohort as cooperative agreements and
awarded the 45 Validation and Development grants as discretionary grants. According to the
Department’s Handbook for the Discretionary Grant Process, a cooperative agreement can be
used whenever the Department anticipates substantial Federal involvement during the
performance of the grant. 3 The Department awarded all 23 grants in the 2011 cohort as
cooperative agreements to signal a higher level of Department involvement.

Evaluation Component
The Department requires all i3 grantees to use part of their budget to conduct an independent
evaluation of their project. The purpose of the evaluation requirement is to ensure that projects
funded under the i3 program contribute significantly to improving practitioner and policymaker
knowledge of which practices work, for which types of students, and in which contexts. This
evaluation requirement differentiates the i3 program from most discretionary grant programs at
the Department.

The evaluation component has four requirements:
   • Independent Evaluation – All grantees must conduct an independent evaluation of its i3
       project. 4 The grantee’s independent evaluator is responsible for the design and execution
       of the required independent evaluation.

    •   Cooperation with Technical Assistance – All grantees must agree to cooperate with any
        technical assistance provided by the Department and its contractors.

    •   Sharing Results – All grantees must share the results of their i3 evaluations broadly.

    •   Sharing Data – All Scale-up and Validation grantees must make the data collected for
        their evaluations available to third-party researchers.
Although OII has primary oversight responsibility for the i3 program, the Department’s Institute
of Education Sciences (IES) is also involved in oversight of the evaluation component for the
2010 and 2011 cohorts of grantees through its contract with Abt Associates (Abt). Abt’s primary
responsibilities are to provide technical assistance to i3 grantees’ independent evaluators on the

2
  The 2010 i3 grant cohort was awarded on a Federal fiscal year basis. The 2011 cohort was awarded on a calendar
year basis.
3
  The primary difference between a cooperative agreement and a discretionary grant is that the cooperative
agreement provides the Department with more information on a grantee’s project and allows the Department to plan
its involvement in the grant.
4
  For the i3 program, an independent evaluation is one that is designed and carried out independent of, but in
coordination with, the grantee that developed the practice, strategy, or program it is implementing.
Draft Report
ED-OIG/I13M0001                                                                                   Page 3 of 14

design and implementation of their evaluations; to prepare four interim reports discussing the
quality, progress, and preliminary results of the independent evaluations; and to prepare one final
report providing summaries of the quality and findings of the evaluations.

The Department’s contract with Abt to provide technical assistance on the evaluation component
includes provisions to help ensure the integrity of the evaluations. Abt assigns technical
assistance liaisons to each grantee to work directly with the grantee’s independent evaluator to
provide feedback in support of the design and implementation of the evaluations. The
independent evaluators participate in regularly scheduled telephone calls with the technical
assistance liaisons to discuss the progress of the individual evaluations and to discuss the plans
for collecting data, assigning subjects to treatment and comparison conditions, measuring and
documenting the implementation of the i3 interventions, and analyzing and reporting the results.
Program officers are expected to review summaries of the call discussions and follow up with
questions or points of clarification, but they are not expected to be the primary resource to
grantees or independent evaluators on evaluation issues.

IES also assists OII by providing technical assistance to applicants and reviewing applicants’
evidence against the What Works Clearinghouse evidence standards. 5

OII Monitoring of the i3 Program
The primary objectives of monitoring in the i3 program are to track the progress of i3 grantees
toward achieving project-specific goals, to ensure proper use of Federal funds, and to ensure
compliance with the Department’s grant regulations and program requirements. Program
officers in OII serve as the initial point of contact for grantees in providing customized technical
assistance, appropriate feedback, and followup to help grantees achieve successful grant
outcomes. To provide i3 grantees with useful information early in the grant cycle, program
officers and other Department staff developed and organized web-based training on post-award
activities, fiscal management, evidence and evaluation, and management plans.

                                                  RESULTS

The objective of our inspection was to evaluate the Department’s monitoring of i3 program grant
recipients. We found that program officers regularly engaged with i3 grantees and provided
substantive monitoring; however, the Department did not hold i3 grantees accountable when they
did not respond or did not respond timely to Department requests. We also identified two
potential risks to the Department’s ability to adequately monitor i3 grantees in the future.
Specifically, if the workload of program officers increases or if the technical assistance for the
evaluation component is no longer available, there could be a negative impact on the
Department’s ability to monitor i3 grantees.
Program Officers Regularly Engaged with i3 Grantees and Provided Substantive
Monitoring
We determined that program officers tracked the progress of i3 grantees toward achieving
project-specific goals and were actively engaged in monitoring grantees to ensure the proper use
of Federal funds and compliance with the Department’s program regulations and requirements.

5
 Reviewers trained in the What Works Clearinghouse evidence standards review each study that passes eligibility
screens to determine whether the study provides strong evidence (Meets Evidence Standards), weaker evidence
(Meets Evidence Standards with Reservations), or insufficient evidence (Does Not Meet Evidence Standards) for a
project’s effectiveness.
Draft Report
ED-OIG/I13M0001                                                                                    Page 4 of 14

Program officers regularly communicated with grantees through e-mail and monthly phone calls
to discuss grantee progress, budget requests, requests for technical assistance, and project-
specific issues. Program officers also regularly identified grantee issues needing followup and
consulted appropriate subject matter experts in the Department’s Office of the Chief Financial
Officer, Office of the General Counsel, and the technical assistance contractor to address the
issues.

Program officers were able to regularly engage with grantees and provide substantive monitoring
due in part to their manageable workload, OII leadership’s commitment to professional
development, and OII’s decision to award grants as cooperative agreements to signal a higher
level of Department involvement.

Manageable Workload
The nine program officers that made up the i3 monitoring staff at the time of our review were
responsible for an average of eight grants each, which allowed each program officer to dedicate a
reasonable amount of time to monitor individual grantees. 6 Additionally, the Director of Oi3
assigned grants to program officers based on the size and complexity of the grant. Scale-up
grants, which are large and complex, were assigned to senior program officers, and Validation
and Development grants were assigned to all levels of staff based on subject matter expertise and
the interests of the program officers.

Commitment to Professional Development
Another factor contributing to the level of engagement in monitoring i3 grants was OII
leadership’s commitment to professional development and regular communication with program
officers. OII leadership identified training opportunities and seminars for program officers to
learn about best practices and efficiencies to help them perform their duties better. The Director
of Oi3 also assigned senior program officers as mentors to help newer program officers and held
weekly staff meetings to discuss grant monitoring issues.
More Department Involvement through Cooperative Agreements
Cooperative agreements require grantees to outline more information about their projects at the
beginning of the grant, which helps the Department to better plan its involvement in each grant.
After awarding the 2010 cohort as both cooperative agreements and discretionary grants, OII
decided to award the 2011 cohort as cooperative agreements. According to OII senior staff, this
shift was intended to signal to grantees that there would be a higher level of OII involvement.
Because OII awarded the 2011 cohort as cooperative agreements, OII was able to require all
grantees to submit management plans. Management plans provide measurable steps that
grantees will take to accomplish grant objectives. The measureable steps in management plans
allow program officers to more easily track grantees’ progress toward accomplishing grant
objectives and more quickly identify issues that need to be addressed. According to OII senior
staff, all future i3 grants will be awarded as cooperative agreements.
The Department Did Not Hold i3 Grantees Accountable When They Did Not Respond or
Did Not Respond Timely to Department Requests
We determined that the Department did not hold i3 grantees accountable when they did not
respond or did not respond timely to Department requests. During our review of the 2010 and

6
  By contrast, in our review of Congressional earmarks (I13H0004), issued in 2007, we found that program officers
in the Fund for the Improvement for Education were responsible for monitoring over 100 earmark projects. As a
result, program officers were unable to dedicate significant time to each grantee.
Draft Report
ED-OIG/I13M0001                                                                                       Page 5 of 14

2011 cohorts’ grant files, we found that grantees did not always respond or respond timely to
Department requests to resolve identified issues. In these instances, the Department did not
impose any consequences on the grantees.

Requests for Documentation
We identified 3 grantees out of our sample of 25 (one Scale-up grantee and two Validation
grantees) that did not respond or did not respond timely to Department requests for the grantees
to revise or submit documentation.

    •    The program officers responsible for monitoring the Scale-up grant awarded to Teach for
         America (awarded $50,000,000 in the 2010 cohort), identified many weaknesses in the
         grantee’s 2010-2011 management plan. 7 For example, the program officers noted that
         information provided in the “Expected Results” section of the management plan appeared
         to be activities rather than benchmarks that could be used to determine when Teach for
         America was achieving outcomes. While the program officers informed Teach for
         America that it should be more specific about the milestones and intermediate indicators
         of grantee performance, the Department did not require any Scale-up grantees to correct
         management plan weaknesses in the first year of the i3 program. Teach for America’s
         management plan for year two was submitted more than 4 months after it was due. The
         program officers identified similar weaknesses in the year two management plan that had
         not been addressed at the time of our file review. Teach for America was also not timely
         in resolving issues related to its budget that were identified by the program officers.

    •    We also found that two Validation grantees, Johns Hopkins University (awarded
         $30,000,000 in the 2010 cohort) and North Carolina New Schools Project (awarded
         $14,999,802 in the 2011 cohort), did not respond or did not respond timely to Department
         requests for documentation related to their evaluations. In August 2011 the program
         officer responsible for monitoring Johns Hopkins University expressed concern that the
         grantee had not submitted the evaluation plan despite requests from the Abt technical
         assistance liaison. Johns Hopkins University did not provide the documentation until
         almost 2 months after the Department expressed concern about the ongoing delays. At
         the time of our review, The North Carolina New Schools Project still had not submitted
         its evaluation plan or design; a delay of over 2 months. Given the significance of the
         evaluation in the i3 program, being non-responsive in this area could cause significant
         problems with developing an effective evaluation design and the grant outcomes.

Annual Performance Report Deficiencies
We identified 2 grantees out of our sample of 25 (one Scale-up grantee and one Development
grantee) that were not timely in responding to program officers’ requests to resolve significant
issues identified through the program officers’ reviews of the grantees’ Annual Performance


7
  A 2008 OIG audit (A02H0003) found that Teach for America could not provide adequate supporting
documentation for $774,944 of its discretionary grant expenditures because it lacked sound fiscal accountability
controls. Based on this audit and subsequent Department site visits, OII concluded that special conditions would be
imposed on Teach for America’s i3 award. Special conditions are imposed on a grant award by the Department if a
program official determines that, without the special conditions, the grantee might not be successful in implementing
its project or projects. Teach for America’s special conditions include limitations on its ability to draw funds and a
requirement that it make significant progress in addressing the weaknesses in its financial management system and
related internal controls.
Draft Report
ED-OIG/I13M0001                                                                      Page 6 of 14

Reports (APR). Delays in resolving issues identified in the APR can negatively impact the
grantees’ ability to achieve the agreed upon grant results.

   •   In January 2012, the program officers responsible for monitoring the same Scale-up grant
       awarded to Teach for America requested that Teach for America clarify information
       provided in the APR’s budget section and executive summary, as well as in a project
       status chart included with the APR. Teach for America did not begin providing
       information in response to the program officers’ January 2012 request until April 2012
       and did not provide answers to all of the program officers’ questions until June 2012.

   •   In January 2012, the program officer responsible for monitoring the Development grant
       awarded to Bay State Reading Institute (awarded $4,997,492 in the 2010 cohort)
       identified and began seeking information on budget issues and project results. Bay State
       Reading Institute did not provide the additional information until May 2012.

While the Department can impose special conditions on grantees to address administrative and
programmatic issues, special conditions are generally reserved for what the Department
considers severe issues. When program officers need additional documentation, identify
problems with a grant, or question what a grant is funding, they need timely responsive answers
from grantees. Additionally, the lack of grantee responsiveness to program officers results in
unnecessary time being spent following up on the issues raised by the program officers.
Recommendation 1.1
We recommend that the Assistant Deputy Secretary for Innovation and Improvement develop
appropriate requirements or consequences for i3 grantees that do not respond or do not respond
timely to Department requests.
Potential Risks to the Department’s Ability to Adequately Monitor i3 Grantees in the
Future
We identified two potential risks to the Department’s ability to adequately monitor i3 grantees in
the future. Specifically, if the workload of program officers increases or if the technical
assistance for the evaluation component is no longer available, there could be a negative impact
on the Department’s ability to monitor i3 grantees.
Increasing Workload
Should funding for the i3 program continue, the workload of program officers could increase as
new grant competitions and awards are added to their existing workload. An OII senior official
stated that the current workload of eight grants on average per program officer is close to the
limit of what they can handle. OII estimates that most program officers will be assigned one new
grant as a result of the Fiscal Year (FY) 2012 competition, though a few program officers with
smaller workloads will be assigned four additional grants. If the number of grants awarded for
the FY 2013 competition is similar to the FY 2012 competition, the workload for program
officers will increase further.
Though senior officials in OII are aware of the workload issue, they do not have a clear plan to
address the potential increase in program officers’ workload. OII is, however, taking steps that
may help program officers accomplish their monitoring duties more efficiently. These efforts
include: implementing the Grantee Records and Assistance Database System (GRADS 360),
participating in discussions with program offices using the Grant Electronic Monitoring System
(GEMS) to identify functions from GRADS 360 and GEMS for future inclusion in the G5 grants
Draft Report
ED-OIG/I13M0001                                                                                       Page 7 of 14

database, and hiring a technical assistance contractor to assist grantees with programmatic
issues. 8 OII informed us that it conducts an assessment of each program officer’s workload after
each competition and also examines the competition process to identify potential efficiencies and
improvements. Since the Department is currently implementing these steps to assist program
officers, we could not draw any conclusions on their effectiveness.
Recommendation 2.1
We recommend that the Assistant Deputy Secretary for Innovation and Improvement continue to
monitor any increase in program officers’ workload to ensure adequate monitoring.
Loss of Technical Assistance for the Evaluation Component
Technical assistance for the evaluation component is an important part of the Department’s
ability to ensure the success of the i3 program. OII program officers do not have the expertise to
appropriately review and approve evaluation designs and address evaluation problems and may
soon experience an increased workload. Due to the lack of technical expertise in evaluation
methods on the part of program officers, the Department relies heavily on a technical assistance
contractor to work with grantees and their independent evaluators to revise evaluation designs
and address evaluation problems as they occur.
The evaluation component is an additional tool for OII’s monitoring of the i3 grantees because
the data resulting from the project evaluations can be used to determine whether the project
interventions (and i3 program in its entirety) are producing measurable and desirable outcomes,
and whether they are worth the financial investment on the part of taxpayers. OII structured the
i3 grants to emphasize the importance of evidence and evaluation. The evaluation component of
the i3 program is complex and requires collaboration between OII, IES, i3 grantees, i3 grantees’
independent evaluators, and the technical assistance contractor. The evaluation component also
requires significant expenditures by grantees to contract with independent evaluators, and by the
Department to contract with a technical assistance contractor.
The funding of the technical assistance contract for the i3 evaluation component has a history of
uncertainty. The Department was able to fund the technical assistance contract for the 2010
cohort of grantees, but experienced delays in obtaining contractor coverage for the 2011 cohort
of grantees. For the 2010 cohort, IES funds were used to cover the cost of the technical
assistance contract; however, IES did not have the funds to cover technical assistance for the
2011 cohort. The FY 2012 appropriation for the i3 program designated the program as eligible
for national activities funds which OII can use to support technical assistance and evaluation
activities. 9 The Department used these funds to add the 2011 cohort to the existing technical
assistance contract. Because of the delay in finding available funds, the 2011 cohort did not have
coverage from the contract during the first six months of the projects. OII staff informed us that
the 2011 grantees were at a disadvantage compared to the 2010 grantees that were able to receive
technical assistance during the early stages of their evaluations. The Department plans to use
national activities funds to support the technical assistance contract in the future, but the
availability of these funds is not guaranteed.

8
  GRADS 360 is a project management system used by OII and IES. GEMS is a project management system used
by the Department’s Office of Postsecondary Education and Office of Elementary and Secondary Education. G5 is
the Department’s official central grants management database.
9
  National activities funds are funds designated in a program’s budget that are reserved from a program's annual
appropriation for certain general purposes, as authorized by Congress. For i3, the Secretary was authorized to
reserve up to five percent for technical assistance and evaluation until the expiration of the authority on December
31, 2012.
Draft Report
ED-OIG/I13M0001                                                                         Page 8 of 14

Delays in technical assistance for future cohorts of grantees may cause delays in evaluation
implementation if grantees and their independent evaluators must wait several months into the
project period to receive technical assistance. Without technical assistance there is an increased
likelihood that flaws in evaluation designs and implementation problems may not be identified
by OII staff because program officers lack technical expertise in evaluation methods.
Recommendation 2.2
We recommend that the Assistant Deputy Secretary for Innovation and Improvement ensure that
an evaluation technical assistance contractor is available for future cohorts of grantees for the full
project period, or find an equivalent alternative for technical assistance.

                                       OTHER MATTERS
In addition to the issues identified above, we found that the i3 grant files did not always include
documentation showing resolution of issues identified by program officers. Program officers
stated that resolution may not always be documented in the file when issues are resolved over the
phone and noted that they do not file every e-mail.

During our file review, we contacted the program officers with specific questions regarding
missing documentation in their assigned grant files. Program officers were generally able to
provide appropriate documentation and/or reasonable justification. Maintaining up-to-date grant
files with documentation indicating when and how problems were resolved is important to
ensure that files are complete and program officers are following all issues to resolution.

                                 DEPARTMENT COMMENTS

On December 3, 2012, we provided OII with a copy of our draft inspection report for comment.
We received OII’s comments on December 21, 2012. OII concurred in part with our finding that
the Department did not hold i3 grantees accountable when they did not respond or did not
respond timely to Department requests. We have summarized OII’s comments on this issue and
provided our responses below.

OII fully concurred with our finding and recommendations related to potential risks to the
Department’s ability to adequately monitor i3 grantees in the future. OII also provided
comments regarding technical inaccuracies and areas in need of additional clarification. We
have reviewed those comments and made changes to the report for clarity. OII’s response, in its
entirety, is attached.

OII Comments

OII agreed that in some instances i3 grantees may not have responded in a timely manner, but
did not concur with the draft inspection report’s assessment of the significance and extent of the
delay. OII stated that i3 program officers are in frequent contact with their grantees to obtain
timely and comprehensive responses. OII also stated that it previously provided OIG
information on examples cited under this finding that provided appropriate context for the
significance and extent of the delays in program officers receiving documentation from i3
grantees.
Draft Report
ED-OIG/I13M0001                                                                        Page 9 of 14

OII stated that although Teach for America did not provide all of the responses to the program
officers’ follow-up questions on the APR until June 2012, Teach for America provided
information periodically and the open questions were not considered significant issues. OII also
stated that because Teach for America is on cost reimbursement, OII disagrees that Teach for
America is not held accountable for delays in its responses.

However, OII stated that it appreciates the recommendation and agrees that the i3 program’s
monitoring can be improved with a more consistent and standard approach for addressing delays
in grantees’ responses to program officers. OII also provided information on corrective actions
that will include a policy for how program officers will respond to delays in communication with
grantees.

OIG Response

For the examples cited in this report, we determined that the significance of the issues combined
with the extent of the delays could negatively impact those grants. In the case of Johns Hopkins
University, the grantee did not provide the requested documentation until nearly two months
after the program officer expressed concern about the ongoing delay. In the case of Teach for
America’s APR, the grantee did not begin providing information in response to the program
officers’ January 2012 request until April 2012. We have included this information in the
finding to provide additional clarity. As for the significance of the issues, the program officers
questioned Teach for America about changes to its approved budget, changes to its performance
targets without prior Department approval, and its risk mitigation strategies to reach targets it
had not met during the first year.

Additionally, cost reimbursement is one of the limitations the Department has placed on Teach
for America as part of the special conditions that were based on a 2008 OIG audit (A02H0003)
and subsequent Department site visits. We found no indication in the grant file that there is a
connection between Teach for America’s cost reimbursement status and the delays cited in this
report.

We agree that developing a more consistent and standard approach for addressing delays in
grantees’ responses to program officers would help OII improve its monitoring of the i3
program.

                     OBJECTIVE, SCOPE, AND METHODOLOGY

The objective of our inspection was to evaluate the Department’s monitoring of i3 program grant
recipients.

We began our fieldwork on April 16, 2012, and conducted an exit conference on October 18,
2012.

The scope of our review included all 72 i3 grants awarded in FY 2010 and FY 2011. To evaluate
the Department’s monitoring of i3 program grant recipients, we reviewed a stratified random
sample of 25 i3 recipient grant files. To develop our sample, first we stratified recipients based
on characteristics of their i3 grant. Each recipient had exactly one i3 grant which fell into one of
the following categories: FY 2010 Scale-up grants, FY 2010 Validation grants, FY 2010
Development grants, FY 2011 Scale-up grants, FY 2011 Validation grants, and FY 2011
Draft Report
ED-OIG/I13M0001                                                                     Page 10 of 14

Development grants. Next, we randomly selected up to 5 grants from each of the 6 categories to
review a total of 25 recipient grant files. The random sample included:

   •   5 Scale-up grants (all cooperative agreements, representing all Scale-up grants awarded
       in FY 2010 and FY 2011);
   •   10 Validation grants (5 cooperative agreements, 5 discretionary grants); and
   •   10 Development grants (5 cooperative agreements, 5 discretionary grants).

We reviewed the complete FY 2010 and FY 2011 i3 grant files for these 25 grant recipients. We
specifically reviewed for evidence of the Department’s monitoring, including correspondence
between program officers and grant recipients, grantee performance reports, and financial
information. We also reviewed Abt’s Performance Work Statement, OII and Oi3 Monitoring
Plans, the Department’s Handbook for the Discretionary Grant Process, the 2010 Recovery Plan,
and applicable laws and regulations.

We interviewed OII senior officials, the Director of Oi3, program officers responsible for
monitoring i3 program grant recipients, and other Department staff.

Our inspection was performed in accordance with the Counsel of the Inspectors General on
Integrity and Efficiency’s “Quality Standards for Inspection and Evaluation” as appropriate to
the scope of the inspection described above.
     Anyone knowing of fraud, waste, or abuse involving
      U.S. Department of Education funds or programs
  should call, write, or e-mail the Office of Inspector General.

                           Call toll-free:
                    The Inspector General Hotline
                 1-800-MISUSED (1-800-647-8733)

                              Or write:
                      Inspector General Hotline
                    U.S. Department of Education
                     Office of Inspector General
                          550 12th St. S.W.
                       Washington, DC 20024

                              Or e-mail:
                          oig.hotline@ed.gov

    Your report may be made anonymously or in confidence.

For information on identity theft prevention for students and schools,
  visit the Office of Inspector General Identity Theft Web site at:
                        www.ed.gov/misused




      The Department of Education’s mission is to promote
student achievement and preparation for global competitiveness
 by fostering educational excellence and ensuring equal access.

                         www.ed.gov
                                                              Attachment 1




               Acronyms/Abbreviations Used in this Report

Abt          Abt Associates

APR          Annual Performance Report

ARRA         American Recovery and Reinvestment Act of 2009

Department   U.S. Department of Education

FY           Fiscal Year

GEMS         Grant Electronic Monitoring System

GRADS 360    Grantee Records and Assistance Database System

i3           Investing in Innovation Fund

IES          Institute of Education Sciences

Oi3          Office of Investing in Innovation

OII          Office of Innovation and Improvement

OIG          Office of Inspector General
                                                                                     Attachment 2


                      UNITED STATES DEPARTMENT OF EDUCATION
                               OFFICE OF INNOVATION AND IMPROVEMENT




                                               December 20, 2012



Chris Vierling, Director
Evaluations and Inspections
U.S. Department of Education
Office of Inspector General
400 Maryland Avenue, SW
Washington, DC 20202-1500

Subject: Written Comments to the Draft Inspection Report, “The Department’s Monitoring of
Investing in Innovation Program Grant Recipients” (ED-OIG/I13M0001, December 2012).

Dear Mr. Vierling:

The Office of Innovation and Improvement (OII) appreciates the opportunity to provide written
comments and proposed corrective actions to the Draft Inspection Report, ED-OIG/I13M0001,
received on December 3, 2012 (Draft Inspection Report).

As with all the programs we administer, OII strives to continuously improve the operations of the
Investing in Innovation (i3) Fund. Since its inception, the i3 program has used tools such as the
cooperative agreement and monthly calls to support our providing quality monitoring and
technical assistance to i3 grantees. However, OII recognizes that these monitoring techniques
are resource-intensive and agrees that it is important for the i3 program to continue to seek out
opportunities to improve monitoring efficiency and effectiveness.

OII’s responses to this draft report are detailed below, organized by finding and
recommendation, and include Corrective Actions to address the findings and recommendations.


Sincerely,

/s/

James H. Shelton, III
Assistant Deputy Secretary for Innovation and Improvement
                                                                                        Attachment 2


FINDING 1 – We determined that the Department did not hold i3 grantees accountable when
they did not respond or did not respond timely to Department requests.

RECOMMENDATION 1.1 – We recommend that the Assistant Deputy Secretary for
Innovation and Improvement develop appropriate requirements or consequences for i3 grantees
that do not respond or do not respond timely to Department requests.

OII Comments and Corrective Actions:

OII Comments – OII concurs with these findings and recommendations in part. That is, OII
agrees that in some instances i3 grantees may not have responded in a timely manner, but does
not concur with the Draft Inspection Report’s assessment of the significance and extent of the
delay.

As noted in the responses OII provided previously (e-mail from Ayesha Edwards to K.C. Jones
and Melanie Wilmer on October 26, 2012), the i3 program officers are in frequent contact with
their grantees in order to obtain timely and comprehensive responses. There are situations in
which grantees may not respond according to established timeframes, but that does not mean that
they have not communicated with their program officer about the delay or are not trying to
provide the requested materials. For example, although it is correct that Teach for America did
not provide all of the responses to the program officers’ follow-up questions on the annual
performance report until June 2012 (page 6 of the Draft Inspection Report), Teach for America
provided information periodically (as documented in the official grant file) and the open
questions were not considered significant issues that would impact the performance of the grant.
Further, as Teach for America is on cost reimbursement, OII disagrees with the statement that
this particular grantee is not held accountable for delays with their responses. OII’s October 26,
2012 e-mail included information on other examples cited under this finding in the Draft
Inspection Report that provided appropriate context for the significance and extent of the delays
in program officers receiving documentation from i3 grantees.

While OII is willing to allow additional time to grantees in order to ultimately obtain better work
products from the grantees and considers this approach to be reasonable, we appreciate the
recommendation provided in the Draft Inspection Report and agree that the i3 program’s
monitoring can be improved with a more consistent and standard approach for addressing delays
in grantees’ responses to program officers.

OII Corrective Actions – The i3 program will create a policy for how program officers will
respond to any delays in communication with the grantees. The policy will include different
stages based on the length of the delay. OII will complete a draft of the policy by March 1,
2013; however, some initial thoughts are set out below.

   •   Program officers will include a specific deadline or timeline for follow-up in their initial
       communications with the grantee.
   •   If a complete response is not received by the deadline, the program officer will e-mail the
       grantee on that day to remind the grantee of the deadline and ask whether the grantee
       would like to submit a request and justification for an extension.
                                                                                         Attachment 2


   •   If the grantee does not respond to the follow-up communication within five business
       days, the program officer will call the grantee to ask that the response be provided within
       two business days. The program officer will follow-up on this call with an e-mail.
   •   If the grantee does not provide a response, the program officer will meet with the i3
       Director to discuss the significance of the delay on the monitoring and implementation of
       the project and the appropriate consequence, including a determination of whether a
       corrective action plan will be required.
   •   When the situation involves significant financial issues, OII will place grantees on route
       payment to ensure their draw downs are in keeping with their activities and progress.
   •   If delays in communication with the Department continue to be an issue with a grantee,
       OII will discuss the option of placing special conditions on the grant with the i3 program
       attorney, such as additional reporting requirements or putting the grant on cost
       reimbursement.

FINDING 2 – We identified two potential risks to the Department's ability to adequately
monitor i3 grantees in the future. Specifically, if the workload of program officers increases or if
the technical assistance for the evaluation component is no longer available, there could be a
negative impact on the Department's ability to monitor i3 grantees.

RECOMMENDATION 2.1 – We recommend that the Assistant Deputy Secretary for
Innovation and Improvement continue to monitor any increase in program officers' workload to
ensure adequate monitoring.

RECOMMENDATION 2.2 – We recommend that the Assistant Deputy Secretary for
Innovation and Improvement ensure that an evaluation technical assistance contractor is
available for future cohorts of grantees for the full project period, or find an equivalent
alternative for technical assistance.

OII Comments and Corrective Actions:

OII Comments – OII concurs with these findings and recommendations. With regards to
program officers’ workloads, OII is committed to ensuring adequate monitoring of thei3 grants
and conducts an assessment of the i3 competition process and each program officer’s workload
after each competition is completed. We examine the competition process to determine how
efficient and effective the process was, and how it could be improved. OII conducts similar
discussions regarding grant monitoring and oversight, and strives to improve internal systems
that will assuage program officers’ workloads. OII will continue this process to ensure staff
workload is monitored regularly. On a related topic, the degree to which i3 funds are used to
support activities or initiatives other than funding new awards will also influence the number of
grants. As the i3 program anticipates Congress appropriating funds for national activities and
Advanced Research Projects Agency-Education (ARPA-ED) in FY 2013, OII expects i3
program officers will only be assigned one or two additional grants in the coming year.

With regards to the evaluation technical assistance, OII agrees that evaluation is an essential
component of i3 grants and plan to continue providing technical assistance to grantees on their
evaluations. OII has included the continuation costs for this contract in its FY 2013 spending
plan and will continue to request national activities funding to support it. As noted in OII’s
October 26, 2012 e-mail, we do not know the extent to which the delay in providing evaluation
                                                                                        Attachment 2


technical assistance disadvantaged the 2011 i3 cohort. Although it is possible that the delay may
have created some challenges, the i3 program took steps to mitigate such risks by providing the
evaluation technical assistance tools to the 2011 i3 cohort and created a tool for program officers
to use when reviewing the revised design plans. Moreover, the most recent information provided
by the evaluation technical assistance contractor indicates that 78 percent of the 2011 i3 cohort’s
evaluation designs have the potential to meet What Works Clearinghouse Evidence Standards.

OII Corrective Actions – As noted above, the i3 program will assess staff workload and the
resources available for evaluation technical assistance annually.

Other Comments:

OII appreciates the opportunity to review the Draft Inspection Report and provides the following
comments regarding technical inaccuracies or areas that may need additional clarification:

   •   Footnote 2 (page 2) states that both the 2010 and 2011 i3 cohorts were awarded on a
       Federal fiscal year basis. Although this is correct for the 2010 i3 grants, it is not correct
       for the 2011 or 2012 i3 grants as they were awarded on a calendar year basis.
   •   Page 6 refers to OII’s efforts coordinate with the Office of the Chief Financial Officer
       (OCFO) and other program offices using the Grant Electronic Monitoring System
       (GEMS) to identify functions that might be included in the Department’s G5 system. It is
       correct that OII is participating in such discussions. However, while the i3 program will
       likely benefit from these discussions, OII wants to clarify that the i3 program is not
       leading these efforts and is not currently using GEMS.
   •   Page 8 indicates that 25 grant files were selected randomly. OII questions whether the
       sample was entirely random as it included 100 percent of the Scale-up grants. The higher
       percentage of Scale-up and Validation grant files (as compared to the percentage of
       Development grant files) included in the review suggests that grant size may have been
       taken into consideration when selecting which files to review. If any judgment was used
       in the selection, OII recommends the basis for that judgment to be addressed in the final
       report so that the approach and scope of the review are clearly explained.