oversight

Teacher Incentive Fund Stakeholder Support and Planning Period Oversight

Published by the Department of Education, Office of Inspector General on 2013-02-08.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                                    UNITED STATES DEPARTMENT OF EDUCATION
                                          OFFICE OF INSPECTOR GENERAL


                                                                                                                  Control Number
                                                                                                                ED-OIG/A19L0005

                                                         February 8, 2013

Deborah S. Delisle
Assistant Secretary
Office of Elementary and Secondary Education
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202-4300

Dear Ms. Delisle:

This final audit report, titled Teacher Incentive Fund Stakeholder Support and Planning Period
Oversight, presents the results of our audit. The objectives of our audit were to
(1) review and assess the adequacy of the Department of Education’s (Department) review and
evaluation processes to ensure that funded applications demonstrated the involvement and
support of teachers, principals, other personnel, and unions necessary to carry out program
activities; and (2) review and assess the Department’s monitoring plans for funded applicants
proposing a planning period, and determine if the Department’s monitoring efforts ensured that
applicants developed the lacking core element(s) and mitigated related performance risk.




                                                      BACKGROUND



The Teacher Incentive Fund (TIF) grant program was established to support efforts to develop
and implement performance-based compensation systems (PBCS) for teachers and principals in
high-need schools. Its goals include:

         Improving student achievement by increasing teacher and principal effectiveness;
         Reforming teacher and principal compensation systems so that teachers and principals are
          rewarded for increases in student achievement;
         Increasing the number of effective teachers teaching poor, minority, and disadvantaged
          students in hard-to-staff subjects; and
         Creating sustainable performance-based compensation systems.



 The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                   excellence and ensuring equal access.
Final Audit Report
ED-OIG/A19L0005                                                                                     Page 2 of 21

The program was created by Congress as part of the Appropriations Law for the Departments of
Labor, Health and Human Services, Education, and Related Agencies for the fiscal year (FY)
ending September 30, 2006, and was initially funded at $100 million. An additional
$395 million was made available for the TIF program in FYs 2008 and 2009, including
$200 million in funding through the American Recovery and Reinvestment Act of 2009
(ARRA). These funds were intended to be used to provide continuation awards for grants
awarded in FYs 2006 and 2007 and, in the case of ARRA funding, also combined with
$400 million in FY 2010 appropriations to conduct a new grant competition. This new
competition resulted in the Department awarding 62 TIF grants in September 2010, with award
amounts ranging from $190,000 to $21 million.1

The FY 2010 TIF competition differed from previous TIF competitions in a few notable ways.
Among these was a requirement that each applicant demonstrate that it had in place five “core
elements,” including:

       a.	 a communications plan describing the components of its PBCS;
       b.	 the involvement and support of teachers, principals, and other personnel and the 

           involvement and support of unions, when and where applicable; 

       c.	 rigorous, transparent, and fair evaluation systems for teachers and principals that
           differentiate effectiveness using multiple rating categories and that take into account data
           on student growth as a significant evaluation factor;
       d.	 a data-management system linking student achievement data to teacher and principal
           payroll and human resources systems; and
       e.	 a plan for ensuring that teachers and principals understand the specific measures of
           effectiveness included in the PBCS and receive related professional development.

Applicants that did not have all five core elements in place could request a planning period of up
to 1 year, during which time they would be permitted to use TIF funds to develop any lacking
core elements, but were prohibited from making incentive payments to teachers, principals, and
other personnel. Determining whether or not applicants demonstrated that they had the core
elements in place was the responsibility of non-Federal reviewers, with panels of three reviewing
each application, and Department staff, who conducted summary assessments of reviewer
comments. Reviewers were also required to assign an overall “Project Design” score and judge
fiscal sustainability. At the time of award, 8 of the 62 FY 2010 TIF grantees (13 percent) were
determined to have demonstrated that all five core elements were in place and were subsequently
deemed ready for implementation, meaning that they could begin making incentive payments
during year 1. The remaining 54 grantees (87 percent) were placed into a planning period of up
to 1 year to allow for development of lacking core elements.


                                                            
1
  The FY 2010 TIF competition was comprised of two separate competitions: a “Main” competition and an
“Evaluation” competition. The competition was structured in this manner to enable the Department to meet an
ARRA requirement that it conduct a “rigorous national evaluation... using randomized controlled methodology to
the extent feasible, that assesses the impact of performance-based teacher and principal compensation systems… on
teacher and principal recruitment and retention in high-need schools and subjects.” Of the 62 TIF grants awarded in
FY 2010, 50 (81 percent) were for Main competition grantees and 12 (19 percent) were for Evaluation competition
grantees.
Final Audit Report
ED-OIG/A19L0005                                                                                       Page 3 of 21

The Department developed and implemented a monitoring plan as a framework for the
evaluation of TIF recipient performance, to include the use of numerous monitoring tools such as
annual performance reports, site visits, periodic telephone and email contact, and review of
relevant data in G5, the Department’s grants management system. The Department also awarded
two contracts for technical assistance and monitoring for FY 2010 TIF grantees.2




                                                               AUDIT RESULTS



Overall, we found weaknesses in the Department’s processes for reviewing and evaluating
applications with regard to the involvement and support of stakeholders. We also concluded that
improvements are needed in its process for monitoring TIF planning period grantees.

With regard to objective one, we found that the Department’s application review process for the
FY 2010 TIF competition did not ensure that funded applications demonstrated the involvement
and support of stakeholders. Specifically, we found that non-Federal reviewers accepted as
adequate varying levels of quantitative and qualitative evidence of the support of teachers,
principals, other personnel, and unions and lacked clarity in their overall determinations. We
also found that Department staff interpreted reviewer comments to indicate that applications
demonstrated adequate stakeholder support when in fact such conclusions did not appear to be
clearly supported by the reviewers’ comments. As a result, the Department increased its risk of
providing funding to grantees that did not adequately demonstrate the involvement and support
of stakeholders, which increases the risk that a grantee will face significant challenges in meeting
its project objectives.

With regard to objective two, we found that improvements are needed in the Department’s
monitoring process for TIF planning period grantees. Specifically, we found that monitoring
activities related to the development of core elements were inadequate for 13 of 14 TIF planning
period grantees (93 percent) randomly selected for review. We noted that the Department did
not begin to monitor grantees’ progress toward the development of lacking core elements until
almost 6 months after awards were made, and that activities subsequent to this time were both
insufficient and inconsistent, to include ensuring that grantees’ projects progressed in relation to
established timelines. We found that 7 of the 14 planning period grantees (50 percent) sampled
during this audit had still not fully developed one or more core elements at the end of the
planning period. Overall, we noted that 28 of the 54 planning period grantees (52 percent) from
the FY 2010 TIF competition were not ready for implementation after year 1. These grantees
were subsequently placed into “implementation with conditions” status, where they were not
able to make any incentive compensation payouts until all core elements were successfully
developed, similar to the terms under which they operated during the 1 year planning period.
                                                            
2
  One contractor was tasked with providing technical assistance and monitoring for all FY 2010 TIF grantees, while
the other was tasked with focusing solely on Evaluation grantees. The first contract is administered by the Office of
Elementary and Secondary Education (OESE) and the second contract is administered by the Institute of Education
Sciences (IES).
Final Audit Report
ED-OIG/A19L0005                                                                              Page 4 of 21

These 28 grantees received approximately $177 million of the $364 million (49 percent) initially
awarded to planning period grantees.      

Lastly, we followed up on an issue noted in a prior TIF audit report3 in which we observed that
the allowance of a planning period could create inconsistencies in standards applied to
applicants. During our current audit, we learned that non-Federal reviewers were instructed not
to deduct points for applicants requiring a planning period, but, rather, to evaluate all
applications against the same selection criteria, regardless of whether the applicant indicated that
it was ready for implementation or requested a planning period. Our analysis of reviewer
comment forms for 24 TIF grantees found that there was no correlation between an applicant’s
Project Design score and its status as either an implementation or planning period grantee, with
many applicants that still needed to develop core elements scoring the same or higher than those
that already had them in place.

In its response to the draft audit report, the Department disagreed with both of the findings and
one of the four recommendations. The Department stated that it did not agree that one standard
for demonstrating stakeholder support should be set and applied to all applications. It cited two
major concerns with this contention, specifically that it fails to consider the different stages of
readiness between applicants and that it runs counter to the purpose of the peer review process,
which is to provide an objective evaluation of the grant application. The Department also noted
that it disagreed that monitoring of planning period grantees was inadequate, stating it believed
there was a high level of interaction with grantees in the early implementation stages of the
grants.

The Department expressed that, as with all of the programs it administers, it strives to
continuously improve the operation of the TIF program. It stated that it has used the information
obtained from the implementation of TIF Cohorts 1 and 2 to inform the competition held in
FY 2010 and the implementation of Cohort 3 (the focus of this audit report). Building upon the
lessons learned in Cohort 3, the Department noted that it made more refinements to the program
as illustrated in the TIF competition held in FY 2012 and the implementation of the Cohort 4
grants.

In response to the Department’s comments to the draft audit report, we modified
recommendation 1.1 to recognize that requirements for demonstrating stakeholder support and
other essential elements need not be limited to the Notice of Final Priorities (NFP) and Notice
Inviting Applications (NIA), but might also be conveyed via nonregulatory guidance. We also
slightly modified Finding No. 1 regarding the types of applications that we reviewed. We did
not make any other changes to the findings or related recommendations. The Department’s
comments are summarized at the end of each applicable finding. The full text of the
Department’s response is included as Attachment 2 to this report.




                                                            
3
 “Department’s Implementation of the Teacher Incentive Fund Grant Program” (ED-OIG/A19I0007), issued
December 30, 2011.
Final Audit Report
ED-OIG/A19L0005                                                                      Page 5 of 21

FINDING NO. 1 – The Department’s Application Review Process Did Not
                Adequately Ensure that Funded Applicants Demonstrated the
                Involvement and Support of Stakeholders

We found that the Department’s application review process for the FY 2010 TIF competition did
not adequately ensure that funded applicants demonstrated the involvement and support of
stakeholders. Specifically, we found that non-Federal reviewers accepted as adequate varying
levels of quantitative and qualitative evidence of the support of teachers, principals, other
personnel, and unions and lacked clarity in their overall determinations. We also found that, in
some cases, Department staff interpreted reviewer comments to indicate that applicants had
demonstrated adequate stakeholder support when, in fact, such conclusions did not appear to be
clearly supported by the reviewers’ comments.

As part of the review process, each non-Federal reviewer was tasked with providing comments
on the evidence provided by the applicant to demonstrate stakeholder support, or “Core Element
B,” as defined by the Department in its May 21, 2010, NFP for the TIF program. Reviewers
were also required to comment on the strengths and weaknesses of the applicant’s overall Project
Design, part of which entailed consideration of the extent to which the proposed project had the
involvement and support of stakeholders. The application review process also included
completion, by Department staff, of a checklist that was intended to summarize the non-Federal
reviewers’ conclusions and provide support for the Department’s decision on whether a grantee
was categorized as being ready for implementation or placed into a planning period.

For the purposes of this finding, we only reviewed the applications submitted by the eight
grantees that were identified as being ready for implementation. Based on our review of these
applications, we found that non-Federal reviewers accepted differing levels of evidence of
stakeholder support as adequate. For example, we noted that one funded applicant provided a
signed letter from principals at all of the schools where it planned to implement a PBCS, as well
as survey results showing teachers’ agreement with the proposed PBCS. However, another
funded applicant provided only a general statement in its application that the local education
agency (LEA) had the support of teachers, principals, and other personnel for the project with no
further evidence in the grantee’s application to support this statement. For three of the eight
implementation grantees (38 percent), we found that the reviewers varied in their feedback on
applicants’ demonstration of this particular core element, at times contradicting one another.
Further, we noted that reviewer comments sometimes lacked clarity with regard to an overall
determination of adequacy. Specifically:

      In one case, two of three reviewers agreed that the application contained adequate
       stakeholder support; however, one reviewer clearly stated in his comments that the
       applicant did not provide such support. Specifically, this reviewer noted that a weakness
       of the project design was that no evidence was provided indicating the support of teachers
       and principals.
      In another case, all three reviewers did not reach a clear conclusion regarding the
       evidence of stakeholder support. In their analysis of the project design, each of the
       reviewers noted weaknesses regarding the evidence of stakeholder support, but did not
       specifically state that support was inadequate.
Final Audit Report
ED-OIG/A19L0005                                                                      Page 6 of 21

   	 In the third case, one of three reviewers noted that the application demonstrated adequate
      stakeholder support; however, the remaining two reviewers did not reach a clear
      conclusion in this area, and appeared to indicate that the applicant still needed to obtain
      support. For example, one of these two reviewers stated, “As the district forges ahead
      with this initiative, engagement and support from teachers, principals, and the unions in
      participating district will be sought out.” This would indicate that the grantee did not
      demonstrate stakeholder support in its application.

We also noted that the Department’s assessment of the applications for two of the eight grantees
(25 percent) did not always directly correlate with the reviewer assessments. In both cases,
while at least two of the three non-Federal reviewers were unclear in their assessments of the
adequacy of stakeholder support, describing both strengths and weaknesses, the Department
interpreted their comments to indicate that the core element had been met. We subsequently
noted that one of the grantees was moved from implementation to planning in the months
immediately following award.

We determined that several weaknesses in the FY 2010 TIF award process contributed to the
issues noted above. Specifically, we noted that the Department did not define for applicants
minimum supporting documentation requirements relating to stakeholder support prior to award,
or provide non-Federal reviewers with explicit standards against which to assess the level of
support provided. The TIF Peer Reviewer Training presentation, dated July 19, 2010, instructed
reviewers as to the program’s requirements, but we found nothing to indicate that the
Department provided guidance to reviewers specifically with regard to evidence that was
necessary to demonstrate adequate stakeholder support. Rather, we were told that reviewers
were expected to use their professional judgment. The Department’s Handbook for the
Discretionary Grant Process (Handbook), Section 1.3, requires that the Department establish
clear policies that enable consistent policy interpretation and implementation on grant
administration issues.

According to Department officials, the Department was unable to define supporting
documentation requirements for stakeholder support. Department officials noted that because
the competition included both implementation and planning period awards, they could not
require a minimum level of evidence for stakeholder support since those that were requesting
planning periods would not be able to fulfill the minimum requirements. Further, Department
officials noted that they could not provide the application reviewers with minimum standards for
evidence of stakeholder support since the NFP and NIA did not establish such requirements.
According to the same officials, this would not have been fair to the applicants, as the reviewers
would have been required to assess their applications against a standard that was not conveyed in
the applicable guidance documents for prospective applicants.

We also noted that while the Department required reviewers to provide comments pertaining to
each of the core elements, it did not require that reviewers assign a numeric score to each or
otherwise state, explicitly, whether the core element was met. Applicants could receive up to 60
points on overall Project Design, a separate part of the application; however, reviewers were not
required to score the selection criteria individually, which included the extent to which the
proposed PBCS met the required core elements.
Final Audit Report
ED-OIG/A19L0005                                                                                  Page 7 of 21

In May 2011, the Department completed and disseminated guidance that clearly described what
would qualify as adequate stakeholder support for annual performance reporting (APR)
purposes. According to Department officials, the guidance was developed to provide grantees
with examples of acceptable evidence to demonstrate stakeholder support, in addition to the
other four TIF core elements, and to ensure that program officers were consistent in their reviews
of grantees’ Core Element Submissions.4 The results of these reviews were used by the
Department in July 2011 to determine whether planning period grantees were ready for
implementation.

We applied the guidance to the applications of implementation grantees to determine whether the
grantees would have adequately demonstrated the involvement and support of stakeholders at the
time of their initial application for funding under the TIF program if the guidance had been in
effect at that time. Based on our review, we found that five of the eight grantees (63 percent) did
not appear to provide acceptable evidence of stakeholder support.

Weaknesses identified with regard to the TIF application review process increased the risk of the
Department providing funding to grantees that did not adequately demonstrate the involvement
and support of stakeholders, to include teachers, principals, and unions. Such a situation also
increases the risk that a grantee will face significant challenges in meeting its project objectives.
The Department acknowledged as much in the TIF NFP, wherein it stated that, “… because the
creation of a PBCS directly affects employee compensation… the Department believes that
cooperation from and agreement with local union representatives, where a union is a
representative in collective bargaining, is essential to successful implementation of a PBCS.”
We noted that two of the eight implementation grantees (25 percent) were changed to planning
period grantees in the months immediately following award because the Department determined
that the grantees were lacking stakeholder support and were not ready for implementation. We
found that neither of these grantees had resolved issues pertaining to stakeholder support by the
end of year 1 and, consequently, had conditions placed on their grants in Fall 2011.

In comments provided in response to the preliminary findings presented at our exit conference,
the Department stated that it is committed to continuous improvement, but is also confident that
the TIF application review process was appropriate and properly implemented the purposes of an
objective peer review process consistent with the published requirements and criteria. The
Department noted that qualified non-Federal reviewers were encouraged to use professional
judgment so that the Department had the benefit of their individual, independent, and objective
judgment in reviewing applications. The Department further noted that the use of reviewers in
this manner is especially important in complex and innovative programs like the TIF program
and that while it provides some training to reviewers, the intent is not to substitute Department
judgment for that of the reviewers. The Department also recognized that there can be
disagreements and differences in the views of reviewers and that the review process is not a
process in which the reviewers reach consensus. Specifically, the Department stated that it
generally directs reviewers not to seek consensus, but, rather, to give the Department their best
individual professional judgment as informed by the requirements, the criteria, and discussion

                                                            
4
 Core Element Submissions were required to be submitted with the APR by all planning period grantees to
demonstrate that they met all five elements and that the project is ready for full implementation.
Final Audit Report
ED-OIG/A19L0005                                                                        Page 8 of 21

among the reviewers. As a result, the Department expected that there might be variations in
these individuals’ views.

We recognize that professional judgment is an integral part of the grant application review and
evaluation processes and, as such, did not find issue with the Department’s reliance on
non-Federal reviewers’ professional judgment as a general matter of practice in the FY 2010 TIF
competition. We are also aware that reviewers were encouraged to discuss areas where there
were substantial scoring differences; however, stakeholder support was not an area that received
an individual score. Instead, as previously noted, it factored into an overall Project Design score.
Differences in opinions on stakeholder support would most likely not have made a significant
difference in that score and therefore would not have generated any discussion on the noted
differences in reviewer comments. We note the importance of requiring that reviewer comments
and conclusions be clearly noted, and contradictions and ambiguities in reviewer judgment
resolved, particularly with regard to elements that are considered key to the success of a grant,
and in the case of the FY 2010 TIF applicants, determination of readiness for implementation.

During our review, we noted that the Department has taken steps that may help to improve some
of the weaknesses identified with regard to its TIF application review and evaluation processes.
Specifically, the NIA for the FY 2012 TIF competition, dated June 14, 2012, indicates that non-
Federal reviewers will now be required to provide comments and itemized scores pertaining to
selection criteria that closely mirror the core elements from the FY 2010 TIF competition. This
includes an assessment of the quality of educator involvement in the development and
implementation of, and support for, the proposed PBCS and educator evaluation systems
described in each application.   

Recommendations

We recommend that the Assistant Secretary for OESE

1.1	    Ensure that requirements for demonstrating stakeholder support and other essential
        elements are adequately defined during the grant competition planning process; included
        in the NFP, NIA, or, at a minimum, addressed in the Department’s TIF Frequently
        Asked Questions (FAQs) or similar nonregulatory guidance; and made available to all
        applicants.

1.2	    Develop specific guidance on the review of stakeholder support and other essential
        elements for both non-Federal reviewers and Department staff and communicate
        expectations in advance of the application review process so that resulting conclusions
        are clearly stated and supported by the evidence.

Department Comments

The Department disagreed with Finding No. 1, stating that the draft report implies that one
standard for demonstrating stakeholder support and other essential elements that is clearly
defined should be set and applied to all applications, and that this should be clearly stated and
communicated in the NFP and NIA. The Department specifically cited two major concerns
Final Audit Report
ED-OIG/A19L0005                                                                         Page 9 of 21

with this contention. First, it fails to consider the different stages of readiness between
applicants. The Department noted that because applicants could propose a 1-year planning
period, they were in varying stages of implementation and some were further along than others
in securing stakeholder support. It noted that it is therefore reasonable to expect and allow for
varying levels of evidence of stakeholder support. Second, the implications of a set standard for
stakeholder support run counter to the purpose of the peer review process, which is to provide
an objective evaluation of the grant application. The Department stated that to help ensure
objectivity, it utilizes peer reviewers with a wide range of backgrounds, viewpoints, and
expertise to read the TIF applications and to apply their professional judgment in reviewing the
content and quality of applications. The Department further noted that when reviewers disagree
as to what is or is not sufficient quality, this inevitably leads to in-depth conversations about an
application and its strengths and weaknesses and results in valuable objective review and
feedback to the Department and the applicant on the merits of its proposal.

The Department disagreed with recommendation 1.1 since it did not agree that one standard
should be set and applied to all applications; however, it did agree that more examples and
guidance could be given to applicants and reviewers through FAQs and technical assistance
workshops. For the FY 2012 competition, the Department stated it issued several FAQs to
assist applicants in understanding the type of evidence that could be submitted to respond to
selection criterion. The Department agreed with recommendation 1.2, noting that the FAQs
referenced above were shared with non-Federal peer reviewers and incorporated into the peer
review training.
 
OIG Response

We disagree that the finding implies that one standard should be set and applied to all
applications. The finding specifically states that we found that the Department did not define for
applicants minimum supporting documentation requirements prior to award, similar to what it
subsequently provided to grantees for guidance and used in its review of Core Element
Submissions to determine whether core elements were met. With regard to the specific concerns
cited by the Department, a key point to note is that this finding is based on work that was focused
only on those grantees that were deemed by the Department to be ready for implementation,
meaning all core elements, to include stakeholder support, were supposed to be already
developed and in place. In response to the Department’s concern that the finding fails to
consider the different stages of readiness between applicants (i.e. planning period versus
implementation), we slightly modified the report to emphasize the fact that this finding is based
solely on the review of the eight applications deemed ready for implementation.

With regard to the Department’s other major concern surrounding the purpose of the peer review
process, as previously stated in the finding, we recognize that professional judgment is an
integral part of the grant application review and evaluation processes and, as such, did not find
issue with the Department’s reliance on non-Federal reviewers’ professional judgment as a
general matter of practice in the FY 2010 TIF competition. We are also aware that reviewers
were encouraged to discuss areas where there were substantial scoring differences; however,
stakeholder support was not an area that received an individual score. Instead, it factored into an
overall Project Design score. Differences in opinions on stakeholder support would most likely
Final Audit Report
ED-OIG/A19L0005                                                                     Page 10 of 21

not have made a significant difference in that score and therefore would not have generated any
discussion on the noted differences in reviewer comments. This was of particular concern in
instances where we noted contradictions in reviewer feedback or a lack of clarity with regard to
overall determination of adequacy on an element considered key to the success of a grant
subsequently selected for funding.

After considering the Department’s comments, we modified recommendation 1.1 to recognize
that requirements for demonstrating stakeholder support and other essential elements need not be
limited to the NFP and NIA, but might also be conveyed via nonregulatory guidance. As
previously noted, we also slightly modified the finding regarding the types of applications that
we reviewed. We did not make any other changes to the finding or related recommendations.

FINDING NO. 2 – Improvements are Needed in the Department’s Monitoring
                Process for TIF Planning Period Grantees

We determined that improvements are needed in the Department’s monitoring process for TIF
planning period grantees. Specifically, we found that monitoring activities related to the
development of core elements were inadequate for 13 of 14 TIF planning period grantees
(93 percent) selected for review. This determination was based on the lack of documentation of
such activities in the months immediately following award, as well as the overall quality of
monitoring activities and communications between the Department and grantees. We noted that
the Department did not begin to monitor grantees’ progress toward the development of lacking
core elements until almost 6 months after awards were made, and that activities subsequent to
this time were both insufficient and inconsistent, to include ensuring that grantees’ projects
progressed in relation to established timelines.

Documentation of Monitoring Activities

Based on our review of documentation in the Department’s Grants Electronic Monitoring System
(GEMS), we found evidence that the Department held post award performance conferences,
within 1 to 2 months of award, with only 8 of the 14 selected planning period grantees
(57 percent). Section 5.2 of the Handbook states that this is the point at which monitoring
begins, with the purpose of the conference being to establish a mutual understanding of expected
outcomes, measures for assessing progress and results, the monitoring process, statutory or
regulatory requirements, and potential budget issues or concerns. We followed up with program
officers for the six grantees where documentation of a post award performance conference was
lacking, but did not receive any additional information to substantiate that such activities ever
occurred. According to program officers, when conferences were held, they focused primarily
on grantees’ budgets and did not always include a discussion related to lacking core elements.

We found that monitoring of grantees’ progress toward the development of core elements did not
begin in earnest until March 2011, approximately 6 months after the start of the planning period.
During this time, we noted an increase in the number and frequency of core element-related
emails and meeting notes between the Department and grantees. When we discussed the
monitoring process with TIF program officers, they acknowledged that monitoring activities
related specifically to lacking core elements did not begin until this time. We found that the
Final Audit Report
ED-OIG/A19L0005                                                                                 Page 11 of 21

Department’s monitoring efforts increased further with the issuance of Core Element Submission
Guidance and grantees’ submission of APRs, which occurred between May and July 2011,
8-10 months into the planning period.5 It was not until this time that the Department appeared to
clearly identify lacking core elements for each grantee. We noted that correspondence related to
lacking core elements increased overall following the Core Element Submissions in July 2011.
However, we found that 2 of the 14 grants (14 percent) still had very few or no documented
incidences of communication related to any lacking core elements; 7 of the 14 grants
(50 percent) had several documented incidences of communication related to some lacking core
elements, but very few or no documented incidences of communication related to other lacking
core elements.

Quality of Monitoring Activities

While reviewing documentation of monitoring activities, we assessed the quality of program
officer communications with grantees. In general, for the 14 grants included in our review, we
noted the communications did not show evidence of substantive monitoring, to include
discussion of needed technical assistance or status updates on lacking core elements. We noted
only one instance of a program officer requesting an update to a grantee’s project during the
planning period. Grantees that requested a planning period were required to include as part of
their application a plan with related milestones for implementing the lacking core elements
during the planning period. However, it does not appear that these timelines were used to
monitor the progress of grantee activities. While we found that all of the program officers were
aware of the timelines, just two stated that they used them in monitoring grantee progress.
According to another program officer, she did not use the timelines because they changed
throughout the year.

Office of Management and Budget (OMB) Circular A-123 “Management’s Responsibility for
Internal Control,” states that management has a fundamental responsibility to develop and
maintain effective internal control. This includes ensuring that Federal programs operate and
Federal resources are used efficiently and effectively to achieve desired objectives, with minimal
potential for waste, fraud and mismanagement. Section 5.3 of the Handbook states that regular
monitoring enables the Department to provide customized technical assistance, appropriate
feedback, and follow-up to help grantees improve areas of need, identify project strengths, and
recognize significant achievements. The Handbook also states that a grantee’s project must
progress against previously established performance measures, and requires the Department to
develop suitable monitoring tools that are designed to assess the extent to which projects are
meeting established program goals, objectives, and performance measures.

We found that issues with regard to monitoring likely occurred due to a combination of
management turnover and insufficient planning and communication both prior to and early in the
planning period. Specifically, we found that there were significant changes within OESE’s
Academic Improvement and Teacher Quality (AITQ) group. The reassignment of the AITQ


                                                            
5
 As noted in Finding No. 1, the Core Element Submission Guidance specified suitable evidence for demonstrating
completion of each core element.
Final Audit Report
ED-OIG/A19L0005                                                                                    Page 12 of 21

Director and Assistant Director and departure of the TIF Team Lead resulted in a lack of
management oversight of monitoring activities for planning period grantees at the time of award
and at least through January 2011, when the positions were filled.6 According to four program
officers with whom we spoke, expectations with regard to monitoring the grantees’ progress in
developing lacking core elements were not clearly communicated or defined by AITQ
management at the onset of the planning period. Further, two program officers acknowledged
that, at the time of award, they were not aware of which core elements were lacking for grantees
that required a planning period. This made it difficult, if not impossible, to focus monitoring
efforts on lacking core elements during the planning period.

We found that issues with planning period monitoring activities were further exacerbated by
inadequate planning prior to the award of grants under the FY 2010 TIF competition.
Specifically, the Department’s FY 2011 TIF Monitoring Plan did not provide any detailed
guidance on monitoring activities, to include distinguishing between monitoring activities for
planning period and implementation grantees. Specifically, Section IV(B) – Post-Award
Technical Assistance states that, “…staff will concentrate monitoring efforts on Cohort 3,
especially those grantees with a planning year.” Section III(B) – Performance Problems states

              The Cohort 3 grantees with planning years will need to be closely evaluated to
              determine whether they are ready to begin implementing incentives at the end of
              their planning period. Program staff is working to create appropriate criteria for
              making such determinations.

As previously mentioned in Finding No. 1, detailed guidance to assist program staff with
assessment of grantee core element progress was not completed until May 2011.

We compared the TIF Monitoring Plan for Cohort 3 to the TIF Monitoring Plan for Cohorts 1
and 2. We noted that the two monitoring plans were very similar even though it would have
been reasonable to anticipate an increase in program risk with the inclusion of a planning period.
According to a Department official, the Department typically uses the prior year’s monitoring
plan as a starting point for the development of the current year’s monitoring plan. The
Department official stated that it is not uncommon for monitoring plans to lack detail, especially
in a program with new requirements, such as a planning period. In this case, the lack of detail
within the monitoring plan appeared to hinder the Department’s ability to monitor planning
period grantees.

Based on our review of grant documentation for 14 selected planning period grantees, it appears
that actual monitoring of grantees and development of core element guidance occurred too far
into the planning period to likely affect any noticeable impact on grantee performance, leaving
the grantees themselves to more or less be the drivers of whether they were successful in
developing lacking core elements and reaching implementation at the end of the planning period.

                                                            
6
  We noted in a prior audit of the Department’s implementation of the State Fiscal Stabilization Fund program
(ED-OIG/A19J0001) that, during this time period, AITQ staff were devoting a significant amount of time to work
on programs funded under the American Recovery and Reinvestment Act of 2009 – a situation that we reported at
the time could impact the Department’s ability to effectively manage existing programs if not carefully monitored.
Final Audit Report
ED-OIG/A19L0005                                                                     Page 13 of 21

We found that 7 of the 14 planning period grantees (50 percent) reviewed during this audit had
still not fully developed one or more core elements at the end of the planning period.

Overall, we noted that 28 of the 54 planning period grantees (52 percent) from the FY 2010 TIF
competition were not ready for implementation after year 1. These grantees received
approximately $177 million of the $364 million (49 percent) initially awarded to planning period
grantees. Of these 28 grantees, 27 (96 percent) lacked a teacher and/or principal evaluation
system (Core Element C); 22 (79 percent) still lacked stakeholder support (Core Element B); and
18 (64 percent) lacked a plan for communicating effectiveness measures to affected employees
and providing professional development (Core Element E). Communication plans describing the
planned PBCS (Core Element A) as well as related data management systems (Core Element D)
were also still not fully developed by 25 and 29 percent of the grantees, respectively. These 28
grantees were subsequently placed into “implementation with conditions” status, where they
were not able to make any incentive compensation payouts until all core elements were
successfully developed, similar to the terms under which they operated during the 1 year
planning period.

We noted that the Department’s NFP for the 2012 TIF competition does not include a provision
for a planning period. When we discussed this change with Department officials, they did not
provide a specific reason for discontinuing use of the planning period. However, in a response
provided to the House Committee on Education and the Workforce in July 2012, the Department
acknowledged that a number of planning period grantees were not able to meet all core element
requirements, and that implementation was delayed. The Department further stated that for the
FY 2012 competition, applicants will be required to provide additional information regarding
their level of readiness, educator involvement, and support that will allow reviewers to evaluate
the strength of their applications.

In comments provided in response to the preliminary findings presented at our exit conference,
the Department stated that it is committed to continuous improvement, but believes that a
reasonable level of monitoring and technical assistance was provided, and that there may be
differences in views about what constitutes “monitoring.” The Department noted that the
program office, in conjunction with a contractor, had a significant amount of involvement with
planning period grantees in the first six months following award, which the Department asserts
continued throughout all stages of the planning period. The Department also noted limitations in
on-site visits due to limited resources but stated that program officers were able to successfully
and effectively monitor projects due to the enhanced capabilities and role of technology and the
use of the technical assistance contractor.

We considered the Department’s use of a technical assistance and monitoring contractor and its
activities pertaining to FY 2010 TIF grantees as part of our audit. We found that the
Performance Work Statement (PWS) for the primary technical assistance contractor, dated
April 7, 2010, required that the contractor submit a monthly report of technical assistance
provided and monitoring activities conducted during the prior month. According to most TIF
program officers with whom we spoke, however, these reports were not used for the purpose of
monitoring since they were outdated by the time they were received. The Contracting Officer’s
Representative also noted that the reports were for the sole purpose of invoicing and were not
Final Audit Report
ED-OIG/A19L0005                                                                           Page 14 of 21

intended to assist with monitoring planning period grantees. Further, the monitoring activities
conducted by the contractor were generally administrative in nature, such as taking meeting
notes and scheduling meetings, rather than monitoring grantees’ progress toward developing
lacking core elements. Lastly, the PWS does not place responsibility with the contractor for
ensuring that grantees are held accountable for progress toward developing any lacking core
elements; this function ultimately lies with the Department.

We recognize that technical assistance is integral to ensuring that grantees have the resources
necessary to work through challenges associated with planning and implementation and achieve
program results. We also agree that the contractor did provide such assistance, along with TIF
staff, in an effort to encourage project success. However, we view technical assistance and
monitoring as distinct activities along a continuum and maintain that systematic and regular
emphasis on the latter is invaluable to ensuring that grantees are held accountable for meeting
established timelines, performance measures, and goals. This is especially important when
grantees are operating within constraints, such as a planning period of up to 1 year only. The
Handbook recognizes the importance of monitoring as well, stating that while it is the
Department’s policy to treat grantees as partners and provide both programmatic and financial
technical assistance, this partnership should not interfere with the Department’s and grantees’
roles in ensuring that there is appropriate fiscal accountability and program administration and
implementation.

Recommendations

We recommend that the Assistant Secretary for OESE

2.1 	         Develop a formal monitoring plan for the TIF program that includes specific monitoring
              tools and processes related to the unique programmatic risks associated with grantees that
              have not yet successfully developed required core elements

2.2 	         Ensure that program officers have an understanding of the program’s monitoring plan,
              expectations for monitoring, and all available tools.

Department Comments

The Department disagreed with Finding No. 2, stating that the draft report implied that no
monitoring occurred between the awarding of the grants and April 2011. The Department noted
that TIF staff and its technical assistance providers were in regular contact with grantees from
the earliest stages of the grant, to include working together on budget revisions in the fall of
2010 and discussing the results of needs assessments in February 2011. The Department also
noted that grantees that were participating in the National Evaluation7 received extensive
technical assistance and monitoring to ensure they were making progress on meeting the core
elements. This information was provided to IES by the evaluation contractor, who in turn
shared it with the program officer in OESE.


                                                            
7
    See footnote 1 on page 2.
Final Audit Report
ED-OIG/A19L0005                                                                       Page 15 of 21

Despite its disagreement with the finding, the Department agreed with both recommendations.
The Department stated that it supports the development of a monitoring plan for FY 2012 TIF
grantees that is focused on addressing potential implementation issues or concerns, particularly
for grantees that are not yet in their full implementation stages. The Department noted that
there was no formal planning period for these grantees; however, they have the option of
phasing in their performance-based compensation system. Many grantees are spending the first
year of their grants involved in planning activities. To ensure they are ready to implement in at
least a subset of schools in Year 2, the Department noted it has intensified the technical
assistance and monitoring of these grantees during this period. Further, the Department stated
that program officers for FY 2010 TIF grantees are trained on the monitoring protocol and
processes and that staff are in the process of developing the monitoring protocol for FY 2012
TIF grantees. According to the Department, the program officers for FY 2012 TIF grantees
have collected and are using the submitted timelines to gauge progress in implementation and
are also reviewing technical assistance visit reports and call notes submitted by the contractor.
The Department stated it has set expectations for monitoring that all staff understand and
observe.
 
OIG Response

The Department has not presented any additional information beyond what was already
considered by the OIG in developing the draft report. Our position remains that monitoring
specifically related to planning period grantees’ development of core elements was not adequate
and did not begin in earnest until April 2011. We agree that program officers conducted budget
discussions during the early part of the planning period, which we noted in the draft report;
however, we did not find evidence that core elements were discussed, which is the focus of this
finding due to the importance of the core elements to the success of the overall grant. Further,
while the needs assessments had various domains that overlapped with the core elements, they
were not completed until February 2011, which was nearly 4 months into the planning year. We
found that two program officers noted that needs assessments were not used for monitoring, and
two other program officers noted that needs assessments were used solely for information
purposes and not for tracking or following up on noted weaknesses.

With regard to the National Evaluation, the Contracting Officer’s Representative for the IES
evaluation contract stated that she did not monitor these grantees, noting that this activity should
have been the responsibility of OESE. We noted that the OESE grant files for the National
Evaluation grantees contained the least amount of formal monitoring activity, including but not
limited to, communications with IES on the progress of the planning period grantees in
developing lacking core elements.

After considering the Department’s comments, we did not make any changes to the finding or
related recommendations.
Final Audit Report
ED-OIG/A19L0005                                                                       Page 16 of 21



                                                               OTHER MATTER



The Department’s Implementation of the FY 2010 TIF Competition Created
Inconsistencies in Evaluation Standards

In a prior audit report pertaining to the Department’s implementation of the TIF program,8 we
noted that the allowance of a planning period creates inconsistencies in standards applied to
applicants. During our current audit, we found evidence that substantiated our previous
concerns. Under the FY 2010 TIF competition, applicants were to demonstrate in their
application that they had all of the required core elements for the program in place. In the event
that all core elements had not been developed at the time of application, applicants were required
to provide a plan for how they would establish the lacking core elements during a planning
period of up to 1 year.

As previously noted, Project Design represented 60 of 100 total possible points (60 percent) that
an applicant could earn on its application under the FY 2010 TIF competition. It was scored
based on the quality of the design of the proposed project, which included reviewers’
consideration of the extent to which the proposed PBCS met the required core elements. Of the
remaining 40 points that an applicant could earn, 10 were based on the need for the project, 25
were based on the adequacy of support for the project, and 5 were based on the quality of the
local project evaluation. We reasoned, at least initially, that applicants that were ready for
implementation could be expected to score higher on the Project Design component of the
competition than those that still needed time to develop lacking core elements.

However, we learned that non-Federal reviewers were instructed not to deduct points for
applicants requiring a planning period. Reviewers were required to evaluate all applications
against the same selection criteria, regardless of whether the applicant indicated that it was ready
for implementation or requested a planning period. Our analysis of reviewer comment forms for
24 TIF grantees found that there was no correlation between an applicant’s Project Design score
and its status as either an implementation or planning period grantee. In essence, it was possible
for applicants that still needed to develop core elements for their project to score the same or
higher than those that already had them in place. Specifically, we found that three of the five
highest-scoring applicants (60 percent) were classified as planning period grantees while the
other two (40 percent) were placed directly into implementation. See Table 1 below.




                                                            
8
    See footnote 3 on page 4.
Final Audit Report
ED-OIG/A19L0005                                                                        Page 17 of 21

                    Table 1: Project Design Scores of Top-Rated TIF Grantees

                                                                                Average
          Grantee        Status       Reviewer 1   Reviewer 2   Reviewer 3   Project Design
                                                                                 Score
            A        Implementation      60           58           59             59
             B          Planning         60           55           60             58
             C          Planning         55           60           60             58
            D           Planning         58           58           58             58
             E       Implementation      55           57           55             56


During our review, we noted that the Department administers another grant program, Promise
Neighborhoods, that is similar to the TIF program with respect to the allowance of a planning
period. We determined that Promise Neighborhoods differs from TIF, however, in that
applications are submitted and evaluated under two separate competitions: one for
implementation grantees and another for planning grantees. Planning grants are for 1 year only
and are meant to support eligible organizations in conducting needs assessments, establishing
partnerships, and developing feasible plans with the potential to significantly improve the
educational and developmental outcomes of children and youth in a neighborhood.
Implementation grants are for 3 to 5 years and provide Federal support to eligible organizations
that have demonstrated both a sound strategy and the capacity to implement a continuum of
solutions with the potential to “scale up” what works. Promise Neighborhood applicants must
decide which competition they wish to participate in and structure their application accordingly.
Successful planning grantees are eligible to compete for implementation grants in subsequent
years, but there is no guarantee of award.

We noted that the Department’s recently posted NFP for the FY 2012 TIF competition does not
include a provision for a planning period. However, if such a provision is reincorporated in the
future, or discussed as a potential feature of another grant program, we suggest that the
Department consider holding separate competitions for planning and implementation grants.
Doing so would help prevent inconsistencies in the evaluation of substantively different types of
applications. It would also end Federal funding for those planning period grantees that have still
not successfully developed a program at the end of the 1 year grant period. [See Finding No. 2
for details on TIF grantees that did not successfully complete their planning period.]

In comments provided in response to the preliminary findings presented at our exit conference,
the Department noted its appreciation of our discussion regarding the structuring of grant
programs with planning periods, such as TIF and Promise Neighborhoods, but submitted that
reasonable judgments were made in designing and implementing what it views as an innovative
and dynamic program. The Department stated that it has a great deal of discretion, consistent
with applicable requirements, to design competitions in the best manner possible to ensure an
objective and fair competition. It also noted that there are many reasonable considerations that
can be taken into account in planning a program, in evaluating the effectiveness of a program,
and in considering ways to maximize the positive results of the investments of grants in a
promising and successful program. The Department further noted that the competitions in the
Final Audit Report
ED-OIG/A19L0005                                                                      Page 18 of 21

Promise Neighborhoods program and in the FY 2010 TIF competition were based on applicable
requirements and reasonable policy decisions and took into account such factors as the type of
competition, the type of criteria, the nature of the program, and the relative experience of the
potential applicants in planning and implementing this type of program. Lastly, the Department
stated that these determinations must be made on a case-by-case basis, and that what works for
one program may not work for another.

While we acknowledge that the Department has significant discretion in designing grant program
competitions, and that each program is unique and should be viewed as such, we continue to
believe that our suggestion remains valid, and that careful consideration should be given to the
effects of combining fundamentally different types of applicants for evaluation against the same
selection criteria. Holding separate competitions for planning grantees and implementation
grantees can help ensure greater equity in program administration. It can also mitigate the
potential that certain grantees will continue to receive funding without having completed the
development of an implementable program within the timeframes dictated under the terms of the
grant award, as was the case with more than half of the planning period grantees under the
FY 2010 TIF competition.

Department Comments

The Department disagreed with the Other Matter, but noted its appreciation of our suggestion
that it consider holding separate competitions for planning and implementation grants. The
Department stated that it does not believe it has the authority to implement this approach under
the current statute for the TIF program. Specifically, the language for the TIF statute requires
that funds be used, “… to develop and implement performance-based compensation systems for
teachers, principals, and other personnel in high need schools.” The Department stated that a
separate, stand-alone, planning grant cannot be used to plan a performance-based compensation
system without a scope of work that also includes the implementation of the system.

The Department also discussed advantages and disadvantages to conducting separate
competitions for planning grants. The Department stated that one of the considerations with
planning grants is the extent to which potential applicants will invest time and resources to
submit a robust planning grant without knowing that funds will be available for implementing
the planned activities. It added that planning to implement a comprehensive evaluation and
performance-based compensation system is not an easy feat. To take on that work without the
clear possibility that the applicant will have the opportunity for implementation is a risk that
many applicants may not wish to take.

OIG Response

We agree that the TIF statute, unlike the statute under which Promise Neighborhoods operates,
does not appear to allow for separate competitions for planning and implementation grants.
However, we would note that Senate Committee Reports on appropriations bills for FYs 2010
and 2011 urged the Department to award grants for short-term planning for the development of
performance-based compensation systems as well as for implementation. Although such reports
are nonbinding, particularly in cases when the bills themselves fail to pass, it does appear that
Final Audit Report
ED-OIG/A19L0005                                                                      Page 19 of 21

some consideration has been given to holding separate competitions. As a result, we would
encourage the Department to consider working with Congress to enact changes to legislation that
would allow for separate competitions, if deemed necessary, to help prevent inconsistencies in
the evaluation of substantively different types of applications.

While we acknowledge the Department’s concern that potential grantees may not want to submit
applications for planning grants without having assurance of a subsequent award for
implementation, we believe that those entities that are truly interested and committed to
implementing a performance-based compensation system would not be discouraged from doing
so. In such cases, planning and implementation would occur regardless of the funding source.




                 OBJECTIVES, SCOPE, AND METHODOLOGY 



The objectives of our audit were to (1) review and assess the adequacy of the Department’s
review and evaluation processes to ensure that funded applications demonstrated the
involvement and support of teachers, principals, other personnel, and unions necessary to carry
out program activities; and (2) review and assess the Department’s monitoring plans for funded
applicants proposing a planning period, and determine if the Department’s monitoring efforts
ensured that applicants developed the lacking core element(s) and mitigated related performance
risk.

To accomplish our objectives, we gained an understanding of internal control applicable to the
Department’s administration and oversight of discretionary grant programs. We reviewed
Department policies and procedures, OMB Circular A-123 “Management’s Responsibility for
Internal Control,” and the Government Accountability Office (GAO) “Standards for Internal
Control in the Federal Government.” We reviewed legislation, regulations, and Department
guidance pertaining to the TIF program, in general, and the FY 2010 TIF competition, in
particular. We also reviewed similar information pertaining to the FY 2012 TIF competition. In
addition, to identify potential vulnerabilities, we reviewed prior OIG and GAO audit reports with
relevance to our audit objectives.

We conducted discussions with OESE officials and TIF program officers to obtain an
understanding of the application review and evaluation processes and the Department’s
monitoring of planning period grantees, to include the role of its two technical assistance and
monitoring contractors. We also conducted discussions with IES officials to obtain an
understanding of their role in the TIF grant monitoring process.

The scope of our review was limited to the Department’s pre-award, award, and post-award
activities for grants made under the FY 2010 TIF competition. As previously noted, the
Department awarded 62 such grants, 8 of which were classified as implementation grantees
(13 percent), receiving a total of $83 million, and the remaining 54 which were classified as
planning period grantees (87 percent), receiving a total of $364 million.
Final Audit Report
ED-OIG/A19L0005                                                                                Page 20 of 21

To fulfill our first objective, we reviewed information specific to the TIF application review and
evaluation processes, such as the NIA, and training provided to non-Federal reviewers. We then
reviewed and compared applications and related documentation for all eight implementation
grantees. This included Technical Review Forms completed by non-Federal reviewers, which
contained comments and, in some cases, conclusions regarding the extent to which applicants
met program criteria, and TIF Application Checklists completed by OESE staff, which were
intended to summarize reviewer comments and conclusions and provide support for the
Department’s determination as to whether a grantee was ready for implementation or required a
planning period. Further, during our review, we learned that the Department completed guidance
in May 2011 that was designed to assist TIF recipients in preparing their APR/Core Element
Submissions, as well as inform reviews by OESE staff. This guidance provided examples of
evidence that recipients could submit to show that any previously lacking core elements had been
developed. To determine whether all implementation grantees would have been recognized as
having adequately demonstrated the involvement and support of stakeholders at the time of
award, we retroactively applied this guidance to their applications. Lastly, we reviewed
information on whether these grantees remained in implementation throughout year 1 or were
subsequently placed into planning.

To fulfill our second objective, we reviewed information specific to the TIF monitoring process,
such as the FY 2011 Monitoring Plan, the PWS for the primary technical assistance and
monitoring contractor, and the above-referenced APR/Core Element Submission Guidance. We
randomly selected a sample of 17 of the 54 grantees (32 percent) that were placed into a planning
period. Our sample included $107 million of the $364 million (29 percent) initially awarded to
planning period grantees. For each of these grantees, we reviewed GEMS and other repositories
for documentation pertaining to the Department’s monitoring efforts for planning period
grantees.9 This included applications, Technical Review Forms, TIF Application Checklists,
contractor-completed needs assessments and monthly reports, emails, meeting notes, APRs/Core
Element Submissions, and associated condition and release letters. Our conclusions regarding
the adequacy of monitoring to ensure the development of lacking core elements were based on
both quantitative and qualitative factors, including the timeliness, frequency, amount, and focus
of communications between program officers and grantees.

Use of computer processed-data for the audit was limited to obtaining obligation amounts from
G5, the Department’s grants management system, and establishing the universe of
implementation and planning period grantees for the FY 2010 TIF competition. Because G5 is
the Department’s system of record for grant obligations and the data was used primarily for
informational purposes and did not materially affect our audit findings and resulting conclusions,
we did not assess its reliability.

We conducted fieldwork at Department offices in Washington, D.C., from August 2011 through
August 2012. We provided our audit results to OESE officials during an exit conference
conducted on August 29, 2012.

                                                            
9
  During our review, we determined that 3 of the 17 planning period grantees in our sample were moved into
implementation subsequent to award. Because there would have been no expectation of monitoring with regard to
the development of lacking core elements for these grantees, we removed them from our sample.
Final Audit Report
ED-OIG/A19L0005                                                                        Page 21 of 21

We conducted this performance audit in accordance with generally accepted government
auditing standards. Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions
based on our audit objectives. We believe that the evidence obtained provides a reasonable basis
for our findings and conclusions based on our audit objectives.




                            ADMINISTRATIVE MATTERS



Corrective actions proposed (resolution phase) and implemented (closure phase) by your office
will be monitored and tracked through the Department’s Audit Accountability and Resolution
Tracking System (AARTS). Department policy requires that you develop a final Corrective
Action Plan (CAP) for our review in the automated system within 30 days of the issuance of this
report. The CAP should set forth the specific action items and targeted completion dates
necessary to implement final corrective actions on the findings and recommendations contained
in this final audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector
General is required to report to Congress twice a year on the audits that remain unresolved after
6 months from the date of issuance.

In accordance with the Freedom of Information Act (5 U.S.C.§ 552), reports issued by the Office
of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please
call Michele Weaver-Dugan at (202) 245-6941.

                                             Sincerely, 



                                             Patrick J. Howard /s/ 

                                             Assistant Inspector General for Audit
 
                                              
                                                                       Attachment 1

             Acronyms/Abbreviations/Short Forms Used in this Report

AARTS            Audit Accountability and Resolution Tracking System

AITQ             Academic Improvement and Teacher Quality

APR              Annual Performance Report

CAP              Corrective Action Plan

Department       U.S. Department of Education

FAQ              Frequently Asked Questions

FY               Fiscal Year

GAO              Government Accountability Office

GEMS             Grants Electronic Monitoring System

Handbook         Handbook for the Discretionary Grant Process

IES              Institute of Education Sciences

LEA              Local Education Agency

NFP              Notice of Final Priorities

NIA              Notice Inviting Applications

OESE             Office of Elementary and Secondary Education

OIG              Office of Inspector General

OMB              Office of Management and Budget

PBCS             Performance-Based Compensation System

PWS              Performance Work Statement

TIF              Teacher Incentive Fund

                         
                                                                                   Attachment 2

                       WRITTEN COMMENTS

    FROM THE OFFICE OF ELEMENTARY AND SECONDARY EDUCATION 

             IN RESPONSE TO THE DRAFT AUDIT REPORT, 

  TEACHER INCENTIVE FUND STAKEHOLDER SUPPORT AND PLANNING, ED-
                           OIG/A19L0005 

                              [Date] 


The Office of Elementary and Secondary Education (OESE) appreciates the opportunity to 

provide written comments to the Draft Audit Report, Teacher Incentive Fund Stakeholder 

Support and Planning Period Oversight, ED-OIG/A19L0005, provided on November 13, 2012 

(Draft Audit Report). 


As with all of the programs we administer, OESE strives to continuously improve the operation 

of the Teacher Incentive Fund (TIF). We have used the information obtained from the 

implementation of TIF Cohorts 1 and 2 to inform the competition held in Fiscal Year (FY) 2010 

and the implementation of Cohort 3 (the focus of this draft audit report). Building upon the 

lessons learned in Cohort 3, we made more refinements to the program as illustrated in the TIF 

competition held in FY 2012 and the implementation of the Cohort 4 grants.


Our comments should help to clarify our ongoing efforts as we strive to ensure that the funds 

Congress appropriates to the Department for this program will significantly improve the quality 

of teaching and student academic growth. 


OESE’s responses to this Draft Audit Report are detailed below and in the Corrective Action 

Plan, organized by finding and recommendation. Any subsequent questions, comments, or 

concerns should be addressed to: 


Sylvia E. Lyles, Ph.D. 

Director, Academic Improvement and Teacher Quality Programs 

U.S. Department of Education
Office of Elementary and Secondary Education
400 Maryland Avenue, SW
Room 3E314
Washington, DC 20202
FINDING NO. 1 – The Department’s Application Review Process Did Not Adequately
Ensure that Funded Applicants Demonstrated the Involvement and Support of
Stakeholders

Recommendation 1.1 – Ensure that requirements for demonstrating stakeholder support
and other essential elements are adequately defined during the grant competition planning
process and included in the NFP and NIA.

Comments. OESE disagrees with Finding 1, and disagrees with Recommendation 1.1. The
Draft Audit Report noted that “… non-Federal reviewers accepted differing levels of quantitative
and qualitative evidence of the support of teachers, principals, other personnel, and unions.” The
report implies that one standard for demonstrating stakeholder support and other essential
elements that is clearly defined should be set and applied to all applications and that this should
be clearly stated and communicated in the Notice of Final Priorities (NFP) and the Notice
Inviting Applications (NIA).

OESE has two major concerns with this contention. First, it fails to consider that applicants to
the FY 2010 TIF competition (TIF 3) were in different stages of readiness. The competition was
intentionally designed to allow applicants at different stages of readiness to apply for TIF
funding. We view this as a positive of the competition because this approach did not limit the
competition to only those applicants that were in the advanced planning stages of reforming
teacher and principal compensation systems. Allowing applicants to apply that were just
beginning to tackle this complex reform provided these applicants the opportunity to move to the
next level.

Because applicants could propose a one-year planning period, they were in varying stages of
implementation and some were further along than others in securing stakeholder support.
Additionally, some applicants had preliminary support (e.g., a letter of support from the teachers’
union), but because the evaluation system and performance-based compensation system had not
been developed, and negotiations had not occurred, the support was essentially premature and
not a real measure of support. Given the varying levels of readiness in creating and
implementing the complex systems and infrastructure required under TIF and the innovative
nature of this reform effort, it is reasonable to expect and allow for varying levels of evidence of
stakeholder support.

Second, the implications of a set standard for stakeholder support runs counter the purpose of the
peer review process. To help ensure objectivity, we utilize peer reviewers with a wide range of
backgrounds, viewpoints, and expertise to read the TIF applications and to apply their
professional judgment in reviewing the content and quality of applications. We provide standard
training and guidance to the peer reviewers on the scoring rubric used to assess the quality of the
applications that they are assigned to score. We view these differing opinions of quality not as a
weakness or a flaw of the review process, but as a strength. When reviewers disagree as to what
is or is not sufficient quality, this inevitably leads to in-depth conversations about an application
and its strengths and weaknesses, and results in valuable objective review and feedback to the
Department and the applicant on the merits of its proposal.
As a point of clarification, the Draft Audit Report indicates that, “… the Department stated that it
generally directs reviewers not to seek consensus, but, rather to give the Department their best
individual professional judgment as informed by the requirements, the criteria, and discussion
among the reviews.” This is not accurate. Consistent with the grants policy of the Department,
reviewers are informed that they do not need to reach consensus, but that their comments must
support their scores.

While we do not agree that one standard should be set and applied to all applications and
communicated in the NFP and NIA, we do agree that more examples and guidance could be
given to applicants and reviewers through Frequently Asked Questions (FAQs) and technical
assistance workshops. See our comments under Recommendation 1.2 for more on this issue.


Recommendation 1.2 – Develop specific guidance on the review of stakeholder support and
other essential elements for both non-Federal reviewers and Department staff and
communicate expectations in advance of the application review process so that resulting
conclusions are clearly stated and supported by the evidence.

Comments. OESE disagrees with Finding 1, but agrees with Recommendation 1.2. While we do
not agree that one set standard should be set and articulated to non-Federal reviewers, we do
agree that there is a benefit to providing applicants and reviewers with clear examples and
guidance on addressing some of the application requirements, including stakeholder support.

For the FY 2012 TIF (TIF 4) competition, applicants were required to submit a description of the
extent to which they had secured stakeholder support for their projects, but applicants were not
required to have secured this support prior to submitting their applications or before receiving a
grant award. However, the quality of stakeholder support was a heavily-weighted factor in the
selection criteria (25 points). To assist applicants in understanding the type of evidence that
could be submitted to respond to this selection criterion, we issued several FAQs on this issue;
these FAQs were also shared with non-Federal peer reviewers and incorporated into the peer
review training.


FINDING 2 – Improvements are Needed in the Department’s Monitoring Process for TIF
Planning Period Grantees

Recommendation 2.1 – Develop a formal monitoring plan for the TIF program that includes
specific monitoring tools and processes related to the unique programmatic risks associated
with grantees that have not yet successfully developed required core elements.

Comments. OESE disagrees with Finding 2, but agrees with Recommendation 2.1. The Draft
Audit Report notes that monitoring activities for the development of core elements were
inadequate and that this determination was made based on a lack of documentation and the
quality of the monitoring activities and communications with grantees. The report also indicates
that the TIF program officers did not begin monitoring grantees until 6 months into the planning
year. The report implies that no monitoring occurred between the awarding of the grants and
April 2011.

In actuality, and as stated in the Exit Conference, TIF staff and the Department’s technical
assistance providers were in regular contact with the grantees from the earliest stages of the
grant. TIF staff worked with grantees on budget revisions all through the fall while the grantees
were hiring their staffs and operationalizing their projects. In February 2011, the grantees came
to Washington DC for a project directors’ meeting where they formally met one-on-one with
their TA provider and TIF staff to go over the results of the needs assessment that was performed
by the contractor and to discuss other related issues in the early stages of the implementation of
the grant. The needs assessment covered various domains that overlapped with the core
elements. The feedback that grantees received at the project directors’ meeting was extremely
valuable to the grantees in the development of project activities and tasks, and gave the TIF staff
a good sense of the potential implementation issues/challenges with each grantee.

Additionally, the grantees that are participating in the National Evaluation received extensive
technical assistance and support from the contract, managed by the Institute of Education
Sciences (IES). These grantees received specific technical assistance on the core elements and
this information was reported regularly to IES, who, in turn, shared this information with the
program officer. These evaluation grantees, in particular, received intensive technical assistance
and monitoring to ensure that they were making progress on meeting the core elements.

While we believe that there was a high level of interaction with the grantees in the early
implementation stages of the grants, and do not agree with the Finding, OESE does support the
development of a monitoring plan for TIF 4 that is focused on addressing potential
implementation issues or concerns, particularly for grantees that are not yet in their full
implementation stages.

All of the TIF 3 grantees are now implementing their projects and have met the five core
elements. Under the TIF 4 competition, there was no formal planning period and no requirement
that various core elements be in place before grantees could make payouts. However, TIF 4
grantees have the option of phasing in their evaluation system and performance-based
compensation system. Many grantees are spending the first year of their grants involved in
planning activities. To ensure that these grantees are ready to implement in at least a subset of
schools (or for a subset of staff) in Year 2, we have intensified our technical assistance and
monitoring of these grantees during this period.


Recommendation 2.2 – Ensure that program officers have an understanding of the
program’s monitoring plan, expectations for monitoring, and all available tools.

Comments. As noted above, OESE disagrees with Finding 2, but agrees with Recommendation
2.2. We are unsure how the finding supports this recommendation as the Draft Audit Report did
not indicate that program officers were unaware of the TIF monitoring plan or did not have
access to certain tools. We assume that this Recommendation may be related to how program
officers were or were not using the monthly reports submitted by the contractor and the timelines
submitted by the grantees.

While we are unclear how this Recommendation flows from the Finding, we do agree that it is
essential that program officers are involved in and understand the monitoring process. For TIF
3, the monitoring protocol used in Years 1 and 2 – and now Year 3 – was a team effort and is
designed around the 5 core elements. TIF 3 team members are trained on the monitoring
protocol and process, and in turn are required to make presentations to their grantees at the
Annual Project Directors’ Meeting on the current protocol in use.

The staff is in the process of developing the monitoring protocol and tools for TIF 4 – once
again, it is a team effort. Additionally, the staff is aggressively monitoring the TIF 4 grantees
that are not in full implementation. The team has collected and is using the submitted timelines
to gauge progress in implementation. The team is also reviewing the technical assistance visit
reports and call notes submitted by the contractor. We have set expectations for monitoring that
all staff understand and observe.


OTHER MATTER – The Department’s Implementation of the FY 2010 TIF Competition
Created Inconsistencies in Evaluation Standards

Comments. While OESE appreciates the suggestion that in the future we consider holding
separate competitions for planning and implementation grants, as we discussed in the Exit
Conference, we do not believe that we have the authority to implement this approach under the
current statute for the TIF program. The language of the TIF statute requires that funds be used,
“to develop and implement performance-based compensation systems for teachers, principals,
and other personnel in high-need schools.” A separate, stand-alone, planning grant cannot be
used to plan a performance-based compensation system, without a scope of work that also
includes the implementation of the system. Because of this, it is not appropriate to consider this
approach.

As we discussed at the Exit Conference, there are advantages and disadvantages of conducting a
separate competition for planning grants. For each competition, the Department considers how
best to structure the grant competition to meet the purposes of the program, obtain applications
of the highest quality, fund applications that may have the best chance of being successful in
accordance with the published criteria, and make the most effective use of taxpayers dollars. As
mentioned in the Exit Conference, one of the considerations with planning grants is the extent to
which potential applicants will invest time and resources to submit a robust planning grant
without knowing that funds will be available for implementing the planned activities. Planning
to implement a comprehensive evaluation and performance-based compensation system is not an
easy feat. To take on that work without the clear possibility that the applicant will have the
opportunity for implementation is a risk that many applicants may wish not to take.