oversight

The Department's Implementation of the State Fiscal Stabilization Fund Program.

Published by the Department of Education, Office of Inspector General on 2010-09-24.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

        U.S. Department of Education 

         Office of Inspector General 


   American Recovery and
   Reinvestment Act of 2009
         The Department’s Implementation of the 

         State Fiscal Stabilization Fund Program 


                  Final Audit Report




ED-OIG/A19J0001                          September 2010
                                         UNITED STATES DEPARTMENT OF EDUCATION

                                              OFFICE OF INSPECTOR GENERAL



                                                                     September 24, 2010


Thelma Meléndez de Santa Ana, Ph.D.
Assistant Secretary
Office of Elementary and Secondary Education
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202

Dear Dr. Meléndez de Santa Ana:

This final audit report presents the results of our audit of the U.S. Department of Education’s
implementation of the State Fiscal Stabilization Fund program. We received the Office of
Elementary and Secondary Education’s comments on the contents of our draft report. The
comments are summarized within the Results section of this report.

Corrective actions proposed (resolution phase) and implemented (closure phase) by your office
will be monitored and tracked through the Department’s Audit Accountability and Resolution
Tracking System. Department policy requires that you develop a final corrective
action plan (CAP) for our review in the automated system within 30 days of the issuance of this
report. The CAP should set forth the specific action items, and targeted completion dates,
necessary to implement final corrective actions on the findings and recommendations contained
in this final audit report.

In accordance with the Freedom of Information Act (5 U.S.C. § 552), reports issued by the
Office of Inspector General are available to members of the press and general public to the
extent information contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given to us during this review. If you have any questions, please
call Michele Weaver-Dugan at (202) 245-6941.



                                                                            Sincerely,               



                                                                            Keith West /s/       

                                                                            Assistant Inspector General for Audit                             





The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                    excellence and ensuring equal access.
Final Audit Report                                                                                 Page 3 of 15
ED-OIG/A19J0001

     The Department’s Implementation of the State Fiscal Stabilization Fund Program
                          Control Number ED-OIG/A19J0001

                                                  PURPOSE

This final report provides the results of our audit of the U.S. Department of Education’s
(Department) implementation of the State Fiscal Stabilization Fund (SFSF) program, a new, one­
time appropriation of $53.6 billion in both formula and discretionary grant funds under the
American Recovery and Reinvestment Act of 2009 (ARRA). In this audit, we focused
exclusively on the formula grant portion of the program, which comprises over 90 percent of the
funding and is referred to as the SFSF or Stabilization program. These funds have the dual goals
of (1) helping to stabilize State and local government budgets in order to minimize and avoid
reductions in education and other essential public services and (2) improving student
achievement by encouraging investments in school improvement and reform.

The objectives of our audit were to:

    1.	 Validate that State allocations were calculated in accordance with statutory requirements;
    2.	 Determine whether applications for initial funding and State plans1 included all required
        information and were appropriately reviewed; and
    3.	 Evaluate the Department’s program staffing and monitoring plans.

                                                  RESULTS

We found that the Department’s initial implementation of the SFSF program was generally
appropriate, as related to our objectives. With respect to the first objective, we determined that
the Department calculated State allocations in accordance with statutory requirements.
Regarding the second objective, we found that sampled funding applications included all
required information and underwent multiple levels of review. However, although the
Department’s process indicates that reviewers verified that all required data and related
information were provided, it does not provide assurance that steps were taken to assess whether
the data were reasonably supported, which could impact the Department’s ability to determine
whether States are complying with maintenance-of-effort (MOE) requirements.

In addressing the third objective, we found that Department officials believed that current staff,
plus planned contractor assistance, would be adequate to manage the SFSF program and monitor
recipients. While it appears that staffing has been adequate during the initial implementation of
the program, we noted that the time required to implement and monitor the SFSF program could
impact the Department’s ability to effectively manage existing programs. With regard to
monitoring, we noted that the Department had completed some efforts intended to ensure
compliance with the law, applicable regulations, and Department guidance on the SFSF program.
We also noted that the Department had completed a formal monitoring plan, our review of which


1
 Because State plans were part of Phase II of the SFSF program and Phase II began after our audit fieldwork was
substantially completed, we did not review State plans as part of this audit.
Final Audit Report                                                                   Page 4 of 15
ED-OIG/A19J0001

found it to be adequate, if implemented as described. However, we noted the Department will
rely heavily on contractor support to ensure that States and their subrecipients comply with
applicable Federal requirements and are meeting program goals. Reliance on contractor support
will require effective contract monitoring practices to reduce related performance risk.

In its response to the draft audit report, the Office of Elementary and Secondary Education
(OESE) stated that it did not concur with the finding or the recommendations. OESE believed
that the documentation that each State provided in its application was appropriate and sufficient
to address initially the statutory MOE requirement, and that the Department’s process for
reviewing the applications and its oversight provided sufficient safeguards to justify the SFSF
awards. OESE further believed that it would have been impractical to have staff assess the
validity of MOE data in greater detail prior to making SFSF Phase I awards and noted that there
was no requirement in statute, regulation, or policy that program staff verify MOE data prior to
making a grant award.

The Department’s response did not warrant a change to our conclusion that the SFSF Phase I
application review process did not provide assurance that steps were taken to assess whether
MOE data was reasonably supported or to any of the related recommendations. As noted in our
finding, we found that 3 of 16 States (19 percent) included in our sample appeared to have
insufficient or questionable supporting data. As such, we contend that requiring the submission
of supporting documentation for key data fields, especially data that States are already required
to maintain, would provide a greater level of accountability and assist program staff with their
review of applicable information. It would also provide for more focused monitoring efforts and
reduce the need to address issues noted after the funds have already been awarded and likely
spent. OESE’s comments are summarized at the end of the finding. The full text of OESE’s
response is included as Attachment 2 to this report.

                                       BACKGROUND

The newly-created SFSF program received a fiscal year (FY) 2009 appropriation of $53.6 billion
under the ARRA. Of this amount, $48.6 billion was reserved for States in the form of formula
grant funds. ARRA stipulated that 61 percent of a State’s SFSF allocation be based on its
relative share of the population of individuals aged 5 to 24, and 39 percent be based on its
relative share of the total population. The remaining $5 billion is being awarded competitively
under the Race to the Top (RTT) and Investing in What Works and Innovation Fund programs
beginning in mid FY 2010.

The SFSF program is itself composed of two separate “funds”: (1) the Education Fund
(81.8 percent of a State’s total allocation) and (2) the Government Services Fund (the remaining
18.2 percent). Funds provided under the Education Fund must be used to help restore FYs 2009-
2011 support for public elementary, secondary, and postsecondary education to the greater of the
FY 2008 or FY 2009 levels, with any remaining funds awarded directly to local education
agencies (LEAs) on the basis of their relative Title I shares. Funds provided under the
Government Services Fund are less restricted in terms of their allowable uses, but must
nevertheless be used for education, public safety, and other government services.
Final Audit Report                                                                             Page 5 of 15
ED-OIG/A19J0001


To receive these funds, Governors were required to assure that their State would maintain
funding for public elementary, secondary, and postsecondary education at least at the FY 2006
levels in each of FYs 2009-2011.2 Governors were also required to assure that they would take
actions to: (a) increase teacher effectiveness and address inequities in the distribution of highly
qualified teachers; (b) establish and use pre-K-through-college and career data systems to track
progress and foster continuous improvement; (c) make progress toward rigorous college-and
career-ready standards and high-quality assessments; and (d) support targeted, intensive support
and effective interventions to turn around schools identified for corrective action and
restructuring.

The Department chose to award funds in two phases as a way of holding States accountable for
their early decisions. To receive its initial SFSF allocation (Phase I), each State was required to
submit to the Department an application that provided (1) the assurances described above;
(2) baseline data that demonstrated the State's current status in each of the four education reform
areas (or agree to accept baseline data already compiled by the Department); and
(3) a description of how the State intended to use its SFSF allocation. The Department stated
that it would provide a State with at least 67 percent of its SFSF allocation within two weeks of
receipt of an approvable application. Each State receives the remaining portion of its SFSF
allocation (Phase II) after the Department approves the State's plan detailing its strategies for
addressing the education reform objectives described in the assurances. This plan must also
describe how the State is implementing the recordkeeping and reporting requirements under
ARRA and how SFSF and other funding will be used in a fiscally prudent way (for example,
avoiding the “funding cliff” beyond FY 2011) that substantially improves teaching and learning.

                                                FINDING

The Department Should Strengthen Its Efforts to Ensure That Key Application Data
Are Reasonably Supported Prior to Grant Award

We determined that the Department’s SFSF Phase I application review process was generally
reasonable and effective with regard to ensuring that all required information was provided.
However, although the Department’s process indicates that reviewers verified that all required
data and related information were provided, it does not provide assurance that steps were taken
to assess whether the data was reasonably supported, especially with regard to reported levels of
State support for elementary and secondary education and public institutions of higher education
(IHEs). Such a procedure would have been particularly valuable in helping the Department
confirm the legitimacy of FY 2006 MOE data, which serves as the baseline for State public
education funding.

The ARRA contains MOE requirements for the SFSF program that apply to State support for
elementary and secondary education and IHEs, stating that, “In each of fiscal years (FYs) 2009,
2010, and 2011, the State will maintain State support… at least at the level of such support in FY
2006.” ARRA also authorizes the Secretary of Education (Secretary) to waive or modify these
2
 As noted previously, ARRA provides the Secretary the authority to waive the SFSF MOE requirement under
certain conditions.
Final Audit Report                                                                                      Page 6 of 15
ED-OIG/A19J0001

requirements should a State meet a specific statutory criterion.3 As part of its Application for
Initial Funding Under the State Fiscal Stabilization Fund Program, each State was required to
(1) assure that it would comply with the SFSF MOE requirements; (2) submit an additional MOE
waiver assurance if it anticipated being unable to meet the MOE requirements for one or more of
the relevant fiscal years; and (3) provide baseline MOE data for FYs 2006, 2009, 2010, and 2011
(to the extent that data was currently available). States were required to identify and describe the
data sources used in determining the levels of State support for elementary and secondary
education and public IHEs. States were given some flexibility in determining the levels of State
support for MOE purposes, provided that they did so in a manner consistent with their governing
statutes and regulations concerning primary education funding formulae.

Program staff reviewed the applications using a standardized application screening checklist.4
This review involved (a) ensuring that all requested information was submitted and that
assurances were confirmed and signed (and/or additional or substitute data provided),
(b) ensuring that budget data was supported (if available online and capable of interpretation)
and verifying shortfall/restoration calculations (using a worksheet developed by program staff),
and (c) noting any issues of concern for discussion with the State in followup conference calls.
All applications underwent four levels of review – two program staff, the group leader and the
program director.

Unlike the required baseline data for education reform assurances, the Department did not have a
readily available data source that could be used to corroborate MOE information. Staff stated
that in instances where data sources noted on the applications were easily accessible (for
example, State appropriations bills or other relevant budget documents posted online), an effort
was made to confirm that reported numbers matched to source documentation. In instances
where they were not, or if questions remained, the Department requested clarification and/or
additional information during the followup conference calls noted above. Officials stated that a
concerted effort was made to ensure that data were accurate. However, what exactly this
entailed – particularly as it relates to the verification of support for reported MOE data – is
largely undocumented.

To determine whether States did, in fact, submit data that were reasonably supported, we
performed our own limited analysis. This involved online searches based on the FY 2006
elementary and secondary education MOE data sources cited in State applications.5 We found
corroborating evidence for the reported levels of such State support for 13 of the 16 (81 percent)
3
  The MOE waiver criterion is as follows: “A State is eligible for a waiver of the elementary and secondary
education MOE requirement or the higher education MOE requirement for a given fiscal year if the Secretary
determines that the State will not provide for elementary, secondary, and public higher education, for the fiscal year
under consideration, a smaller percentage of the total revenues available to the State than the percentage provided
for such purpose in the preceding fiscal year.” We determined that no such waivers had been granted as of the time
of this audit.
4
  An identical checklist was used if a State amended its application due to (1) changes to the reported levels of State
support that were used to determine MOE or to calculate restoration amounts or (2) significant or relevant changes
to other key features of its application.
5
  The decision was made to focus on only FY 2006 State levels of support for elementary and secondary education
because these numbers serve as the baselines to which all subsequent data are compared for MOE purposes. We
also noted that the SFSF program is focused primarily on improvements and reforms in the K-12 arena and that the
amount of funding involved for public IHEs is, in general, significantly lower.
Final Audit Report                                                                                     Page 7 of 15
ED-OIG/A19J0001

States in our sample. Three States appeared to have insufficient or questionable supporting data.
For two States, we found larger amounts noted than what was reported as State support in their
applications. We were unable to identify evidence of support levels for one State/territory.

Office of Management and Budget (OMB) Circular A-123 states

         Management has a fundamental responsibility to develop and maintain effective internal
         control. The proper stewardship of Federal resources is an essential responsibility of
         agency managers and staff. Federal employees must ensure that Federal programs
         operate and Federal resources are used efficiently and effectively to achieve desired
         objectives. Programs must operate and resources must be used consistent with agency
         missions, in compliance with laws and regulations, and with minimal potential for waste,
         fraud, and mismanagement.

         Management is responsible for developing and maintaining effective internal control.
         Effective internal control provides assurance that significant weaknesses in the design or
         operation of internal control, that could adversely affect the agency’s ability to meet its
         objectives, would be prevented or detected in a timely manner.

Government Accountability Office (GAO) Standards for Internal Control states

         Internal control and all transactions and other significant events need to be clearly
         documented, and the documentation should be readily available for examination. … All
         documentation and records should be properly managed and maintained.

There was no policy in effect specific to the formula grant process during the period of our
review,6 but the Department has long relied on its Handbook for the Discretionary Grant
Process, most recently updated in January 2009. Section 4.3.1 discusses general guidelines in
evaluating the budget of a [discretionary] grant application. Among these are that program staff
should determine whether costs are adequately documented and justified, and treated consistently
with costs used for the same purpose in similar circumstances.

We determined that Department staff did not assess whether reported MOE data was reasonably
supported prior to awarding SFSF Phase I funds because of both established time constraints and
insufficient Departmental guidance, which in turn led to disparities in the information provided
by States and decreased the ability of staff to confirm support in a timely manner. The
Department, responding to a request from the Secretary and also taking into account the ARRA
goal of helping to stabilize State and local government budgets in order to minimize and avoid
reductions in education and other essential public services, put an emphasis on awarding funds
within two weeks of receipt of an approvable Application for Initial Funding Under the SFSF
Program. Thus, SFSF staff, pulled largely from other programs within OESE’s Academic

6
  The Department issued a directive entitled Guide for Managing State Administered Programs in February 2010.
This guidance was not in effect during our audit fieldwork, but does include a section detailing steps to be followed
in awarding formula grants. Among these are that the program manager must establish review procedures for State
plans and ensure that grant files contain documentation that assigned program staff reviewed State plans and
conducted budget analyses.
Final Audit Report                                                                    Page 8 of 15
ED-OIG/A19J0001

Improvement and Teacher Quality (AITQ) group, may not have had sufficient time to conduct
the necessary research on State levels of support for elementary and secondary education and
public IHEs. As a result, too heavy a reliance may have been placed on assurances and other
statements from the Governors and Chief State School Officers, with too little focus on
independent confirmation of support for the reported funding levels.

We further noted that States appear to have interpreted differently the application’s requirement
to “identify and describe the data sources used in determining the levels of State support…” –
and, for its part, the Department appears to have accepted submissions of varying sufficiency.
Although some States provided detailed information on their primary funding formulae and
clearly identified the document(s) from which the MOE data were determined, others offered
broader, sometimes vague, descriptions, such that any attempt at confirming support would
require both considerable knowledge of the intricacies of individual States’ education financing
methods and a substantial amount of reviewer time. Had the Department been more prescriptive
in terms of what would qualify as adequate support for reported data, it is likely that SFSF
program staff would have been able to more easily assess the legitimacy of such data and
document their efforts.

Confirming that reasonable support exists for key data reported in grant applications is necessary
to ensure that States are submitting the appropriate information and thereby committing to
providing the appropriate levels of funding for public education. Documenting any actions taken
to corroborate reported data and source data is a critical step as it provides evidence of due
diligence on the part of the Department, lends credibility to its decisions, and provides a
definitive record of events should questions or conflict arise between the Department and
grantee. It also increases the level of transparency in government operations, a key ARRA goal.

During the exit conference, Department officials stated that they believed their initial review was
adequate and reasonable, adding that the confirmation of support for, and subsequent verification
of, MOE data is an ongoing process that will be fulfilled through program monitoring and future
audits. Officials also stated that the Governors made assurances regarding the accuracy of MOE
data under penalty of law, which they feel added a level of accountability to the process. The
transparency achieved by posting State applications and MOE data online was described as
another control over verification.

Department officials also expressed concerns during the exit conference about the potential
burden on States with regard to the submission of supporting documentation. It was noted that a
requirement that applicants submit detailed budget data might be disapproved by OMB, on the
basis that such a request would be contrary to the intent of the Paperwork Reduction Act.
Officials also reiterated their understanding that this was a situation that required expeditious
action on the part of the Federal government and that an appropriate balance must always be
sought with regard to internal control. In this case, the need for application review and data
verification – and degree to which this occurred – was weighed against the need to distribute
funds as quickly as possible in order to help create and/or save jobs.
Final Audit Report                                                                  Page 9 of 15
ED-OIG/A19J0001

The Office of Inspector General (OIG) previously noted potential issues with MOE in a
September 2009 alert memorandum issued to Department officials. At the time, OESE noted
that OIG’s recommendation that a process be established and implemented to ensure that States
have met the MOE requirements and assurances prior to awarding additional SFSF funding was
reasonable. However, the Department has not yet proposed a corrective action plan with regard
to this matter and, based on our discussion, it is unclear what additional steps may have been
followed when awarding SFSF Phase II funds.

While we agree that an appropriate balance must be sought between controls and risk, we
maintain that, whenever possible, steps should be taken to ensure that States have provided
supporting data for key information contained in grant applications prior to awarding any funds,
not after the fact when the funds have already likely been expended. This is especially important
given the size of the SFSF grants and the requirement that States fund public education –
elementary, secondary, and postsecondary – at least at the FY 2006 levels in each of FYs 2009,
2010, and 2011. In addition, we note that States were already required to maintain adequate
documentation substantiating the levels of State support used in making MOE calculations per
the Department’s SFSF MOE guidance. As a result, there should have been no additional burden
attributed to a requirement to submit supporting MOE documentation with the applications.

Recommendations:

We recommend that the Assistant Secretary for OESE ensure:

1.1	   Staff have adequate time and resources to effectively confirm that reasonable support
       exists for key data in applicant submissions for applicable Department grant programs
       prior to the awarding of funds, and that any such efforts are appropriately documented in
       the official grant files.

1.2	   Applications for Department grant programs require the submission of supporting
       documentation for key data fields and also provide applicants with an explanation of
       what would be considered adequate documentation, to ensure accountability and assist
       program staff with their review of applicable information.

1.3	   Supporting documentation is requested and reviewed during planned SFSF on-site
       grantee monitoring or desk reviews and, where applicable, adjustments are made and/or
       funds are requested to be returned.

1.4	   A corrective action plan is prepared in response to recommendations made in the OIG
       alert memorandum noted in this report concerning SFSF MOE requirements.
Final Audit Report                                                                    Page 10 of 15
ED-OIG/A19J0001

Department Comments

OESE stated that it did not concur with the draft finding or the recommendations. OESE
believed the documentation submitted by each State was appropriate and sufficient to address
initially the statutory MOE requirement and the SFSF Phase I application review process
provided sufficient safeguards to justify the awarding of State SFSF awards. Specifically, OESE
stated that every State provided the required MOE data, Department staff discussed and inquired
about the data, and more information was requested from State officials if necessary. OESE
further expressed its belief that given the extraordinary circumstances surrounding the program,
it would have been impractical to have staff assess, in greater detail, the validity of MOE data
prior to making SFSF Phase I awards. OESE believed the level of verification done prior to
making SFSF awards exceeded the level of verification that the Department normally does prior
to making formula grant awards and referenced the online posting of applications as evidence of
its commitment to ensuring accountability and transparency.

OESE noted that there was no requirement in statute, regulation, or policy that program staff
verify MOE data prior to making a grant award, and noted that during the monitoring process
States must provide documentation substantiating the level of support for MOE purposes.
Lastly, OESE asserted its belief that the Department had fully addressed the recommendation
included in the OIG’s September 2009 MOE alert memorandum, citing as evidence a
requirement included in its Phase II application that each State submit updated MOE data, as
appropriate, along with an assurance from the Governor that the updated data is “true and
correct.”

OIG Response

We recognize the level of effort put forth by OESE in its initial implementation of the SFSF
program. However, as previously noted in this report, we concluded that 3 of 16 States within
our sample (19 percent) appeared to have insufficient or questionable supporting MOE data.
This data is key to ensuring that the appropriate level of funding is committed by each State for
public education, especially given these challenging economic times. While we acknowledge
that there is no specific requirement in statute, regulation, or policy that program staff verify
MOE data prior to making a grant award, we maintain that requiring the submission of
supporting documentation for key data fields would provide a preventive internal control and
greater level of accountability, assist program staff with their review of applicable information,
and allow for more focused monitoring efforts. Reviewing key supporting data after funds are
already awarded may place OESE in a position of having to consider appropriate corrective
actions after the funds have likely already been spent.

We noted that the application for the new Education Jobs Fund Program (Ed Jobs) requires that
SFSF MOE requirements be met in order for States to receive Ed Jobs funding, further
supporting the need for key data to be verified prior to awarding the substantial amounts of
funding that are involved under these programs. The Department included a requirement in the
Race to the Top grant program application for States to provide financial data to support the
percentage of State revenues used to support education, suggesting that similar measures can be
taken for future grant programs.
Final Audit Report                                                                  Page 11 of 15
ED-OIG/A19J0001

In response to the Department’s comment regarding the OIG alert memorandum, we noted that
the related recommendations are all still unresolved as of August 30, 2010 according to the
Department’s Audit Accountability and Resolution Tracking System (AARTS). We also noted
there has been no substantive resolution activity documented in AARTS since the date the
memorandum was issued. In addition, requiring another MOE assurance in Phase II applications
would not seem to be an effective corrective action with regard to the recommendation to ensure
that States have actually met MOE requirements prior to awarding additional funding.

                                      OTHER MATTERS

Grant File Maintenance

During the course of our audit, we located official correspondence that was not maintained in the
official grant files. This included communications between Governors, State legislators and/or
education officials, Congressional representatives, and the Department regarding aspects of the
SFSF program and issues pertaining to their respective States. In one case, a Governor wrote the
Secretary seeking assistance in helping ensure that the public and State legislature fully
understood the consequences of reducing State funding for public education. In another case,
members of a State’s congressional delegation requested more detailed guidance regarding what
qualified as primary elementary and secondary education formulae. During the exit conference,
Department officials stated that any correspondence not maintained in the official grant files was
likely determined as having no impact on the grants in question, and that only relevant
correspondence was included in the files. All other correspondence would be maintained by the
Executive Secretariat in the Office of the Secretary.

As previously noted, there was no policy in effect specific to the formula grant process during
the period of our review, but the Department has long relied on its Handbook for the
Discretionary Grant Process. Section 4.10 addresses documentation requirements for the
official grant file folder. The file must hold the (1) original application and reviewer’s
comments; (2) required forms; (3) grant award notifications; (4) Annual Grant Performance
Reports; (5) correspondence; (6) decisions; and (7) any other documentation relevant to the grant
throughout its life cycle.

Maintaining complete and adequate documentation of correspondence with States in the official
grant file is necessary to ensure that all relevant matters are considered. It also lessens the
chance of any miscommunication and provides a definitive record of events should conflict arise
between the Department and grantee. We suggest that the Department consider amending its
recently issued Guide for Managing State-Administered Programs to include information on the
types of correspondence that must be included in the official formula grant file and procedures
detailing any coordination between the various offices that handle such matters.
Final Audit Report                                                                                   Page 12 of 15
ED-OIG/A19J0001

Program Staffing

Department officials stated that there are approximately 12 staff actively involved in SFSF
program administration, with others – including program attorneys in the Office of the General
Counsel – providing assistance as needed.7 We noted, however, that OESE staff are devoting a
significant amount of time to ARRA work – a situation that could impact the Department’s
ability to effectively manage existing programs if not carefully monitored.8 We reviewed
timesheets for 10 of the 12 individuals, all of whom work on other grant programs within OESE­
AITQ.9 We determined that during the September-October 2009 timeframe, staff spent on
average 77 percent of their time on ARRA work. This included work on the RTT program,
SFSF program, and Teacher Incentive Fund. The SFSF program has proven to be the most time-
consuming, accounting for 47 percent of the average workweek.

Although officials expressed confidence that the current Department staff, plus staff from a to­
be-determined contractor, will be sufficient going forward, we suggest that consideration be
given to evaluating staffing levels on a formal, recurring basis. Such a task may help ensure that
all programs – whether longstanding, recently implemented, or newly authorized – are properly
resourced and executed.

Program Monitoring

In its September 2009 bimonthly ARRA report, GAO noted that the Department had yet to
finalize SFSF monitoring plans and processes, but anticipated doing so in the near future. In the
interim, officials informed GAO that they were taking several steps to monitor information
received from States and focusing on technical assistance. Efforts included: (1) providing
written guidance and conducting webinars designed to improve State education agencies’ (SEA)
and LEAs’ awareness of the appropriate uses of SFSF funds and related subrecipient monitoring
and reporting requirements;10 (2) monitoring drawdowns and following up with States on any
reports about questionable uses of SFSF funds; and (3) reviewing information reported by States
on SFSF funds in their required quarterly ARRA reports. Department officials described these
same activities when we met, adding only that they are in constant communication with the
States and review all OIG and GAO audit reports and alert memoranda to identify potential
weaknesses in program implementation and administration on both the Federal and State levels.

7
  During the exit conference, Department officials stated that three additional staff had been hired to work on both
the SFSF and RTT programs, with three more staff expected to be hired in the near future.
8
  Similar concerns were noted in a March 2010 report entitled Review of Contracts and Grants Workforce Staffing
and Qualifications in Agencies Overseeing Recovery Act Funds, compiled by the Department of Commerce OIG
and issued on behalf of the Recovery Accountability and Transparency Board. The report presented the results of a
survey of subagencies throughout 26 Federal agencies that are responsible for awarding and administering
ARRA‐funded contracts and grants.
9
  OESE-AITQ staff are responsible for administering almost 30 grant programs, including: Reading First and Early
Reading First, Improving Teacher Quality State Formula Grants, Mathematics and Science Partnerships, the Teacher
Incentive Fund, and 21st Century Community Learning Centers.
10
   As part of this effort, the Department posted an advisory notice to States on its SFSF website and listserv in late
August 2009. States were reminded of their responsibility to monitor subrecipients under the SFSF program to
ensure compliance with all applicable Federal requirements. The Department further stated that to comply with
these requirements, each State must have a comprehensive monitoring plan and protocol to review grant- and
subgrant-supported activities.
Final Audit Report                                                                                      Page 13 of 15
ED-OIG/A19J0001

In late December 2009, OESE posted a Request for Information (RFI) on a draft Statement of
Work (SOW) on the Federal Business Opportunities website. The draft SOW discussed OESE’s
plans to enter into a contract for the monitoring of States’ and subrecipients’ implementation of
the SFSF and RTT programs. In February 2010, officials stated that they had received
comments from 16 vendors in response to the RFI and planned to put out the final SOW for bid
in late spring 2010, with the expectation of an award by June 2010. OESE officials recently
stated that the contract was actually awarded in September 2010.

Also in February 2010, OESE provided the audit team a copy of its SFSF program monitoring
plan, as well as monitoring protocols for SEAs, LEAs, public IHEs, and the Government
Services Fund. OESE also provided a preliminary State monitoring schedule.11 We found that
the Department plans to use OESE/AITQ staff and added monitoring consultants, as needed, to
conduct desk reviews and on-site visits with States annually.12 States will be required to submit
documentation regarding the: (1) allocation and uses of SFSF funds, (2) fiscal oversight
procedures, (3) MOE, (4) progress in the four ARRA reform areas,13 (5) subrecipient monitoring,
and (6) reporting. As part of the reviews, OESE will interview appropriate staff of LEAs, public
IHEs, and other entities receiving SFSF funds.

Our review of the SFSF program monitoring plan found it to be adequate, if implemented as
described. However, as noted, OESE will rely heavily on contractor support to ensure that States
and their subrecipients comply with applicable Federal requirements and are meeting program
goals. Reliance on contractor support will require effective contract monitoring practices to
reduce related performance risk.

Guidance for Future Grant Programs

Lastly, and with regard to all of the above, if history is any indicator, the Department will likely
be tasked with implementing additional grant programs in the future in a similarly expedited
timeframe. We have reviewed the implementation and early administration of a number of new
grant programs over the past five years and although the results have been generally positive, we
suggest that the Department consider establishing some sort of protocol for this process to ensure
the trend continues – a “how-to” guide for program officials that touches on all of the key steps
and phases of program implementation. The development of a document like this would be
especially timely given the Partnership for Public Services’ May 2008 projections on retirement
in the Federal government, which estimate that the Department will shed 22 percent of its
workforce by 2012.14 It may also expose the Department to best practices at other agencies and
yield improvements further down the line.




11
   All of these documents were also made available to the public on the Department’s SFSF website. 

12
   On-site visits will be scheduled with half of the States in a given year; the other half will have a desk review 

performed. 

13
   These are: (1) achieving equity in the distribution of qualified teachers, (2) improving collection and use of data, 

(3) enhancing the quality of standards and assessment, and (4) supporting struggling schools. 

14
   The issue brief notes that all retirement projections are from the Office of Personnel Management, based on

permanent full-time employees on board as of October 1, 2006.

Final Audit Report                                                                                  Page 14 of 15
ED-OIG/A19J0001

                                     SCOPE AND METHODOLOGY

To accomplish our objectives, we performed a review of internal control applicable to the
Department’s implementation and administration of the SFSF program. We reviewed the
authorizing legislation; applicable sections of the Elementary and Secondary Education Act of
1965, as amended, Individuals with Disabilities Education Act, and America COMPETES
[Creating Opportunities to Meaningfully Promote Excellence in Technology, Education, and
Science] Act; and relevant OMB memoranda. We reviewed Department guidance,
communications, and other materials provided by officials or posted on the Department’s ARRA
website and conducted interviews with Department officials to obtain information on the SFSF
program. We also reviewed GAO’s Standards for Internal Control in the Federal Government
and a series of ongoing GAO reports on Federal agencies’ implementation of ARRA programs.

To achieve our first objective, we reviewed the Department’s SFSF program allocation
methodology for reasonableness and compliance with the relevant section of the legislation. We
also verified individual State and outlying area allocation calculations using available U.S.
Census Bureau population data.

To achieve our second objective, we reviewed the Department’s Application for Initial Funding
Under the SFSF Program. This was done to determine what assurances and other submission
requirements were necessary in order for States to receive Phase I funds. We also discussed with
Department officials the application submission, review, and approval processes. Next, we
reviewed a sample of State applications and the associated application screening checklists to
determine whether they included all required information and were appropriately reviewed. Our
overall focus with regard to the SFSF Phase I application review process was on the
reasonableness and appropriateness of the Department’s process and completeness and adequacy
of documentation maintained.

To achieve our third objective, we discussed with Department officials their plans for staffing
and monitoring recipients of funds under the SFSF program and reviewed related documents.
Our work regarding staffing considerations included a review of staff reporting assignments and
analysis of timesheets for three pay periods for a number of OESE employees involved with the
SFSF program. We also reviewed a “Contract and Grant Staffing and Qualification Survey”
completed by OESE in response to a request from the Recovery Accountability and
Transparency Board. Our work regarding monitoring efforts included discussions with both
OESE and the Department’s Risk Management Service (RMS), reviews of the Department’s
ARRA State risk assessment framework and overall agency risk assessment and mitigation plan,
and observation of an SFSF grantee database maintained by OESE. We also reviewed the first
quarterly recipient report review checklists for States in our sample, a draft SOW for SFSF
program monitoring, and the SFSF monitoring plan and protocols.15




15
   States/territories included in our sample were: California, the District of Columbia, Florida, Georgia, Hawaii,
Illinois, Indiana, Louisiana, Maine, Michigan, New York, Pennsylvania, Puerto Rico, South Carolina, Tennessee,
and Texas.
Final Audit Report                                                                    Page 15 of 15
ED-OIG/A19J0001

We employed judgmental sampling to determine which State applications to include in our
review. To maintain organization-wide consistency, we selected the nine States being reviewed
by OIG regional auditors at the time of our audit fieldwork. We also included any States
designated by RMS as either high-risk or at-risk, but not already under review. We further
decided to include at least one State from each of the following categories (if not already
included in the sample): (1) States that received greater than 67 percent of their total SFSF
allocation in Phase I; (2) States that requested and may have been granted waivers from the SFSF
MOE requirement; and (3) States that proposed the use of an alternative initial baseline data
source for any of the four education reform assurances. Finally, we selected for review the two
outlying areas receiving the highest dollar allocations. In total, our initial sample included 18 out
of the 56 (32 percent) States and outlying areas that were eligible to receive funding under the
SFSF program. However, we subsequently learned that applications for the outlying areas would
likely not be approved in such time as to allow for review by the audit team. As a result, our
final sample included 16 out of the 56 (29 percent) applicants.

We relied on computer-processed data obtained from the U.S. Census Bureau website and the
Department’s Grant Administration and Payments System (GAPS). The U.S. Census Bureau is
generally recognized as the authoritative and optimal source for population data; as such, we
determined an assessment of data reliability was unnecessary. However, we did perform a
limited data reliability test on the other two sources of data. This involved validating obligation
amounts in GAPS against the projected State and outlying area allocations determined by
program officials and Budget Services to ensure that they were consistent. We subsequently
determined that although there were minor differences between the two sources of data, none
were significant enough to have a material impact on any findings related to the Department’s
implementation of the SFSF program. Based on our analysis, we concluded that the computer-
processed data were sufficiently reliable for the purposes of our audit.

The scope of our review was limited to the Department’s implementation and administration of
the SFSF program in FY 2009 and early FY 2010. We conducted fieldwork at Department
offices in Washington, D.C., during the period June 2009 through February 2010. We provided
our audit results to Department officials during an exit conference held on April 28, 2010.

Our audit was performed in accordance with generally accepted government auditing standards
appropriate to the scope of the review. The standards require that we plan and perform the audit
to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and
conclusions based on our audit objectives. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions based on the audit objectives.
               Anyone knowing of fraud, waste, or abuse involving
                U.S. Department of Education funds or programs
            should call, write, or e-mail the Office of Inspector General.

                                    Call toll-free:
                             The Inspector General Hotline
                          1-800-MISUSED (1-800-647-8733)

                                       Or write:
                               Inspector General Hotline
                             U.S. Department of Education
                              Office of Inspector General
                               400 Maryland Ave. S.W.
                                Washington, DC 20202

                                       Or e-mail:
                                   oig.hotline@ed.gov

             Your report may be made anonymously or in confidence.

For information on identity theft prevention for students and schools, visit the Office of 

                     Inspector General Identity Theft Web site at:

                                  www.ed.gov/misused



           The Department of Education’s mission is to promote 

     student achievement and preparation for global competitiveness 

      by fostering educational excellence and ensuring equal access. 


                                     www.ed.gov
                                                                   Attachment 1
                      Acronyms/Abbreviations Used in this Report

AARTS        Audit Accountability and Resolution Tracking System

AITQ         Academic Improvement and Teacher Quality

ARRA         American Recovery and Reinvestment Act of 2009

CAP          Corrective Action Plan

Department   U.S. Department of Education

Ed Jobs      Education Jobs Fund Program

FY           Fiscal Year

GAO          Government Accountability Office

GAPS         Grant Administration and Payment System

IHE          Institution of Higher Education

LEA          Local Educational Agency

MOE          Maintenance-of-Effort

OESE         Office of Elementary and Secondary Education

OIG          Office of Inspector General

OMB          Office of Management and Budget

RFI          Request for Information

RMS          Risk Management Service

RTT          Race to the Top

SEA          State Education Agency

Secretary    U.S. Secretary of Education

SFSF         State Fiscal Stabilization Fund

SOW          Statement of Work
                                                                                                                Attachment 2
                                UNITED STATES DEI'ARTMENT OF EDUCATION
                                    OFFICE OF EI.I>:MI·:NT,\KY ,\ND SECONDARY EDUC.J\TlON




MEMORANI)UM
                                                                                        JUl 272010
DATE:


TO:                 Keith West
                    Assistant Inspector General for Audit

                                                                          ,
FROM:               Thelma Melendez de Santa Ana, Ph.D.
                    Assistant Secretary


SUBJECT:            The Department's Implementation orthc State Fiscal Stabilization Fund Program
                    Control Number EI)-OlG/AI9JOOOI




This memorandum provides our initial response on the findings and recommendations identified
in the Omee of Inspector General's Draft Audit Rcport ED-OIG/A 19)0001 entitled The
Department's Implementation ofthe Stale Fiscal Stabilization Fund Program. We appreciate
the work that went into this audit, and appreciate the opportunity to comment.


Drnft Finding - The Department Should Strengthen Its Efforts to Ensure That Key
              Applic�,tion Data Are Reasonably Supported Prior to Grant Award


We do not concur with the draft finding or the recommendations. We believe that
documentation that each State provided in its Application/or Initial Funding was appropriate
and sufficient to address initially the statutory maintenance-of-effort (MOE) requirement, and
that the Department's process for reviewing the applications and its oversight provided sufficient
safeguards to justify the awarding of State Fiscal Stabilization Fund (SFSF) awards.


In its initial application, each State was required to submit to the Department the following:
(I) data on its levels of support for elementary and secondary education and public institutions of
higher education; (2) identification of the source documentation supporting the reported levels of
support; and (3) assurances from the Governor that those data were true and correct and that the
State would either meet the MOE requirement or the criterion for an MOE waiver. To help
ensure transparency, the Department made the initial and approved State applications available
on its website.


The program office's review included the use of a standardized application screening fonn. As
part of the review, program staff ensured that the State submitted all required infonnation and
checked the source documentation provided or identified in the application.                              During the review
process, program staff held at least one conference call with each State to discuss in detail its
application. Every State provided the required MOE data and assurances as part of its
                                                           www.cd.goY


                                  400 MARYLAND AVE. SW, WASlllNG"l'QN, DC 20202
                                                             .




 Ollr mission is 10 promote sllldel1l achievemel1l al1d pn.'lxlratiol1/or global competitivel1ess byfostering educational excellence
                                                    al1d ensuring eqllal acces ·.
                                                                                       Attachment 2


application, and Department staff discussed and inquinxi about this data, and requested more
infonnation from State officials, if necessary. The Department instnlcted States to amend the
initial application if its levels of State support changed in any of the relevant fiscal years.


To further ensure accountability and transparency, the Department required each State to submit
updated MOE data, as appropriate, in its SFSF Phase II application. Furthermore, the
Department required each Governor to once again assure that the State would either meet the
MOE requirement or the criterion for an MOE waiver. 1\s part of the Phase II review process,
program staff carefully and thoroughly reviewed the revised MOE data and source
documentation, and once again, asked for further information if necessary.


As part of its monitoring of the SFSF program, the Department is further verifying the MOE data
provided by States. During the monitoring process a State must provide documentation
substantiating its level of support for MOE purposes. Further, the Department intends to contract
for additional assistance in reviewing and verifying the data. We will also utilize audits to
further review and monitor the reliability and accumcy of this data.


The level of verification done prior to making SFSF awards exceeds the level of verification that
the Department nonnally does prior to making a fonnula gmnt award. It is not practical to have
program stafT assess the validity of MOE data in greater detail than was done during the SFSF
application review prior to making grant awards. This is especially true for the SFSF program
under which funds needed to be awarded quickly to avert layoffs and maintain essential
government services. Given these facts, we do not believe that the recommendations included in
the report arc appropriate.


We note that there is no requirement in statute, regulation, or policy that program statTverify
MOE data prior to making a grant award.       To our knowledge, no program administered by the
Department has verified MOE data prior to making an award, and the Department's DIG has
never previously issued a finding becausc MOE data was not verified prior to making an award.
It would be unusual if the OIG for the first time questioned the efforts ofa program office for
verifying the appropriateness of MOE data under a program in which funds needed to be
awarded on an expeditious basis to create and retain jobs and maintain essential services in
challenging economic times


In addition, we believe that the Department has fully addressed the recommendation included in
the alert memorandum that it establish and implement a process to ensure that States have met
the MOE requirement and assurances prior to awarding additional SFSF funding. As described
above, we required each State to submit, as part of its SFSF Phase II application, updated MOE
data (as appropriate) and an assunmce from the Governor that the updated data is true and
correct.


We once again appreciate the opportunity to submit these comments on the draft audit report.
Please let us know if you have questions or want additional information about our comments.




                                                                                                  2