oversight

The Effectiveness of the Department's Data Quality Review Processes

Published by the Department of Education, Office of Inspector General on 2011-08-22.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

    U.S. Department of Education 

     Office of Inspector General 


American Recovery and
  Reinvestment Act
     The Effectiveness of the Department’s 

        Data Quality Review Processes 


              Final Audit Report 





ED-OIG/A19K0010                      August 2011
                                      UNITED STATES DEPARTMENT OF EDUCATION
                                            OFFICE OF INSPECTOR GENERAL
                                                                                                             AUDIT SERVICES



                                                          August 22, 2011
Tony Miller
Deputy Secretary
U.S. Department of Education
400 Maryland Avenue, SW
Washington, DC 20202

Dear Mr. Miller:

This final audit report presents the results of our review of the effectiveness of the
U.S. Department of Education’s data quality review processes. We received the Office of the
Deputy Secretary’s comments on the contents of our draft report. The comments are
summarized within the Results section of this report.

Corrective actions proposed (resolution phase) and implemented (closure phase) by your office 

will be monitored and tracked through the Department’s Audit Accountability and Resolution 

Tracking System (AARTS). Department policy requires that you develop a final corrective 

action plan (CAP) for our review in the automated system within 30 days of the issuance of this 

report. The CAP should set forth the specific action items, and targeted completion dates, 

necessary to implement final corrective actions on the findings and recommendations contained 

in this final audit report.


In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector 

General is required to report to Congress twice a year on the audits that remain unresolved after 

6 months from the date of issuance. 


In accordance with the Freedom of Information Act (5 U.S.C. § 552), reports issued by the 

Office of Inspector General are available to members of the press and general public to the extent 

information contained therein is not subject to exemptions in the Act. 


We appreciate the cooperation given to us during this review. If you have any questions, please 

call Michele Weaver-Dugan at (202) 245-6941. 


Sincerely, 



Keith West /s/ 

Assistant Inspector General for Audit 





 The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                   excellence and ensuring equal access.
Final Audit Report
ED-OIG/A19K0010                                                                       Page 1 of 12

        The Effectiveness of the Department’s Data Quality Review Processes
                               Control Number ED-OIG/A19K0010

                                            PURPOSE
The American Recovery and Reinvestment Act of 2009 (Recovery Act) places a heavy emphasis
on accountability and transparency, including reporting requirements related to the awarding and
use of funds. Challenges associated with the reporting requirements include ensuring that
recipients of Recovery Act funds meet the reporting obligations, assessing the quality of the
reported information, and using the collected information effectively to monitor and oversee
Recovery Act programs and performance. As of the period ended March 31, 2010, the
U.S. Department of Education (Department) had made 1,688 Recovery Act related grant awards
totaling more than $63 billion. These included awards made to augment existing programs such
as Federal Work Study (FWS) and to newly established Recovery Act related programs such as
the State Fiscal Stabilization Fund. In addition, the Department awarded 28 Recovery Act
related contracts, totaling $60.9 million.

This final report presents the results of our audit of the effectiveness of the Department’s
processes to ensure the accuracy and completeness of recipient-reported data.

                                       RESULTS IN BRIEF
                                       RESULTS IN BRIE
We found that the Department’s processes to ensure the accuracy and completeness of recipient-
reported data were generally effective. The Department established an internal control structure
that included formal policies and procedures specifying Department-wide responsibilities, to
include those of Principal Offices (PO), in performing data quality reviews. These procedures
included automated data checks to validate selected recipient-reported data elements against data
in the Department’s financial system and manual reviews of reported data against specific grant
program or contract criteria to identify outliers in certain data elements. The Department also
formulated and distributed reporting guidance to Recovery Act recipients that specified the
recipients’ responsibilities and reporting requirements.

However, we found instances of recipient-reported data that were inconsistent with data in the
Grants Administration and Payment System (GAPS), contract file documentation, or other data
elements within the recipient reports. These anomalies still existed after the Department had
completed its formal data quality review processes and after the related recipient correction
period. Overall, we identified 2,043 anomalies (4 percent) out of the 49,150 data quality tests we
performed for grant awards and 1 anomaly (1 percent) out of the 110 tests we performed for
contract awards. We also noted that the Department had not established a formal process to
identify and remediate instances in which Recovery Act recipients demonstrated systemic or
chronic reporting problems and/or otherwise failed to correct such problems. Recipient reports
are subject to public scrutiny and are intended in part to help drive accountability for the
spending of Recovery Act dollars. As such, agencies must have an effective review process to
ensure that recipient reports contain accurate and complete data. Incorrect data may lead to
mistaken conclusions about Recovery Act funding and may obscure the transparency that these
reports were designed to provide.
Final Audit Report
ED-OIG/A19K0010                                                                       Page 2 of 12
In its response to the draft audit report, the Office of the Deputy Secretary (ODS) agreed with the
recommendations and described the corrective actions already taken or planned. ODS stated it
was encouraged that the Office of Inspector General determined that the Department’s processes
to ensure the accuracy and completeness of recipient reported data were generally effective,
notwithstanding the opportunities for improvement. ODS further stated that it would use the
recommendations to support its ongoing efforts to continuously improve the quality of recipient-
reported data. The full text of ODS’ response is included as Attachment 4 to this report.

                                        BACKGROUND

The Recovery Act was signed into law on February 17, 2009, and had three immediate goals:
(1) create new jobs and save existing ones, (2) spur economic activity and invest in long-term
growth, and (3) foster accountability and transparency in government spending. To ensure
transparency and accountability of Recovery Act spending, recipients are required to submit
quarterly reports on Recovery Act awards, spending, and jobs impact (§ 1512 of the Recovery
Act).

No later than 10 days after the end of each calendar quarter, recipients must submit Recovery
Act data to FederalReporting.gov, the nationwide data collection system, in order to fulfill their
Section 1512 reporting obligations. Recipient reports are required to include various data
elements, such as the type, date, and amount of award; project description and status; the number
of jobs created or retained; and the amount of Recovery Act funds received and spent.
Following submission of the data reports, the relevant Federal agency is required to perform a
limited data quality review that is intended to identify material omissions and/or significant
errors in the recipient-reported data. When an agency identifies a data quality issue, it is required
to notify the applicable recipient of the nature of the problem and the need to make appropriate
and timely changes through FederalReporting.gov. Federal agencies must make the reports
publicly available on the Recovery.gov website no later than 30 days after the end of each
calendar quarter.

In January 2010, the Recovery Accountability and Transparency Board (Recovery Board)
modified the process for correcting data in FederalReporting.gov by establishing a continuous
corrections period. During this period, recipients can correct reported data from the preceding
reporting quarter once that reporting quarter has ended and after the data is published on
FederalReporting.gov. Federal agencies are required, at a minimum, to conduct a final review of
the reported data upon the close of the continuous corrections period emphasizing the
identification of significant report errors, material omissions, and administrative/technical
problems. The continuous corrections period closes 90 days after the end of the reporting period.

ODS is responsible for providing primary oversight of the Department’s Recovery Act policies,
implementation, reviews, and reporting. ODS monitors the progress of the data quality reviews
and provides external reports, as required, on the status of recipient reporting efforts while
identifying and troubleshooting potential obstacles. ODS also leads daily meetings with the
Department’s POs, which are responsible for conducting the limited data quality reviews and
providing advice and programmatic assistance to recipients.
Final Audit Report
ED-OIG/A19K0010                                                                                     Page 3 of 12
                              FINDING AND RECOMMENDATIONS

FINDING – Department Processes to Ensure the Accuracy and Completeness of Recipient
          Reported Data Were Generally Effective

The Department’s processes to ensure the accuracy and completeness of recipient-reported data
were generally effective. Overall, we found that the Department had established an internal
control structure that included formal policies and procedures specifying Department-wide
responsibilities, to include those of POs, in performing data quality reviews. These procedures
included automated data checks to validate selected recipient-reported data elements against data
in the Department’s financial system. The procedures also included manual reviews of the
recipient-reported data against specific grant program or contract criteria to identify outliers in
certain data elements. In addition, the Department formulated and distributed reporting guidance
to Recovery Act recipients that specified the recipients’ responsibilities and reporting
requirements. Department officials stated that they also conducted outreach efforts to recipients
that included providing technical assistance through telephone and email contact as needed.

However, we found instances of recipient-reported data that were inconsistent with data in
GAPS, contract file documentation, or other data elements within the recipient reports. We
reviewed and analyzed final recipient-reported data for the period ended March 31, 2010,
associated with 1,688 of the Department’s Recovery Act grant awards, including 745 FWS
awards. Specifically, our analysis sought to identify potential discrepancies and omissions that
still existed after the Department had completed its formal data quality review processes for this
reporting period and after the related recipient correction period. We did not review whether the
recipient-reported data were accurate; rather, our primary focus was to determine whether the
Department’s processes would identify potential problems with recipient-reported data.

To perform our review, we compared the recipient-reported data to GAPS data and to other
logically related data elements within the recipient reports. Of the 49,150 data quality tests we
performed, 47,107 (96 percent) identified no anomalies, which indicates that the Department’s
data quality review processes were generally effective. However, we did identify 2,043
anomalies (4 percent) between the recipient-reported data and GAPS data or other logically
related data elements within the recipient reports. The areas with the highest anomaly rates were
as follows:1

        Award Date— 418 (44 percent) of the 943 non-FWS awards had inconsistencies between
         the reported award dates and the award dates in GAPS.2
        Amount of Recovery Act Funds Received Compared to the Cumulative Amount of
         Drawdowns— 226 (24 percent) of the 943 non-FWS awards had variances between the
         reported amount received and the cumulative amount of drawdowns listed in GAPS as of
         the end of the reporting quarter. Of these, 40 awards (18 percent) had variances of
         $500,000 or more.

1
 Attachment 1 contains a listing of data quality tests performed on the grant awards and corresponding error rates.
2
 We did not perform tests relating to award date and amount of Recovery Act funds received compared to the
cumulative amount of drawdowns for FWS awards. This was because the Department did not separately identify
Recovery Act FWS awards from non-Recovery Act FWS awards in GAPS.
Final Audit Report
ED-OIG/A19K0010                                                                                     Page 4 of 12
        Congressional District Identifier— 267 (16 percent) of 1,688 awards had inconsistencies
         between the reported district numbers and the district numbers reported in GAPS.
        Amount of Recovery Act Funds Expended Greater Than Amount of Recovery Act Funds
         Received— 253 (15 percent) of 1,688 awards had an amount reported as expended that
         exceeded the amount of Recovery Act funds received. Of these, 41 awards (16 percent)
         had an amount reported as expended that exceeded the amount of funds received by
         $1 million or greater.
        Award Number— 394 (23 percent) of 1,688 awards had inconsistencies between the
         reported award numbers and the award numbers in GAPS.

In addition to the data quality tests, we performed 110 data and logic checks on a random sample
of 5 (18 percent) of the Department’s 28 Recovery Act related contracts as of the
March 31, 2010, reporting period. When selecting our random sample, we noted that 3 contracts
(11 percent) contained duplicate data in the Recovery Act Prime Recipient Report. Overall, we
noted 1 anomaly (1 percent) in the 110 tests performed.3 This anomaly related to the recipient
using an incorrect award number.

During our review, we also noted that the Department had not established a formal, consistent,
and centralized process to meet Office of Management and Budget (OMB) requirements that it
identify and remediate instances in which Recovery Act recipients demonstrate systemic or
chronic reporting problems and/or otherwise fail to correct such problems. Specifically,
automated reports did not flag anomalies that were also identified during prior reporting periods,
thereby allowing PO staff to easily identify recipients with reporting problems. Instead, the
Department relied on POs to perform this function. Officials from the three POs that administer
the largest number of Recovery Act awards stated they were aware of the OMB requirements.
However, processes for identifying these recipients varied among the offices and among
suboffices.

Finally, we reviewed the Department’s Master List for the period ended June 30, 2010, to
determine whether all Recovery Act grants and contracts with recipient reporting requirements
were included on this list. We found that the Department accurately identified all 1,404 grants
and 26 (96 percent) of 27 contracts on its Master List that were subject to Recovery Act
reporting requirements.

OMB Memorandum M-09-21, “Implementing Guidance for the Reports on Use of Funds
Pursuant to the American Recovery and Reinvestment Act of 2009,” dated June 22, 2009,
requires Federal agencies to provide programmatic assistance to recipients. It also states Federal
agencies should “perform limited data quality reviews intended to identify material omissions
and/or significant reporting errors, and notify the recipients of the need to make appropriate and
timely changes.”

OMB Memorandum M-10-08, “Updated Guidance on the American Recovery and Reinvestment
Act – Data Quality, Non-Reporting Recipients, and Reporting of Job Estimates,” dated
December 18, 2009, provides guidance to Federal agencies intended to improve the quality of
3
 Attachment 2 contains a listing of all data quality tests performed on the sample of contract awards and the
corresponding error rates.
Final Audit Report
ED-OIG/A19K0010                                                                       Page 5 of 12
data reported and further outlines important steps Federal agencies must take during their data
quality reviews. Specifically, it states that improving data quality requires a focus on possible
data anomalies. Further, it states, in instances where agencies identify such anomalies in
recipient reports, they are to:

       1.	 Assess the highest priority corrections necessary to reduce the likelihood of
           significant error;
       2.	 Assess other corrections that would improve recipient data quality; and
       3.	 Encourage recipients to make corrections that ensure accurate data reporting.

OMB Memorandum M-10-08 also requires Federal agencies to continuously evaluate recipient
efforts to meet Recovery Act recipient reporting requirements as well as the requirements of
OMB implementing guidance. It states Federal agencies will work to identify and remediate
instances in which recipients demonstrate systemic or chronic reporting problems and/or
otherwise fail to correct such problems.

OMB Memorandum M-10-14, “Updated Guidance on the American Recovery and Reinvestment
Act,” dated March 22, 2010, requires Federal agencies to compile a comprehensive list, for each
reporting period, of all awards that have Recovery Act recipient reporting requirements.

Grant Related Anomalies

As part of the Department’s internal procedures for data quality reviews, the Office of the Chief
Information Officer (OCIO) generated two automated reports: the error exception report and
analysis/anomalies report. The error exception report validated selected recipient-reported data
elements against data in GAPS and flagged data that may have been inaccurate. The
analysis/anomalies report identified potential inconsistencies between selected recipient-reported
data elements. Each day during the reporting period, the OCIO provided these two reports to PO
staff to assist them in identifying material omissions, significant reporting errors, and possible
data anomalies. In addition to the OCIO automated reports, PO staff could use data in GAPS,
official grant files, and other recipient-reported data submitted to perform their data quality
reviews. As further described below, we found flaws with the programming source codes that
generated portions of OCIO’s error exception report. As a result, some data discrepancies were
not flagged and some information provided on the report was deemed of no value by PO staff
and not used. We also noted that some staff performing the reviews did not have access to
GAPS and therefore had limited ability to perform reviews of some of the reported data
elements. In some instances, Department-developed reporting guidance allowed recipients to
enter data in certain fields that differed from data in GAPS, hindering reconciliation efforts.

   Award Date

   To compare the recipient-reported award date with the award date in GAPS, the error
   exception report pulled data from a field in GAPS that identified the date the grant
   application was scanned into the system for processing as opposed to the actual grant award
   date field. OCIO staff stated neither they, nor the contracted report developer, knew why the
   grant application scan date field was used on the automated report as opposed to the actual
Final Audit Report
ED-OIG/A19K0010                                                                    Page 6 of 12
   award date field. OCIO staff said they validated the error exception report with the
   assistance of the PO staff that used the report when it was first developed. However, we
   found that the information provided to PO staff noted only the data elements from the
   recipient reports that would be compared to data elements in GAPS, not which GAPS fields
   the data were actually being pulled from. As a result of the programming error, the report
   was flagging all awards as having award date exceptions. We found that staff in at least one
   PO had noted problems with this part of the exception report and no longer relied on it;
   however, they did not communicate their concerns to OCIO. As a result of our review,
   OCIO staff stated that they had instructed the report developer to correct the source code and
   review all remaining data fields.

   To determine whether the POs employed other processes to review this field and took action
   to have identified anomalies corrected, we selected a random sample of 42 (10 percent) of the
   418 awards identified as having an award date anomaly from our testing. We reviewed
   FederalReporting.gov to determine whether the Department provided applicable comments to
   these recipients, as this is the required means of communication with recipients for noted
   problems with submitted reports. None of the 42 awards sampled contained comments
   regarding discrepancies noted in the award date field. We subsequently found that none of
   the 42 awards were actually identified by PO staff as having award date anomalies. Staff in
   one PO stated that recipients were given the discretion to use dates other than the date the
   grant was awarded, such as the date the Recovery Act was enacted and the date Recovery
   Act funds became available to states. Staff in another PO said their agency reviewers do not
   have access to GAPS and could not verify if the reported award dates matched the dates in
   the system.

   Amount of Recovery Act Funds Received Compared to the Cumulative Amount of
   Drawdowns

   We found that the source code for the error exception report did not correctly report the
   difference between the amount of Recovery Act funds received and the cumulative amount
   of drawdowns from GAPS as of the end of the reporting quarter. The report provided a value
   of zero for all recipients regardless of what the real difference was. PO staff may have
   interpreted the zero as meaning that reported amounts did not differ from GAPS data and
   therefore initiated no follow-up. OCIO staff said the calculation was corrected in the source
   code for the reporting period ended December 30, 2010, in response to our inquiry.

   We subsequently reviewed a sample of 40 (18 percent) of the 226 awards identified as
   having a related anomaly from our testing. The sampled awards had variances between the
   amount of Recovery Act funds received and cumulative amount of GAPS drawdowns that
   were greater than $500,000. We specifically reviewed the Department’s comments to these
   recipients made through FederalReporting.gov and found 6 (15 percent) of the 40 awards
   sampled had comments regarding the relationship between GAPS drawdowns and the
   amount reported as received. For 12 of the 34 awards without comments, POs did identify
   the anomaly but did not enter related comments in FederalReporting.gov. PO staff stated
   they held conversations with the recipients of 10 of these 12 awards to discuss the
   discrepancies. For the remaining two awards, PO staff stated they performed additional
Final Audit Report
ED-OIG/A19K0010                                                                    Page 7 of 12
   research into the GAPS drawdowns and found that timing issues from when funds were
   drawn down and actually received by the recipient caused the discrepancies.

   PO staff did not identify this anomaly for 22 of the 34 awards that did not have related
   comments in FederalReporting.gov. One PO stated that these anomalies were not identified
   because staff did not review these two data fields. Staff in another PO said that not all of
   their reviewers have access to GAPS to verify the drawdown amounts.

   Congressional District Identifier

   We also found that the error exception report’s source code did not identify anomalies
   between the reported data and GAPS data for the congressional district identifiers. The
   source code ensured only that the reported data contained an acceptable value. OCIO staff
   stated that if the reported congressional district number was lower than the maximum district
   number possible, then the source code would not compare the reported identifier to the
   identifier in GAPS. OCIO further noted that recipient addresses often change and may not be
   updated in GAPS, therefore matching against the data in GAPS could likely result in an
   exception. We noted that on December 15, 2009, the Chairman of the Recovery Board
   announced that internal data checks were being incorporated into FederalReporting.gov that
   would prevent a recipient from entering a congressional district that did not match its zip
   code. Recipients were notified that the address on file in the Central Contractor Registration
   database would be the authoritative source for determining the appropriate congressional
   district.

   Amount of Recovery Act Funds Expended Greater Than Amount of Recovery Act Funds
   Received

   We identified 253 anomalies in which the amount of Recovery Act funds expended was
   greater than the amount of funds received. We reviewed a sample of 41 anomalies
   (16 percent) that had an amount expended that exceeded the amount of funds received by
   $1 million or greater. We noted that all of the exceptions sampled were flagged on the
   Department’s analysis/anomalies report. Of the 41 sampled awards, 5 (12 percent) had a
   related comment to the recipient in FederalReporting.gov and 36 did not have related
   comments. For 2 of the 36 awards without comments, PO staff stated they made verbal
   contact with the recipients to discuss the anomaly and the recipients provided explanations
   for the differences between the reported amounts. PO staff did not follow-up on the noted
   anomalies in 34 of the 36 awards. Staff in one PO stated that funds expended greater than
   funds received were allowable if the total amount expended did not exceed the award
   amount. Staff in another PO said many recipients use a reimbursement method for
   drawdowns; therefore, it may be correct to have an amount of funds expended that was
   higher than the amount of funds received. We agree with the statements made by PO staff
   with regard to this type of anomaly.
Final Audit Report
ED-OIG/A19K0010                                                                      Page 8 of 12
   Award Numbers

   Of the 394 awards that we found with inconsistent award numbers between reported data and
   GAPS data, 372 awards (94 percent) were from the FWS program. We noted that the
   Department’s Tip Sheet for FWS awards instructed recipients to enter an award number that
   included two additional alphabetic characters from what was recorded in GAPS. Federal
   Student Aid staff stated that this was the designation for older FWS awards and that
   recipients were permitted to use either format for the award number.

Contract Related Anomalies

Duplicate data for individual contracts occurred when Contracts and Acquisition Management
(CAM) staff requested recipients to correct their reported data. It appeared that instead of
correcting the original data entered, additional records were created. The one anomaly that we
identified occurred because the recipient placed data in an incorrect field. With regard to the
missing contract award on the Department’s Master List, CAM staff stated that the recipient had
incorrectly marked a previous report as its final report and therefore it was no longer included on
the Master List. CAM staff stated they would contact the recipient and have it submit a final
report in the next reporting cycle, showing the correct information, and ensure it was added back
to the Master List. CAM staff stated that according to the Recovery Board, this would be an
appropriate corrective action for this situation.

Recipient reports are designed to provide the public with transparency as to how Recovery Act
funds are being spent in their communities. In addition, other reports are generated from these
data that are subject to public scrutiny and are intended in part to help drive accountability for
the spending of Recovery Act dollars. Therefore, it is essential that agencies have an effective
review process to ensure that recipient reports contain accurate and complete data. Although we
noted a low anomaly rate in the Department’s recipient reports, incorrect data may lead to
mistaken conclusions about the funding and may obscure the transparency that these reports
were designed to provide to the public, Congress, and the Recovery Board.

RECOMMENDATIONS

We recommend that the Deputy Secretary:

1.1	   In coordination with the Chief Information Officer, ensure that the programming source
       codes for automated Recovery Act reports currently in use and those developed in the
       future are appropriately validated.

1.2	   To facilitate validation efforts, ensure current and future recipient reporting guidance
       requires data to be reported that is consistent with data stored in official Department
       systems.

1.3	   Ensure applicable staff review all required reporting elements and have access to GAPS
       and other data sources necessary for data validation.
Final Audit Report
ED-OIG/A19K0010                                                                      Page 9 of 12
1.4	   In any future related efforts, ensure automated reports are effectively used to enable an
       efficient means of tracking recipients with chronic problems to better focus technical
       assistance efforts.

ODS Comments

ODS stated that it agreed with the draft report recommendations and will use the
recommendations to support its ongoing efforts to continuously improve the quality of recipient-
reported data. In its comments, ODS outlined actions that it had already taken to address the
draft report recommendations. This included testing and correcting automated report source
codes and using the newly available Automated Data Correction tool to fix errors and address
chronic recipient errors from prior quarters. ODS further identified additional planned corrective
actions such as the implementation of a validation process for automated reports to be completed
prior to each reporting period, the review and update of existing recipient reporting guidance,
and the issuance of a reminder to program offices to ensure appropriate staff complete the
approval process for access to the Department’s grants management system.

                                      OTHER MATTERS
During our audit, we reviewed FedBizOpps (FBO) for contract awards that were funded by the
Recovery Act to determine the completeness of the Department’s Master List for the period
ended June 30, 2010. As part of this review, we identified 2 (8 percent) of 26 contract awards
that were on the Department’s Master List, but did not have presolicitation and award notices
posted on FBO as required. One of the contracts was issued on March 25, 2010, as a
modification against a General Services Administration Schedule Delivery Order for additional
services in the amount of $50,000. The other contract was awarded on June 8, 2010, as a
purchase order for services totaling approximately $31,000. CAM staff stated that notices for
these awards were not posted on FBO due to an oversight by staff. Subsequent to our review,
award notices for both contract awards were publicized by CAM on FBO.

Federal Acquisition Regulation (FAR) 5.7, “Publicizing Requirements Under the American
Recovery and Reinvestment Act of 2009,” effective March 31, 2009, requires presolicitation and
award notices to be posted on FBO for actions expected to exceed $25,000, funded in whole or
in part by the Recovery Act.

Publicizing contract opportunities and award information that were funded in whole or in part by
the Recovery Act enhances transparency to the public. We suggest that the Department ensure
all pre-award and post-award notices for future Recovery Act contract actions are publicized as
required by the FAR.
Final Audit Report
ED-OIG/A19K0010                                                                    Page 10 of 12
                             SCOPE AND METHODOLOGY

To accomplish our objective, we performed a review of internal control applicable to the
Department’s processes to ensure the accuracy and completeness of recipient-reported data. This
included reviews of applicable Federal laws and regulations; OMB memoranda; prior audit
reports from OIG and other agencies, through which we sought to identify any potential
vulnerabilities in this area; and the Government Accountability Office’s “Standards for Internal
Control in the Federal Government.” We reviewed Department policies and procedures, as well
as Department guidance made available to Recovery Act funding recipients. We conducted
interviews with appropriate Department officials to gain an understanding of the data quality
review and recipient notification processes.

To perform our audit, we extracted the March 31, 2010, Prime Recipient Report from
FederalReporting.gov. This file contained data for 1,723 unique Recovery Act awards, which
included 1,688 grants and 35 contracts. The file contained 83 data elements that could contain
responses for each award. In addition, we obtained the Department’s Master List of Recovery
Act awards for the period ended June 30, 2010, from OCIO staff. The Master List contained
1,430 unique Recovery Act awards, which included 1,404 grants and 26 contracts.

Prime Recipient Report for the Period Ended March 31, 2010

To evaluate the effectiveness of the Department’s data quality review processes, we
judgmentally identified data quality checks based on our review of OMB guidance, the
Department’s internal policies, and analysis of the relationship between data within the Prime
Recipient Report and the Department’s GAPS. We performed these data quality checks for all
1,688 grant awards, as applicable, to determine whether recipient-reported data were consistent
with grant data maintained in GAPS, data provided to recipients in the Tip Sheets, and other data
elements.

To determine whether PO staff were reviewing the areas with the highest discrepancy rates and
taking action to have the anomalies corrected, we selected samples of awards to review as
follows:
        	 Award Date— We selected a random sample of 42 (10 percent) of the 418 awards
            that we identified as having this anomaly.
        	 Amount of Recovery Act Funds Received Compared to the Cumulative Amount of
            Drawdowns— We judgmentally selected a sample of 40 (18 percent) of the 226
            awards that we identified as having a related anomaly from our testing. These
            sampled awards had variances between the amount of Recovery Act funds received
            and cumulative amount of GAPS drawdowns that were greater than $500,000.
        	 Amount of Recovery Act Funds Expended Greater Than the Amount of Recovery Act
            Funds Received— We judgmentally selected a sample of 41 (16 percent) of the 253
            awards that we identified as having a related anomaly from our testing. These awards
            had an amount reported as expended that exceeded the amount of funds received by
            $1 million or greater.
Final Audit Report
ED-OIG/A19K0010                                                                                    Page 11 of 12
For each of the 123 sampled awards, we reviewed the Department’s automated reports to
determine whether the anomalies were flagged. We also reviewed comments to these recipients
made through FederalReporting.gov and determined whether comments were made regarding the
anomalies in each of the specific areas we identified. If we did not identify related comments,
we then determined whether the Department identified the discrepancies at all and had
communicated them to the recipients in another manner.

To evaluate the effectiveness of the Department’s data quality review processes for contract
awards, we judgmentally identified data quality checks based on our review of OMB guidance,
the Department’s internal policies, and analysis of the relationship between data within the Prime
Recipient Report and the official contract file maintained by the Department. We selected a
random sample of 5 (18 percent) of the 28 contracts4 from the Prime Recipient Report and
performed data checks to determine whether reported data were consistent with the contract file
documentation and other data elements.

To evaluate the completeness of recipient-reported data for the 1,688 grants and 5 randomly
sampled contracts, we examined all 83 data elements on the Prime Recipient Report, determined
which elements would be expected to have reported values for all awards, and reviewed these
categories for incomplete data.

Our work was limited to an assessment of the Department’s processes for reviewing recipient
reports and identifying data discrepancies. We did not perform work to assess the actual quality
of the data reported by recipients.

Master List for the Period Ended June 30, 2010

To determine whether the Department’s Master List was complete for grant awards, we extracted
awards from GAPS that had Recovery Act funded Catalog of Federal Domestic Assistance
numbers, award dates prior to July 1, 2010, award amounts greater than $25,000, and had not
submitted a final recipient report in a prior reporting period. We compared these awards to the
1,404 grants on the Department’s Master List to identify any awards that were not contained on
the Master List, but had recipient reporting requirements.

To determine whether the Department’s Master List was complete for contract awards, we
compared the 26 contracts on the Master List to the Recovery Act funded contracts that were
awarded prior to July 1, 2010, and posted on FBO. We also compared the 26 contracts to the
recipient reports for the periods ended March 31, 2010, and June 30, 2010, to identify contracts
that were not completed and required to report.

This audit did not include a review of non-reporting recipients due to previous related work that
we performed in this area as part of a Recovery Board request in February 2010.5

4
  The Department’s Recovery Act Prime Recipient Report for the period ended March 31, 2010, included 35

contracts. During our review, we noted that three contracts each had two duplicative reports. One contract was not 

a contract awarded by this Department. 

5
  “Recovery Act Data Quality: Errors in Recipients’ Reports Obscure Transparency,” issued February 23, 2010. 

Final Audit Report
ED-OIG/A19K0010                                                                    Page 12 of 12
The audit itself was a test of the reliability of computer-processed data in FederalReporting.gov
for the Department’s Recovery Act awards. As part of this audit, we compared the recipient-
reported data to data in the Department’s GAPS. GAPS is the official system of record for grant
awards, widely used by Department officials, and considered the best available data for the
purpose of this audit.

We conducted fieldwork at Department offices in Washington, D.C., from July 2010 through
April 2011. We provided our audit results to Department officials during an exit conference
conducted on May 2, 2011.

We conducted this performance audit in accordance with generally accepted government
auditing standards. Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions
based on our audit objectives. We believe that the evidence obtained provides a reasonable basis
for our findings and conclusions based on the audit objectives.
                                                                            Attachment 1

                   Results of Data Quality Tests for Grant Awards

                                                              Percentage of Awards
                      Data Quality Test
                                                                 with Anomalies
1    Award type                                                        2
2    Award number                                                     23
3    Funding agency code                                               2
4    Awarding agency code                                             2
5    Award date (excludes FWS awards)                                 44
6    Award amount                                                     3
7    Catalog of Federal Domestic Assistance (CFDA)                     2
     number
8    Program source/Treasury Account Symbol (TAS)                      2
9    Recipient Data Universal Numbering System (DUNS)                  5
     vs. DUNS in GAPS
10   Recipient state                                                  0.3
11   Congressional district identifier                                16
12   Agency code vs. TAS                                              0.2
13   CFDA vs. awarding/funding agency                                 0.1
14   Award type vs. agency code                                        0
15   Final report vs. project status                                   1
16   Final report vs. ARRA funds received                             0.5
17   Project status vs. ARRA funds received                           0.4
18   Project status vs. amount of ARRA expenditure (project            3
     completed, but all funds not expended)
19   Project status vs. amount of ARRA expenditure (project            1
     not initiated, but funds expended)
20   Award date vs. jobs created/retained                             0.2
21   Award date vs. projects completed                                0.2
22   Amount of ARRA expenditure vs. award amount                       0
23   Amount of ARRA expenditure vs. ARRA funds                        15
     received
24   Amount of ARRA expenditure vs. number of jobs                     1
     created/retained
25   Amount of ARRA funds received vs. total expenditure              24
     in GAPS (excludes FWS awards)
26   Amount of ARRA funds received vs. award amount                    0
27   Amount of ARRA infrastructure expenditure vs. CFDA                0
28   Amount of ARRA infrastructure expenditure vs. total               0
     ARRA expenditure
29   Total amount of sub-awards less than $25,000/award               0.2
     vs. total number of sub-awards less than $25,000/award
30   Recipient highly compensated officers                             2
                                                                           Attachment 2

            Results of Data Quality Tests for Sampled Contract Awards

                                                              Percentage of Sampled
                       Data Quality Test                         Contracts with
                                                                   Anomalies
 1   Award type                                                         0
 2   Award number                                                      20
 3   Funding agency code                                                0
 4   Awarding agency code                                               0
 5   Award date                                                         0
 6   Award amount                                                       0
 7   TAS code                                                           0
 8   Recipient DUNS vs. DUNS in GAPS                                    0
 9   Recipient name                                                     0
10   Recipient state                                                    0
11   Congressional district identifier                                  0
12   Order number                                                       0
13   Government contracting office code                                 0
14   Agency code vs. TAS                                                0
15   Award type vs. agency code                                         0
16   Final report vs. project status                                    0
17   Final report vs. ARRA funds invoiced                               0
18   Project status vs. ARRA funds invoiced                             0
19   Award date vs. jobs created/retained                               0
20   Award date vs. projects completed                                  0
21   Amount of ARRA funds received vs. award amount                     0
22   Total amount of sub-awards less than $25,000/award                 0
     vs. total number of sub-awards less than $25,000/award
                                                                              Attachment 3

               Abbreviations, Acronyms, and Short Forms Used in this Report


CAM                  Contracts and Acquisitions Management

CFDA                 Catalog of Federal Domestic Assistance

Department           U.S. Department of Education

DUNS                 Data Universal Numbering System

FAR                  Federal Acquisition Regulation

FBO                  FedBizOpps

FWS                  Federal Work Study

GAPS                 Grants Administration and Payment System

PO                   Principal Office

OCIO                 Office of the Chief Information Officer

ODS                  Office of the Deputy Secretary

OMB                  Office of Management and Budget

Recovery Act         American Recovery and Reinvestment Act of 2009

Recovery Board       Recovery Accountability and Transparency Board

TAS                  Treasury Account Symbol
                                                                                          Attachment 4


                      UN ITED STATES DEPARTMENT OF EDUCATION

                                                                                  THE DEPUTY SECRETARY

                                           August 1,20 11



MEMORANDUM

TO:             Michele Wea ver-Dugan
                Director, Operations Internal Aud it Team
                Office of Inspector Ge~1           ~ ,

FROM:           TOnYMiller~
SUBJECT:        Draft Audit Report, American Recovery and Reinvestment Act: The
                Effectiveness of lhe Department's Data Quali ty Review Processes (ED-
                OIG/AI9KOO I0)

Thank you for th e opportunity to comment on the draft audit report, " American Recovery and
Reinvestment Act: The Effectiveness of the Department's Data Qua li ty Review Processes." We
are encouraged that the Orfice of Inspector General (D IG) determ ined that the Department's
processes to ensure the accuracy and completeness of recipient reported data were generally
effective, notwithstanding the opportunities fo r improvement. As your report notes, the
Department establi shed and impl emented po li cies and procedures for performin g data quali ty
reviews, whi ch included, for example, automated data checks, manua l reviews of recip ient-
reported data against speci fie grant program or contract cri teria, and written guidance and
technica l assistance to recip ients of American Recovery and Re investment Act (Recovery Act)
funds. We are encouraged that 96 percent (47,107 of 49, 150) of the data qua lity tests that O IG
performed identifi ed no anoma li es and that the anomalies iden tifi ed were not in arcas of great
significance (e.g., award numbers, award dates). We full y appreciate the importance of
provid ing the pub li c with accurate and compl ete data about how Recovery Act funds are being
spent and that an effecti ve review process is essential to that effort. Therefore, we appreciate and
agree with your recommendat ions and wi ll use them to suppo rt our ongoing efforts to
continuously improve the quality of recipient-reported data. Following is the Department 's
response to each recommendat ion.

The Deputy Secretary:

I. In coordination with the C hief Information Officer, ensure that the programming
      source codes for automated Recover y Act reports currently in use and those develo ped
      in the future are appropriately validated.

   As noted in the report, the Department already has corrected source code errors fo r
   auto mated Recovery Act reports and conducted a tech nica l wa lk-through of a ll remaining
   data fi e lds. With the implementation of the Department 's new grants management system
   (GS) , codes were converted and retested w ith G5. We also note that some data fields


                            400 MARYU.ND AVE., S.W. WASHINGTON, D,C. 20202·0500
                                                www.ed.gov
   identified as anomali es in this report (e.g., award dates) wou ld no longer be considered
   anomali es based on updated guidance from the Office of Management and Budget (OMB).

   Correct ive Act ion: To ensure that the programm ing source codes developed in the fu ture are
   appropriate ly validated, the Department's O ffice of the Chief Infonnation Officer (OCIO)
   will update the internal Data Qual ity Rev iew procedures to req uire a validation process and
   test report be completed pri or to each new reporting peri od. The update to the procedures
   will be completed by August 30, 20 II.

2. To facilitate validation efforts, ensure current and future recipient reporting guidance
   requires data to be reported that is consistent with data stored in official Department
   systems.

   Correcti ve Action: To facilitate validation efforts, the Department's Metrics and Monitoring
   Team will rev iew existing recip ient reporting guidance and, where applicable, update the
   in formation to require data to be reported that are consistent with data stored in the official
   Department systems. Updated guid ance w ill be comm uni cated to recipients by
   September 30, 20 II.

3. Ensure applicable staff review all required reporting elements and have access to GAPS
   and other data sources necessary for da ta validation.

   The Department's data quality processes are aligned with OMB's guidance, and the
   Department already requires program offices to ensure that all applicable staff conduct
   thorough reviews of the ir recip ien ts' reports. OC IO currently has procedures and forms in
   place for all appropriate Department personnel to obtain access to the G5 system.

   Correct ive Actions: a Clo wi ll develop and imp lement a process to regularl y monitor and
   ensure that program offices are rev iewing their recipi ent reports. The process will be
   developed and implemented by August 30, 20 11. In addition, the link to the most current
   User Access Req uest Form wi ll be distributed by OCIO to app li cabl e program offices by
   August 15, 20 11 , with a reminder for the program office to ensure that all appropriate staff
   complete the approval process to access the G5 system .

4. In any future related efforts, ensure automated reports are effectively used to enable an
   efficient means of tracking recipients with chronic problems to better focus technical
   assistance efforts.

   The Department has been tracking multip le-time non-reporters for several reporting cycles.
   The Department began tracki ng recip ien ts with incorrect award numbers across multip le
   cycles during the Continuous Correction portion of the 201 1 Q l cycle. In addition, durin g
   th e April20 11 reporting period, the Department began using the newly avail able Automated
   Data Correction tool to fix errors and address chronic errors by recipients in previous
   quarters.




                                                 2
Correcti ve Act ion: The Department will provide OIG with a report of th is track ing activ ity
by A ugust 30, 20 II.




                                              3
     Anyone knowing of fraud, waste, or abuse involving
      U.S. Department of Education funds or programs
  should call, write, or e-mail the Office of Inspector General.

                          Call toll-free:
                   The Inspector General Hotline
                1-800-MISUSED (1-800-647-8733)

                             Or write:
                     Inspector General Hotline
                   U.S. Department of Education
                    Office of Inspector General
                         550 12th St. S.W.
                      Washington, DC 20024

                             Or e-mail:
                         oig.hotline@ed.gov

    Your report may be made anonymously or in confidence.

For information on identity theft prevention for students and schools,
  visit the Office of Inspector General Identity Theft Web site at:
                        www.ed.gov/misused




          The Department of Education’s mission is to promote 

    student achievement and preparation for global competitiveness 

     by fostering educational excellence and ensuring equal access.


                             www.ed.gov