oversight

HUD's Proposed HOME Regulations Generally Addressed Systemic Deficiencies, but Field Office Monitoring and Data Validation Need Improvement

Published by the Department of Housing and Urban Development, Office of Inspector General on 2013-02-12.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

OFFICE OF AUDIT
REGION 1
BOSTON, MA




          The U.S. Department of Housing and Urban
                        Development

           HOME Investment Partnerships Program




2013-BO-0001                             FEBRUARY 12, 2013
                                                        Issue Date: February 12, 2013

                                                        Audit Report Number: 2013-BO-0001




TO:            Mark Johnston
               Acting Assistant Secretary, Community Planning and Development, D



FROM:          Edgar Moore,
               Regional Inspector General for Audit, Boston Region, 1AGA


SUBJECT:       HUD’s Proposed HOME Regulations Generally Addressed Systemic
               Deficiencies, but Field Office Monitoring and Data Validation Need Improvement


    Enclosed is the U.S. Department of Housing and Urban Development (HUD), Office of
Inspector General’s (OIG) final results of our review of HUD’s HOME Investment Partnerships
Program and systemic deficiencies identified within prior Office of Inspector General audit
reports.

    HUD Handbook 2000.06, REV-4, sets specific timeframes for management decisions on
recommended corrective actions. For each recommendation without a management decision,
please respond and provide status reports in accordance with the HUD Handbook. Please furnish
us copies of any correspondence or directives issued because of the audit.

    The Inspector General Act, Title 5 United States Code, section 8L, requires that OIG post its
publicly available reports on the OIG Web site. Accordingly, this report will be posted at
http://www.hudoig.gov.

If you have any questions or comments about this report, please do not hesitate to call me at 212-
264-4174.



cc: Yolánda Chavez, Deputy Assistant Secretary for Grant Programs, DG
                                            February 12, 2013
                                            HUD’s Proposed HOME Regulations Generally
                                            Addressed Systemic Deficiencies, but Field Office
                                            Monitoring and Data Validation Need Improvement




Highlights
Audit Report Number 2013-BO-0001


 What We Audited and Why                        What We Found

We reviewed the U.S. Department of          If properly implemented, HUD’s proposed changes to
Housing and Urban Development’s             HOME regulations and controls should mitigate the
(HUD) Home Investment Partnerships          systemic deficiencies identified in prior HUD OIG
Program (HOME) as part of an Office         audit reports with the exception of (1) the program
of Inspector General (OIG) plan to          office’s oversight of grantee monitoring and (2)
improve HUD’s execution and                 validating the reliability of HOME data. 1
accountability of fiscal responsibility.
Our objective was to determine whether      Office of Community Planning and Development
HUD’s proposed regulation changes           (CPD) program officials’ oversight of field office
and controls would mitigate the             monitoring and grantee compliance required
systemic deficiencies identified in prior   improvement because the quality management review
OIG audit reports.                          process they relied on failed to identify systemic
                                            monitoring flaws and officials did not use onsite
                                            monitoring data to assess monitoring efforts. As a
 What We Recommend
                                            result, officials could not ensure that monitoring was
                                            complete and effective and may have missed
We recommend that the Acting                opportunities to identify systemic issues requiring
Assistant Secretary for Community           corrective action, such as seldom or never monitored
Planning and Development (1) develop        and longstanding noncompliant grantees.
and implement procedures to oversee
and assess the effectiveness of field       Although CPD officials had improved controls over
offices’ monitoring efforts, and (2)        HOME data in the Integrated Disbursement and
develop and implement a quality control     Information System, they lacked a complete process
system to validate the accuracy and         for validating the data. They focused their efforts on
reliability of HOME data in the             training, moving the database to a Web-based system,
Integrated Disbursement and                 and implementing system controls to improve grantee
Information System.                         compliance and data reliability. However, the HOME
                                            data were not fully validated, and the reliability of the
                                            data as a whole was unknown. With hundreds of
                                            grantees and thousands of subgrantees, reliable data
                                            are critical in overseeing the program, identifying
                                            high-risk grantees to monitor, and responding to public
                                            and congressional requests regarding the program.


                                            1
                                                See appendix C for our detailed conclusions.
                           TABLE OF CONTENTS

Background and Objective                                                      3

Results of Audit

      Finding 1:    HUD Officials’ Oversight of Field Office Monitoring
                    Efforts and Grantee Compliance Had Weaknesses             4
      Finding 2:    HUD Officials Had Improved Controls over HOME Data, but
                    Data Reliability Was Insufficient                         7

Scope and Methodology                                                         10

Internal Controls                                                             12

Follow-up on Prior Audits                                                     14

Appendixes

A.    Details of Open Internal Audits and Recommendations                     15
B.    Auditee Comments and OIG’s Evaluation                                   19
C.    Conclusions Regarding Systemic Deficiencies                             27




                                            2
                           BACKGROUND AND OBJECTIVE

The HOME Investment Partnerships Program, established in 1992, provides between $1 and $2
billion in formula grants each year to States and local jurisdictions (grantees). Grantees use and
distribute the funds to communities and nonprofit groups to build, buy, or rehabilitate affordable
housing for rent, home ownership, or to provide direct rental assistance to low-income people.

HOME is a large program with approximately 642 grantees, thousands of subrecipients, and
more than 15,000 open activities at any one time. The U.S. Department of Housing and Urban
Development’s (HUD) Assistant Secretary for Community Planning and Development (CPD) is
responsible for the program, and the Office of Affordable Housing Programs directly administers
and oversees the program.

Monitoring at the grantee level is achieved primarily through onsite performance and compliance
reviews conducted by HUD’s 42 local field offices. Due to the large number of participants and
its inability to monitor all grantees onsite, HUD also relies on its automated Integrated
Disbursement and Information System to electronically monitor grantees. Grantees, in turn, are
responsible for monitoring their subgrantees.

HUD maintains two information systems to manage the program. The Integrated Disbursement
and Information System (IDIS) reports program performance and is used for oversight and
grantee compliance. This system was moved to a Web-based platform in 2009, enabling
substantial improvements including new input controls, flags, and system reports to enhance
reporting and compliance. The Grants Management Process (GMP) database is used to record
monitoring efforts and results and to facilitate the selection of high-risk grantees for monitoring.

The HOME regulations were last substantively revised in September 1996, and the Office of
Affordable Housing Programs is in the process of updating the regulations to address known
issues.2 We expect the revised regulations to be published after the issuance of this report.

The HOME program and HUD’s oversight received considerable public scrutiny in Washington
Post articles and congressional hearings. Congress expressed its concern when it reduced the
2012 HOME budget to $1 billion and as a condition of funding, required that HUD report within
120 days on how CPD was improving its program’s data quality, data management, and grantee
oversight and accountability, including addressing problems identified in Office of Inspector
General Reports since 2006 and ongoing audits.3

Due to our longstanding concerns and congressional requests, we performed this audit with the
objective of determining whether HUD’s proposed regulations and other controls, if properly
implemented, would mitigate the systemic findings in prior Office of Inspector General (OIG)
audit reports.

2
    24 CFR (Code of Federal Regulations) Part 92, The HOME Investment Partnerships Program Final Rule
3
    Section 232 of the Consolidated and Further Continuing Appropriations Act, 2012 (P.L. 112-55)


                                                       3
                                       RESULTS OF AUDIT


Finding 1: HUD Officials’ Oversight of Field Office Monitoring Efforts
           and Grantee Compliance Had Weaknesses
CPD program officials’ oversight of field office monitoring of grantee compliance was not
sufficient to ensure that monitoring was effective and complete. HUD headquarters officials did
not determine whether field office monitoring efforts were effective, identify systemic
deficiencies, or oversee the monitoring of non-high-risk grantees.4 This condition occurred
because the quality management review process they relied on failed to identify systemic
monitoring flaws and program officials did not use the information derived during onsite
performance reviews to assess monitoring efforts. As a result, program officials could not ensure
that monitoring was complete and effective and may have missed opportunities to identify
systemic issues requiring corrective action, such as seldom- or never-monitored grantees and
longstanding noncompliant grantees. In addition, program officials did not assess the monitoring
of grantees that field offices determined were not high risk to ensure the soundness of risk
assessments and obtain early warnings of potential deficiencies.


    Program Officials, Did Not
    Determine Whether Monitoring
    Was Effective and Complete

                 HUD’s policy is for program officials to continually assess the effectiveness of
                 grantee monitoring.5 Therefore, officials maintain several systems and processes
                 to facilitate the policy, including (1) quality management reviews that evaluate
                 field office monitoring efforts and (2) field office onsite monitoring results that
                 provide grantee performance and compliance data in the GMP database.
                 However, the quality management reviews were not adequate to ensure that
                 monitoring was effective, and officials did not evaluate field offices’ monitoring
                 results to determine whether monitoring was effective and complete.6

                 When Congress asked for program details, HUD queried the GMP database and
                 reported that 238 HOME reviews were completed during 2009 and 2010 and
                 identified 591 compliance and performance findings. However, program officials
                 did not routinely use the database to determine whether monitoring was effective
                 or complete. Rather they relied on field offices to oversee monitoring and the
                 resolution of findings. As a result, officials did not know and could not readily

4
  HUD Monitoring Desk Guide and CPD Notice 12-02
5
  HUD Monitoring Desk Guide
6
  According to the U.S Government Accountability Office, monitoring is complete only when deficiencies are
corrected, the corrective action produces improvements, and it is decided that further management action is not
needed.

                                                         4
                show whether the field offices’ monitoring efforts were effective and how many
                of the 591 findings and any later findings had been resolved.

    Quality Management Reviews
    Did Not Identify Deficiencies

                The program office relied on its quality management review process to assess
                monitoring efforts. The most recent review reported that field offices followed
                the monitoring handbook and effectively carried out monitoring.7 However, an
                audit completed 3 months later revealed that the monitoring handbook was not
                always followed and findings were not followed up on in a timely manner.8 For
                example, OIG reported that field offices failed to use required monitoring
                handbook exhibits and document follow-up with grantees that failed to meet
                target dates. Thus, the reviews were not an effective tool for identifying
                monitoring deficiencies and should not be relied on as a sole source for assessing
                and overseeing monitoring.

    The Program Office Did Not
    Oversee Monitoring of Non-
    High-Risk Grantees

                HUD’s policy is that field offices should monitor a limited number of grantees
                that they determine to be non-high risk to validate the soundness of the risk
                assessment rating criteria and obtain early warnings of potentially serious
                problems.9 Program officials said that some reviews were conducted; however,
                the number completed and results were unknown. Therefore, without overseeing,
                documenting, and evaluating non-high-risk grantee monitoring results, field
                offices may not have tested a sufficient number of non-high-risk grantees, their
                risk assessments may not have been sound, and the highest risk grantees may not
                have been selected for monitoring. In addition, the program office may have lost
                opportunities to obtain early warnings of potentially serious problems.

    Data Were Available To Assess
    Monitoring

                During our review, we determined that HUD program officials can assess the
                effectiveness of field monitoring by using data in the GMP database. The
                database can identify metrics such as

                              Grantees monitored and not monitored,

7
  HUD’s Fiscal Year 2011 Quality Management Review Report
8
  OIG Audit Report 2012 FO 0003, Additional Details To Supplement Our Report on HUD’s Fiscal Years 2011 and
2010 Financial Statements
9
  If travel resources permit, according to CPD Notice 12-02

                                                     5
                                 Areas tested and not tested,
                                 Types of findings and concerns,
                                 Continually noncompliant grantees, and
                                 The resolution of findings and concerns.

                   Thus, the program office could and should use the data to assess monitoring
                   efforts using the above metrics. The procedures should ensure that (1) seldom- or
                   never-monitored grantees are identified and minimized to reduce the fraud risk,
                   (2) monitoring provides adequate program coverage of known systemic
                   deficiencies,10 (3) findings and results are analyzed to identify systemic
                   deficiencies requiring additional management emphasis, (4) continually
                   noncompliant grantees are identified and appropriate corrective action is taken,
                   and (5) monitoring is complete or appropriate action is taken for grantees that
                   have not resolved a noncompliance in a timely manner.11

     Conclusion

                   Program officials could not show that monitoring efforts were effective and
                   complete. As a result, the fraud risk for grantees seldom or never monitored was
                   not known and may not have been mitigated; systemic deficiencies may not have
                   been tested, identified, and mitigated; findings may not have been resolved in a
                   timely manner; and continually noncompliant grantees may not have been
                   identified and appropriate corrective action not taken to preserve the integrity of
                   the program and conserve HUD resources. Consequently, program officials’
                   oversight of field office monitoring efforts was insufficient. We attributed this
                   condition to reliance on ineffective quality management reviews12 and the lack of
                   procedures to evaluate monitoring results in the GMP database.

     Recommendations

                   We recommend that the Acting Assistant Secretary for Community Planning and
                   Development

                   1A. Develop and implement comprehensive procedures to assess the effectiveness
                      and completeness of monitoring efforts using metric or query data in the GMP
                      database as detailed in this finding. .

                   1B. Develop and implement procedures to evaluate the field office testing of non-
                      high-risk grantees to ensure the soundness of risk assessments and obtain
                      early warning of potential deficiencies as provided for in HUD CPD Notice
                      12-02.
10
     To include testing systemic issues such as program income (see related finding 2)
11
      As provided for in 24 CFR 92.551
12
     We expect this to be corrected in part during the resolution of OIG Audit Report 2012-FO-0003. See Follow-up
     on Prior Audits section in appendix A in this report.

                                                         6
Finding 2: HUD Officials Had Improved Controls Over HOME Data,
           but Data Reliability Was Insufficient
Although HUD officials had implemented controls to improve the reliability of HOME data in
the Integrated Disbursement and Information System, they lacked a complete process for
validating the data. This occurred because officials were concerned with implementing data
input controls and had not yet established data validation controls. As a result, they could not
show that the new controls were effective and HOME program data as a whole were complete,
accurate, and supported by appropriate documentation.


     Data Were Not Always Reliable

                  Despite regulations requiring grantees to properly report HOME information, OIG
                  audits have shown that grantees often inaccurately reported HOME data in the
                  Integrated Disbursement and Information System; such as, program income,
                  commitments, and activity status including not closing activities in a timely
                  manner. This condition was primarily due to grantee errors and omissions and
                  known system weaknesses such as the system’s method of accounting for
                  program income.

      HUD Officials Had Improved
     Controls Over Data Reliability
     and Compliance

                  HUD’s policy is to validate data for accuracy, completeness, and consistency to
                  the extent possible.13 The program office and its field offices use several
                  processes to promote data integrity and validity. The program office implements
                  system controls to ensure that some data are complete and within parameters. It
                  also generates system reports and posts them on its Web site. The rationale is that
                  if a report showed poor performance or noncompliance due to inaccurate or
                  incomplete data, the field office, local official, and grantee could detect and
                  correct the data.

                  HUD officials were aware of data and compliance issues and focused
                  considerable efforts on training, moving the system from an enterprise-based to a
                  Web-based system, and designing and implementing system controls to improve
                  data reliability and program compliance to address such issues as the following:

                       Program income issues – Officials modified the system to remove
                        limitations that discouraged and prevented grantees from complying with
                        requirements.


13
     HUD Monitoring Desk Guide

                                                   7
               Commitment issues – Officials added an electronic certification to confirm
                that they complied with requirements and were supported by required
                documentation.

               Activity status and expenditure issues – Officials implemented a process to
                automatically cancel activities that showed that no funds were spent within
                their first year.

         System flags were also added to alert grantees when they were in danger of not
         meeting regulatory requirements such as the 5-year statutory expenditure limit.
         Therefore, if properly implemented, these additional controls should improve data
         reliability and grantee compliance.

HUD’s Validation Process Was
Not Complete

           HUD’s field offices also test and validate data during individual onsite grantee
           performance reviews. However, HUD’s validation efforts were not complete in
           that the program office did not assess the extent of field offices’ data testing, the
           results, and whether data errors and findings had been corrected. For example,
           program officials did not verify whether field offices tested and verified that
           grantees properly reported program income, commitments, and expenditures in
           the HOME database and whether any deficiencies found were resolved. Thus, the
           program office did not know and could not readily show whether HOME data as a
           whole were accurate, complete, and consistent. Further, program officials may
           have missed opportunities for identifying additional systemic data issues.

Monitoring Data Can Be Used To
Validate Data and Identify
Reliability and Compliance
Issues

           We found that field offices’ onsite monitoring results in the GMP database could
           and should be used to validate data. Field offices monitor the reliability of data in
           the Integrated Disbursement and Information System during onsite grantee
           performance reviews and enter their results into the GMP database. The GMP
           database can be queried to show what data tests were completed; their results,
           findings, and concerns; and whether data findings and concerns have been
           resolved.

           Thus, monitoring data in the GMP database can and should be used to assess the
           overall reliability of HOME data in the Integrated Disbursement and Information
           System and improve management of the HOME program. For example, if GMP
           queries show that grantees are properly recording program income, the results can
           be used as a basis to validate the reliability of program income data. If queries
           show that program income was not tested frequently, the program office could

                                             8
                  issue a directive to increase testing. Also, if queries indicate systemic data
                  deficiencies or longstanding unresolved findings, the program office could
                  investigate, determine the cause, and take action to mitigate them.

     Conclusion

                  Although HUD officials had improved controls over HOME data in the Integrated
                  Disbursement and Information System, they did not validate the data to better
                  ensure that controls are effective and the data are reliable. This occurred because
                  officials were concerned with implementing data input controls and had not yet
                  established data validation controls. Reliable data are critical to HUD’s oversight
                  because HUD lacks the resources to visit all 642 participating jurisdiction
                  grantees and observe the 15,000 to 20,000 HOME activities. HUD officials rely
                  on the grantee-provided data to (1) report performance, (2) identify and correct
                  noncompliance, (3) determine which grantees to monitor onsite, and (4)
                  successfully implement the eCon14 system. Thus reliable data are critical for
                  overseeing program compliance, are a primary source for selecting grantees that
                  will and will not be monitored, and is needed to respond to public and
                  congressional inquiries regarding the program.

     Recommendations

                  We recommend that the Acting Assistant Secretary for Community Planning and
                  Development

                   2A. Develop and implement a quality control system to validate HOME
                      program data recorded in the Integrated Disbursement and Information
                      System by using field office monitoring data in the GMP database or some
                      other auditable method, such as statistical sampling and testing of key
                      program data.

                  2B. Develop and implement formal procedures to continually assess the
                     effectiveness and completeness of field office data monitoring efforts using
                     GMP monitoring data to include (1) verifying that HOME data are tested, (2)
                     analyzing results to determine whether program data as a whole are reliable
                     and to identify systemic data issues or issues that should be addressed, and
                     (3) verifying that findings are corrected in a timely manner and monitoring is
                     complete.




14
  The “eCon Planning Suite” is an online tool designed to help grantees with their needs analysis and strategic
decision making.

                                                         9
                         SCOPE AND METHODOLOGY

We conducted the audit from our Hartford, CT, field office and at HUD’s Office of Affordable
Housing Programs in Washington, DC, between February and October 2012. The audit scope
generally covered the period between January 2006 and January 2012.

To accomplish our objectives, we

      Reviewed the existing and proposed HOME regulations, the Departmental Management
       Control Program Handbook, the HUD Monitoring Desk Guide, and relevant handbooks
       and notices.

      Interviewed HUD officials to identify and obtain an understanding of controls over the
       HOME program and status of the proposed regulations.

      Reviewed the 77 HUD OIG audit reports issued during the period to identify systemic
       deficiencies and traced questioned costs to the Audit Resolution and Corrective Action
       Tracking System to identify any problems with recovering funds not spent in accordance
       with program requirements.

      Determined, if properly implemented, whether HUD’s proposed regulation changes,
       combined with existing and proposed controls, provided reasonable assurance that
       systemic deficiencies identified in our reports will be prevented, detected, and corrected.

This audit was limited to a review of policies and procedures and, thus, we did not test the
implementation of the controls. Therefore, our results may be relied upon only if HUD properly
implements its proposed regulations and existing and planned controls, such as updating the
monitoring handbook if and when the proposed regulation is final and taking appropriate
corrective and remedial actions for noncompliant grantees.

Regarding our reliance on automated data,

(1) We relied on HUD’s automated Audit Resolution and Corrective Action Tracking System to
    identify the status of audit findings and the recovery of questioned costs. The risk of
    inaccurate data was low due to system controls and separation of duties between the audit
    and HUD officials responsible for maintaining the system. Thus, we performed minimal
    exception testing by following up with audit officials and grantees to verify the accuracy of
    data indicating problems with resolving findings or recovering questioned costs. Our limited
    testing indicated no material data errors. Thus, we believe the data were reliable for our audit
    objectives.

(2) We considered data in HUD’s GMP system. We used these data obtained by HUD to show
    the number of grantees tested and findings and concerns. These data did not materially affect
    our results; thus, we considered the data adequate for our purposes.


                                                10
(3) We considered data in HUD’s Integrated Disbursement and Information System. Our audit
    reports showed that the data reliability for this system was a systemic deficiency. Thus, we
    did not test the data during this audit and recommended that HUD validate the data (see
    finding 2).

We conducted the audit in accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain sufficient, appropriate
evidence to provide a reasonable basis for our findings and conclusions based on our audit
objective(s). We believe that the evidence obtained provides a reasonable basis for our findings
and conclusions based on our audit objective.




                                               11
                              INTERNAL CONTROLS

Internal control is a process adopted by those charged with governance and management,
designed to provide reasonable assurance about the achievement of the organization’s mission,
goals, and objectives with regard to

              Effectiveness and efficiency of operations,
              Reliability of financial reporting, and
              Compliance with applicable laws and regulations.

Internal controls comprise the plans, policies, methods, and procedures used to meet the
organization’s mission, goals, and objectives. Internal controls include the processes and
procedures for planning, organizing, directing, and controlling program operations as well as the
systems for measuring, reporting, and monitoring program performance.



 Relevant Internal Controls

               We determined that internal controls over the following systemic deficiencies were
               relevant to our audit objective:

                     Income eligibility
                     Commitments and expenditures
                     Property standards
                     Stalled activities
                     Terminated projects
                     Reporting on the Integrated Disbursement and Information System
                     Program income
                     Unsupported and ineligible costs, including missing documents and
                      improper procurement procedures
                     Ownership and lease issues
                     Monitoring grantees

               We assessed the relevant controls identified above.

               A deficiency in internal control exists when the design or operation of a control does
               not allow management or employees, in the normal course of performing their
               assigned functions, the reasonable opportunity to prevent, detect, or correct (1)
               impairments to effectiveness or efficiency of operations, (2) misstatements in
               financial or performance information, or (3) violations of laws and regulations on a
               timely basis.



                                                 12
Significant Deficiencies

             Based on our review, we believe that the following items are significant deficiencies:

                   The Office of Affordable Housing Programs lacked procedures for and did
                    not assess the effectiveness of its field offices’ grantee monitoring efforts
                    (see finding 1).

                   The Office of Affordable Housing Programs did not have adequate
                    controls to assess and ensure the reliability of HOME data in the
                    Integrated Disbursement and Information System (see finding 2).




                                              13
                                      FOLLOW-UP ON PRIOR AUDITS

During the audit, we reviewed 77 HUD OIG external and internal audit reports issued between
January 2006 and 2012 to identify systemic deficiencies.

We limited our follow-up for external audits to findings with questioned costs to determine
whether there were any systemic problems with recovering and realizing the questioned costs.
We determined that there were no material issues with recovering questioned costs.

We limited our follow-up for internal audits to findings and recommendations related to the
systemic HOME findings in the six internal reports issued during our audit period. Overall, the
open recommendations were not in dispute, HUD had submitted its proposed corrective action
plan to HUD OIG, and OIG agreed with proposed corrective actions. Therefore, we expected
that the open recommendations would be resolved through the normal audit resolution process.15

We noted that one issue was affecting the closure of several HOME and other program findings.
At issue was the method HUD used to account for grant funds and thereby account for
compliance with statutory spending requirements. OIG’s position was that the accounting
method HUD used did not comply with Federal financial management system requirements.
HUD did not agree, and OIG was waiting for a formal opinion from the U.S. Government
Accountability Office. However, this issue did not impact our results or conclusions for this
audit.




15
     See appendix A for a complete listing of the reports and open recommendations.

                                                          14
                                     APPENDIXES

Appendix A

           DETAILS OF OPEN INTERNAL AUDITS AND
                    RECOMMENDATIONS

  1. OIG Audit Report 2009-AT-0001, “HUD Lacked Adequate Controls to Ensure the
     Timely Commitment and Expenditure of HOME funds.” We recommended that HUD’s
     General Deputy Assistant Secretary for Community Planning and Development

        1a - Ensure that field offices require grantees to close out in a timely manner
         $62,201,487 in activities reflected in its open activities report that are more than five
         years old and cancel the fund balances.

        1b - Require grantees to reimburse HUD from nonfederal sources any portion of the
         $11,634,558 for activities listed in appendix C that HUD determines had been
         terminated, voluntarily or involuntarily. When making this determination, HUD
         should consider the grantees’ lack of timely physical completion and/or production of
         affordable housing occupied by HOME income-eligible individuals.

        1c - Recapture any shortfalls generated by the closure and deobligation of fund
         balances associated with the open activities.

        1d - Establish and implement controls to ensure that field offices require grantees to
         close out future HOME activities within a timeframe that will permit reallocation and
         use of the funds for eligible activities in time to avoid losing them to recapture by the
         United States Treasury under provisions of Public Law 101-510.

        2a - Establish and implement procedures to monitor the accuracy of commitments
         that grantees enter into the information system. These procedures should include
         expanding HUD’s risk rating system to include risk factors for this review area and
         development of an appropriate monitoring checklist to ensure consistency and
         thoroughness of coverage among field offices.

        3a - Obtain a formal legal opinion from the Office of General Counsel on whether
         HUD’s cumulative technique for assessing compliance with commitment deadlines is
         consistent with and is an allowable alternative to the 24-month commitment
         requirement stipulated at Title II of the Cranston-Gonzalez National Affordable
         Housing Act.

        3b - Obtain a formal legal opinion from the Office of General Counsel on whether
         HUD’s first-in first-out method for assessing compliance with HOME expenditure
         requirements is consistent with and is an allowable alternative to the eight-year

                                               15
       recapture deadline pursuant to Public Law 101-510.

      3c - Revise the regulations to ensure the procedures for assessing compliance with
       commitment and expenditure requirements are consistent with statutory requirements
       and discontinue use of the cumulative technique for assessing deadline compliance
       and the first-in first-out method to account for the commitment and expenditure of
       HOME funds.

2. OIG Audit Report Number 2009-CH-0002, “The Office of Affordable Housing
   Programs’ Oversight of HOME Investment Partnerships Program Income Was
   Inadequate.” We recommended that HUD’s General Deputy Assistant Secretary for
   Community Planning and Development require the Office to

      1a - Require the 26 participating jurisdictions to disburse the $39,611,376 in available
       Program income as of December 31, 2008, for eligible housing activities and/or
       administrative costs before drawing down Program funds from their treasury accounts
       as appropriate.

      1b - Implement adequate procedures and controls to ensure grantees disburse
       available Program income for eligible housing activities and/or administration costs
       before drawing down Program funds from their treasury accounts as appropriate. The
       procedures and controls should include but not be limited to updating HUD’s System
       to prevent participating jurisdictions from drawing down Program funds from their
       treasury accounts when they have available Program income and requiring
       participating jurisdictions to certify that they do not have available Program income
       when they draw down Program funds. In addition, the Office may need to implement
       interim procedures and controls until HUD’s System can be updated.

      2a - Implement adequate procedures and controls to ensure that grantees report
       Program income in HUD’s System accurately and in a timely manner. The
       procedures and controls should include but not be limited to creating a report from
       HUD’s System to identify grantees that may not be reporting all Program income in
       HUD’s System.

3. OIG Audit Report Number 2010-CH-0002, “The Office of Affordable Housing
   Programs’ Oversight of Resale and Recapture Provisions for HOME Investment
   Partnerships Program Assisted Homeownership Project Was Inadequate.” We
   recommended that HUD’s General Deputy Assistant Secretary for Community Planning
   and Development require the Office to

      1a - Implement adequate procedures and controls to ensure that participating
       jurisdictions (1) include appropriate resale and/or recapture provisions in their
       consolidated and/or action plans and (2) implement appropriate resale or recapture
       provisions for their projects.



                                            16
      1b - Require the State of New York and Cobb County, GA, Consortium to reimburse
       their Programs $30,000 and $9,947, respectively, from non-Federal funds for the two
       projects that they did not ensure met HUD’s affordability requirements.

      1c - Require the State of Montana to place a deed restriction, land covenant, affidavit,
       and/or lien on the property to ensure that it would recoup all or a portion of the
       $3,139 in Program funds used for project number 3515 if the housing does not
       continue to be the principal residence of the household for the duration of the
       affordability period. If the State cannot place a deed restriction, land covenant,
       affidavit, and/or lien on the property, it should reimburse its Program $3,139 from
       non-Federal funds.

4. OIG Audit Report 2010-FO-0003, “Additional Details To Supplement Our Report on
   HUD’s Fiscal Years 2009 and 2010 Financial Statements.” We recommended that CPD

      1e - Determine whether the $24.7 million in unexpended funds for the HOME
       program from fiscal years 2001 and earlier that are not spent in a timely manner
       should be recaptured and reallocated in next year’s formula allocation.

      1f - Develop a policy for the HOME program that would track expenditure deadlines
       for funds reserved and committed to community housing development organizations
       and subgrantees separately.

      4a - Ensure that its programs are accounting for and reporting their financial and
       performance information in accordance with Federal financial management system
       requirements.

5. OIG Audit Report 2011-FO-0003, “Additional Details To Supplement Our Report on
   HUD’s Fiscal Years 2010 and 2009 Financial Statements.” We recommended that CPD

      1a - Cease the changes being made to IDIS [HUD’s Integrated Disbursement and
       Information System] for the HOME program related to the FIFO [first-in first-out]
       rules until the cumulative effect of using FIFO can be quantified on the financial
       statements.

      1b - Change IDIS so that the budget fiscal year source is identified and attached to
       each activity from the point of obligation to disbursement.

      1c - Cease the use of FIFO to allocate funds (fund activities) within IDIS and disburse
       grant payments. Match outlays for activity disbursements to the obligation and
       budget fiscal source year in which the obligation was incurred and match the
       allocation of funds (activity funding) to the budget fiscal year source of the
       obligation.

      1d - Include as part of the annual CAPER [consolidated annual performance
       evaluation report] a reconciliation of HUD’s grant management system, IDIS, to

                                            17
       grantee financial accounting records on an individual annual grant basis, not
       cumulatively, for each annual grant awarded to the grantee.

      2c - Review the 510 obligations which were not distributed to the program offices
       during the open obligations review and deobligate amounts tied to closed or inactive
       projects, including the $27.5 million we identified during our review as expired or
       inactive.

      2g - In coordination with the CFO [Chief Financial Officer], develop and publish
       written guidance and policies to establish a benchmark for field directors to use to
       determine the validity of the open obligation. The guidance should include specific
       procedures for open obligation amounts, wherein the obligation was made prior to a
       specified amount of time, as well as disbursement inactivity beyond a specified
       amount of time.

      2h - In coordination with the CFO, develop procedures to periodically evaluate
       HUD’s program financial activities and operations to ensure that current accounting
       policies are sufficient and appropriate and to ensure that they are implemented and
       operating by program and accounting staff as intended.

6. OIG Audit Report 2012-FO-0003, “Additional Details To Supplement Our Report on
   HUD’s Fiscal Years 2011 and 2010 Financial Statements.” We recommended that CPD

      3d - Ensure that field offices have developed and implemented control activities,
       which are documented and can be periodically tested and monitored by the Office of
       Field Management, to ensure that the field offices have a system to ensure
       compliance with the requirements within the biennial risk analysis process Notices
       for Implementing Risk Analyses (CPD Notice 09-04) for Monitoring Community
       Planning and Development Grant Programs and the CPD Monitoring Handbook.

      3e - Review information within the GMP system for consistency and completeness
       and follow up with field offices when information is incomplete or inconsistent
       among the risk analysis, work plans, and completed monitoring efforts.

      3f - Ensure that all required information has been updated and entered into the GMP
       after the due dates for submissions have passed and follow up with field offices that
       have not entered their information.

      3g - Follow up on information in GMP to ensure that findings which had questioned
       costs have been repaid and noncompliance and internal control deficiencies have been
       addressed.

      3h - Develop, document, and implement internal control procedures for OAHP’s
       [Office of Affordable Housing Preservation] review to ensure that grantees comply
       with the terms of the grant agreement, which require the grantees to perform
       monitoring procedures.

                                           18
Appendix B

        AUDITEE COMMENTS AND OIG’S EVALUATION


Ref to OIG Evaluation   Auditee Comments




Comment 1




                         19
        AUDITEE COMMENTS AND OIG’S EVALUATION

Ref to OIG Evaluation   Auditee Comments




Comment 1




Comment 2




Comment 3




                         20
        AUDITEE COMMENTS AND OIG’S EVALUATION

Ref to OIG Evaluation   Auditee Comments




Comment 4




Comment 5




Comment 6




                         21
        AUDITEE COMMENTS AND OIG’S EVALUATION

Ref to OIG Evaluation   Auditee Comments




Comment 6



Comment 7




Comment 8




Comment 9




Comment 10




                         22
        AUDITEE COMMENTS AND OIG’S EVALUATION

Ref to OIG Evaluation   Auditee Comments




Comment 10




Comment 11




                         23
                         OIG Evaluation of Auditee Comments

Comment 1   OIG does not have an incomplete understanding of CPD’s existing monitoring
            procedures. We used HUD's and GAO's standard for determining whether
            monitoring was complete and effective. Specifically, HUD's Monitoring Desk
            guide Chapter 7 and the GAO consider monitoring complete and effective when
            deficiencies are corrected, the corrective action produces improvements, and it is
            decided that further management action is not needed. However, we believe that
            HUD officials lacked auditable and reliable procedures to verify that grantee
            deficiencies and findings observed during field office monitoring visits were
            adequately resolved. Our finding never mentions that absolute compliance is an
            expectation; thus, we recommend that CPD officials implement procedures to
            provide reasonable assurances to verify the extent to which its monitoring is
            effective and complete.

Comment 2   As stated in the report, QMR reviews were not an effective tool for identifying
            monitoring deficiencies and should not be relied on as a sole source for assessing
            and overseeing monitoring. Although CPD officials’s actions to implement
            training should improve monitoring; training in itself does not ensure that field
            offices will properly conduct monitoring, or that deficiencies will be identified
            and corrected. Thus, we made no recommendations regarding training in this
            report, and suggest that other methods to complement how CPD officials assess
            their monitoring efforts be developed.

Comment 3   We agree headquarters should communicate with the field offices. However,
            officials provided no evidence that these discussions resulted in an overall
            assessment of whether field offices properly conducted monitoring, identified
            deficiencies, and ensured that grantee deficiencies were adequately resolved.

Comment 4   HUD's policy is that when travel resources are available field offices should
            monitor a limited number of non-high risk grantees to validate the soundness of
            the risk assessment rating criteria and obtain early warnings of potentially serious
            problems. Thus, officials are correct in that the policy does not explicitly require
            CPD officials to evaluate the results of non-high risk monitoring to determine the
            appropriateness of risk assessment factors; however, CPD officials are responsible
            for establishing the risk assessment factors and procedures. Therefore, we
            maintain our recommendation that officials should analyze the results of this
            monitoring to determine if low risk grantees are being monitored, the results
            thereof, and whether any changes to the risk assessment procedures are warranted.

Comment 5   OIG encourages HUD officials in their efforts to improve monitoring; and
            acknowledge that the actions taken as a result of their contracting for an
            independent assessment of their risk analysis and monitoring procedures may be
            used to satisfy our recommendations if the actions ensure that CPD officials
            document and ensure that deficiencies are identified and corrected, and
            monitoring is completed per GAO standards.

                                             24
Comment 6     We disagree that the finding is speculative in nature, as prior OIG reports clearly
              showed that IDIS data was not always reliable. We asked CPD officials to show
              us what controls they implemented to increase data reliability and how they
              validated the data. Officials showed us the controls they implemented to increase
              data reliability, however, they lacked procedures to document how they validated
              the data.

Comment 7     We agree HUD's new controls should increase data reliability however; during the
              audit and in their comments CPD officials provided no procedures or evidence to
              show the controls were effective and data was now reliable. Therefore, we
              maintain that they should develop specific procedures and controls to document
              how they validate the data. This process should be ongoing to ensure that IDIS
              data used to monitor program performance and compliance are valid and reliable.

Comment 8     Our recommendation does not require HUD to validate all data nor do we imply
              that all data should be subjected to extensive validation procedures. HUD already
              performs some data validation during field office on-site reviews. Although each
              grantee is not tested the results of this sample could be used to draw conclusions
              regarding the integrity of HOME data as a whole. The level of validation and
              amount thereof is thus left to HUD's discretion.

Comment 9     Field Office monitoring of grantees and the GMP data base are maintained at
              considerable expense to the taxpayers; and thus, we believe they should be used to
              their maximum extent. During the audit officials told us that they believed that
              the GMP could be queried at the question level with assistance from the
              contractor. However, some field offices were consolidating their monitoring
              results in pdf form rather than entering their results into the discrete GMP fields.
              Officials said this may have occurred because some staff is still not comfortable
              with computers and or perhaps as a time saving method. Nonetheless, by
              consolidating results in pdf form we agree the data is less usable. Thus, we
              suggest that CPD officials should consult with their contractor to determine if
              discrete GMP data fields can be developed and require field offices to enter
              monitoring results in the appropriate discrete GMP data fields, so that the data can
              be analyzed. If CPD officials do not use the GMP these changes may not be
              necessary.

Comment 10 Regarding CPD’s process for validating HOME IDIS data, CPD officials
           commented that; 1) CPD compares project data to IDIS data; and 2) headquarters
           and field offices periodically review HOME IDIS reports. However, during the
           audit in its comments CPD officials provided no records, reports, data or other
           auditable evidence to show that the new IDIS controls were effective and that
           IDIS data is now reliable. Thus, we maintain our recommendation that CPD
           should develop formal written procedures and obtain auditable and verifiable
           information to validate data. This can be achieved using GMP monitoring data,
           statistical sampling, or some other method that shows IDIS data is reliable.

                                               25
              As for the basis for our decision, OIG auditors used HUD's Monitoring Desk
              guide Chapter 7 and the GAO standards that consider monitoring complete and
              effective when deficiencies are corrected, the corrective action produces
              improvements, and it is decided that further management action is not needed.

              Note, we did not define the degree to which CPD officials should ensure field
              offices are testing grantees for HOME IDIS data. We are leaving that to CPD
              officials to define and determine what is practicable.

Comment 11 Congress has tasked OIG and HUD to increase controls over the HOME program.
           At a minimum, CPD officials’ oversight should provide reasonable assurance that
           known instances of noncompliance are addressed and corrected. Therefore, we
           strongly disagree that our findings are not substantiated. Finding one is being
           reported in part because CPD officials did not know and did not show that the 591
           HOME compliance and performance findings reported to Congress were resolved.
           We reported finding two because HUD uses IDIS to monitor compliance and
           prior OIG audit reports showed that IDIS data was not reliable and during our
           review CPD officials did not have auditable and verifiable procedures to show
           that HOME IDIS data were verified and reliable.




                                              26
Appendix C

                              CONCLUSIONS REGARDING
                               SYSTEMIC DEFICIENCIES

During our review we rolled up the results of 77 OIG issued audit reports on HUD’s HOME
program. Specifically, we identified and classified ten systemic HOME deficiency areas and
ranked them below in order of occurrence.

                                                                    Deficiencies reported in
                                                               6 Internal             71 External
             Common Areas
                                                              audit reports          audit reports
     1       Unsupported and Ineligible Costs                       0                     139
     2       Reporting on IDIS                                       4                          55
     3       Commitments and Expenditures                            3                          50
     4       Property Standards                                      0                          46
     5       Inadequate Monitoring Procedures                        3                          32
     6       Program Income                                          1                          27
     7       Income Eligibility                                      0                          25
     8       Terminated Projects                                     1                          22
     9       Ownership/ lease issues                                 1                          15
     10      Stalled Activities                                       1                         12
             Totals:                                                 14                        423

We reviewed HUD’s proposed regulations and preventive, detective, and corrective controls
pertaining to common deficiency areas and concluded that if properly implemented, HUD’s
proposed changes to the HOME regulations and controls should mitigate the systemic
deficiencies identified in prior HUD OIG audit reports16.




16
  With the exception of (1) the program office’s oversight of grantee monitoring (See Finding 1), and (2) validating
the reliability of HOME data (See Finding 2).

                                                        27