oversight

Review of Case Management & Oversight's Program Review Function.

Published by the Department of Education, Office of Inspector General on 2000-09-01.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

             Review of Case Management & Oversight’s
                     Program Review Function


                                FINAL AUDIT REPORT




                  Control Number ED-OIG / A04-90003
                           September 2000




Our mission is to promote the efficient        U.S. Department of Education
and effective use of taxpayer dollars          Office of Inspector General
in support of American education.                       Atlanta, Georgia
                             NOTICE
  Statements that management practices need improvement, as well as
   other conclusions and recommendations in this report, represent the
 opinions of the Office of Inspector General. Determination of corrective
 action to be taken will be made by appropriate Department of Education
                                 Officials.

In accordance with the Freedom of Information Act (5 U.S.C. §552), reports
   issued by the Office of Inspector General are available, if requested, to
      members of the press and general public to the extent information
          contained therein is not subject to exemptions in the Act.
                              UNITED STATES DEPARTMENT OF EDUCATION
                                       OFFICE OF INSPECTOR GENERAL

                                                                                          THE INSPECTOR GENERAL
                                                September 21, 2000


MEMORANDUM


TO:              Greg Woods
                 Chief Operating Officer
                 Student Financial Assistance


FROM:            Lorraine Lewis

SUBJECT:         FINAL AUDIT REPORT
                 Review of Case Management & Oversight’s Program Review Function, Control
                 Number ED-OIG/A04-90003

Attached is our subject final report that covers the results of our review of Case Management
& Oversight’s program review function during fiscal year 1998. We received your comments
generally concurring with the findings and recommendations in our draft audit report.

Please provide the Supervisor, Post Audit Group, Financial Improvement, Receivables and
Post Audit Operations, Office of Chief Financial and the Office of Inspector General, Planning,
Analysis, and Management Services with semiannual status reports on promised corrective
actions until all such actions have been completed or continued follow-up is unnecessary.

In accordance with the Freedom of Information Act (Public Law 90-23), reports issued by the
Office of Inspector General are available, if requested, to members of the press and general
public to the extent information contained therein is not subject to exemptions in the Act.
Copies of this audit report have been provided to the offices shown on the distribution list
enclosed in the report.

If you have any questions, please call Carol Lynch, Regional Inspector General for Audit, at
(404) 562-6462.


Attachment




                             400 MARYLAND AVE., S.W. WASHINGTON, D.C. 20202 – 1510

       Our mission is to ensure equal access to education and to promote educational excellence throughout the Nation
Table of Contents

Review of Case Management & Oversight’s Program Review Function
Control Number ED-OIG/A04-90003


Executive Summary                                                         1

Background                                                                3

Audit Results                                                             4

     Finding #1
      The Number of On-Site Program Reviews Conducted Within the Mix of
      Case Management Tools Needs to be Increased                         4

     Finding #2
      The Program Review Report Process Needs Improvement                 10


Objective, Scope, and Methodology                                         12

Statement on Management Controls                                          13


EXHIBIT    Additional Details Regarding Findings                          14

APPENDIX Department’s Initial Comments to the Draft Report
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                                FINAL




                              EXECUTIVE SUMMARY

                  REVIEW OF CASE MANAGEMENT & OVERSIGHT’S
                          PROGRAM REVIEW FUNCTION




We performed an audit of the Case Management & Oversight’s 1 (CMO) program
review function. The objective of our audit was to determine whether CMO is
utilizing program reviews effectively within its case management system to monitor
and improve institutional performance. The audit period covered Fiscal Year 1998
(October 1, 1997 through September 30, 1998).

Although CMO does have a process in place to conduct program reviews within the
case management system, we concluded that CMO does not have proper controls
to ensure the effective utilization of program reviews to monitor and improve
institutional performance.


       •    The number of on-site program reviews performed within the mix of case
            management tools needs to be increased. Whereas the number of on-site
            program reviews conducted prior to case management (FY96) was 746,
            the number conducted under case management (FY98) decreased to 128,
            a reduction of 618. In addition, we concluded that there were institutions
            where program reviews were warranted, but were not conducted.

       •    CMO’s program review process needs improvement. Specifically, report
            quality, timeliness, supervisory review, file maintenance, documentation of
            program reviews, and data entry into the Case Management Information
            System (CMIS).


We recommend that the Chief Operating Officer (COO) for Student Financial
Assistance (SFA):

       1.      Institute management controls within the case management process to
               ensure a consistent and appropriately balanced use of program
               reviews to monitor institutional compliance with Title IV requirements.



1
 At the time of our review, Case Management & Oversight was known as Institutional Participation &
Oversight Service. In the fall of 1999, Student Financial Assistance (SFA) was reorganized. The
operational procedures of the Institutional Participation and Oversight Service did not change,
however it was renamed Case Management & Oversight, a division within the SFA Schools Channel.
                                                1
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                     FINAL

      2.     Update the 1994 Program Review Guide to reflect the reorganization
             to case management and clarify guidelines for the timely issuance of
             program review reports.

      3.     Establish policies and procedures that provide formal guidance to
             ensure consistent application of supervisory review, file maintenance,
             documentation of program reviews, and data entry into the CMIS.



DEPARTMENT’S REPLY

The Department generally concurred with much of the information provided in the
first finding and recommendation. However, they did not agree with all of the
assessments and conclusions. The Department stated that as they work to develop
program review measures and refine goals, they will clarify the importance of
program reviews for the case teams and emphasize the need for a more balanced
use of reviews in case management. The Department believes that concentrating
program review efforts and resources on institutions that appear to be high risk will
minimize the need for a large number of reviews. To accomplish this, the case
teams will increase the number of program reviews at high-risk institutions. To
examine the integrity of the risk analysis system, case teams will also conduct
reviews at 25 non-risk school in FY2001.

The Department also concurred with the second finding and the recommendations.
The Department stated that a workgroup has been formed to update the 1994
Program Review Guide within the first half of FY2001. Policies and procedures will
be reviewed and updated as required.

See APPENDIX for the full text of the Department’s response to this all findings.




                                          2
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                      FINAL




                                          BACKGROUND



As of June 30, 1998, there were a total of 5,846 institutions within the United States
certified as eligible to participate in the Federal Student Financial Assistance (SFA)
programs and 122 institutional review specialists2 making up case teams to perform
institutional monitoring. Program reviews are required by the Higher Education Act
(HEA) to protect the interest of taxpayers and students. To achieve this, CMO case
teams monitor institutional compliance with the Title IV statute and its regulations
through on-site assessments of the administration of the Federal Student Financial
Assistance (SFA) programs. When institutions are identified that are seriously
mismanaging or abusing the SFA programs, they are referred to the Department’s
Administrative Actions & Appeals Division (AAAD) for administrative action, to
include termination, when appropriate. Program reviews also address financial harm
to the taxpayer through liability assessments.

Section 498A of the HEA states that the Secretary shall provide for the conduct of
program reviews on a systematic basis designed to include all institutions of higher
education participating in the SFA programs.

Work previously conducted by the Office of Inspector General in 1997 determined
that there had been a significant decrease in the number of program reviews
performed by CMO. During the eight month period ended May 31, 1997,
approximately 61 reviews were performed compared to 746 reviews during the
previous twelve month period. Reasons given for the decrease in the number of
program reviews were: resources were redirected to recertification, a significant
amount of training was conducted as a result of the CMO reorganization, and a risk-
based method of selecting institutions for review was adopted. While CMO had
given a low priority to program reviews for fiscal year 1997, we were informed that it
planned to conduct more reviews in fiscal year 1998.

Given the decrease in the number of program reviews performed by CMO, we
conducted an audit to determine whether CMO is utilizing program reviews
effectively within its case management process to monitor and improve institutional
performance.




2
    This figure was as of September 30, 1998.
                                                3
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                           FINAL


                                 AUDIT RESULTS


                                       FINDING #1
THE NUMBER OF ON-SITE PROGRAM REVIEWS CONDUCTED WITHIN THE MIX
       OF CASE MANAGEMENT TOOLS NEEDS TO BE INCREASED




Since CMO’s reorganization to case management, the total number of on-site
program reviews conducted has decreased. Whereas the number of on-site
program reviews conducted prior to case management (FY96) was 746, the number
conducted under case management (FY98) decreased to 128, a reduction of 618.
In addition, we concluded that there were institutions where a program review was
warranted but one was not conducted.

RECENT HISTORY OF CMO
At the time of our review, CMO was one of six services within Student Financial
Assistance that was responsible for administering the SFA programs. Its
responsibilities included determining institutions’ eligibility to participate in the federal
SFA programs, certifying institutions for participation, developing and implementing
policies and procedures for monitoring institutions participating in the programs to
ensure compliance with the HEA, regulations, and policies, and conducting on-site
reviews of participating postsecondary institutions.

Prior to 1996, these functions were performed separately by different sections within
CMO. Compliance and financial statement audits were used to monitor institutions’
compliance with Title IV requirements. However, according to CMO officials,
program reviews were the primary monitoring device. Through these on-site
assessments of institutions’ administration of the SFA programs, institutions were
assessed liabilities for noncompliance and some of those identified as mismanaging
or abusing these programs were also referred to AAAD for a fine, limitation, and/or
termination action.

The HEA amendments of 1992 provided for additional areas of responsibility for
CMO with specific timeframes. Financial statement audits of institutions participating
in the SFA programs were required annually. Compliance audits, which were
required biennially prior to the 1992 amendments, were also required annually
(effective July 1, 1994). This required CMO to resolve twice as many audits as they
had previously. Additionally, all institutions’ eligibility to participate in the SFA
programs after the date of the enactment expired within five years. Prior to the
amendments, once an institution was initially certified to participate in the SFA
programs, no review of its eligibility and certification was performed unless the
Department deemed it necessary. Also, CMO would have to recertify institutions to



                                             4
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                                    FINAL

participate in the SFA programs every four years.3


According to CMO officials, in order to accommodate these changes they began
restructuring into a team-based case management organization in November 1996.
This new multidisciplinary approach was designed to help CMO focus its resources
on institutions that posed a significant risk to the SFA programs. The reorganization
significantly changed the way that CMO conducts business. All aspects of
monitoring institutions that were once performed independently, such as
recertification, audit resolution, financial analysis, and program reviews, were
consolidated into several teams, with each team being responsible for an assigned
portfolio of institutions. Additionally, CMO’s focus shifted to a more customer
oriented approach and placed an emphasis on technical assistance.

NUMBER OF PROGRAM REVIEWS DECREASED
Subsequent to the reorganization, the number of program reviews performed
decreased significantly. Reasons given for the decrease were: resources were
redirected to recertification, a significant amount of training was conducted as a
result of the CMO reorganization, and the risk-based method of selecting institutions
for review was adopted. While CMO had given a low priority to program reviews for
FY97, we were informed in 1997 that it planned to conduct more reviews in FY98.

During FY98, the total number of on-site program reviews conducted nationwide
remained low. An analysis of data contained in the Postsecondary Education
Participants System (PEPS) revealed that CMO conducted 128 on-site program
reviews nationwide under case management (FY98). The number of reviews
conducted prior to case management (FY96) was 746. This is a reduction of 618
on-site program reviews. Based on data provided by CMO officials, during FY99 the
total number of on-site program reviews conducted was 116.

The following graph illustrates the decrease in the total number of on-site program
reviews, by region, between FY96 and FY98.




3
    The1998 amendments to the HEA extended the recertification period to every six years.
                                                  5
    REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                              FINAL



                                        Total On-Site Program Reviews FY96 vs FY98

                      140



                      120



                      100



                       80

Number of Reviews

                       60



                       40



                       20



                        0
             Region          1    2         3      4     5      6     7      8       9    10

  Total # of Reviews in 96   42   106       66    140    58    99     88     59      47   41
  Total # of Reviews in 98   3    14        10    15     7     14     28     18      10    9




    Based on an analysis of case management decisions made during FY98 at
    judgmentally selected institutions in the three regions visited, we concluded that a
    program review was warranted, but was not conducted, at 11 out of 163 institutions.
    Our conclusions were reached using the same criteria that CMO case teams use
    when making the decision to conduct a program review. Areas such as the
    institution's financial strength, past program review history, audit history, default
    rates, fluctuations in loan volume, problems reported by state agencies or
    accrediting agencies, and drop out rates were all taken into consideration. In
    addition, we reviewed data available within CMIS and the Risk System that was
    available to the case teams during FY98, and spoke with Area Case Directors in
    each region to obtain information available about the institution that may not have
    been documented within these systems.

    One institution where a program review was warranted but was not conducted had a
    high risk score and was in borderline financial condition. The institution’s previous
    program reviews resulted in major findings and it was on the reimbursement method
    of payment. Another institution had significant audit findings resulting in liabilities of
    approximately $192,000 over a two-year period. The risk system showed “more
    loans defaulting than average” at this institution and “50% of audits deficient.” The
    Area Case Director explained that the region was waiting for guidance from
    headquarters before conducting this program review. These examples illustrate
    instances where CMO case teams did not consistently apply their criteria to
    determine whether or not a program review should be conducted.




                                                         6
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                                      FINAL

Section 498A of the HEA, as amended, states that the Secretary shall provide for
the conduct of program reviews on a systematic basis designed to include all
institutions of higher education participating in the SFA programs. Priority for
program reviews shall be given to institutions with high cohort default rates,
significant fluctuation in Federal Stafford Loan, Federal Direct Stafford Loan or
Federal Pell Grant award volume, deficiencies or financial aid problems, high annual
dropout rates, or the failure to possess administrative capability or financial
responsibility as determined by the Secretary.

Although case management is the process used by CMO for monitoring and
oversight of institutions, we are concerned that it does not include a consistent and
balanced use of program reviews. This is evidenced by the fact that instances were
identified where a program review seemed warranted, but was not conducted, and
the significant decrease in the number of reviews conducted. As CMO case teams
strive to meet these many demands, program reviews are no longer given the
priority that they were in the past.

DECREASED LIABILITIES
CMO informed us that although fewer program reviews are conducted under case
management, the program reviews conducted are better targeted at institutions truly
needing a review. However, we found that CMO has assessed fewer liabilities at
institutions as a result of a program review since its reorganization to case
management. This was determined by examining the relationship between liabilities
assessed at institutions for on-site program reviews, by region and nationwide, prior
to case management (FY96) and under case management (FY98) using information
extracted from the PEPS system.4 We found that $0 liabilities were assessed as a
result of a program review more often under case management (FY98) than prior to
case management (FY96). In FY98, 73% of the on-site program reviews assessed
liabilities of $0. Prior to the reorganization to case management (FY96), only 54% of
the on-site program reviews assessed liabilities of $0.

Not only did total liabilities assessed nationwide as a result of on-site program
reviews decrease from FY96 to FY98, but the average liabilities assessed per review
also decreased. From FY96 to FY98, total liabilities assessed decreased by $47
million and the average liabilities assessed as a result of an on-site program review
decreased by $25,444. During FY96, the average liabilities assessed per review
nationwide were $71,209, whereas in FY98 they were only $45,765. Additionally,
the average liabilities assessed per review from FY96 to FY98 decreased in 7 out of
10 regions. The following graph illustrates the relationship between average
liabilities assessed per review in FY96 and FY98 by region and nationwide.




4
 All reviews assessing liabilities equal to all Title IV funds received over a several year period were
eliminated for the purposes of our analysis. Including reviews assessing such liabilities would have
distorted the average amount of liabilities imposed per Region.
                                                   7
                  REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                                                           FINAL




                                                      Average Liabilities per Review by Region FY96 vs. FY98
                            $400,000


                            $350,000


                            $300,000


                            $250,000


Average Liability per Review $200,000


                            $150,000


                            $100,000


                             $50,000


                                   $0
                          Region           1         2         3         4            5       6          7         8         9         10      Nationwide

                       96 Liab per Rev   $83,713   $16,360   $58,532   $27,594   $99,906   $122,703   $33,135   $25,172   $352,614   $32,809    $71,209
                       98 Liab per Rev     $0      $14,341   $1,747      $0      $15,764    $176      $85,951   $58,163   $167,556   $44,199    $45,765




                  In FY99, the liabilities assessed as a result of an on-site program review continued
                  to decrease. Data provided by CMO officials showed that during FY99 a total of
                  $536,398 in liabilities was assessed nationwide for on-site program reviews, and the
                  average liabilities assessed decreased to $4,624.

                  Program reviews protect the interests of taxpayers and students by identifying
                  institutions that are mismanaging SFA programs or not complying with Title IV
                  requirements. Without a consistent application of the criteria used to determine
                  whether a program review should be performed, an appropriate balance of program
                  reviews being performed within the mix of case management tools is not achieved.
                  There is also an increased risk of unidentified institutional misuse of federal funds.

                  DEPARTMENT’S REPLY
                  The Department generally concurred with much of the information provided in the
                  finding and the recommendation. However, they did not agree with all of the
                  assessments and conclusions. Specifically, they did not agree that program reviews
                  are neglected. Despite this disagreement, the Department stated that as they work
                  to develop program review measures and refine goals, they will clarify the
                  importance of program reviews for the case teams and emphasize the need for a
                  more balanced use of reviews in case management. The Department believes that
                  concentrating program review efforts and resources on institutions that appear to be
                  high risk will minimize the need for a large number of reviews. To accomplish this,
                  the case teams will increase the number of program reviews at high-risk institutions.
                  To examine the integrity of the risk analysis system, case teams will also conduct
                  reviews at 25 non-risk schools in FY2001.

                  Additionally, the Department did not entirely concur with our conclusion that program
                  reviews were not conducted when warranted for 11 of the 163 cases judgmentally

                                                                                  8
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                   FINAL

selected. See APPENDIX for the full text of the Department’s response to this
finding.

IG’S REPONSE
We have considered the Department’s response regarding program reviews being
neglected and have modified the wording of the report.

The Department did not entirely concur with our conclusion that program reviews
were not conducted when warranted for 11 of the 163 cases judgmentally selected.
We have reviewed their response and have not changed our conclusion.

RECOMMENDATION
We recommend that the COO for SFA institute management controls within the case
management process to ensure a consistent and appropriately balanced use of
program reviews to monitor institutional compliance.




                                         9
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                          FINAL




                                             FINDING #2
          THE PROGRAM REVIEW REPORT PROCESS NEEDS IMPROVEMENT




Management controls over CMO’s program review report process need
improvement. In reviewing 29 program review report files, 24 final program review
determination files, and interviewing various members of the case teams, we
determined the following 5:


          §   Program review reports and final program review determinations contain
              errors in general report content and math computations. In addition, the
              scope of reports (where a focused review was conducted) is often stated
              in very general terms rather than reflecting the focus of the review.

          §   Program review reports and final program review determinations are not
              always issued in a timely manner. We found two different criteria
              regarding the time period within which a program review report should be
              issued. The first criterion is that a report should be issued within 30 days
              from the conclusion of the review. The second criterion is that a report is
              to be sent to the school within 60 days from the conclusion of the review.
              In many instances, CMO is failing to meet the suggested guideline of
              issuing the report within 60 days of the conclusion of the review.

          §   Extensive periods of time pass before the issuance of final program review
              determinations. During this time there is no evidence of communication
              between CMO and the institution.

          §   CMO does not have a formal and consistent supervisory review process in
              place. Each region visited has a unique supervisory review procedure for
              reports. This is documented differently in each region, and in one region it
              is not documented at all.

          §   Institutional files are poorly maintained and do not always adequately
              document program review information. File contents are organized
              inconsistently which make documents difficult to locate. Additionally,
              program review workpaper files do not always sufficiently document the
              work performed. Items such as entrance/exit conference notes, interview
              notes, and/or documents pertaining to sampling methodology are missing.




5
    Details regarding this information may be found in the EXHIBIT.
                                                   10
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                     FINAL



      §   Data is not entered consistently into the CMIS. The CMIS was designed
          as a communication vehicle for tracking, managing, and reporting case
          information across CMO. In many instances it was not possible to
          determine that a program review was performed by looking in the CMIS.
          Start and completion dates of program review activity were entered
          inconsistently.

We also found that CMO’s guidance for program reviews needs improvement. First,
CMO’s most recent program review guide is out of date. The 1994 Program Review
Guide does not reflect CMO’s reorganization to case management. Specifically,
survey reviews, previously the standard review approach, have been replaced with
more concentrated (focused) reviews. In addition, the guide specifies that a
program review report “generally should be issued no later than 30 days of the
conclusion of the review visit.” However, two Department publications conflict with
this requirement. The Blue Book, (June, 1999) whose primary purpose is to provide
guidance to institutional personnel who administer and manage SFA programs,
states that the program review team sends a program review report to a school
within 30 to 60 days of the review. According to the 1999-2000 Student Financial
Aid Handbook, the report should be sent to the school within approximately 60 days
of the review.

The HEA requires guidelines for the conduct of program reviews. As stated in
Section 498A, as amended, “the Secretary shall establish guidelines designed to
ensure uniformity of practice in the conduct of program reviews of institutions of
higher education and make available to each institution participating in programs
authorized under this title complete copies of all review guidelines and procedures
used in program reviews.”


DEPARTMENT’S REPLY
The Department generally concurred with the recommendations. The Department
stated that a workgroup has been formed to update the 1994 Program Review
Guide within the first half of FY2001. Policies and procedures will be reviewed and
updated as required. See APPENDIX for the full text of the Department’s response
to this finding.


RECOMMENDATIONS
We recommend that the COO for SFA:
2.1 Update the 1994 Program Review Guide to reflect its reorganization to case
     management and clarify requirements regarding the recommended time period
     within which program reports should be issued.
2.2 Establish policies and procedures that provide formal guidance to ensure
     consistent application of supervisory review, file maintenance, documentation
     of program reviews, and data entry into the CMIS.



                                         11
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                        FINAL




                      OBJECTIVE, SCOPE, AND METHODOLOGY


The objective of our audit was to determine whether CMO is utilizing program
reviews effectively within its case management system to monitor and improve
institutional performance.

The audit period covered Fiscal Year 1998 (October 1, 1997 through September 30,
1998). We performed our fieldwork in CMO headquarters, Washington, DC,
November 17-20, 1998. Fieldwork was also performed at four CMO regional offices
located in Kansas City (Region VII) January 11-15, and 26-29, 1999, Denver
(Region VIII) February 1-5, 1999, Philadelphia (Region III) March 22-26, 1999, and
Chicago (Region V) June 21-30, 1999. Additional analysis was performed at our
offices through November 1999, and an exit conference was held with CMO officials
on December 15, 1999. Our review was conducted in accordance with generally
accepted government auditing standards appropriate to the scope of the review
described above.

To achieve our objective we performed the following:

•   Interviewed agency officials, CMO case team directors, and other regional staff,
    analyzed and reviewed applicable laws and regulations, and reviewed the most
    recent copy of the Program Review Guide and Institutional Review Specialist
    Guide.

•   Analyzed information regarding the volume of program reviews performed and
    resolved (from PEPS), and technical assistance provided (from CMIS).

•   Selected a judgmental sample of program reviews conducted and resolved and
    technical assistance provided within each region visited. These were reviewed
    with particular attention given to the selection process, staff assignments, quality,
    supervisory review, documentation, and file maintenance.

•   Examined FY98 case management decisions at 163 selected institutions from
    those regions visited, and determined whether a program review was warranted.




                                           12
REVIEW OF CMO’S PROGRAM REVIEW FUNCTION ED-OIG/A04-90003                     FINAL




               STATEMENT ON MANAGEMENT CONTROLS


As part of our review, we made an assessment of CMO’s management controls,
procedures, and practices applicable to the scope of our audit. Our assessment was
performed to determine the level of control risk for determining the nature and extent
of substantive tests to accomplish the audit objective. For the purposes of this
report we reviewed management controls over the program review function which
included the institutional selection process, quantity of reviews, assignment of staff
to perform the reviews, quality of reviews, supervisory review process,
documentation and file maintenance kept for each program review, closure of
reviews, and issuance of final program review determinations.

Because of inherent limitations, a study and evaluation made for the limited purpose
described above would not necessarily disclose all material weaknesses in the
control structure. However, we noted the management control weaknesses that are
discussed in the audit results section.




                                         13
REVIEW OF CMO’ PROGRAM REVIEW FUNCTION ED-OIG/AO4-90003_____________________________________Additional Details Regarding Findings




 EXHIBIT

                                           ADDITIONAL DETAILS REGARDING FINDINGS

CONDITION                                            KANSAS CITY            DENVER        PHILADELPHIA          CHICAGO              TOTAL

Program review reports contained errors in         8 out of 10 files   4 out of 5 files   5 out of 9 files   5 out of 5 files   22 out of 29 files
general report content, math computations,
and/or grammar.
Final Program Review Determinations                3 out of 7 files    1 out of 5 files   2 out of 4 files   2 out of 8 files   8 out of 24 files
contained errors in general report content,
math computations, and/or grammar.
Program review reports scope was stated in         6 out of 10 files   5 out of 5 files   0 out of 9 files   4 out of 5 files   15 out of 29 files
very general terms instead of stating the focus
of the review.
Program review reports were not issued within      2 out of 10 files   2 out of 5 files   6 out of 9 files   2 out of 5 files   12 out of 29 files
60 days after the conclusion of the review visit
Extensive periods of time passed before the        1 out of 7 files    1 out of 5 files   1 out of 4 files   5 out of 8 files   8 out of 24 files
issuance of Final Program Review
Determinations without any evidence of
communication between CMO and the
institution.
Overall file maintenance.                          POOR                GOOD               GOOD               POOR
(Based on institutional file being maintained in
such a way that documents were easy to locate
within each file).
Program review workpaper files inadequately        8 out of 10 files   2 out of 5 files   3 out of 9 files   3 out of 5 files   16 out of 29 files
documented the work performed during the
program review. Items such as entrance/ exit
conference notes, interview notes, and/or
documents pertaining to sampling methodology
were missing.
Adequate documentation was not present in          4 out of 10 files   1 out of 5 files   4 out of 9 files   0 out of 5 files   9 out of 29 files
CMIS to show that a program review was
performed.




                                                                       14
                                                                                      APPENDIX
                    UNITED STATES DEPARTMENT OF EDUCATION
                                      Student Financial Assistance
                                        Chief Operating Officer




                                       August 4, 2000


Ms. Lorraine Lewis
Inspector General
U.S. Department of Education


Dear Ms. Lewis:

Thank you for the opportunity to review and comment on the draft audit report entitled “Review of
Case Management & Oversight’s Program Review Function,” ACN ED-OIG A04-90003.

We are pleased that your draft report notes the increased oversight responsibilities that the
Department has been required to perform. We share the Office of Inspector General’s
conclusion that conducting program reviews is a critical oversight function. However, we feel it is
important for other oversight tools, such as reviewing institutional audits and financial statements,
performing certification reviews, and providing technical assistance, to be recognized as being
equally effective and critical in ensuring institutional compliance with statutes and regulations and
in safeguarding student financial assistance funds.

While it is our goal to utilize every available oversight tool as effectively as possible, there will
always be situations where there are issues of judgment regarding which course of action is the
most appropriate. For example, the draft report states that program reviews were not conducted
at eleven schools that warranted an on-site review. We believe that our case teams took
appropriate actions considering the information available. Information on these schools is
contained in the enclosed Appendix. Nonetheless, we are using this opportunity to re-evaluate
and improve our existing procedures and practices for conducting institutional oversight.

The enclosure provides the Department’s response to each recommendation.                 Again, we
appreciate the opportunity to review and comment on the draft report.

                                          Sincerely,

                                              /S/

                                          Greg Woods

Enclosure

cc:   Carol Lynch
      Pat Howard
      Kay Jacks
      Jim Lynch
                                                                                          APPENDIX
Response to OIG Draft Audit Report, "Review of Case Management and Oversight's Program
Review Function," Control Number ED-OIG / A04-90003, June 2000.

Finding 1

The number of on-site program reviews conducted within the mix of case management needs to be
increased.

Recommendation: Institute management controls within the case management process to ensure a
consistent and appropriately balanced use of program reviews to monitor institutional compliance
with Title IV requirements.

Response: We concur with much of the information provided in the finding and with the
recommendation. However, we do not agree with all of your assessments and conclusions.

The draft audit report recognizes that the number of financial and compliance audits has doubled
because of the statutory requirement for annual instead of biennial audit submissions. The draft audit
report also recognizes that Student Financial Assistance (SFA) was required to review every
participating institution's certification by 1997, and at least every four (recently changed to six) years
thereafter. We met our statutory responsibilities without increasing the number of staff but doing so
did mean we did not conduct as many on-site reviews. However, as the result of the recertification
process and the imposition of cohort default rate penalties, we have eliminated a significant number
of non-performing institutions.

The case management process is designed to be an outcome-based system. It is dependent on a
discussion of each institution's merits and problems as outlined by its financial strength, audit
findings and risk profile. Consistent and appropriate outcomes do not always conclude with the need
for a program review to be conducted. There are instances where we determine that technical
assistance would be a better approach and, to deliver such assistance effectively, institutional
improvement specialists were added to case management teams. Since then, SFA has endeavored to
increase institutional administrative capability and compliance through focused attention on
individual institutions and technical assistance to groups of institutions.

While the number of program reviews clearly decreased, we do not agree that program reviews are
neglected. In addition to program reviews, we believe the audit report should also recognize all the
oversight tools and technical assistance as critically important tools utilized by the case management
teams. However, as we work to develop program review measures and refine goals, SFA will clarify
the importance of program reviews for the case teams and emphasize the need for a more balanced
use of reviews in case management.
                                                                                          APPENDIX
The draft report notes that liability assessment decreased. Liability assessment in and of itself is not
a valid measurement of the success or failure of any particular oversight tool. We cannot presume
that liabilities should continually increase in order to prove the success of any compliance activity.
Providing technical assistance to institutions and the closure of non-performing institutions may
account for the reduced liabilities assessed as a result of program reviews. If technical assistance
and other oversight activities are successful, liabilities should decrease.

While we believe that oversight through case management has partially compensated for the
reduction in the number of program reviews and liabilities, SFA believes that concentrating program
review efforts and resources on institutions that appear to be high risk will minimize the need for a
large number of reviews.

To accomplish this, the case teams will increase the number of program reviews at high-risk
institutions. However, to examine the integrity of the risk analysis system, case teams will also
conduct reviews at 25 non-risk schools in FY 2001. The balance of the reviews in FY 2001 will be
conducted specifically at high-risk schools. That number will be dependent upon the overall
capacity of the case teams, the introduction of the new Program Review Guide and the
implementation of training for program reviewers.

We do not entirely concur with your conclusion that program reviews were not conducted when
warranted for 11 of the 163 FY98 cases judgmentally selected by the IG for the audit.

Of the 11 institutions cited, case teams had recommended, but had not yet scheduled prior to the
audit, that program reviews be conducted at three institutions and that a technical assistance visit be
conducted at a fourth. Two program reviews were conducted: one resulted in no liability, the other
in a nominal liability. The third program review is currently underway and the technical assistance
visit will be conducted prior to the end of the fiscal year.

A review of the other seven institutions revealed the following: two were able to meet the financial
responsibility standards (one belonged to a corporate entity that issued an initial public offering and
raised sufficient funds to retire its debt, the second met the standards by posting a 50% letter of
credit) and a third institution took all required corrective actions and resolved all outstanding issues.
There was no other issue for these schools that would have warranted a program review.

The remaining four institutions were a part of the same corporate structure. The case team
conducted a program review at the two largest locations. Both reviews resulted in an Expedited
Final Program Review Determination letter being issued. Because neither of those reviews resulted
in a liability, it was determined that no further action was necessary at the other two institutions.
                                                                                        APPENDIX
Finding 2

The program review report process needs improvement.

Recommendation 2.1: Update the 1994 Program Review Guide to reflect the reorganization to case
management and clarify requirements for the timely issuance of program review reports.

Response: We concur with the recommendation. A workgroup is being formed to accomplish this
task within the first half of FY 2001. However, we wish to note that while it is certainly preferable
to update and maintain the Program Review Guide (Guide) so it is always current and reflects the
current organizational structure, we strongly believe the case teams understood how to conduct
program reviews under the case team organizational structure, especially since the Guide is not the
only document case teams rely upon for guidance on conducting program reviews.

Regarding the timely issuance of program review reports, we will clarify the time period for
issuance. However, most delays in issuing reports were due to schools seeking extensions to gather
information to respond to tentative findings, not because case teams were being unreasonably
dilatory. We do not wish to preclude the ability of schools to provide their input because this saves
SFA time and expense in the long run by avoiding a lengthy resolution and appeal process. We will
examine how best to keep track of extensions requested by institutions to ensure case teams issue
reports as expeditiously as possible.

Recommendation 2.2: Establish policies and procedures that provide formal guidance to ensure
consistent application of supervisory review, file maintenance, documentation of program reviews,
and data entry into the CMIS.

Response: We concur with most of this recommendation. Policies and procedures will be reviewed
and updated as required. The new Guide will also clarify expectations and provide operational
guidance to staff. It will outline the steps and provide revised criteria necessary for a well-
documented, timely and error free program review report. Additionally, training will be developed
and provided to case team members responsible for the conduct or oversight of program reviews.

However, we believe the finding placed too great an emphasis on program review data within the
Case Management Information System (CMIS). CMIS was designed to be, and will continue to be
utilized as, an internal workflow tracking system. CMIS was not intended to be the official
repository for program review data. The PEPS system is, and will continue to be, the official
database for program review information. Accordingly, it is important to have complete, accurate
and up to date institutional information maintained in PEPS.
                                   FINAL
                         REPORT DISTRIBUTION LIST
                      CONTROL NUMBER ED-OIG/A04-90003
                                                                      No. of Copies
Action Official/Auditee

Mr. Greg Woods                                                           Original
Chief Operating Officer
Student Financial Assistance
U.S. Department of Education
400 Maryland Avenue., SW
Regional Office Building 3, Room 4004
Washington, DC 20202


Other ED Officials (via e-mail, except OGC, & OPA)

General Manager for Schools, Student Financial Assistance                  1

Director, Case Management and Oversight                                    1

Chief Financial Officer, Student Financial Assistance

Audit Liaison, Student Financial Assistance                                1

General Counsel and Assistant General Counsel                              3

Acting Director of Public Affairs                                          1
Office of Public Affairs

Supervisor, Post Audit Group                                                1
Office of the Chief Financial Officer

Office of Inspector General (via e-mail)

Inspector General                                                          1
Deputy Inspector General                                                   1
Acting Assistant Inspector General for Audit                               1
Acting Deputy Inspector General for Audit                                  1
Student Financial Assistance Advisory & Assistance Director & Staff        2
Inspector General for Analysis & Inspection Services                       1
Counsel to the Inspector General                                           1
Regional Inspectors General for Audit Services                        1 each