oversight

Case Management & Oversight's Monitoring of Postsecondary Institutions.

Published by the Department of Education, Office of Inspector General on 2004-09-30.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                          UNITED STATES DEPARTMENT OF EDUCATION
                                          OFFICE OF THE INSPECTOR GENERAL



                                                      SEP 30 2004
MEMORANDUM 


TO:            Theresa S. Shaw
               Chief Operating Officer
               Federal Studen; t id.

FROM:          HelenLew          -Adh-L~
              .Assistant Inspector General for Audit
               Office of Inspector General

SUBJECT:       FINAL AUDIT REPORT
               Case Management and Oversight 's Monitoring ofPostsecondary Institutions
               Control No. ED-OIG/A04-D0014


Attached is the final audit report that covers the results of our review of Case Management and
Oversight' s monitoring of postsecondary institutions during August 2001 through May 2003.
An electronic copy has been provided to your Audit Liaison Officer. We received your
comments nonconcurring with the findings and recommendations in our draft report. Our
response to your comments is included in the Audit Results section of the report.

Corrective actions proposed (resolution phase) and implemented (closure phase) by your office
will be monitored and tracked through the Department's Audit Accountability and Resolution
Tracking System (AARTS). ED policy requires that you develop a final corrective action plan
(CAP) for our review in the automated system within 30 days of the issuance of this report. The
CAP should set forth the specific action items, and targeted completion dates, necessary to
implement final corrective actions on the findings and recommendations contained in this final
audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector
General is required to report to Congress twice a year on the audits that remain unresolved after
six months from the date of issuance.

In accordance with the Freedom of Information Act (5 U.S.C. §552), reports issued by the Office
of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please
call Regional Inspector General J. Wayne Bynum at 404-562-6477 or Assistant Regional
Inspector General Mary Allen at 404-562-6465.


Enclosure
                                      400 MARYLAND AVE., S.W., WASHINGTON, DC 20202-1510
                                                               www.ed.gov

              Our mission is to ensure equal access to education and to promote educational excellence throughout the nation.
                     Case Management & Oversight’s 

                   Monitoring of Postsecondary Institutions 



                                 FINAL AUDIT REPORT 





                                            ED-OIG/A04-D0014 

                                             September 2004




Our mission is to promote the efficiency,                        U.S. Department of Education
effectiveness, and integrity of the                                Office of Inspector General
Department’s programs and operations.                                         Atlanta, Georgia
Statements that managerial practices need improvements, as well as
other conclusions and recommendations in this report, represent the
   opinions of the Office of Inspector General. Determinations of
    corrective action to be taken will be made by the appropriate
                  Department of Education officials.


In accordance with the Freedom of Information Act (5 U.S.C. §552),
  reports issued by the Office of Inspector General are available to
 members of the press and general public to the extent information
      contained therein is not subject to exemptions in the Act.
                         TABLE OF CONTENTS


                                                                             Page 


EXECUTIVE SUMMARY……………………………………………………………………..1 


AUDIT RESULTS………………………………………………………………………………3 


Finding No. 1 – The Institutional Assessment Model Is An Ineffective Tool for 

                Identifying “At Risk” Institutions.………………………………………….3
Recommendations………………………………………………………………………………6 


Finding No. 2 – The Program Review Process Needs Improvement..……………………...10 

Recommendations……………………………………………………………………………..14 


Finding No. 3 – Technical Assistance Was Not Adequately Documented or
                Followed Up…...…………………………………………………………….16 

Recommendations……………………………………………………………………………..18 


Finding No. 4 – CMO-HQ Monitoring of Regional Office Operations Needs 

                Improvement.……………………………………………………….………18 

Recommendation.……………………………………………………………………………...21 


BACKGROUND………………………………………………………………………………24 


OBJECTIVE, SCOPE, AND METHODOLOGY..……………………………………..……25


STATEMENT ON MANAGEMENT CONTROLS..……………………………….………27 


APPENDIX A – CMO Regional Office Exceptions..……………………………………..…28 


APPENDIX B – Written Response to the Draft Report…………………………............…29 





ED-OIG/A04-D0014            FINAL REPORT                           Page i

                             EXECUTIVE SUMMARY


Our objectives were to evaluate (1) Case Management & Oversight’s (CMO) use of program
reviews as a compliance tool (2) CMO’s use of technical assistance as a compliance tool, and
(3) CMO Headquarters (CMO-HQ) management controls over regional offices’ monitoring of
postsecondary institutions. Audit coverage included CMO monitoring of institutional
compliance with the Title IV, Student Financial Assistance (Title IV) requirements during the
period August 2001 through May 2003. To accomplish our objectives, we visited the CMO-HQ
in Washington, DC, and four CMO regional offices (Atlanta, GA; Chicago, IL; Dallas, TX; and
San Francisco, CA).

We identified weaknesses in the Institutional Assessment Model (IAM) used to identify and
select institutions for review, the CMO regional office program review process, and the CMO
regional office technical assistance process. We also found that CMO-HQ monitoring of
regional office operations needed strengthening.

The IAM did not contain complete and accurate information, and the IAM risk scores did not
always predict problematic institutions. There were no policies, procedures, and management
controls over the information entered into the IAM and no evaluation of its effectiveness. The
weaknesses identified with the IAM may prevent CMO from effectively prioritizing case
management efforts. While it should be noted that the IAM was not the only methodology used
by CMO to identify problematic institutions, it was a significant tool used to identify high-risk
institutions for review.

The CMO regional office program review reporting process, retention of supporting
documentation, and consistency in the review process needs improvement. Weaknesses in the
program review process were caused by a lack of detailed policies and procedures and a lack of
compliance with the limited existing policies and procedures. These weaknesses placed CMO at
risk of failing to adequately identify and report significant instances of noncompliance and of
being inconsistent and inequitable in its conduct and resolution of program reviews.

We also identified problems with the regional offices’ documentation of technical assistance and
a lack of follow-up on the results of technical assistance. These problems were caused by a
failure to comply with existing policies and procedures and a lack of detailed policies and
procedures for some compliance areas. Failure to document and follow up on technical
assistance prevented CMO management from having the ability to measure the effectiveness of
technical assistance as a compliance tool.

The CMO-HQ monitoring of regional office operations needed strengthening. CMO-HQ did not
(1) monitor regional offices’ use of the IAM, (2) provide guidance for the selection of
institutions for case management in the absence of an updated IAM risk list, (3) monitor regional
offices’ compliance with internal policies and procedures for program reviews and technical
assistance, (4) evaluate the effectiveness of program reviews or technical assistance conducted or
the consistency of regional offices’ selection of institutions for program review or technical


ED-OIG/A04-D0014                   FINAL REPORT                              Page 1 of 46
assistance, and (5) evaluate the effectiveness of the enforcement actions taken as a result of
regional office program reviews. These weaknesses were primarily a result of the level of
autonomy given to regional office managers over monitoring decisions. This also created the
potential for inconsistent treatment of institutions across the country.

CMO-HQ is currently developing a new electronic CMO (eCMO) initiative to improve the
overall monitoring process. According to CMO-HQ officials, the eCMO structure is grounded
in the case management process model and is focused primarily on updating tools and systems to
help support decision-making that is informed, effective, efficient, consistent, documented,
standardized, and distributed. During an end of fieldwork meeting, CMO-HQ officials provided
information on how the new eCMO initiative may address some of the concerns noted in this
audit report; however, the new initiative was still in the research and development phase and the
officials were unable to provide an estimated implementation date. Even if eCMO is fully
implemented, CMO will need to address the management deficiencies identified in this audit.

We recommend that the Chief Operating Officer for Federal Student Aid (FSA) require
CMO-HQ to:

   • 	 Develop and implement management controls to ensure that the data used to identify the
       most at-risk institutions is complete, accurate, and applicable to the institutions being
       evaluated.
   • 	 Develop a methodology for evaluating the effectiveness of any risk assessment model
       used to identify the institutions presenting the highest risk of loss of Title IV funds.
   • 	 In the absence of an effective risk model, provide guidance to the regional case 

       management teams for identifying institutions for program review and technical 

       assistance. 

   • 	 Establish detailed policies and procedures over supervisory review of program reviews,
       record retention, off-site program reviews, specific items for making a program review or
       technical assistance the appropriate monitoring action, and the appropriate action to be
       taken as a result of a specific compliance issue identified at an institution.
   • 	 Develop a quality control process to ensure regional compliance with the policies and
       procedures concerning the program review function and consistency across the regions in
       decisions pertaining to monitoring actions taken and enforcement actions in the event of
       noncompliance.
   • 	 Develop and implement policies and procedures for providing technical assistance in a
       consistent manner across all regions, documenting the technical assistance provided,
       identifying when technical assistance ends and enforcement begins, and following up
       on technical assistance visits and measuring the effectiveness of them as a
       compliance/monitoring tool.
   • 	 Implement management controls to ensure consistent treatment of institutions across
       regional offices.
   • 	 Develop internal policies and procedures to ensure management oversight of CMO
       operations.

FSA did not agree with all of the audit findings; however, FSA agreed to take action on the
recommendations. We summarized FSA’s written response after each finding and included the


ED-OIG/A04-D0014                   FINAL REPORT 	                            Page 2 of 46
response as Appendix B to this report. Due to the large volume of pages, we did not include the
attachments to the written response. Our comments to FSA’s written response are included after
each finding.


                                     AUDIT RESULTS


Finding No. 1 – The Institutional Assessment Model Is An Ineffective Tool for
Identifying “At Risk” Institutions
We found that the IAM may prevent CMO from effectively prioritizing case management
efforts. The IAM did not contain complete and accurate information, and the IAM risk scores
did not always accurately identify problematic institutions. This occurred due to a lack of
policies, procedures, and management controls around the information used in the IAM and the
lack of evaluation of the effectiveness of the IAM. By maintaining a risk system that does not
accurately identify the most at-risk institutions, CMO may be making ineffective decisions about
the best use of its resources and ineffectively prioritizing its case management efforts.

Section 498A of the Higher Education Act of 1965 (HEA) prescribes the requirements for the
conduct of program reviews as follows:

       (a) GENERAL AUTHORITY. - In order to strengthen the administrative 

       capability and financial responsibility provisions of this title, the 

       Secretary- (1) shall provide for the conduct of program reviews on a 

       systematic basis designed to include all institutions of higher education 

       participating in programs authorized by this title. 


According to this section of the HEA, the Secretary is to give priority for program review to
institutions of higher education that have a cohort default rate in excess of 25 percent, a loan
default rate that places the institution in the highest 25 percent of such institutions, a significant
fluctuation in Federal loan or Pell grant volume, reported deficiencies or financial aid problems,
high annual dropout rates, and institutions that the Secretary determines may pose a significant
risk of failure to comply with the administrative capability or financial responsibility provisions
of the Act.

In fiscal year (FY) 2001, CMO adopted the IAM to rank institutions according to their potential
risk of loss of Government funds. The IAM system, hosted by OakRidge National Laboratories
(ORNL), is a tool to prioritize case management efforts in selecting schools for on- and off-site
program reviews and technical assistance. The IAM software uses school data taken from
various sources, organizes it, and then presents it in a manner that will track, assess, and
anticipate risk among institutions participating in Title IV programs. The IAM is based on
statistical data collected and utilized over a period of time. The information is used to assess the
probability of a specific event befalling a school. For each specific financial problem that is
identified through the assessment, several probability measurements can be constructed. “At



ED-OIG/A04-D0014                     FINAL REPORT                                Page 3 of 46
risk” schools identified in the assessment are grouped into problem related categories pertaining
to surety, fines, reimbursements, or penalties.

Institutional data from the following U.S. Department of Education systems is submitted to
ORNL for use in arriving at the IAM risk score: National Student Loan Data System (NSLDS),
Recipient Financial Management System (RFMS), Grants Administration Payments System
(GAPS), Central Processing System (CPS), Postsecondary Education Participants System
(PEPS), and the Default Management System.

The IAM Did Not Contain Complete Information
We compared the institutions participating in Title IV programs (i.e., schools for which
disbursements were reported in NSLDS and GAPS) to institutions receiving an IAM score for
two years. Our analysis showed that approximately 525 of the 6,371 institutions (8.2 percent)
that participated in the Title IV programs were not assigned an IAM score in the July 2001 risk
assessment. In addition, approximately 500 of the 6,371 institutions (7.8 percent) that
participated in the Title IV programs were not assigned an IAM score in the November 2002 risk
assessment. Of this 500, 424 were the same institutions that did not receive a score in the July
2001 risk assessment.

CMO-HQ officials explained that institutions did not receive an IAM risk assessment score if
they were a new school or a “satellite” campus whose information was rolled into the main
campus score, had insufficient information for a score, were not on the eligibility list, or had
been inadvertently dropped because there was no case team assigned to them per the IAM
database. CMO-HQ officials said they reviewed a sample of the schools that were not assigned
an IAM risk score and found that although some of the schools should have received a score,
other schools did not receive a score because they had closed, merged with another school, lost
Title IV funding, or had not been in the program long enough to provide sufficient data to
support the calculation.

Information Submitted By the Department to OakRidge National Laboratories (ORNL)
Was Incomplete
ORNL used data provided from PEPS to assign an IAM risk score to institutions participating
in the Title IV programs. Some of the PEPS data did not accurately reflect an individual
institution’s financial responsibility, a factor used in calculating the IAM score. According to
CMO officials, financial statements that fail certain conditions (e.g., audit opinion, compliance
issues, contingent liabilities, debt agreement violation, change in auditor, late refunds) are
“flagged” in the PEPS system. If there are 10 schools covered by a financial statement (e.g.,
OMB A-133 Statewide Single Audit of public institutions), all 10 schools receive the same flag.
Thus, it is possible for schools with high risk not to be flagged and for schools with low risk to
be flagged as high risk. As a result, the IAM scores assigned to public institutions may not
necessarily reflect an accurate financial responsibility rating.

In addition, information in PEPS relating to the total amount of liabilities assessed as a result of
noncompliance in a program review was not always correct. We obtained a PEPS extract dated
June 26, 2003, that contained the total amount of liabilities assessed as a result of all program
reviews conducted during our audit period. Our review of a random sample of program review



ED-OIG/A04-D0014                     FINAL REPORT                               Page 4 of 46
report files in the four regional offices visited revealed that liability amounts reported in the
CMO Final Program Review Determination (FPRD) letters did not always match the liability
amounts reported in PEPS. Of the 40 report files reviewed for which a FPRD letter had been
issued, we found 8 differences between liability amounts in PEPS and the FPRD letters. For
these 8 files, the FPRD letters showed liabilities totaling $778,140 while PEPS showed liabilities
totaling $180,864. As a result, some of the liabilities assessed as a result of reported
noncompliance may be incorrectly or incompletely reported in PEPS. ORNL used the PEPS
data to develop risk scores for institutions.

Information pertaining to program review findings in PEPS was also sometimes inaccurate. Our
review of a sample of program review report files in the regional offices visited revealed that the
CMO institutional review specialists did not always report all program review findings identified
during the review. This resulted in incomplete information pertaining to program review
findings being used to develop risk scores. Finding No. 2 provides additional information
regarding this problem.

IAM Scores Did Not Always Predict Problematic Institutions
As part of our institutional file review, we compared the relationship between the IAM risk
score and the findings of noncompliance at institutions identified in audit reports, program
reviews, and other documentation in the files indicating possible noncompliance. In 74 of the
155 school files reviewed (48 percent), there was no apparent relationship between the IAM
score and the findings of noncompliance identified at the institutions. For these 74 files, 58
institutions had a high IAM score with a low level of evidence supporting noncompliance issues,
and 16 institutions had a low IAM score with a high level of evidence supporting noncompliance
issues.

We found no policies, procedures, or management controls in place to evaluate the effectiveness
of the IAM or to ensure that the data provided to ORNL to identify the most at-risk institutions
was complete, accurate, and applicable to the institutions being evaluated. By maintaining a risk
system that does not accurately identify the most at-risk institutions, CMO may be making
ineffective decisions about the best use of its resources and incorrectly prioritizing its case
management efforts.

During our end of fieldwork briefing with CMO-HQ officials, we were informed that CMO and
ORNL were currently evaluating the overall effectiveness of the IAM. The officials said the
results of this review would be used to develop an improved risk model that will be part of the
future electronic CMO (eCMO). The officials said that since eCMO was in the design phase,
they were unable to provide an estimated completion date for the eCMO project and/or rollout of
the improved risk system. Until CMO and ORNL complete their analysis of the IAM, the
CMO-HQ should strengthen current policies, procedures, and management controls over the
determination of the most at-risk institutions.




ED-OIG/A04-D0014                    FINAL REPORT                              Page 5 of 46
RECOMMENDATIONS
We recommend that the Chief Operating Officer for Federal Student Aid require CMO-HQ to:

1.1 	 Develop and implement management controls to ensure that the data used to identify the
      most at-risk institutions are complete, accurate, and applicable to the institutions being
      evaluated.

1.2 	 Develop a methodology for evaluating the effectiveness of any risk assessment model used
      to identify institutions presenting the highest risk of loss of Title IV funds.

1.3 	 In the absence of an effective risk model, provide guidance to the regional case
      management teams for identifying institutions for program review and technical assistance.

FSA RESPONSE AND OIG COMMENTS

In general, FSA agreed that internal procedures and management controls can be strengthened in
the areas identified. FSA stated that it will review and revise the procedures as necessary and
provide training to the case teams on the new procedures. FSA agreed that the IAM system
could be enhanced, and stated that the Schools Eligibility Channel (SEC) staff have identified the
requirements for a new model as part of the development of the Integrated Partner Management
System.

Regarding FSA’s disagreement with certain statements in the audit report, the information
provided in the written response was not sufficient to convince us to amend the finding and
recommendations. FSA’s specific response to the draft report and our comments are
summarized below.

FSA Response. FSA stated that the audit findings were overstated and FSA took issue with
some of the statements in the report. FSA stated that the IAM is only one tool used by the SEC
to identify institutions with a probability of risk for case management. Case management is an
extensive review of a school, given the school’s individual circumstances. This case
management review determines appropriate oversight actions that may include an on-site or
off-site program review.

FSA said the SEC conducts oversight activities required by legislation and regulation to identify
at-risk institutions, including reviews of annual audits and financial statements, calculations of
default rates, eligibility reviews, and program reviews. In addition, the SEC conducts technical
assistance visits to help schools prevent problems. The SEC analyzes data to proactively identify
schools that may need intervention. All these activities are in addition to the risk probability
information being provided by the IAM system.

FSA disagreed with the statement that “by maintaining a risk system that does not accurately
identify the most at-risk institutions, CMO may not be making effective decisions about the best
use of its limited resources.” FSA said the current model was designed to identify schools with
four conditions of risk: the presence of surety (letter of credit), the presence of a fine greater


ED-OIG/A04-D0014                    FINAL REPORT 	                            Page 6 of 46
than $10,000, the condition of being on reimbursement, or the condition of having a liability
from audits or program reviews greater than $10,000. The model is a good predictor of
institutions likely to have these four conditions. These predictions are based on a type of
regression analysis that starts with identifying those schools that have the condition or problem,
and then looks for variables that contribute to the school having the condition.

OIG Comments. We disagree with FSA’s comment that the audit findings were overstated.
The report states that the IAM is one of several tools used by FSA to identify high risk
institutions for review. We placed emphasis on the IAM because it was a primary tool used
to identify and select institutions for review. As stated by FSA, the IAM is a tool to identify
schools with a probability of risk for case management (an extensive review of a school) and that
the case management review determines appropriate oversight actions that may include an
on-site or off-site program review. Therefore, it is important to correctly identify the most at risk
institutions for proper allocation of resources and prioritization of workload. Since CMO had
not evaluated the effectiveness of the IAM, we did so through this audit and found that the IAM
was an ineffective tool for identifying “at risk” institutions because the IAM did not contain
complete information, the information submitted by the Department to ORNL for use in
calculating risk scores was incomplete, and the risk scores did not always predict problematic
institutions.

FSA Response. Regarding the schools that were not assigned an IAM score in the July 2001
and November 2002 risk assessments, FSA stated that most of these schools should not have had
a risk score for various reasons (e.g., closed schools, merged/consolidated schools, loss of
Title IV eligibility, loss of State accreditation/authorization or voluntary withdrawal, not eligible
and/or not certified, funding office only, and insufficient data to support calculation). FSA said
68 schools were inadvertently dropped from the July 2001 risk list due to an error (56 of which
should have received a risk score). This error was detected and corrected for the November 2002
list. Five of the 56 schools that should have received a risk score in July 2001 were included on
the November 2002 high-risk list. FSA said the fact that it had identified the 68 schools with
missing scores for July 2001 showed that it did perform analysis and checks on the IAM data and
system. FSA concluded that there were 67 schools with a valid missing risk score for the July
2001 and November 2002 risk assessments. FSA also provided a spreadsheet of its analysis on
why schools that we identified as missing an IAM score should not have had a score.

OIG Comments. Our point of the missing scores was that the IAM did not contain complete
information and, therefore, there could be schools that needed to be case managed for which
FSA did not have complete information. The problem with the 68 schools with missing scores
for July 2001 was not completely corrected for the November 2002 risk assessment. We found
that 13 of these 68 schools also did not receive a risk score in the November 2002 risk
assessment. The support for FSA’s comment that it had identified the 68 schools with missing
scores showed that it did perform analysis and checks on the IAM data and system consisted of
an e-mail from the contractor dated July 31, 2003. This was a month after the start of this audit
and about eight months after the November 2002 IAM scores had been generated.

While there may be schools that validly did not have an IAM score, we identified discrepancies
in the data provided by FSA in response to this issue. Because of these discrepancies, we could



ED-OIG/A04-D0014                    FINAL REPORT                                Page 7 of 46
not determine the actual number of institutions that validly did not have an IAM score. These
discrepancies further support our conclusion that CMO-HQ needs to evaluate the effectiveness of
the IAM or any other methodology used to generate risk scores to ensure that all participating
schools receive coverage. For example, (1) a school that closed after the IAM risk assessment
did not receive a risk score; (2) a school that closed before the IAM risk assessment received a
risk score; and (3) discrepancies existed in the dates necessary for analysis (e.g., the data showed
a school began participating in the Title IV programs on January 1, 1965, and lost its State
authorization the same day, and the action updated 37 years later). Follow-up discussions with
FSA officials revealed that there were errors in the data and that additional schools should have
received a risk score.

FSA Response. FSA disagreed that IAM scores assigned to public institutions did not reflect
financial responsibility. FSA said the IAM indicator for all schools is the presence or absence of
a flagged financial statement, or a missing financial statement. Whenever a financial statement
fails a condition, the SEC sets a flag in the system for the school. If there are 10 schools covered
by the financial statement, all 10 schools receive the same flag.

OIG Comments. FSA’s comments did not change our conclusion regarding the IAM indictor
for public institutions. We did, however, amend the finding to better reflect how public schools
receive a flag in PEPS. FSA’s comments confirmed that it is possible for schools with high risk
not to be flagged and for schools with low risk to be flagged.

FSA Response. Regarding the differences between liability amounts in PEPS and the FPRD
letters, FSA said the differences were caused by data entry timing delays, FPRDs being issued
after OIG extracted the data from PEPS, data entry conducted on the same day that OIG
extracted its data, and the case team not reporting deficiencies that had been corrected by the
school. FSA also said current practice allows corrections of program review finding on-site, or
shortly thereafter, and the finding is not included in the FPRD. FSA said a delay in entry of
program review information is not sufficient to support the OIG’s claim that information
submitted to IAM is incomplete. FSA said all of the data for the schools reviewed by OIG has
since been entered or corrected in PEPS.

OIG Comments. We amended the finding to reflect differences for 8 of the 40 files we
reviewed (instead of 11 of 40) based on the information provided by FSA in its written response.
Our review of the “data entry timing delay” justification offered by FSA revealed that the
applicable FPRD’s were issued about 5, 7, 9, and 12 months prior to the time the PEPS data was
extracted for OIG (June 2003). We do not agree that such delays in entering FPRD data into
PEPS is justified. The IAM model was designed to identify schools with four conditions of risk,
one of which is a liability from audits or program reviews greater than $10,000. The failure to
enter program review liabilities into PEPS in a timely manner prevents the IAM from identifying
all institutions that meet this condition. We also disagree with FSA’s policy of not reporting all
program review findings if the findings are corrected on-site, or shortly thereafter. This issue is
discussed in more detail in Finding 2.

FSA Response. FSA said the statement that IAM scores did not always predict problematic
institutions is inaccurate because OIG (1) used a different definition of noncompliance from the



ED-OIG/A04-D0014                    FINAL REPORT                               Page 8 of 46
IAM definition (i.e., OIG used audit and program review findings to define noncompliance),
(2) used the peer group probability score in many instances, instead of the national score for
comparing noncompliance, and (3) scrutinized IAM as a distinct, independent application, not as
an integrated tool that is inherently aligned with the case management approach.

OIG Comments. For purposes of our analysis, we compared the IAM score to program
reviews, audit findings, and any other information found in the institutional file to indicate
possible noncompliance. Other information found in the file to indicate possible noncompliance
included the four conditions of risk upon which the IAM model is based (i.e., the presence of
surety (letter of credit), the presence of a fine greater than $10,000, the condition of being on
reimbursement, or the condition of having a liability from audits or program reviews greater than
$10,000).

At the beginning of the audit, we were informed that CMO used the peer group score for
selecting high-risk schools from the July 2001 risk assessment and used the national score for
selecting high-risk schools from the November 2002 assessment. We evaluated the correlation
of the risk scores (peer group scores for the July 2001 assessment and national scores for the
November 2002 assessment) with the information in the institutional files.

We recognize, as the report states, that the IAM is one of several tools used by FSA to identify
institutions for review. Although the IAM is one of several tools, it was a major tool used by the
regional offices to identify high-risk schools for review.

FSA Response. In response to Recommendation 1.1, FSA agreed to develop and implement a
process to validate critical data in PEPS. FSA said it believes that the data needs to be complete
and accurate regardless of the system used to determine the probability of risk.

OIG Comments. While we agree with FSA’s statement on the need for correct and accurate
data, FSA’s response did not fully address Recommendation 1.1. FSA should ensure that the
validation of PEPS data includes the accuracy of information regarding financial responsibility,
assessed liabilities, and program review findings. In addition, FSA needs to ensure that risk
scores are determined for all schools that receive Title IV funds and that the scores identify high-
risk institutions (regardless of whether the scores are determined by the IAM, the FY 2004
Compliance Initiative, or eCMO). FSA’s response did not address the implementation of
management controls to ensure the data used to identify the most at-risk institutions are
complete, accurate, and applicable to the institutions being evaluated.

FSA Response. In response to Recommendation 1.2, FSA agreed to evaluate the effectiveness
of the FY 2004 Compliance Initiative. FSA is currently using the FY 2004 Compliance
Initiative, not solely IAM, to identify schools with a potential for noncompliance in identified
areas.

OIG Comments. FSA’s response to Recommendation 1.2 did not fully address the
recommendation. FSA should also develop a methodology to evaluate the effectiveness of the
IAM if it continues to be used as a risk assessment model, the eCMO when it is implemented,
and any other model developed to assess risk.



ED-OIG/A04-D0014                    FINAL REPORT                               Page 9 of 46
FSA Response. In response to Recommendation 1.3, FSA stated that one of the basic
requirements and a continuing function of the case management process after performing a
comprehensive review of all functional area information is for the teams to recommend
appropriate next steps. This includes making recommendations to perform program reviews,
refer for administrative action, or provide technical assistance. FSA said as additional data
analysis is performed that identifies additional data outliers, the SEC would provide these
potential risk issues and guidance for resolution to the case teams. The SEC performed analysis
and identified several schools to be worked by the case teams as their risk list in the FY 2004
Compliance Initiative. The training of trainers for the current Compliance Initiative took place
on July 27-28, 2004. Management Improvement Services (technical assistance) procedures were
issued in July with an effective date of August 1, 2004. Training was conducted July 29, 2004.

OIG Comments. In response to Recommendation 1.3, FSA provided a summary of the FY
2004 Compliance Initiative, which outlined seven anomalies identified through data mining
consisting of 379 institutions for case management. Although institutions were identified for
case management, the initiative documentation did not provide a methodology for case teams to
select institutions for program review and technical assistance. We also requested the
Compliance Initiative training materials and were informed that the materials were in draft.

Finding No. 2 – The Program Review Process Needs Strengthening
Management controls over CMO’s program review process need strengthening. Our review of
program review report files and interviews of case team members in four regional offices
identified weaknesses in the program review process, reporting process, record retention, and
consistency in the program review process across regions. This occurred due to a failure to
comply with existing policies and procedures and a lack of detailed policies and procedures for
some compliance areas. These weaknesses put CMO at risk of failing to properly identify and
report significant instances of noncompliance and of being inconsistent and inequitable in its
conduct and resolution of program reviews.

As previously noted, the HEA requires guidelines for the conduct of program reviews. Section
498A states “the Secretary shall establish guidelines designed to ensure uniformity of practice in
the conduct of program reviews of institutions of higher education. . . .”

Excess Cash Review
We were unable to determine whether or not Institutional Review Specialists adequately
reviewed excess cash as part of the program review. Determining whether or not an institution
is maintaining excess cash is part of the fiscal review to be performed during the program
review. The Program Review Guide outlines the procedures for performing a fiscal review of
institutional records to determine noncompliance with cash management regulations, and
requires that documentation be maintained to support this review. We did not find sufficient
documentation to support the conclusions reached in the excess cash reviews performed. In
three of the four regions visited, the only documentation found within the files to indicate that a
fiscal review was performed was Grants Administration and Payment System (GAPS) printouts




ED-OIG/A04-D0014                    FINAL REPORT                              Page 10 of 46
and bank statements. No documentation was maintained to support tracing of Title IV funds to
ensure that funds were spent within required timeframes.

Reporting Process
Institutional review specialists did not always report all findings in program review reports. In
two regions visited, although a finding was identified, the reviewers did not cite the correct
number of student file review exceptions for the finding. Instead, the reviewers cited a few
examples of the student files containing the problem. In addition, the reviewers sometimes
resolved findings while on-site and neither reported the problem nor the number of student file
review exceptions in the program review report. We found these issues in 15 of the 47 files
reviewed. As a result, liabilities were not consistently and fully assessed.

Adequacy of Full File Review Results
Institutional review specialists did not adequately review the results of institutional full file
reviews performed in response to program review findings. The procedures followed by
reviewers during their analyses of institutional full file reviews were inadequate in two of the
regions visited. The review of the institutional full file reviews were inadequate because the
reviewers did not identify the fact that the institution failed to include all student exceptions in
the documentation submitted to CMO for the full file reviews. In addition, the reviews did not
verify refund calculations submitted by the institutions. By failing to adequately analyze
information submitted by an institution for a full file review, there was no assurance that the full
file reviews could be relied upon, that the schools understands how to correctly administer the
Title IV programs, or that all liabilities from non-compliance have been assessed. This can lead
to repeated noncompliance.

CMO’s failure to report all findings in program review reports and adequately review the results
of the full file review documentation submitted by institutions occurred, in part, due to a lack of
supervisory review.1 Although case team members said that supervisory review of the working
papers and reports was performed prior to issuing a program review report, there was no
documentation in the files to support this statement. Also, the Program Review Guide did not
address the topic of supervisory review.

Retention of Supporting Documentation
We identified weaknesses in the retention of documentation to support program review results.
The time period for which documentation was retained to support program review findings
varied across the four regions visited. We noted differences in the time period for which
documentation to support program review findings was maintained. One region purged all
supporting documentation after the final program review determination appeal period ended;
another region maintained documentation up to 5 years after the appeal period ended, or longer if
necessary; one region maintained documentation within the office until all issues were resolved
after which time the records were archived for 5 years; and one region maintained documentation
up to three years or as long as storage space was available.

1
 OIG reported a similar finding in a “Review of Case Management & Oversight’s Program Review Function,”
Control No. ED-OIG A04-90003, issued in September 2000. This review found that CMO did not have a formal
and consistent supervisory review process in place. Each region visited had unique supervisory review procedures
for reports. This was documented differently in each region, and in one region it was not documented at all.


ED-OIG/A04-D0014                         FINAL REPORT                                   Page 11 of 46
The Program Review Guide did not provide specific guidance on how long supporting
documentation was to be maintained. CMO-HQ officials said that the official document
retention policy for documentation to support program review reports was the same as FSA’s
record retention policy. FSA’s record retention policy is found within the Department of
Education Records Disposition Schedules (ED/RDS), Part 10. ED/RDS, Part 10, Item 23a
(N1-441-00-01, Item 4a) requires the maintenance of program review files indefinitely unless the
institution is terminated from the Title IV programs. If an institution is terminated, the files must
be retained for one year from termination.

FSA’s document retention policy was not readily available for reviewers on CMO’s website.
The most recent record retention documentation information was a September 2002 e-mail sent
to the case management teams. Furthermore, CMO officials were uncertain of the exact terms of
their document retention policy prior to researching its exact terms in response to our questions.
CMO’s weakness in the maintenance of documentation to support program review findings was
primarily due to its failure to follow existing Departmental document retention policies and
procedures or to make the policies and procedures for the time period for which supporting
documentation is to be retained known to the case teams.

Inconsistencies in the Program Review Process Across Regions
We identified three inconsistencies in the program review process across the regional offices
visited. First, the methodology for selecting and performing off-site program reviews was
inconsistent. Second, there were differences in the weight placed on factors considered to
determine whether or not a program review was warranted. Examples of these factors included
the IAM risk score, findings in prior program review reports and audit reports, and any other
information found within the institutional file suggesting potential noncompliance. Finally, the
actions taken as a result of program reviews with similar findings and liability assessments
varied across regions.

In one region, off-site reviews were similar to on-site program reviews. All documentation that
would normally be examined during an on-site program review was obtained from the institution
within 48 hours of the institution being notified of the review. Another region used off-site
program reviews to establish liabilities at the institutions with the Department. The remaining
two regions used off-site program reviews for a combination of purposes such as to establish a
liability; to take a school off of reimbursement; or to perform focused reviews, which examine
specific issues known by the case team prior to performing the review.

Inconsistencies in the selection and conduct of off-site reviews created the potential for
inaccurate reporting of data into FSA’s systems such as PEPS. Most of the off-site program
reviews reported by CMO were excess cash liability determinations made at the request of Direct
Loan Staff. Direct Loan excess cash reviews were limited reviews of Title IV fund drawdowns
and school expenditures for the purpose of establishing a liability to the Department for excess
cash that had been drawn down. However, these reviews were coded as off-site program
reviews. This inclusion of Direct Loan excess cash reviews as off-site reviews inflated the total
number of program reviews reported in PEPS. The information in PEPS did not accurately




ED-OIG/A04-D0014                    FINAL REPORT                              Page 12 of 46
reflect program reviews conducted by CMO. Table 2.1 below shows the number and amount of
Direct Loan excess cash reviews that were reported as off-site reviews.

Table 2.1 – Direct Loan Excess Cash Reviews Included in Off-Site Program Reviews
                                    2001                         2002
                         Direct Loan                  Direct Loan
                         Excess Cash     Reported    Excess Cash      Reported
                           Reviews       Reviews       Reviews        Reviews
 Reviews                         139            188           161           181
 Off-Site Liabilities     $5,182,259    $11,105,532    $7,433,266 $10,272,629

In early 2003, CMO stopped conducting Direct Loan excess cash reviews. At this point, the
Direct Loan Operations Section of FSA began assessing liabilities for excess cash.

We also noted differences in the weighting of factors considered for determining whether or not
a program review was warranted. We reviewed 234 files of institutions that were either case
managed, provided technical assistance, or had an off-site direct loan excess cash review to
determine whether there was any information indicating that an on-site program review was
warranted. Factors that may have suggested that an on-site review was warranted included a
large number or significant findings in prior audit reports and program review reports, large
liabilities assessed as a result of the audit or program review, and a long time period since the
last program review. We identified 10 institutions where an on-site program review was
warranted, and two institutions where technical assistance was warranted. For example, one
institution had a history of repeat findings in six of the previous audit reports, several of which
were repeat audit findings including refunds. An ED-OIG audit found that the school failed to
meet the 85/15 rule and recommended recovery of over $1 million. A program review had not
been conducted in the past 5 years. The regional office case-managed the institution and
determined that no further action was needed despite the problems reported. We found instances
in other regions where a similar compliance history triggered a program review.

The monitoring actions taken as a result of program review findings varied across the regions.
The actions taken as a result of program reviews with similar findings and liability assessments
varied both within and across regions. We identified instances where institutions within the
same region had a high number of program review findings and liability assessments; however
one institution was placed on the reimbursement method of payment and another institution was
not. We identified similar inconsistencies across the regions visited. For example, we noted
instances where an institution within one region had few findings or liability assessments, but
was placed on the reimbursement method of payment; while in another region an institution
fitting this same scenario was not.

The inconsistencies identified were caused by a lack of policies and procedures and general
oversight by the CMO-HQ. The Program Review Guide did not address all of the areas noted.
Off-site program reviews, the weight to assign potential issues to trigger a program review, or
the monitoring action to be taken in response to specific types of program review findings were
not addressed in sufficient detail to ensure consistency of monitoring and enforcement within
and across regions. CMO-HQ officials indicated that because of the level of experience,


ED-OIG/A04-D0014                    FINAL REPORT                             Page 13 of 46
expertise, and familiarity with institutions by regional office staffs, they give autonomy to
regional directors to do what they believe is needed to carry out the CMO mission. Although
there may be merit in this management philosophy, the type of inconsistencies we identified will
cause the Department to be at risk of failing to ensure compliance by institutions in a consistent
and equitable manner. Greater oversight on the part of CMO-HQ would provide the opportunity
to identify best practices and improve effectiveness.

Failure by CMO to identify noncompliance at an institution could result in additional problems
in the future, unidentified liabilities due the Department, and potential harm to students attending
the institution. Furthermore, the overall weaknesses in the program review process place the
Department at risk of being inconsistent and inequitable in its monitoring of institutions and its
assessment of liabilities. It also creates a potential inability for the Department to take action
against institutions in the future. Finally, the weaknesses in the data limited CMO management’s
ability to manage and prioritize monitoring efforts.

RECOMMENDATIONS
We recommend that the Chief Operating Officer for Federal Student Aid require CMO-HQ to:

2.1 	 Establish detailed policies and procedures over supervisory review of program reviews,
      record retention, off-site program reviews, specific items for making a program review or
      technical assistance the appropriate monitoring action, and the appropriate action to be
      taken as a result of a specific compliance issue identified at an institution.

2.2 	 Develop a quality control process to ensure regional compliance with the policies and
      procedures concerning the program review function and consistency across the regions in
      decisions pertaining to monitoring actions taken and enforcement actions in the event of
      noncompliance.

FSA RESPONSE AND OIG COMMENTS

FSA agreed that internal procedures should be strengthened and summarized the procedures that
are currently being enhanced. FSA agreed with Recommendations 2.1 and 2.2 and stated that
FSA will develop guidelines and procedures to address these issues. FSA will discuss its draft
action plan during the September 2004 managers’ meeting, which will include all Division
Directors, Area Case Directors, and Co-Team Leaders.

Regarding FSA’s disagreement with certain statements in the audit report, the information
provided in the written response was not sufficient to convince us to amend the finding and
recommendations. FSA’s specific response to the draft report and our comments are
summarized below.

FSA Response. FSA stated that if reviewers did not identify any fiscal findings, including
excess cash, there would be no documentation of the review. FSA agreed to clarify that the
fiscal review should be documented, whether there are findings or not. Regarding the reporting
process in general, FSA stated that institutional review specialists did not always report all


ED-OIG/A04-D0014                    FINAL REPORT 	                           Page 14 of 46
program review findings identified during the review because the current practice states that if a
finding is corrected while the reviewer is on-site at the school, or shortly thereafter, the finding
would not need to be included in the FPRD. The finding would not need to be included in the
FPRD if the reviewer concluded that the deficiency was inconsequential considering both the
qualitative and quantitative factors. It is standard procedure for reviewers to use their
professional judgment in determining whether the noncompliance issue identified has any
material significance in the administration of the Title IV program. FSA agreed to develop
procedures in this area to increase consistency of documentation.

OIG Comments. OIG disagrees with CMO’s policy of allowing reviewers the option of not
reporting all program review findings identified during the review if the findings are corrected
on-site, or shortly thereafter. This policy demonstrates a lack of management control that could
place the institutional review specialist in a situation where they may be coerced by a school not
to report findings. It is OIG’s position that all program review findings, regardless of when
corrected, should be included in the program review report in order to document the findings
identified during the review. Such documentation will serve to record the problems identified
during the review. Although FSA stated that a deficiency is not included as a finding if a
reviewer concludes it was inconsequential, we noted in our review significant issues such as
unmade refunds that were not recorded as findings. It is important that compliance problems be
documented so that independent public accountants, accrediting agencies, OIG, and other entities
will have a clear picture of the weaknesses identified with the administration of the Title IV
programs. In addition, it is impossible for FSA Headquarters to monitor consistency of regional
office operations regarding program review findings if the findings are not recorded.

FSA Response. The information for one school (RETS Tech Center) is inaccurate because it
was a program review to review dependency overrides. Since no guidance had been provided to
schools on this issue, all schools with this type of program review received a Special
Determination Letter rather than an FPRD. This school should not have been included as an
example of reviewers that did not report all findings.

OIG Comments. All findings identified during a review should be reported, regardless of the
type of report issued. All identified findings for the RETS Tech Center were not reported.
Findings were documented within the program review working papers, but not reported.

FSA Response. FSA disagreed that the CMO full file reviews for two schools was inadequate.
Since the CMO was successful in obtaining the return of $176,000 from one school (Trident
Technical College), FSA is unsure of the basis on which OIG claimed the file review was
inadequate. For another school (Victoria Beauty College), the program review report required
the school to reconstruct the fiscal records for the 2000-2001 and 2001-2002 award years and the
return of Title IV calculations for the period October 7, 2000, to the end date of the program
review (April 11, 2003). After the school’s initial response, it was given an opportunity to
provide additional explanation and/or documentation because the Title IV recalculations were
not acceptable. The school was afforded an opportunity to redo its calculations and provide the
reviewer with copies of the calculations as well as documentation. Based on the school’s
October 2003 response, the FPRD closed the review with no assessment of liabilities due to the
Department.



ED-OIG/A04-D0014                     FINAL REPORT                             Page 15 of 46
OIG Comments. CMO required Trident Technical College to conduct a full file review of
refunds due to the Title IV programs. According to the institutional review specialist, the
institution was required to submit a spreadsheet containing the results of the full file review.
CMO reviewed the accuracy of the formula used to total the amount of Title IV refunds due to
ED and lenders; however, there was no documentation to support CMO’s review of the accuracy
of the full file review.

We found no documentation in the Victoria Beauty College file to support the adequacy of
CMO’s review of the school’s full file review. The institutional review specialist explained that
the process for CMO’s review of a school’s full file review included reviewing the submission
from the school against the students identified in the full file review to determine if the school
had performed the calculations correctly. An additional sample would be checked for accuracy.
There was no documentation within the CMO institutional file to support that this procedure was
followed. There was also no documentation in the institutional file giving the institution an
opportunity to provide additional explanation and/or documentation since its first reconstruction
was not acceptable because it failed to complete all steps to arrive at a correct recalculation. In
addition, there was no documentation in the file to support the October 2003 FPRD.

FSA Response. FSA stated that its record retention policy is contained in the Department’s
Records Disposition Schedules, and that it will inform the case teams and conduct training as
appropriate to ensure that records retention procedures are understood. FSA also stated that it
provided instructions on setting up and maintaining appropriate files. Regarding the OIG finding
of inconsistencies in the program review process across regions, FSA stated that off-site program
reviews are another method to ensure compliance in situations that do not require an on-site
presence; the method is productive and provides flexibility. FSA stated that it will enhance
procedures to ensure consistency, including guidance for more uniform decisions, and
monitoring and quality control checks.

OIG Comments. FSA did not provide complete details on its planned corrective actions, so our
recommendations are unchanged.

Finding No. 3 – Technical Assistance Was Not Adequately Documented Or
Followed-Up On
We identified problems with the documentation of technical assistance and a lack of follow-up
by regional offices on the results of technical assistance. We reviewed 40 instances where
technical assistance was provided and interviewed various members of the case teams in the four
regional offices visited. Two regions did not document technical assistance in the Case
Management Information System (CMIS) or the institutional file, and one region did not
document technical assistance in PEPS. Three regions performed informal follow-up and one
region performed no follow-up at all. In three regions, follow-up was not documented. This
occurred due to a failure to comply with existing policies and procedures and a lack of detailed
policies and procedures for some compliance areas. Failure to document and follow up on
technical assistance prevented CMO management from having the ability to measure the
effectiveness of technical assistance as a compliance tool.


ED-OIG/A04-D0014                    FINAL REPORT                             Page 16 of 46
The Institutional Improvement Specialist Guide for conducting technical assistance states that
technical assistance delivered by the institutional improvement specialist and the decision on
how to proceed with technical assistance will be documented in CMIS and/or the school file.
There was no CMO guidance requiring institutional review specialists (who usually conduct
program reviews) to document their technical assistance visits.

Weaknesses in Documentation of Technical Assistance
Institutional review specialists and institutional improvement specialists did not always
document technical assistance in CMIS, the institutional file, or PEPS. In two regions visited,
there was no documentation within CMIS or the institutional file to document the technical
assistance provided by either the improvement specialist or review specialist. Although there
was no requirement for improvement specialists and review specialists to input technical
assistance visits into PEPS, three regional offices visited documented technical assistance visits
in PEPS and one region did not.

CMO did not comply with existing policies and procedures over technical assistance, and did not
have formal policies and procedures for other aspects of the technical assistance process. Failure
to consistently document technical assistance within the institutional files, CMIS, and PEPS
prevented CMO management from having accurate data on the amount and frequency of
technical assistance performed as a form of monitoring. A lack of accurate data prevented CMO
management from having the ability to measure the effectiveness of technical assistance as a
compliance tool.

No Formal Follow-Up Procedures for Technical Assistance
CMO could not demonstrate that it consistently followed-up on whether or not improvement had
been made at institutions receiving technical assistance visits. One of the four regions visited did
not have technical assistance follow-up procedures. The other three regions had informal
procedures to follow-up on technical assistance; however, the follow-up results were only
documented in one of the regions.

In April 2002, the General Accounting Office (GAO) issued a Report to Congressional
Requesters entitled “FEDERAL STUDENT AID – Additional Management Improvements
Would Clarify Strategic Direction and Enhance Accountability” (GAO-02-255). GAO reported
that while FSA had developed strategies intended to improve schools’ regulatory compliance, it
was not clear how FSA would know whether its strategies are effective. In response to GAO’s
report, which recommended that FSA develop measures that better demonstrate whether its
technical assistance activities result in improved compliance among schools, CMO developed the
New School Initiative. This initiative called for technical assistance visits to new Title IV
institutions and a follow-up technical assistance visit one year later to evaluate the institution’s
understanding of the Title IV programs.

At the time of our review, regions had just begun to implement the new schools initiative. We
identified differences among the regions regarding how the new schools initiative was being
implemented. In some regions, the institutional improvement specialist performed the majority
of the technical assistance being provided to new schools, while in some regions the case teams



ED-OIG/A04-D0014                    FINAL REPORT                             Page 17 of 46
provided it and in other regions both the institutional improvement specialist and the case team
performed it. CMO had not developed a formal follow-up initiative for schools already
participating in the Title IV programs. Failure to follow-up on technical assistance places CMO
at a disadvantage by not knowing whether or not technical assistance resulted in improved
compliance by institutions.

RECOMMENDATION
We recommend that the Chief Operating Officer for Federal Student Aid require CMO-HQ to:

3.1   Develop and implement policies and procedures for
      • 	 providing technical assistance in a consistent manner across all regions,
      • 	 documenting the technical assistance provided,
      • 	 identifying when technical assistance ends and enforcement begins, and
      • 	 following up on technical assistance visits and measuring the effectiveness of them as a
          compliance/monitoring tool.

FSA RESPONSE

FSA did not agree with all aspects of the finding; however, FSA agreed with the
recommendation and issued new Management Improvement Services (technical assistance)
procedures in July 2004 to be effective August 1, 2004. Procedures include selecting schools for
technical assistance, the use of corrective action plans, proper documentation, and follow-up.
Training was conducted July 29, 2004, and a workgroup formed to improve the data collection
on these services for effective analysis.

OIG COMMENTS

These procedures were not in effect during the course of our audit; therefore, we did not evaluate
their effectiveness. However, the new Management Improvement Services procedures should
aid FSA in improving its monitoring efforts, if they are fully implemented and the procedures are
consistently followed.

Finding No. 4 – CMO-HQ Monitoring of Regional Office Operations Needs
Improvement
Our review of CMO-HQ procedures and processes for monitoring operations of regional offices
identified key management control areas that need improvement. The CMO-HQ did not
(1) monitor regional offices’ use of the IAM, (2) provide guidance to regional offices as to which
institutions to select for case management in the absence of an updated IAM risk list, (3) monitor
regional offices’ compliance with internal policies and procedures over program review and
technical assistance, (4) evaluate the effectiveness of program reviews conducted or the
consistency of regional offices’ selection of institutions for review, and (5) evaluate the
effectiveness or consistency of the enforcement actions taken as a result of regional office
reviews. These weaknesses occurred as a result of the level of autonomy given to each regional



ED-OIG/A04-D0014                    FINAL REPORT 	                          Page 18 of 46
office regarding monitoring decisions. This situation creates the risk of inconsistent treatment of
institutions across the country.

The Standards for Internal Control in the Federal Government, issued by the U.S. General
Accounting Office (GAO) in November 1999 (GAO/AIMD-00-21.3.1), defined internal control
as “An integral component of an organization’s management that provides reasonable assurance
that the following objectives are being achieved: effectiveness and efficiency of operations,
reliability of financial reporting, and compliance with applicable laws and regulations.” The
standards explain that internal control is a major part of managing an organization. It comprises
the plans, methods, and procedures used to meet missions, goals, and objectives. Internal
control, which is synonymous with management control, helps government program managers
achieve desired results through effective stewardship of public resources.

According to the standards, internal control should provide reasonable assurance that the
objectives of the agency are being achieved in the following categories: Effectiveness and
efficiency of operations including the use of the entity’s resources; reliability of financial
reporting, including reports on budget execution, financial statements, and other reports for
internal and external use; and compliance with applicable laws and regulations.

The standards explain that internal control is management control that is built into the entity as a
part of its infrastructure to help managers run the entity and achieve their aims on an ongoing
basis. One internal control standard is control activities. Internal control activities help ensure
that management’s directives are carried out. The control activities should be effective and
efficient in accomplishing the agency’s control objectives. Control activities are the policies,
procedures, techniques, and mechanisms that enforce management’s directives. An example of a
control activity is top level reviews of actual performance.

Another internal control standard is monitoring. Internal control monitoring should assess the
quality of performance over time and ensure that the findings of audits and other reviews are
promptly resolved. Internal control should generally be designed to assure that ongoing
monitoring occurs in the course of normal operations.

Institutional Assessment Model
The CMO-HQ did not monitor regional offices’ use of the IAM. Although regional offices were
required to risk-manage the highest risk institutions identified by the IAM, CMO-HQ did not
follow-up on actions taken by the regions regarding the institutions on the high-risk list. Regions
were not required to provide feedback regarding risk-management activities to the CMO-HQ.

According to CMO policy for 2003, regions were required to case manage the most high-risk
schools from the IAM list with a probability score of 80 percent or greater (approximately 600
schools). The CMO-HQ had not communicated to the regions how to meet their requirement of
case management of the top 600 high-risk schools in the absence of a new IAM risk list being
generated for 2004. The last IAM list was generated on November 11, 2002. At the time of our
review, regional offices had completed their requirement of case managing the top 600 schools
from the November 2002 list and were waiting for the new IAM list to be generated. According
to regional officials, limited information was communicated to them about when the next IAM



ED-OIG/A04-D0014                    FINAL REPORT                              Page 19 of 46
list would be generated or how they should proceed with their case management activities in the
absence of a new IAM list. Each region we visited had adopted its own methodology for
identifying schools for case management in the absence of a new IAM list. As a result, CMO
was vulnerable to inconsistent treatment in selecting institutions for review.

Failure to Monitor Compliance with Internal Procedures
CMO policy required supporting documentation for excess cash reviews to be maintained.
CMO, as required by Department policy, developed a policy for the retention of records to
support program reviews. The policy requires program review files to be retained indefinitely
unless the institution has been terminated from Title IV programs. It was also CMO policy to
require institutional improvement specialists to document technical assistance visits in CMIS or
the institutional file. However, CMO-HQ did not monitor regional office compliance with these
policies.

Lack of Guidance Over Key Issues Pertaining to Monitoring
The CMO-HQ had not developed guidance on several key issues pertaining to the monitoring of
postsecondary institutions. CMO-HQ had not developed guidance for weighting the factors used
to select institutions for program review other than the IAM score, nor had it developed guidance
to ensure consistency in the type of enforcement action to be taken in response to program
review findings. CMO-HQ had also not developed a policy to monitor the effectiveness of
program reviews or technical assistance.

GAO’s Report on Federal Student Aid (GAO-02-255) stated that FSA’s draft fiscal year 2002
performance plan reflects increasing reliance on providing technical assistance to schools as a
way to ensure their compliance with financial aid rules and regulations. In the past, FSA relied
extensively on conducting on-site program reviews to assess schools’ compliance with rules and
regulations. GAO recommended that FSA develop measures that better demonstrate whether its
technical assistance activities result in improved compliance among schools.

Our visits to four CMO regional offices revealed that each regional office had autonomy over its
monitoring decisions. According to CMO-HQ officials, each regional office manager is trusted
to make the correct monitoring decisions. Since the majority of the staff in each regional office
has years of experience and institutional knowledge about FSA programs, CMO-HQ officials
believe the regional managers will do the “right thing.” CMO-HQ officials said it was important
that each region be given independence to monitor the institutions in its region in the manner that
the regional manager believes to be most appropriate. However, this monitoring philosophy can
create the potential for inconsistent treatment of institutions across the country. CMO-HQ had
no policies and procedures to ensure that the regional managers were doing the “right thing.”

An end of fieldwork briefing was held with CMO-HQ officials in February 2004. During this
meeting, CMO-HQ officials shared their current plan for the new electronic CMO (eCMO)
initiative. According to CMO-HQ officials, the eCMO initiative is grounded in the case
management process model and is focused primarily on updating tools and systems to help
support decision-making that is informed, effective, efficient, consistent, documented,
standardized and distributed. CMO-HQ officials provided information on how this new
initiative may address some of the concerns noted in this report; however, the officials were



ED-OIG/A04-D0014                    FINAL REPORT                             Page 20 of 46
unable to provide details such as an implementation date for the system. Failure to evaluate the
effectiveness of program reviews and technical assistance places CMO at a disadvantage by not
knowing whether or not their monitoring activities result in improved compliance by institutions.

RECOMMENDATIONS
We recommend that the Chief Operating Officer of Federal Student Aid require CMO-HQ to:

4.1 	 Implement management controls that provide for consistent treatment of institutions across
      regional offices.

4.2 	 Develop internal policies and procedures that provide for management oversight of CMO
      operations.

FSA RESPONSE AND OIG COMMENTS

FSA Response. FSA disagreed that each region had adopted its own methodology for
identifying schools for case management in the absence of an IAM list. FSA said the case teams
were managing schools as a result of recertification, audit resolution, financial analysis, program
review, and other trigger events. The case teams are not waiting for a new risk list from IAM.

OIG Comments. CMO issued a Performance Improvement Procedure covering the “Use of
IAM” on January 7, 2003, directing the regional offices to case manage those schools from the
IAM list with a probability score of 80 percent or greater (approximately 600 schools).
According to this procedure, case teams were also required to use the data from the IAM system
when case managing schools based on other information such as complaints, deficient audits,
flagged financial statements, recertification, and other trigger events. In the absence of an
updated IAM list, case teams could not carry out the requirements of this policy. Although we
found that the regional offices visited were case managing schools in FY 2004 as a result of
these trigger events, the case teams were unable to use data from the IAM in the case
management process. We found that the regional case teams had developed their own
methodology for selecting institutions in the absence of an up-to-date IAM risk assessment.

During interviews with CMO regional personnel, we were informed that they were waiting for a
new IAM risk list. We also learned that each region we visited received different information
regarding whether or not a new IAM list would be provided, when it would be provided, and
how to proceed with their directive of case managing schools with a probability score of
80 percent or greater.

FSA Response. FSA stated that during the audit period the case management teams had been
operating on outdated guidance for document retention. Because the file in Washington, DC,
was considered to be the “official” school file, regional offices were given the approval many
years ago to purge their records. FSA stated that when Electronic Records Management was
implemented, FSA received further guidance from the Office of the General Counsel that revised
the document retention procedures. This revision was shared with the case teams in September
2002 and is being formalized in the Electronic Records Management plan.


ED-OIG/A04-D0014                    FINAL REPORT 	                           Page 21 of 46
OIG Comments. While FSA’s response indicates action to correct the specific problem
identified, it does not address the underlying problem of the lack of management oversight and
monitoring that led to the record retention problem across the regions.

FSA Response. As a process improvement, FSA agreed to develop more guidance related to
selecting institutions for review, ensuring consistency in enforcement actions, and monitoring the
effectiveness of program reviews and technical assistance. FSA stated that the FY 2004
Compliance Initiative provides case teams with training in the identification, documentation, and
resolution of each of the reported areas of non-compliance; this also includes a specific process
approach to improve consistency in enforcement actions. This initiative also provides for
continuous feedback to closely monitor case team actions and gather results to inform future data
analysis initiatives and program monitoring opportunities.

OIG Comments. The 2004 Compliance Initiative does not provide all of the policies and
procedures we recommend. The 2004 Compliance Initiative is comprised of data mining and
related follow up procedures for certain specified issues, most of which are very limited in scope.
FSA did not provide information on how the compliance issues were identified and what made
those issues important for identifying institutions for program review and technical assistance.
The dollars identified at risk for the issues in the initiative ranged from $165,143 for 41
institutions to $47 million for 185 institutions. Of the 379 institutions identified in the 2004
Compliance Initiative, only 82 are identified as potentially having an on-site review as part of the
data mining verification. The 2004 Compliance Initiative does not include guidance on how to
address other compliance issues. CMO provided the Compliance Initiative Executive Summary,
dated August 6, 2004, as part of its response to the draft report. When we requested a copy of
the complete Compliance Initiative, we were informed that the document consisted only of the
Executive Summary. We were referred to training materials for the implementation of the
Compliance Initiative. When we requested the training materials we learned that they were still
in draft.

FSA Response. FSA stated that while it generally agrees that management controls and
procedures can be improved, FSA believes that it currently has an appropriate oversight and
monitoring process in place. FSA recognizes that to achieve the best in the business in oversight
strategies and desired outcomes, FSA must continually work to improve the processes.
Therefore, FSA is developing an action plan to identify and enhance appropriate procedures.
This plan will balance identification of appropriate corrective actions as allowed by regulation
and legislation, while ensuring program integrity and access for students to educational
opportunities. The planned eCMO initiative is expected to further assist the case teams and
management to improve consistency in program oversight of schools. FSA will begin to gather
requirements for eCMO in FY 2005.

OIG Comments. We disagree with FSA’s belief that it currently has appropriate oversight and
monitoring processes in place. The exceptions cited in this report support our conclusion that a
plan for improving institutional monitoring is needed. The FSA response does not deal with the
management weaknesses noted in this finding. FSA needs to address CMO-HQ’s oversight
responsibility in monitoring consistency among regional offices regarding the selection of



ED-OIG/A04-D0014                    FINAL REPORT                             Page 22 of 46
institutions for program review and technical assistance, consistency in processes, and results
among regional offices. Unless the management weaknesses identified in this report are fully
addressed, we do not see how eCMO will improve consistency in program oversight of schools.




ED-OIG/A04-D0014                   FINAL REPORT                            Page 23 of 46
                                     BACKGROUND


CMO is an organizational component of the Office of Federal Student Aid’s Schools Eligibility
Channel. CMO-HQ is responsible for the oversight of operations in 10 regional offices
throughout the Nation. CMO-HQ is responsible for providing guidance to the regional offices
for, among other CMO actions, the conduct of program reviews and technical assistance. With
approximately 186 staff members, CMO’s function is to monitor postsecondary institutions’
compliance with statutory and regulatory requirements for participation in Title IV programs.
Monitoring activities include certifying FSA program eligibility for both new and established
participating schools, conducting on- and off-site program reviews at participating institutions,
financial statement analysis and providing technical assistance to institutions.

Case management is CMO's primary monitoring tool. It is a process where case team members
meet to discuss the following events and issues: compliance audit findings, financial statements
indicating a potential problem, program reviews, technical assistance, complaints, and
congressional inquiries. The case management process is started by a "trigger" event. The
trigger event may include the following: periodic recertification, financial statements indicating
a potential problem, liabilities identified through compliance audit reports, accreditation issues,
or a program review about to be conducted. Once an institution is identified through a trigger
event, the institution is assigned to a case team to be case managed. A case team member
examines the following areas: recertification, program review, program funding, IAM risk score,
audits, financial analysis, and reimbursement if applicable. Each member of the case team
researches his/her area of expertise and the team meets again to discuss the results of the
research. The purpose for the case team discussion of the events/issues is to determine whether
there is a need for the case team to take an action on an institution. The end result of the case
team examination may be an on-site program review, a limited scope off-site review, technical
assistance, or no action taken.

An on-site program review is an evaluation of an institution’s administration of Title IV
programs and generally encompassed the two most recent closed award years and the current
award year. On-site reviews generally take a week, but this timeframe may vary depending on
the scope of the review. Normally, an overall assessment review is chosen when the case
management team seeks a general evaluation of the school’s performance in meeting its
administrative and financial obligations relative to the FSA programs. However, when a
program review is needed to address specific issues known to the case management team, the
scope of the review will be narrowed to focus on those issues. This type of review is known as a
focused review. Although there was no formal definition of an off-site review and the regional
offices’ definition of it varied, such reviews generally served the purpose of reviewing an
institution where the case team already knew about a potential problem and needed to verify
whether or not the problem existed. For off-site reviews, the case team requests the institution to
forward specific documents to the case team for review. Technical assistance provided to
institutions may include telephone contacts; providing written guidance; specialized training for
targeted groups, and regional office assistance.



ED-OIG/A04-D0014                    FINAL REPORT                             Page 24 of 46
                 OBJECTIVE, SCOPE AND METHODOLOGY


Our audit objectives were to evaluate (1) CMO’s use of program reviews as a compliance tool,
(2) CMO’s use of technical assistance as a compliance tool, and (3) CMO-HQ management
controls over regional office monitoring of postsecondary institutions. Audit coverage included
CMO activities during the period August 2001 through May 2003.

To accomplish our audit objectives we:

    • 	 Interviewed CMO officials, Case Team Directors, and other regional staff.

    • 	 Analyzed and reviewed applicable laws and regulations, the most recent copies of the
        Program Review Guide and Institutional Review Specialist Guide, IAM contracts,
        planning documents for eCMO, and GAO audit reports related to FSA.

    • 	 Reviewed a random sample of on-site and off-site program reviews conducted between
        August 1, 2001, through November 3, 2002, and November 4, 2002, through May 5,
        2003. PEPS data was used to select a random sample of on-site and off-site program
        reviews. Our sample included 20 percent of all on-site program reviews and 20 percent
        of all off-site program reviews in each of the four regions visited (Atlanta, Chicago,
        Dallas, and San Francisco).2

                                Sample Sizes for Two Years Reviewed
                                 On-Site Program Reviews      Off-Site Program Reviews
                              No. Reviewed       Universe     No. Reviewed     Universe
          Atlanta                  18               57             13             65
          Chicago                  14               63               6            25
          Dallas                     11                    52                      6                47
          San
          Francisco                  12                     35                     3               63
          Total                      55                    207                    28               200

    • 	 Reviewed a random sample of on-site and off-site technical assistance cases conducted
        between August 1, 2001, through November 3, 2002, and November 4, 2002, through
        May 4, 2003. PEPS data was used to select a random sample of on-site and off-site
        program reviews. Our sample included 20 percent of all on-site technical assistance




2
  There were a few exceptions to these sample sizes. In order to have adequate audit coverage, if the universe of
program reviews was small, we selected a larger percentage of files (from 25 percent to 50 percent) in order to have
a sample size large enough to form conclusions on our review. In every case, we reviewed a minimum of three files.


ED-OIG/A04-D0014                          FINAL REPORT 	                                  Page 25 of 46
         cases and 20 percent of all off-site technical assistance cases in each of the four regions
         visited.3
                               Sample Sizes for Two Years Reviewed
                        On-Site Technical Assistance             Off-Site Technical
                                                                 Assistance
                         No. Reviewed             Universe         No. Reviewed       Universe
          Atlanta               8                    27                   2                2
          Chicago               8                    33                   7               13
          Dallas                4                     7                   4               14
          San                   8                    19                   0                0
          Francisco
          Total                28                    86                  13               29

    • 	 Reviewed a random sample of 20 school files from the IAM top 60/600 risk list (10 from
        the 2001 risk list and 10 from the 2002 risk list) that did not have a program review or
        technical assistance performed, but were case managed in each of the four regions
        visited.

    • 	 Reviewed a random sample of 10 schools that received direct loan excess cash reviews in
        each of the four regions visited.

To meet the objectives of our audit, we relied on computer-processed data in PEPS to identify
the universe of program reviews and technical assistance conducted. During our review of PEPS
data, we noted that the liabilities assessed as part of program reviews were not always the same
in PEPS as in the program review FPRD letters and that technical assistance visits made by one
region were not entered into PEPS. For the region that did not enter technical assistance visits
into PEPS, we obtained a list of technical assistance visits at the regional office from which to
select technical assistance for review. Overall, the PEPS data that we reviewed was determined
to be accurate. Therefore, we determined that the PEPS data was sufficiently reliable for use in
meeting the audit objectives with the exception of program review liabilities.

We examined program review (on-site and off-site) reports and supporting documentation,
available case management documentation, and available technical assistance documentation for
work conducted by CMO during the audit period. We visited the CMO-HQ in Washington, DC,
and the CMO regional offices in Atlanta, GA; Chicago, IL; Dallas, TX; and San Francisco, CA.
Audit work was performed during the period June through December 2003. We held an exit
conference with CMO-HQ officials on June 28, 2004. Our audit was conducted in accordance
with generally accepted government auditing standards appropriate to the scope of the review
described above.




3
  There were a few exceptions to these sample sizes. In order to have adequate audit coverage, if the universe of
program reviews was small, we selected a larger percentage of files (from 25 percent to 100 percent) in order to
have sample size large enough to form conclusions on our review. In every case, we reviewed a minimum three
files.


ED-OIG/A04-D0014                          FINAL REPORT 	                                   Page 26 of 46
            STATEMENT ON MANAGEMENT CONTROLS 


As part of our audit, we assessed the system of management controls, policies, procedures, and
practices applicable to CMO’s monitoring of postsecondary institutions. For the purposes of this
report, we assessed and classified significant controls into the following categories: (1)
completeness and accuracy of the IAM, (2) the program review process, (3) the technical
assistance process, and (4) CMO-HQ monitoring of regional office operations. Due to inherent
limitations, an evaluation made for the limited purposes described above would not necessarily
disclose all material weaknesses in the management controls. Our overall assessment disclosed
management control weaknesses in each of the control areas mentioned above. These
weaknesses are discussed in the AUDIT RESULTS section of this report.




ED-OIG/A04-D0014                   FINAL REPORT                            Page 27 of 46
APPENDIX A – CMO Regional Office Exceptions


                                              Regional Office      Regional Office     Regional Office   Regional Office
               Findings                             A                    B                   C                 D            Total
#1 The IAM Is An Ineffective Tool for Identifying At-Risk Institutions

Liabilities reported in FPRD did not               2 of 8                3 of 14            6 of 11           0 of 5       11 of 38
match those reported in PEPS
IAM scores did not reflect problems at            19 of 40               18 of 36           18 of 41         19 of 38      74 of 155
institutions

#2 The Program Review Process Needs Improvement

Reviewers did not always report all               7 of 11                2 of 14            6 of 11          0 of 11       15 of 47
findings in program review reports
Inadequate review of institutions’ full            1 of 7                 0 of 7             4 of 7           0 of 6        5 of 27
file review
Institution case managed, but should              2 of 20                1 of 20            4 of 20          0 of 20        7 of 80
have received program review (or
technical assistance)
Institution received technical assistance,        1 of 10                0 of 15             0 of 8           0 of 8        1 of 41
but should have received program
review
Institution received direct loan excess           0 of 11                0 of 10            4 of 10          0 of 10        4 of 41
cash off-site review, but should have
received on-site program review

#3 Technical Assistance Not Adequately Documented or Followed Up On

Technical assistance not documented in             2 of 9                0 of 15             6 of 8           0 of 8        8 of 40
CMIS or institutional file
Technical assistance follow up                    Informal               Informal           Informal          None
procedures




ED-OIG/A04-D0014                             FINAL REPORT                            Page 28 of 46
APPENDIX B – Written Response to the Draft Report




                                                      FEDERAL
                                                      STUDENT AlO
                                                      .H.J. .... _~.....,




                                          CHIEF Ol'ERA11NC OmctR



      TO:                   J. Wayne Bynum                                           AUG 2 0 100(
                            Regionallnspcc:tOr General for Audit


      fROM;
                                                .,
                            Office of I""""ctor General

                            Theresa S. Sha~+'I'·~U...l
                                                        ,
                            Chief {)pcra!ing OffICer

      SUBJECT;              Case Management and O veT!lighl' s Monitoring of Poslxcondary
                            III!llitutions. Dated July 7. 2004
                            Control Nwnbt:r ED-OIGJA04-DOOI4
                            Draft Audit Report

      Thank you for providing us with an opportunity to respond to the: Office oflnspect()r General's
      (OIG) Draft Audit Report, Case Managc ment and Oversight's M,mitoring ofPo!ltse<;:ondary
      in:stitutioos, Control Nwnbc:r ED-OIGlA04·DOO I4, dated July 7, 2004. TIle draft TCpon slates
      that yOUl" audit foll!ld the following weakncsscs; I) the Institutional Assessment Medel is an
      iDCfTa:tiVC tool for identifying " al·risk" institutions, 2) the program review process nceds
      strengthening, 1) technical assistance was not adt."<tualcly documented or followed up on, and 4)
      eMO - HQ mooiloring of regional oiTlCe operations needs improvement.

      In Iho.: atlaciuneot, we an providing a rc:sponsc to each finding and recommendalion. In b'CllCral,
      wc believe that many orthc findings are ovcrstated , including the finding on the puq>Osc and
      des ign of tIM: InstifUtiorW ASSoCSsment Mudd (lAM). We take issue with the statement lhat the
      weaknesses identified with lAM may prevent Case Management and Oversight (CMO) from
      effectively prioritizing case management efforts. The School Eligibility Channel (SEC, conducts
      the oveTSight activities n::quircd by legi~latiun arul regulation 10 identify IIHlsk institutiuns,
      including n:views of annual audilS and imnncial statements, calculations of default rau:s,
      eligibility reviews and progJlIm n:vicws. In addition, SEC conducts IOChnicalllSll;~tance visiblo
      help Khools prevent problems. SEC iI..Il3lyzes data to proactively identify KOOols llutt may need
      inlervepJi(m. All oflhesc: activiric~ are in addition to the risk probability information being
      provided by the lAM system.

      Additionally, the lAM was not designed to solely prioritize case: management efTons in selccting
      schools rOT on-site and off_site program ...,vicws and technical assist3nc:e. [AM is II tool to
      identify schools with a probability of risk for case management. Case management (as ~ribctl
      above) is an extensive review of a school, given      Inc
                                                            schoo!"s individual circwnstanccs. nus case:
      management review determines appropriate oVcr.light actions that may include an on.site or ofT·
      site program fC"ie w.


                                   8JO FiI'S! SIn'tf. NE W<uhillgtD71.. D.C. 20201
                                                   j .8O()...I .F£D-AlD
                                              wwlll(snuunt"id.d.gIW




ED-OIG/A04-D0014                            FINAL REPORT                                             Page 29 of 46 

APPENDIX B – Written Response to the Draft Report




      Allhough we believe we have appropri::ate review processes in place, we do agree that our
      intenull procedun:::s and management controls can be strmgIhened in the areas identifJOd. FSA
      will review and revise OUt procedura as ne«:ssary, and we will proyidc training 10 the case
      teams on the new procedures.

      Thank you again for the opportunity 10 respond to your ooncems.

      Allachmcnl


      cc: JacJc Martio
          CbatlQ Miller
          Pat Howard




ED-OIG/A04-D0014                          FINAL REPORT                                            Page 30 of 46 

APPENDIX B – Written Response to the Draft Report




       FiDding No. I - The h u titutiona l Assess me nt M odells All Ineffective Tool fo r
       Ideatifying .. AMUs k" l u titutiOlis

       The IG found that thc lAM mil)' prevent CMO from cffectively prioritizing case management
       effons. lbe lAM did not contain completc and accunlte information. and the lAM risk SCOfe5
       dKi not always accurately i&:ntify problematic institutions.. This I"lCC1IJTed due to a lack o f
       poticies., proc:edures and management tontrols around the iofonnationllSCd in the lAM and the
       lad!: of c¥aluation of !he effectiVC!lCS:5 ofthc lAM. By maintaining II risk system WI does fIQl
       accu.rnlely ideutify the most III·risk iostifUtions, CMO lTIlly be: makiug ineffective dcci:iions ~
       the best use of its rcsoun:es and incffectively priori ti7ing its tase management efforts.

       OVERAJ,L RESPONSE:
       The lAM is only ODe tool used by the SEC to identify institutions with a probability of risk. SBC
       coDdocu the oversighl activities requ.i.Jed by legislaCion and regulation to identify at·risk.
       institutions. These include reviews of amual avdils IUId fin.anciaJ statements, <:ak:uI.Mion of
       dtfaull rates. eligibility re\'tews and program reviews. In addition, SEC provides ted:miatl
       assiswK:t: (Q he lp sc:hooLs prevmt probkms.. SEC aruUy7XS data to proactively identify schools
       lbat may nc:cd intervention. All ofthcse activities are in addition 10 the risk probability
       information being proyi<.\c:d by the lAM system. Wc du II~ that the lAM system tan be
       enhanced and hayC identified the rt:qllin.'IllCIl1 fOI a ocw model as part ortbc deyclopment of tbc
       Inkgrakd Partner Management Sy.rtClll.

       SEC disagrees with lhe statement that "by maintaining Il risk system that does nol accurately
       Identify the most aHist jl\.~t ....ioos, C MO may flO( be rTIlIkiog dfc<:tiyc deci$ion5 about the best
       use orits limited resouJ"CCs." l lte cunent model wa1 designed 10 identify schools with four
       conditions ofris!.::: the presence of sumy (fetlerof credit), the presence ora fine greater than
       $10,000, tbc tonditioo ofbcing on reimbursement. or thctondition of haying a liability from
       audi15 or program reviews grealcf than $10.000. The modd is a good predictor of iMlitutions
       likely 10 have tllcsc foUl conditions. These predictions are based un a type of rcgrcssioo analysis
       lhat suns wi th idcfltifying those schools WI have the condition or problem. and then loob fo r
       variables that c:onlIibutc to the school having thaI ooodil>on. The model WII!I nevcr designed to
       idcn!ify sc:hools with audit findings. In fact, SEC ~ties on the annual audit to review audit
       fiodings. (Sec Appendix I. page:; 15 - 16 for It mon: deta.i lod discussion oR the prodiclllbility o f
       the model.)

       The lAM process is S1atistical and does contain lhe possibilities of false positives (s<:hooIs
       inconc<;tly predicted 10 have problcJtlS) and false negatives (schools iucorrectly predicted to not
       have problems). SEC look a risk-avtne. conservativc approach in the first release of lAM,
       erring 00 the side of caution by intluding IllOfC false positiYes., which resulted in more schools 10
       review lhal may DOl hayC illYproblems.

       We believe SEC is making effective usc of its resoun:cs and prioritizing its oYCl"Sight activit~
       beca.1tSC it uses the ovcrsighllOOls listed above as requi red by legislation and regulation. 1besc
       tools allow SEC (Q make a holistic; judgmrnt on sc:hools ~ 011 data ftool all re/evluit .sources,
       including the lAM syst<:m.

       fSA Rc$pOrI!Ie to Dtaft AuofiI Rqxrt                                                       I'~e   J of l6




ED-OIG/A04-D0014                              FINAL REPORT                                                 Page 31 of 46 

APPENDIX B – Written Response to the Draft Report




       The IG's analysis showed that appmximatdy 525 orthe6.371 institution'S (1.2 pen::ent) thaI
       participllcd in the Title IV programs wae I'IOt a$$igned an lAM score in the July 2001 (FY 2002)
       risk ~t. In addition, 500 or the 6.371 inSlilUliOlls (7.8 pem=D.t) thaI pGrticipato:i ill the
       Title IV programs we:fl: not 8.Slligned an lAM score: in the Novanbcr 2002 (FY 2003) risk
       ~Ilt. Of this 500, 424 wen the same: insti~io n$ lhat did OO t receive a scon: in the July
       2001 risk as..wsmenl.

       RESPONSE:

       FSA ioc"fltified a total of584 undupliattd !>Choals OIl the: combined 2002 and 2003IiSl$. The
       majority of thc: sehools that the IG identified as not havi ng a risk score shoukillOt (based 00 the
       model design) have had a risk sea", ror the following reasons:

       22 1     Closed Schools
        5B      MergedICoosolidated Scbools
        JI      Los.! ofntle IV Eligibility
        IJ      Los.! orSlale AecreditationfAulhoriulion or Voluntary Withdl1lwal
          3     Not EligiNe on<Vor not Cl:rtifiCl.i
          2     Funding Office Only
       162      Initial Date 2000 and Later (insufficient data 10 suppon caicujption. new school)

       490     TOTAL

       Plc:asc: see Appendix 2 rOT a detailed Ii~ orthesc schools. Foe those schools that have me:rged or
       COIlSOlidatcd, we: halle: attached PEPS :screen shou sDewing the "new~ school and the risk SCOfe
       ror that school.

       There: were 94 T'CDlIIining schools that had an "Initial Dale of 1999 and Prior" ""iib no risk score.
       Of the 94, 6B were inadvtttc:ndy dropped from the FY 2002 list hccalL'lC they were missin; a
       team code, lID emJT that was detected by the OakRidge National Laboratories (ORNL) and
       corn:clCd ror the FY 2003 list (See Appendix 3). The rKt that this problem was identified and
       oom;ctQll is evidence tlwt SEC does JX:rlucm analysis and e:hc:cks on the lAM data Ilnd s)'$tc:m.
       HOwtYCf, J2 of~ 68 K hools should DOl have had a s<;ore bceau!!c: (hcy wcre 100 new or lhe;:re
       was insomcic:nt data to colculote: a saln:. Thcre we~ Il tOlal of 56 schools with II valid miMing
       score: in FY 2002. These were corrcctal. and al l received a score: in FY 2003. Please note in FY
       2003, only five ofthcsc llChools had a probability greatct' than BO percent. and thus made the lop
       600 list.

       Fifteen or the I'C:lTIlIining 26 schools (94 - 68) should 1101 have had II scor~. These: ~hoob 3bouId
       not have had a score: because of insuffici~t da1a due 10 entc:ring tbc Title IV program too laic roe
       risk list as a result ofndostaicmelll!l, closingl and reopcnings, lale 1999 new certifications., etc.
       Appendix 4 prollides. few of Ihc:sc: <:XaIlIples.   or this subgJUup, there were IItoul of I I schools
       with a val id missing score:.


       FSA Rnpoose 10 I)noft "'.....it Report                                                      P.e2of16




ED-OIG/A04-D0014                                FINAL REPORT                                            Page 32 of 46 

APPENDIX B – Written Response to the Draft Report




      1bcn:forc, the combined lOUl number of schools with a valid missing risk SCO£C: for citha FY
      ZOOZ or Z003 is 67 (.56 t II), which is one percent oflhe uniVCTllC of schools identified by the
      IG.

      Illformatioo SublQitted By tbe Deparlmeat tu OakRidge Nannal Lab.ratori" (ORNL)
      Was Intomplde

       A. ORNL used data provided from PEPS 10 assJW' an lAM risk score to institutions
       ('W:tio.:ipati,'K in the Tille IV programs . Some ufthe PEPS data did not accwately ,efl ..:", all
       individual institution's fiWlllciai responsibility, a factor used in ~a1cuJaling the lAM scort:. A
       fiWlllcial n:spo~bilily compo:sito; s<;o n: indiCilting fillil1lcial ttSponsibilily W illi not calculated for
       public institutions ba:ause such institutions. an: backed by the full faith and cmii, of the State.
       Therefore, the lAM scores assigned 10 public instiMions did not reflect financial respon.sirnlily.

       RESPONSE;

       A. We do DOt agree Ihat the financial resport."ibility of public schoob is not rcfketed in lAM
      because the public sclJuols do not: get an individual financial responsibility oompo~itc score as II
      result of reviewing their fil"lanCial statcmenl$. The lAM indicator for all schools ( including
      public schools) is the presence or absence o f a flagged financial statement, or II missing financial
      statement. FOT example, if a ~I submil.'l a financial statement, and the financial statement is
      not "flagged" for any reason (a failing financial responsibi lity composite score or othcr reason as
      listeU below), the school gelS an ··OKn lAM indicator for financial n::sponsibility. Conversely, if
      a school fails to SIIbmit a financial stlttelnCllt, it gets a "not OK" lAM indicaTor. Also, if a sct.ool
      SUbmilS a rwancial.statcment and the: statemcnt is "flaggod" for any Tl:ll3O<l, the schoo l gets a
      "not OK" lAM indM:ator.

       In FY ZOOZ, out of.:l total of 1,926 domestic: public schools, there were 2 17 schoo ls with
       financial statements !lagged for failing to meet certain conditions, as :;hUWl) below:

               Audit Opinion                     129
               ED Compliance Issues                 4
               Contingent Liabilities               1
               Going Con<;em                        I
               Debt Agreemcnt Violation            0
               Change in Auditor                  71
               Other Issues:                        7
               Deferred InoomdlllC(llne
                Recognition Problem                 3
               Late Refunds                         J
                             TOTAL               217

        Whenever a financial statement fails one of these conditions, SEC sets a flag in the system for
      . the school. lfthere an; ten schools ooven:d by the financial statement, ail len schools gct!hl:
        same flag(s) set lbis is appropriate for all the ~oodilions listed above. except ED compliance

      FSA I{CSpo<lSC to Dnft Audit Ropor1                                                               Pa&e 3 ofl'




ED-OIG/A04-D0014                                FINAL REPORT                                                    Page 33 of 46 

APPENDIX B – Written Response to the Draft Report




       i:;sues and late refunds. 1lH:sc two wnd.iliom ~bould only apply to me individual :oclJoo1 that has
       tnc condition, not 10 all the :;chotlls in the group. Howeve r, the SEC C()ll1putcr system sets the
       flag for all the s<:bools in the group for al l the reasons. This data is then transfem:d to ORNL for
       iIJdusion in the lAM system.

       There wen: only five schools out of        me:
                                                 total of I ,926 (.3 percent) publi<: sch.ools thaI had flags
       foc compliance issues audiO!"" late refunds. We do not bdieve this condition is a major impael on
       either an individual institution's financial responsibility assessment or its subsequent lAM SCOre.

       B. In add ition, infomIJjUon in PEPS relating to the total amount ofliabilitic:s assessed as II result
       ornoncompliance in a program review was nol always correct. Oflne 40 report fiJes reviewed
       for whi<:h II Final Program R....view Determination Letler (FPRD) had br,c,n issued, we found 11
       differences between liability amounts in PEl'S and tbe FPRD lettCffi. For these: I I files, the
       FPRD 1ette1"S showed liahilities toUling $ 8 16,805 while PEPS showed liabilities totaling
       SI 82,900.

       RESPONSE;

       8 . The data for the IG audit was e;llracted from PEPS on lune 26, 2003. Of the 11 cases
       ideutiflCd by the IG wlLere there appeared to be differences between the PEPS data and the
       FPRD's, SEC found that in foUl" cases the differences were C"auscd by data entry tim ing delays.
       In addition, in two cascs the I-'PRD was not issued 4IItil September 2003 and October 2003,
       respec1ively, which was after the: data was extracted from Pf..PS. In one c.ase, the data entry was
       conducted on June: 26, 200) (Chk:ago School or Massage lber.llPY), the same day as the data was
       extracted. However, none of these diff~nees affected the: lAM scores,. bcc:ausc liabi lities are
       Dol an individ ual so;hool in.put into the model. The model idt;nlifies schools with a probability of
       having liabilities grealrc than $ 10,000. an output. A delay in data entry of program review
       infonnation, while not desirable, is not SlIfficient to support the IG's daim that informat ion
       suhminc:d \0 lAM is incomplc:te.


       School                      Lillbili   S         Reuoo fer DifTcn:ncc

       Art Insti lute             $4.000                Data e       tifni" del.
       Tridept Tedmielll          $176,737              FPRD was issued on 9--6--2002 and dala was
       Col lege                                         entered 9-6-2002. Data entry COITCCtion in
                                                        PEPS 7 _8 ..{)] .
       Scot Lewis                 $1 1,904              $6,017.64 due lenders: S935 not dill.: as
                                                        under $1,000 limil; $4,890.90 school C"ash
                                                        return to Federal Accowlt at Insr.itulion.
                                                        Dc:fidency corrected by school prior 10
                                                        fPRD being issued.
       ·MDTI Business             $4 13,485             Data enlly timing delay. $ 163,529 .29 due 10
                                                        ED; $202;643.36 d ue to lenders; $47.3 12.36

       FSA Reospon:oe 10 Draft Audit   R~                                                          r.gc: 4 o f 16




ED-OIG/A04-D0014                                  FINAL REPORT                                            Page 34 of 46 

APPENDIX B – Written Response to the Draft Report




                               ~,~

                               revised 10




       For Scot Lewis, the difference between the PEPS :scm:n and the fPRD was a rt:SU lt of the Case
       TcllllU following the pr1Idiee of not reporting deficiencies that had hecn corrected by tnc schoo l
       either on-site or IOhortly thcrcaf\a. Therefore il i! 1I()1 an issue ofthc PEPS data being inc;omp1ctc
       becaust the Cas>e Teams were following current practKc. We will darify the procedure
       n:;giIrdi.og !he approprilUC reponing of do::;flCicncia in the program review repons in !he FPRDs.
       and   EDLs. and in PEPS.

       All Dfttle dala for the 11 schools has sino: been entered or corrected in PEPS. Please _      the
       aLw::Ded screen soots from PEPS for these ochool~. (See Appendix S.)

       C. Information pertaining to prognun review lindings in PEPS was also sometimes inac.;urate.
       Our review ora Jiample ofprugram review report files in the regional offices visited revealed that
       the CMO institutionaJ review specialists did not lIlways report al] program rcvi~w firldiogs
       identified during Ike review. TIll., TCsulted in incomplete infOnnlltion pertaining to program
       review fmdings being used to develop risk SCOC'CS.

       RF..sPONSF..:

       C. The Cwtttli pr1JC1ioe aJlowsco=ctioD!l offindingIJ on-s;te, Of shortly thc~. and the
       finding is DOt included in the fPRD. ProgJam review findings IIlC not included in the:
       'development oflhe risk srort:. How.::~.1l5 noted above.. we agree that......e need 10 deYelop
        proc.:dllCC$ in this an:a to incuase consislenty of documenI31x,n.

       FSA Rcspomc 10 Dnfl Alldil Rcpon                                                           Page S of 16




ED-OIG/A04-D0014                             FINAL REPORT                                                  Page 35 of 46 

APPENDIX B – Written Response to the Draft Report




       lAM Scora Did N(lt Always Predict Prnblemlltit lastitutiolU

       As part ofils institutional file review, IG compared the relation ship between the lAM risk
       score;md the fmdings of noncompliiUKXl at institutions identific;d in lIudit n;port:>. pJOgrnm
       reviews iIIId other documentation in the files indicating po~il:lle noncompl iance. In 74 of the
       1S5 school files reviewed (4& percent), there was no apparent re lationship between the lAM
       score and the finding.'I of nom:ompliancc identified at the institUlions. For these 74 files. 58
       institutions bad II high lAM score with a low level of evidence supporting nonoompliance issues.,
       and 16 institutions had a low lAM score with II high level of evidence supporting nom;ompliiUl<;C
       issues.

       IG found DO policies, procodures or management controls in place 10 evaluate the errectiv~ness
       of the lAM OI to ensUfC thllt the data provided to ORNL 10 ide ntify the most at-risk institutions
       was complete, accurate and applicable to the institutions being evaluated. By maintaining II risk
       ~ that docs not &c;curatcly identify the most IIt-risk institutions, CMO may be making
       ineffecti ve d ecisiom about the best usc of its n::soun:C:!II and incorrectly prioritizing ilS case
       llWI<IJ!;emcnt efforts.

       RESPONSE:

       We bdio;vc this fmding is iMCCUrllt~ fQr thrc<: rellSQns. TIle fiut is that IG used lIuiITQ"QlI
       dcfrnitiQn ofIKlIleomplillIlCC from the lAM definitkxl to reach their concl usion. The IG used
       audit and program review findings to Jefine noncomplinm:c. The lAM score focuses on
       predicting a vcry specific set of four outoomcs (sun:ty, reimbursement, fincs ovcr $1 Ok, and
       liabilities over $10k) to develop a quanti fiabl~ model_ /u; referenced eaclier from the Probit
       Papet, we believe the resulting model is a good predi ctor for Ih~s~ four outcomes.

       lhe second i ~ thatlG used the " Peer Gmup~ Probabi lity score in many instances, instead of
       using the national score for comparing noncompliance. SEC switched 10 using the national
       probability score for the FY 2003 li st. to assure that the schools with the highest probability
       natioowid ~ received scrutiny. We continue to calculate both scores. '!be nalional probability
       score ranks all schools against eac h other. and the peer group pI" N.bility SCOI"C ranks only those
       schools in a similar peer group to each other. These two scoring m~thods need to be separated
       for analysis because the meaning orthe two measures is quite different. For example, a sehool
       may have a   ww   national probability score, but may have a high peer group pmbability score
       relative to the other scbools in that peer group. Because ,,(this difference,. the lAM score SOC
       used in FY 200) is actually 2 1. 70 percentage points lower tIum the peer group probabiEty score
       reported by IG. As a result, [G misstated the lAM score in J2 ofthc: 74 schools cited (43
       pcn:ent). Sa: Appendix 6 for a compari~oD of the risk ~c()res using the national vs. the pcec
       group lIeore.

       The third is that IG scrutinized lAM as a distinct, indepo;rnknt application, not as an integrated
       toollhat is inherently aligned with the case m:uu.gemcnt approach. The lAM process is
       statistical and docs contain the possibility of false pos itives (school s incom:ctI y pr-edictcd to have



       fSA Rl:sporue 1<1 Dnft Audit Report                                                          P;W6of16




ED-OIG/A04-D0014                              FINAL REPORT                                                  Page 36 of 46 

APPENDIX B – Written Response to the Draft Report




      problems) and false negalives (schools inc.orrn:tly predicted to not have problems). SEC
      therefore took a risk.~averse, conservative approach in the fiNt rckase of lAM , erring on the side
      of caution by including more false positives, which rcsul t~d in more schools to TeView that may
      IJOI. have any problems.

      The lAM methodology of using the hig.hest. risk indicator (i.e., the peer score) WII..'I inilillily
      developed with the vicw thai Case MilIlaStr.i had access to II. wide v..n.:.1.y oftoob to make
      common sense dttisiorut on lII1()I1llliow; "false positiv~" or al least quickly role them oul. A
      good example oflhis approach is the California State University at Monterey Institute (OPEID
      0]260]00) thai the IG idCIltified because it has a high probability score <100 percent) and
      presumably no risk. ·Ibis was a new school, and the high probability score wunc primarily from
      a large change in fWlding lIS the school expanded. When the .' Y 2002 risk list was first
      ''published" for Case Team use, SEC managers d iscussed this particular school. SEC dccided
      that even though a school might otherwise be low risk, such a change in funding should be
      brought to the attention of ease Manager.; so that they can decide iftne issue needs to be
      pursued.

       To review the false negatives, we took. a sample of the 16 schools identified by the 10 that had a
       low risk score and a high level of evidence supporting noncompliance i3SIJCS (as dClcnnined by
       the 10's definition of noncompliance) and found that each one of thor: four >;ehovl.s in OW" sample
       was reviewed lIlI part of the standard casc management proocSl;(;:i. Although we arc not certain
       what criteria the 10 was using to make ajudgment that these schools had "noncompliance
       issues." our sample indicated we had appropriately case managed these .schools. The resulting
       aclions occurred due Ie other controls thaI Case Management Teams used, s uch as annual
       provisional ccrtilkation rcvieW!l, annual audit reviews, program. ~ew. from high default rdle
       or rcfcrrab" I.1/;;. Sec Appendix 7 for the results of the sample.

       ORNL conducted an analysis of !he previous penalty poinl ri sk model compared to the proposed
       PTobit model (The Probil Measure of Schools at Risk, November 2000.) Set: Appendix I. This
       analysis evaluated the effectiveness oflhe twu models and was the blllli~ fo~ adopting the Probit
       Model, which is the current model. In addition, ORNL ha.~ received a variety ofinpul irom 0dliC
       Team staff to evaluate for possible dumges to the lAM system. ''CM&O Risk Management
       AsscSlilllCIlt and Enhll.llCClRClll Project: U:ser Questions:mel [AM Information Requests" and
       examples o f comments, elc. are included in Appendix 8.

       Appendi~ 9 contains all of the procedures thaI ORNL uses 10 produce the risk list each year,
       ''Guide to Processing Data for tile Dcpanme:nt of Education Case Managemenl and Oversight
       Data Librury." ORNL uses a thU£l.)Ugb .'ilep-by-slcp quaJity procaiurc for processing data,
       including cleaning, scrubbing and cheek.ing data from FSA. Page II of Appendix 91 isIS live
       s(aIJdard qual ity assurance checks such as missing values, duplicate OPElD's, etc. Step-by-step
       instructions begin on page 15. One of the final steps of the edit process is a data file load of the
       latest Eligibility Table. Schools that have closed or are no longer eligible wnuld not be included
       on this table. SEC staff participates with ORNL stafflo assure that the process is followed and




       FSA Rl:spon5C In Dn.ft   """if:   Ropon                                                   I'age 7 of 16




ED-OIG/A04-D0014                                 FINAL REPORT                                            Page 37 of 46 

APPENDIX B – Written Response to the Draft Report




       questions arc aoswen;d. ORNL continues 1.0 UIiC Ihis Quality I'rogrant, and each year they have
       improved !he extnlct-tntnsfonn.J.oad process by automating data transformatioru; by migrating
       from an A&ee:s:J dalllbax to Foxl'ro, and utilizing better software such as SAS 10 perfonn their
       analysis_

       RECOMMENDATIONS
       We recommend that the CbiefOpcrating Officer for t'edera1 Student Aid require CMO-HQ to:

       1. 1   Develop and implement management conlTOls to ensure thnt the data used to identify the
              most nt-risk institutions are oomplde, !l<;CUnltc. and applicable to the institutions ""'iDg
              tvaiuaxcd.

       RESPONSE:

       We agree thai ~ will develop and implement a process to val idate critic.al data in PEPS. 1-'SA
       bdicvcs tlw.1lK; data nCl<ds to be complete and accurate regardless o fthc "system" used 10
       determine the probability of risk.


       1.2    Darelop.3 methodology for evaluating the effectiveness of any risk assessment modcJ UliCd
              to identifY thc most at-risk institutions for potentiailOSli of Title [V funds.

       RESPONSE:

       We agree that ~ will evaluate the effectiveness o fthc FY 2004 Compliance Initialive. FSA is
       currently using the FY 2004 CompliarK<: Initiative, not solely lAM, to ide ntify schools with a
       potential for noncompliance in identified a reas.


       1.3    In the absence of an effective risk model, prov ide gu idance to the regional CII.'>e
              management teams for identifying institutions for program review and technical assistance.

       RESPONSE;

       One: of the basic requirements and a continuing function of the case m3rragemen.t p~!f after
       performing a comprehensive revicw of all functional area inronmllion is for the teams to
       rewmmend appropriate ~ steps. This includes making recommendations to pcrform program
       rwicws, refcr for administrative !lCtion, or proviok tCl<hnical assistance. However, we agree that
       a!I additional data analysis is pcrfonncd that identifies additional data outliers, wc will provjde
       Ihesc " potcntial" risk issues and guidance for resolution to the Case Teams. In fact, we
       performed analysts and identified s~veral schools to be worked by the Ca5C Teams as the ir "risk"




       FSA Response to lhft Audit Report                                                          Page II (If Iii




ED-OIG/A04-D0014                             FINAL REPORT                                                 Page 38 of 46 

APPENDIX B – Written Response to the Draft Report




       list in the FY 2004 Compliance Initiative. TIle training o ftraincrs for this current compliance
       initiative took. pl~ on July 27 28, 2004. Sec: Appendix 10 for an Excc.:ulive Surnrn8l)' ofthc.
                                            +


       FY 2004 Compliance Initiative. Managemeot Improvement Scrvi<:es (technical. assistance)
       ~UI~ were issued in July with an e ffective date of August 1,2004. Tr.lining was ~onductcd
       July 29, 2004.


       Finding No. 2 - The Program Review Process Needs StrengtheaiDg
       Management controls over CMO' s program review process Deed sucngthcning_ 10 review of
       program review repon files and interviews of case learn mcrnben in four regional offICeS
       identified weaknesses in the prog,rnm review process, reporting process. record retention. and
       consistency in the program review process acro~s regions. l1Us ocelMCd due to a failure to
       comply witb existing policies and procedures and a lack. of dctailed policies and procOOurt::i for
       some compliance areas.

       OVERALL RESPONSE:

       We generally agree: that: intemal procedures should be strengthened and this response
       swnmarizcs those procedltts we are enhancing currently. The Pmgnm Review Guide and the
       expertise II.nd kno wledge of our program reviewers provide II. good basis for identifying and
       reporting instances ofnoncompliarn:e. Whenever an issue of noncompliance was identified the
       harm or polential llano 10 the programs was remedied. We do not believe that our current
       procedures requiring schools 10 corro;:t defieiencies!llld come into compliance, which is the
       ovemll goal. of monitoring, leads 10 inequitable treatment o f schools.

       An example of the s trength of our program review process i~ the fact that program review
       fuxling$ are su:stainod dwing the appeal process. When dollar amounts are reduced, it is usually
       because the school WlIlI able to provide additiomd documentatiun that was 00( a vailable at. the
       time the Case Team reaclJcd il.:l ~onelusiUIL The finding itsclfis not changed. This sustainability
       demonstratcs that the program review process is s uccessful in idelllifying and correcting
       noncompliance.

       EUelS CRIb Review

       The 1G was unable to detennine whethe r or not Institutional Review Specialists adequately
       reviewed exeess cash as part ofthc: program revi ew. Detennining whether- ()( nol an institution is
       maintaining excess cash is part of the fiscal review 10 be performed during the program review.
       "{he Program Review Guide outlines the procedures for perform ing a fiscal review of
       institutionaJ ~O£ds to detenni..., noncompliance with cash management regulations, aJld
       requires th:1I documentation be maintained to support th i~ review_ Tbe IG did nol lind suffICient
       doeWIll:ntation to support the; conelWiioru; Inched in the excess cash reviews ptt{onned.




       FSA R"'P"""" 10 Draft AIKIiI Rcpon                                                       Page: 9 oll6




ED-OIG/A04-D0014                                FINAL REPORT                                          Page 39 of 46 

APPENDIX B – Written Response to the Draft Report




       RESPONSE:

       Aco;:oruing t.u the Program Revio;:w Guide, documentation is needed to verify a finding. If
       Ieview~'n did not identify any fiscal findings, then lha"e wuu1d not be any documentation. We
       agree to clarify that the fiscal review s hould be doeumenltxl, whether there are finding!! or not.
       In addition, it is possible that if a focused program review is cond~ed, and the focus is
       something other than fi:scal, such as o;:ampus crime, there would be 00 fi5l:al review.

       Reportiag I'roeeu

       lmItitlltional. Review Specialist, did not always report all fmdingli in progrlllll review rqKnt:!. In
       two ~!\i~ vi~i«:d, although a £indios w'"" ilkntifioo, the n:vicwcrs did nol c itc the com:ct
       nwnbcr ofstudent fi le review exceptions for the finding. Instead, the reviewers c ikd a few
       c xamplcs orthe student files containing the problem. In addition, the revic~ sometimes
       reso lved flDdings while o n-silc and neither reported the problem nor the numbernf student file
       review ~'Xceptions in the program review report.

       RESPONSE:

       As mentiODed in Finding I, the current practice stales if a finding is corrected on-si te at the
       school, or shortly thereafter, that the finding would oot need to be included in the FPRD. iflb::
       reviewer e<meluded that the dcficiellC)' was irK:an:iequentiai considering both '1wlil.ativ.: and
       quanlil.ative f!IICtors. It has been standard proo.:dme followed by rev iewers to usc their
       profcssiooaljudgmenl in determining wheth.,.. the noncompliance issue identified has any
       material significance in the administration of the Titl e IV program. We do recosnizc: the
       importance of documenting these pruccd un.::<! and wi ll include this issue in our revised

       """"'~
       Som~ reviewers make note~ on th~ \YO£k:sheet (space for comments provided) if they have
       questions about 50mething they see in the files or written in a c alalog. but this docs flot
       neo;:eSSitrily mean then:: is a finding. T hese no!!:s remind the reviewer 10 verify the Questions with
       schoo l ulIiciills o r through other rcsoW'CC~. If the institution can provide an explanation ur
       correct th~ problem, that question does not: lIutomatically become !II finding.

       lbe fmding on RETS Tech Center is inaccurate. This was a program review ro examine
       dependency overrides. Because 00 guidance had been provided to schools on this ilOSUC, all
       scboob with thi.$ type of program rev iew received OJ SpcciIIJ. Detamination Ldtr:r {lither than an
       FPRD. R£TS Tech Center should not have been included in "reviewers did not repon al1
       findings . ......

       Ad~qllllCY   of Full File ReYie'fV Re$ults

       rnstitutional Revi~w SpecialislS did not adcquat~l y r-evi~ w the results of insti tutional full file
       n:vi~w:s performed in response to prugr.nn review findings. The review oflbe         institutional fuJI
       file revK!ws were inadequate because the reviewers did not ident ify tbe fact that the institution



       FSA R<:spDJI<C 10 Draft Audit RqIM                                                           P..ge /1)01 16




ED-OIG/A04-D0014                              FINAL REPORT                                                   Page 40 of 46 

APPENDIX B – Written Response to the Draft Report




      failed w include all student exceptklns. in the doelll'l\el1tlllio n 5ubmitlc:d fUT the: full file n;vicws.
      In addition, lhc n:vicws did nOI verify rdund calculations submitted by the imlilUtions.

      RESJ'ONSF..:

      Aeeording to the Progtam Review Guide - Chapter m, page 18, ira me review is rco;civcd, the
      reviewer determines whether verifi cation o(the resWu is required lbe Program Review Guide
      indicatcs lhat documen!atjon will be obtained to sw1ai.n the liDding. ADd 11S noted above in the
      Overall Response to Finding 2, SEC has been successful in surutining program review l ind~.

      We reviewed the fo llowing two so.:hools that wen: identified by the 10 in whieh SEC lIill nUl
      verify the full file fev}ew. The results are ~'WIUIUl1izcd below:

          Trident Technical Colle8e. SEC was successful in obtaining the return o f over $ 116,000 10
          ED and to leaders. 1ben:fon::, we an: unsure of the basis on whieh!he IG daims dW: the file
          review was ioIdequate.

          Victoria Beauty College - The 051211200) program review repon required the school to
          rtt.OnSIJutt the   following:

              Fi!lCal rec:onb (.w the 2000-200 ( and 1001-2002 awtud yc:=o (Findill!; Nl )
              Rcturo o!Tille IV eaJ.eulations (or the period of October 7. 2000 to the end dale: uflbe
              program review. April 11, 2003 (finding #4)

          After review of the institution's 061301200) response 10 Ihe program review report
          findings/requirements, in the 08J2()!200) fo!low-l.lp letter, lhe institution was giVi:n an
          uppurtunily to provide additional "explanatioo and/or docwnentatioI1~ fO( Finding II I aDd
          Finding 14. In addition. the: ill5litutioo _ infonoed tMt it! tctml ofTiIk IV caleula!ioos
          m::onstruetion was not acceptable since it f'ailed to eomplccc al l oflhe steps to arri Vi: at a
          correct return eakulaIion. The institution was afforded !Ill opportunity to redo iu u1eWation.s
          and provide the revic:wc:r with copies of the rccakulat;oo, as wetl as documentation on
          itutituti onal eharges assoeiaLed with the payment petioli upon whidi C1lCb caleulation was
          based. Based on the school's ~, the 1010812003 FPRD dosed the review with 00
          assessment ofliabililics due to !be Department. This is evidenee that we did conduet a
          review of the file results submitted by tnc institution.

      We will review the procedwes for file reviews and re vise the procedure.'! appropriately.

      Reten1ioD ofSupp4fr1U:tC Doeument:a1ioo

      We identified weaknesses in lhe retention of documentation 10 support program review results.
      The rime period for which doewnentation wall retained to support program review findings
      varied itClOSIII the fOUl region.<; visited. We noted differenees in the time period for which
      documentation to support prognun review Hndings was maintained.




      FSA RespollK to DrIft Audit Report                                                               P~tlofl6




ED-OIG/A04-D0014                                FINAL REPORT                                                    Page 41 of 46 

APPENDIX B – Written Response to the Draft Report




      RESPONSE:

      SEC's record mention poliey is found within the Departm~nl ofEd~tion Records Di:;position
      Schedules. We win put a link lothi:;; pol icy on CMONetnnd inform theCasc Teams o filS
      location and use via a proceduics memo. L[needed, we will aJso ~onduct approprilitc training to
      ensure thaI the e""" Teum~ un: IIWl1n: o f and undct'$tand the r\:l;uro n:t.c:ntion pn,K;cdlUCS. The
      Case Team Self I\s:!lessmc:nt that was i5Sucd in Scp(embtt 2002 addusses sc uin8 up amI
      maiDtaining appropriate files on page 12. See Appa1dix 11. These proccdl1f'eS WCf'C re-sent tv
      the Case Teams in June 2003, .... again in August 2004. AdditioMily. we will rntcw these
      proccdwn 1.0 determine ~ c hange!!;.

      lnmnsuttlldel W ibe Progr...... Review P rocess AerGSS Recio"s

      The IG identified three intOnSistmc ies in !he program review proocss across the n:gional offices
      visited. First, the melhodology for selecting and performing ofT-site program revjcws was
      inconsistent. Scoond, then: wen: diffcrcDCC:S in Inc weigh!; placed on factors COD5idered to
      determjne whether or not a prognun fCVtcW was wamrnted, Finally, the actions taken as a rc."IlIIt
      of progr-.un reviews with simi lar findings and liability assessments varied across regions.

      Rl'.SPONS£ :
      One o f SEC's IUOnitoring activities is conductint, prog.nun revi~ On-site prog.ram reviews
      n::quire s ianiflCallt resourt:CS and are used most often when other monitoring activit ies have not
      fully addre5Std all the issues. Off-site program reviews provide flexibility and another method
      to ensure compliance in situations !hat do not require!lll on-she prcscnce. For example. schools
      tluat appear 10 be: iDoort"a:t1y p!"OI1ltill(; loans for programs kss thI!..n 900 boun [end themselves to
      o ff·site reviews fOCl.l:SCd on just that issue. Off·lite revH:ws foe Direct Loan exCC$l edt.c: abo
      appmpriaI;e. 1bese an: focused R:views that identify and assess liabilities in one compliance
      area.. Tl\e$c reviews weT( very productive and wen: aceuratel y TCCOfded ilS <Jff-sile reviews.
      FSA '5 FY 200 1 AlVlnal Report did DOt state thililhe $S8 million in liabilities was as a result of
      both on-site and ofT-site revieWl'l... but thllt amount is aceurate, as is the 172 on·site reviews
      reported. FSA is very proud ofCMO 's drofts tu n:viewand n:cover Din:ct Loan excess casb.

      ManY:l(:hooI cases appear si milar when reviewing the numba" of rmdiogs and the liabilities.
      However, not al l findi ngs IU'l: of equal severity. or resu lt in a Ill3tcrial wcaknes.5 . Eac h case must
      receive a compt"ehensive review by analyzing the school' s financ:ia.! strength W i1.1
      admini'ltrati ve capabi lity, size, runding level, e tc., and then concluding as to whether any
      deficiency identified warrants on-site action. The Ca'ie Te3ms then apply Iheir knowledge ofllle
      situation and their judgment to detennine the action that wnuld result in successful program
      monitoring. while also proIecting access 10 oducatioo.. This is another area. we wiU address
      tJIrou&b enhancod procedures 10 e!lSllre consistency, including guidaJv:e that will result in IllOfe
      uniform decisions on.....nat type of review 10 do or a.msUwce to provide, as well W$ monitori ng
      and pro viding QC checks.




                                                                                                    I'age t20( 16




ED-OIG/A04-D0014                              FINAL REPORT                                                   Page 42 of 46 

APPENDIX B – Written Response to the Draft Report




       RECOMMENDATIONS

       We recommend that the ChiefOperaling OffICer for federal Studenl Aid require CMO-lIQ to;

       2.1 E.'ltablisb detailed policies and proctdurc:s OYer supervisory review, rocord retention,
           off-~ite prognuJl reviews. specifIC items for making a prugl'lIIIl review or technical
           imiSlaDCC the appropriate rnooitoring action, and the appropriate action 10 be takeD as a
           result of II specific compliam:e is.... ue i<kttified at IlIl institution.

       RESPONSE:

       We agree with this =:o mmcooatioo. and we will be developing guidc:lincs and procedures 10
       adWeSll these issues. We arc developing a draft action plan, and will be discussing the plan
       during our September 14 - 16, 2004 Managen' meering, whieh includes all Division Directors,
       AJea ~ Directors, and Co-Team leaders. Our proposed completion date is September 30,
       200'.

       2.2 Develop a quality c:c:mtro1 process to eJ'lSW'e regional compliance with the policies and
           proc:cdures over the program review function and consistency across the regions in doci:JioDli
           pertaining to monitoring IICliQIIlI taken and enforcement actions in the event of
           noncompliance.

       RF.SPONSE;

       We 8gr= with this -':UmmeodatiOD, and we will be dcvdoping gllidelines and proccd~ to
       addJess these issues, including a ooUabonnioo procedure for Case Team ,UafflO raise awareness
       orand consistency wilh trelUnCot for emerging compliance issues, We IU'l: developing a draft.
       action plan. and will be discussing it during our September 14 - 16,2004 Managers' meeting,
       which includes all Division l)irectors, Area Case Directors, and Co·Team Leaders.


       Fi.ding No. 3 - Tecbnksl Assistance Wus Not Adeq utely Docume.ted Or
       Followed-Up On

       Ie identified problems with the duc:wnentation oftccluiif,;A( 1ISlI.i:.uo;:e and a lack of followup by
       regional offices on the results ortechnical as.!istan<:e. IG reviewed 40 instances where lechnical
       assistan<:e was provided and interv iewed various mcmbeni of the case tearns in the four regional
       offICeS visited. Two regioru did not docuncnt teehnk:al assistance in tbe CL~ Management
       Information System (CMJS) at !he institutional file, and one region did nol document technical
       assistance in PEPS. llw'ee regions perfurmed informal rouo'MIp and one n:gion perfOlTllCd flO
       followup at all. (n three regions, followup was not documental.



       A) Institutional R~vi~w Spcciulists and lnstitutionul lroprovcmeot Spe~iali'ls did nol ul.WlIy ~
       docwrn::nl tedmical i $$istanee in CMIS, the institutional file, or PEPS.




ED-OIG/A04-D0014                              FINAL REPORT                                                Page 43 of 46 

APPENDIX B – Written Response to the Draft Report




       RESPONSE:

       Four out of eight oflhc instan!;e$ of technical assiSlaDCC noted by IG wete technical assislancc
       uaining,1IJld coded as such in PEPS_ CUI'I'eIIl proocdutes do not require that I1'8ining be
       documenI:ed in e MIS. Ho.....eveT. DOCHOA the nininashould ha"" been indlMkd in PEPS. Al l
       o(the notes have lime.: been entered into PEPS and/or CMIS, lIS appropriate. See Appendix 12.

       No Formal Follow-lip PnKedUI'el for Tedaoital Anist .. ce

       8) CMO did not follow up vn wht:ther OJ" 001       impro~erm:nl   hail been made at institutivns
       ~eiving Ie<:Mi cal assislAl\ee visits.


       RESPONSE:

       While we: diygrct that no folloW\Jp was pafonncd, SEC iss ued new Management Improvement
       Serv~ (technicalll::lSi~) pnx:edwcs in mid-July 2004 for the purpose of clarifying plliciC3
       and proeedures to be followed, and tmining was c:.ond~ on July 29, 2004. Those pr~
       include selecting $Choolll fo." technical assistance, ~ \ISC: of OOI'TCd..ivc action plans. propc:r
       documcutation,lUld followop. The pmccdwcs were effective Auglbt 1,2004.

       IG JJOI:ed regional differences in impIemmti.ng the New Schools Initiative to evaluate the
       effectiveness oftlx:hnical assistance stralcgies. Techn ical assistance is nol only provided by the
       Institutional lmprovemeflt Spc.:ialm. OthI:r Case TCMl'l members can and do provide technical
       assi~"tan<;A:, whi.ch i¥ Lhe difference noted by IG. Sf:.e co nsKlCI3 Lhis flexibility 10 be W1 ntlvltlliage
       in allocating CMc: team resources and .addressing workload.



       RECOMMENDATION
       We rcco~ that the ChiefOpt'f'3ting OffICer (Of Fed~ Student Aid require CMO-I'Q to:

       3. 1 Dcvdop and impkmcnt policies aod pll)(:(:dun:s for
            • providiRg technical assiswn:: in II; consisteIlt manner across a1lrcgions.,
            • documenting the tcclmicaJ assistance provided,
            • identifying when technical assistaore cnds and enfon;c:ment begins. anJ
            • fo!lowing up on technical assistance visits and measuring the dfecliveness of it as a
              compliance/monitoring tool.

       RESPONSE:

       We ~ with thi~ ~nuncndation, and WI' issua.l the new Management lmprovcment Sc:rvius
       (technical assi5tance) procedures in July 2004, effective A ugust 1. 2004. Procedures include
       sck:ctina :schoo15 (or leclmic;a.l assistance., the L1SC ofoorrcctive action plans, proper
       OOcllmCfltalion and follo W\Jp. Training was conducted on July 29, 2004, and ... woril.group
       formed 10 improv.: the data coll= tion on these scrvic-cs for df!,lCt ive analysis.

                                                                                                      p.140116




ED-OIG/A04-D0014                               FINAL REPORT                                                    Page 44 of 46 

APPENDIX B – Written Response to the Draft Report




       Fi.ding No. 4 - CMO-HQ Mo.itoring of Regional Office Operations Needs
       Improvement

       10 's review ofCMO-HQ procedures and ~ fOT mooi toring openuioos o frcgjonal offices
       identified key management control areas that necd impIOvcment. They found that CMO-HQ did
       not ( I) monitor regional offices usc of the lAM, (2) provide guidance to regional offices as to
       which in.'llitutions to :!elect for case management in the absence o ran updated lAM risk list,
       (3) monitor rqional o fTtceS compliance with intemal. poIieies and proced~ over program
       review and technical assistance, (4) evaluate the effectiveness of program revi~ conducted 01"
       the colISistency of regional offices' selection of ~tu\ioDs for ~ew, and (5) evaluate the
       effectiveness or coosi:stcncy of lhe enforcement actions taken!l.S a result ofrcgional offke
       reviews..

       RESPONSt::

       We disagree thai each region IlII$ itdoped its o wn methodology {or identifying schools {or case
       management in the absence DCa FY 2004 lAM liSl. All Case Teams were ease managing schools
       as a result c freccrtifK:a! ion, audit resolution, financial analysis, prognm review and o ther trigge r
       events. They were oot waiting for a new ri sk list &om lAM. In reviewing the use o r the lAM
       list, SEC round th3I:!be lAM scon::s ofte:n lagged behind evcnts, WId the CIl!;C Teams had already
       rcvi..-c:d UU,l school. Thm:rore, based on this input from tbe Case Teams, SEC management
       decided to trc:aI iAM soores as IIllOlhcr iodiwor, 001 the: d eciding indicator. for case man..aging
       schools.

       With regard to document retention, during the audit period the Case Management Tellfos had
       been operating on outdated guidance o frclaininS documents for seven years.. Becallse the file: in
       Washington, D.C. _ considered the "official" school file:, regional o ffices were: gi ven the
       appIOVIII many yean ft80 to PIU"'8C their records. (See Appendix 11.) Once we implemented
       El a:tronic Records Maruogtmmt, "'" r~vo:d furtho:r guidance from thc Office ofGcncrill
       Counselllwl revised those proctdures. This revi sion was shared with the teams in September
       2002 and is being formal i7.cd in oW" Electronic Rt eords MaiUlll •:ment plan (also included in
       A ppc:ndix 13).

       As a process improl'allCl1l. we,.}so agree to develop more guidance rchted 10 selecting
       institutions {or review, ensuring consistency in elIforcemenl &etKln$ and monitoring the
       effectiven~s o f program reviews and technical as.si stan~. Our most recent FY 2004
       Compliance rnitiative provides our teams with tnlining in the: id entiflcation, documenta.tion, and
       ~Iution o f each of tile reported areas of non-oompliance; this also indudts a specific process
       approach to improve consistency in enforcement actions. 1bis initiative a lso provkles for a
       continuous feedba<:k loop 10 closely monitor Case Team adion$ zmd gather results to inform
       future data analysis initiati VCll and progrMI monitoring oppor1un.itiC$.




       !'SA Rnpo"M '0 !)nil Audit RtpOrI                                                          PaJ:"   I~   of 16




ED-OIG/A04-D0014                             FINAL REPORT                                                      Page 45 of 46 

APPENDIX B – Written Response to the Draft Report




      RECOMMENDATIONS

      We rcoommend that the ChiefOpcrnting OffIC\.1" of Federal Student Aid require C MO-HQ to:

      4. I   Impkment management controls: thaI provide for consistent treatment o f insti tutioos across
             regional offices.

      4.2    ~Io p    internal policie'l and plOccdun::a thaI provide for management o-versight ofCMO
             opaatioas.

      RESPONSF.:

      While we g=<:nilly agree that management cOIJ lrols and procedures ean be improved, we believe
      thai cl.-rently FSA has an appropria1e lIversigbt and monitoring process in place. Our
      perfomumcc plans have continually WId succC$sfully demonsuatcd emphasis on improvcmentli to
      progrom monitoring and will continue to support the case management proces5 o f integrating
      inform.ation from. different functional areas. lbis ptOCqS PpproaCh has povcn 10 be effective in
      identifying «>mpIillnCC issues and pmviding information to ow- sWffOf making eligibility IWd
      enforcement dccision$ regarding inWtutions. We also recognize th3l to achieve the "best in the:
      hwillC'i~" in ovenight stratcgie5 and dc:sired outcomes, we must continually WQlk to improve o W"
      pr-occsscs. 1llcrcfore, we ~ dcvelopi.ng an itCIion plan 10 identify and enhance appro priate
      procedures. This plan wi ll boJaoce identification of appropriate coneetiV1: aetioll$ as allowed by
      regulation and legislation, while ensuring program integrity and ~Sl; fOC" students to
      cducatiooaJ opporfWlitic:s. I'IQX ~c that Ihc planned eCMO inili",li~ is expected to fUltba­
      a.ssiSi the teams and our rn:IIlIIgerncft: to improve COTllliSlC:llcy in program oversight of schools.
      FSA will begin to gather requirements for cCMO in FY 2004-{)5.




                                                                                                ~t601 I6




ED-OIG/A04-D0014                             FINAL REPORT                                               Page 46 of 46