oversight

Fourth Quarter and FY 2007 Summary, FCS Survey

Published by the Farm Credit Administration, Office of Inspector General on 2007-12-01.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                       Fourth Quarter and Fiscal Year Summary Report

      Office of Inspector General’s Survey of Farm Credit System (FCS) Institutions
                      Regarding the Agency’s Examination Function

                                               2007

Introduction

Based on the interface FCS institutions had with the Agency's examination function during the
period July 1 – September 30, 2007, OE identified 16 FCS institutions that were in a position to
provide meaningful survey responses. (Institutions are surveyed no less frequently than every
18 months and, generally, no more frequently than every 12 months.)

The OIG sent surveys to those 16 institutions on October 22. A follow-up e-mail was sent to
nonresponding institutions on November 27. Of the 16 institutions surveyed, 11 submitted
completed surveys. If the 5 nonresponding institutions subsequently send a completed survey,
they will be included in the next quarterly report.

One response to a survey issued for the third quarter was received subsequent to the third
quarter report and is included in this fourth quarter report. As a result, this report covers a total
of 12 responding institutions.

The OIG will continue to provide an e-mail report to you based on each fiscal year quarter-end,
i.e., December 31, March 31, June 30, and September 30, so that you may timely take whatever
action you deem necessary to address the responses. The September 30 report will continue to
include fiscal year summary data for each of the ten survey statements.

The survey asked respondents to rate each survey statement from "1" (Completely Agree) to "5"
(Completely Disagree). The rating options are as follows:

       Completely Agree                1
       Agree                           2
       Neither Agree nor Disagree      3
       Disagree                        4
       Completely Disagree             5

There is also an available response of "Does Not Apply" for each survey statement.

Narrative responses are provided verbatim, except that any identifying information has been
removed and any grammatical or punctuation errors may have been corrected. Narrative in
“brackets” is explanatory information provided by the OIG based on communication with the
institution.

Survey Results - Fourth Quarter FY 2007

 1. Average numerical responses to survey statements 1 - 10 ranged from 1.5 to 2.3 (third
    quarter range was 1.8 to 2.3, second quarter and first quarter ranges were 1.7 to 2.2).
                                                                                                2

 2. The average response for all survey statements was 1.8 (third quarter average was 2.0,
    second and first quarter averages were 1.9).

One institution rated survey statement 8 [Examiners fairly considered the views and responses
of the board and management in formulating conclusions and recommendations.] as a "4"
(Disagree). See the first bullet under survey statement 8 for the corresponding comment. [The
CEO explained that the examiners looked only at a point in time versus considering the trend
when evaluating return on assets.]

The majority of narrative comments to survey statements 1 - 10 were positive, many very much
so. However, 23 percent of the comments were negative to varying degrees. These comments
are listed below under numbers 3, 6, 7, 8, and 10. They may provide opportunities for you to
refine examination methodology and communications, and examiner training.

Survey item 11a asks for feedback on the most beneficial aspects of the examination process.
Consistent with prior quarters’ survey responses to this survey item, many very positive
comments were provided about the examiners and the examination process. One area that the
System responded very favorably to is the streamlined approach to examination, including off-
site examinations. Their comments center around the time and resources this approach saves
the institutions.

Survey item 11b asks for feedback on the least beneficial aspects of the examination process.
There were only a few comments received to this survey item. While none were very negative,
they may provide opportunities for you to refine examination methodology and communications,
and examiner training.

Survey item 12 asks for any other comments. All comments were positive but one, i.e.,
“Seemed to be a conflict of opinion as to why the examination was conducted. Was this a
routine examination or because of recent credit problems discovered by the institution?” [The
CEO explained that because they normally do not receive an on-site examination they were
wondering why the examiners chose to come on site. The institution had recently self-identified
some credit problems and, while the examiners acknowledged this, the institution was not clear
if these emerging problems were the reason for the on-site examination. However, the CEO
expressed complete understanding that such emerging credit problems were a good reason for
the examiners to want to come on-site.]

Survey Results – Fiscal Year 2007 Summary

For fiscal year 2007, the OIG issued 72 surveys and received 57 completed surveys. This is an
80 percent response rate, which is very favorable. The response rate for 2005, the last full year
the OIG surveyed prior to updating the survey, was only 63 percent. The improvement is due to
the revised format of the survey; the survey’s ease of completion and submission, i.e., all
electronic; and our new follow-up process on surveys distributed.

See the Appendix summarizing numerical responses to all ten survey statements for all 57
responding institutions.

Survey statement 9 [The results and recommendations of the Office of Examination’s national
examination activities (e.g., information technology, finance, credit, etc.) and its reports on
identified best practices have assisted your institution.] has consistently received evaluations
somewhat lower than the other nine survey statements. While some of the comments received
                                                                                                 3

for this survey statement have been favorable, about two-thirds were not. This may offer an
opportunity for OE to make adjustments to its national examination activity methodology and
communications.

Responses to Survey Statements 1–10

                              Risk-Based Examination Process

Survey Statement 1:          The scope and depth of examination activities focused on areas of
                             risk to the institution and were appropriate for the size, complexity,
                             and risk profile of the institution.

   Average Response:         1.8 (2.0 third quarter, 1.9 second quarter, 1.8 first quarter)

   Comments:

    •   Appropriate concentration on key management areas identified in last review; business
        continuity, information security, internal audit.
    •   Scope and depth was very adequate in relation to the institution’s present risk position.

Survey Statement 2:          The examination process helped the institution understand its
                             authorities and comply with laws and regulations.

   Average Response:         1.9 (1.9 third quarter, 2.2 second quarter, 2.0 first quarter)

   Comments:

    •   Good suggestions on Audit Committee self-evaluation and training.
    •   Constant reinforcement of laws and regulations is necessary, especially in areas that
        are infrequently dealt with.
    •   As the Audit Director, I appreciate the FCA examination process.


Survey Statement 3:          The results and recommendations of the examination process
                             covered matters of safety and soundness, and compliance with
                             laws and regulations.

   Average Response:         1.9 (2.0 third quarter, 1.9 second quarter, 1.8 first quarter)

   Comments:

    •   Many of the areas addressed focus on interpretation of technical issues, not safety and
        soundness.
    •   Concentration on recommendations from prior review related to internal audit,
        information security and business continuity. No consideration of financial soundness
        during this review.
                                                                                               4


Survey Statement 4:          Examiners were knowledgeable and appropriately applied laws,
                             regulations, and other regulatory criteria.

   Average Response:         1.7 (2.0 third quarter, 1.9 second and first quarters)

   Comments:

    •   Streamlined Examination process – no on-site visit this cycle.
    •   Examiners were well versed in laws, regulations and other regulatory criteria and did an
        effective job of applying this knowledge to the institution’s operations.
    •   Several of the examiners were fairly new to the team but in general were
        knowledgeable.

                           Communications and Professionalism

Survey Statement 5:          Communications between the Office of Examination staff and the
                             institution were clear, accurate, and timely.

   Average Response:         1.5 (1.9 third quarter, 1.7 second and first quarters)

   Comments:

    •   Very.
    •   Lead examiner did an excellent job communicating pertinent and helpful information,
        and did not take excessive amounts of management’s time doing so. He understands
        the difference between safety and soundness issues and management issues.
    •   Excellent communications between the examiners and us in terms of timing, scope,
        required documents, findings, etc.

Survey Statement 6:          Examination communications included the appropriate amount
                             and type of information to help the board and audit committee
                             fulfill their oversight responsibilities.

   Average Response:         1.7 (1.9 third quarter, 1.8 second and first quarters)

   Comments:

    •   Only minimally.
    •   The examiners met with the Audit Committee chairperson to discuss the results of the
        examination and the examiners also met with the Audit Committee to discuss their
        findings and recommendations and to allow the committee members to ask questions.

Survey Statement 7:          The examiners were organized and efficiently conducted
                             examination activities.

   Average Response:         1.8 (1.8 third quarter, 1.9 second quarter, 1.7 first quarter)

   Comments:

    •   FCA examiners had computer problems that created some inefficiency.
                                                                                                     5

    •   No on-site exam this cycle.
    •   The examination was conducted quickly and with minimal disruption to daily work.
        Good planning between the examiners and the institution allowed some of the
        examiners’ work to be done before fieldwork began.
    •   As is typical with learning a new job, efficiency varied according to experience level.

Survey Statement 8:            Examiners fairly considered the views and responses of the board
                               and management in formulating conclusions and
                               recommendations.

   Average Response:           1.8 (1.8 third and second quarters, 2.0 first quarter)

   Comments:

    •   In the area of concern of the institution’s ROA, only one quarter was considered. This
        was perfectly within the institution’s business plan budget and should have been
        addressed at the time of the business plan review. The very next quarter the ROA rose
        considerably above the minimum requirement set by FCA. The institution is not sure
        this call was made by the examiner-in-charge but very possible by his supervisor who
        may not even had seen the business plan.
    •   The Audit Committee chairperson’s and the Audit Committee’s views were considered
        when the examiners formulated their recommendations.
    •   Examiners were very open to listening to the institution’s views and responses.
    •   No on-site exam this cycle.

                           Best Practices and Regulatory Guidance

Survey Statement 9:            The results and recommendations of the Office of Examination’s
                               national examination activities (e.g., information technology,
                               finance, credit, etc.) and its reports on identified best practices
                               have assisted your institution.

   Average Response:           2.3 (2.3 third quarter, 2.1 second quarter, 2.2 first quarter)

   Comments:

    •   Best practice suggestions (audit and security involvement in new application
        development) were already on our work plan.
    •   The national examination activity did bring us up to date on an outdated VMI [assume
        Variable Mortgage Interest] form and corrections needed to some policies.

Survey Statement 10:           FCS-wide guidance from the Office of Examination (e.g.,
                               bookletters, informational memoranda, etc.) was timely, proactive
                               and helpful.

   Average Response:           1.9 (2.1 third quarter, 1.9 second and first quarters)

   Comments:

    •   It is very difficult to be certain that we are receiving all the information since it is
        electronically filed.
                                                                                                        6

       •   General guidance is helpful and timely. New system-wide memos are a good way to
           identify and react to specific hot-button areas without the need for soup-to-nuts
           examinations.

                      Responses to Additional Survey Items 11a, 11b, and 12

Survey Item 11a:                  What aspects of the examination process did you find most
                                  beneficial?

   •       Off site exam—required little of the institution’s time.
   •       Re-emphasizing the need to ensure compliance with consumer disclosure laws and
           regulations.
   •       Identification of credit administration weaknesses/strengths.
   •       Its efficiency. With flexibility of streamlined exam process, FCA resources can be
           directed where appropriate based on risk/performance. Exams for associations which
           pose little risk and have history of performance can be done off-site, allowing association
           resources to be directed to customer service and mission initiatives. Also, FCA’s use of
           technology makes exam process and reporting very efficient and convenient.
   •       No on-site examination activities.
   •       That the examination is completed off site and [our operations are] not interrupted.
   •       Continued emphasis on information security and business continuity topics is helpful, as
           they are key focus areas for us going forward.
   •       The President and CEO are fairly new in their jobs and both appreciated feedback on
           how the institution is completing its mission.
   •       The open communication between examiners, Board and management.
   •       A regulator’s look at our institution keeps everyone focused on the safety and soundness
           of the institution.


Survey Item 11b:                  What aspects of the examination process did you find least
                                  beneficial?

   •       Nit-picking the annual report.
   •       Best practices recommendations did not always fit for the size of our institution.
   •       Can’t think of any.
   •       None.
   •       The fact that this is probably the best shape that the institution has ever been in yet we
           still get a CAMELS rating of 2.

Survey Item 12:                   Please provide any additional comments about the examination
                                  process and related communications.

   •       Seemed to be a conflict of opinion as to why the examination was conducted. Was this
           a routine examination or because of recent credit problems discovered by the institution?
   •       Good focused review of high-priority areas. Well planned between the examiners and
           the institution, executed quickly and painlessly while in the field, findings were reported
           clearly and with no surprises.
   •       Feedback was appreciated.
   •       Overall, the examiners were fair and provided valuable guidance.
                                                                                                                                      7


                                                               APPENDIX


                 Results for FY 2007 of Numeric Responses to Questions 1-10

                                                            Response

                 Completely                        Neither Agree                      Completely          Does Not
                                      Agree                             Disagree
                   Agree                           nor Disagree                        Disagree           Apply *          Total     Average
Question                               (2)                                (4)
                    (1)                                 (3)                              (5)                (6)          Responses   Response
                     % of               % of             % of               % of           % of                % of
            #        Total      #       Total      #     Total      #      Total      #    Total      #       Total
                    Responses          Responses        Responses         Responses       Responses          Responses


    1       11         19       44        77       2       4        0         0       0      0        0         -           57         1.8

    2        8         14       41        72       7       12       1         2       0      0        0         -           57         2.0

    3       10         18       44        77       3       5        0         0       0      0        0         -           57         1.9

    4       14         25       37        65       6       10       0         0       0      0        0         -           57         1.9

    5       20         35       34        60       3       5        0         0       0      0        0         -           57         1.7

    6       17         30       36        63       3       5        1         2       0      0        0         -           57         1.8

    7       18         32       33        59       5       9        0         0       0      0        1         -           57         1.8

    8       16         28       34        61       4       7        2         4       0      0        1         -           57         1.9

    9        5          9       37        66       12      21       2         4       0      0        1         -           57         2.2

   10        9         16       41        73       6       11       0         0       0      0        1         -           57         1.9

Total
            128        23       381       67       51      9        6         1       0      0        4         -          570         1.9
Responses


* “Does Not Apply” responses not used in percentage calculations.

Total Number of Surveys Sent to Institutions: 72

Total Number of Surveys Received: 57