Third Quarter FY 2007 OIG Report on the Survey of Farm Credit System (FCS) Institutions Regarding the Agency’s Examination Function for the Period April 1 – June 30, 2007 Introduction Based on the interface FCS institutions had with the Agency's examination function during the period April 1 – June 30, 2007, OE identified 12 FCS institutions that were in a position to provide meaningful survey responses. (Institutions are surveyed no less frequently than every 18 months and, generally, no more frequently than every 12 months.) The OIG sent surveys to those 12 institutions on July 23, 2007. A follow-up e-mail was sent to nonresponding institutions on August 24. Of the 12 institutions surveyed, 8 submitted completed surveys. If the 4 nonresponding institutions subsequently send a completed survey, they will be included in the next quarterly report. One response to a survey issued for the second quarter was received subsequent to the second quarter report and is included in this third quarter report. As a result, this report covers a total of 9 responding institutions. For the first three quarters of the fiscal year, the OIG has issued 56 surveys and received 45 completed surveys. This is an 80 percent response rate, which is very favorable. The response rate for 2005, the last full year the OIG surveyed prior to updating the survey, was only 63 percent. The improvement is due to the revised format of the survey; the survey’s ease of completion and submission, i.e., all electronic; and our new follow-up process on surveys distributed. The OIG will provide an e-mail report to you based on each fiscal year quarter-end, i.e., December 31, March 31, and June 30, so that you may timely take whatever action you deem necessary to address the responses. A summary report will be issued to you covering aggregate survey results for each fiscal year ended September 30. The survey asked respondents to rate each survey statement from "1" (Completely Agree) to "5" (Completely Disagree). The rating options are as follows: Completely Agree 1 Agree 2 Neither Agree nor Disagree 3 Disagree 4 Completely Disagree 5 There is also an available response of "Does Not Apply" for each survey statement. Narrative responses are provided verbatim, except that any identifying information has been removed and any grammatical or punctuation errors may have been corrected. Information in the comments in brackets has been substituted by the OIG in an effort to ensure the confidentiality of responses. Survey Results - Third Quarter FY 2007 1. Average numerical responses to survey statements 1 - 10 range from 1.8 to 2.3 (second quarter and first quarter ranges were 1.7 to 2.2). 2. The average response for all survey statements is 2.0 (second quarter and first quarter averages were 1.9). One institution rated survey statement 9 [The results and recommendations of the Office of Examination’s national examination activities (e.g., information technology, finance, credit, etc.) and its reports on identified best practices have assisted your institution.] as a "4" (Disagree). The corresponding comment was "We seldom are informed of Best Practices." In my follow-up with the Chairman of the institution’s Audit Committee on this comment, he indicated that his and the Audit Committee members’ intent with this comment was that the examiners should be more aggressive in indicating to the board and management those practices in the institution that are good and those that need improvement. The majority of narrative comments to survey statements 1 - 10 were positive, many very much so. However, 33 percent of the comments were negative to varying degrees. These comments are listed below under numbers 3, 5, and 7 - 9. They may provide opportunities for you to refine examination methodology and communications, and examiner training. Survey item 11a asks for feedback on the most beneficial aspects of the examination process. Many very positive comments were provided about the examiners and the examination process. Survey item 11b asks for feedback on the least beneficial aspects of the examination process. The comments received to this question may also provide opportunities for you to refine examination methodology and communications, and examiner training. Survey item 12 asks for any other comments. Only two comments were provided and both were very positive, reflecting well on the examiners and the examination process. Responses to Survey Statements 1–10 Risk-Based Examination Process Survey Statement 1: The scope and depth of examination activities focused on areas of risk to the institution and were appropriate for the size, complexity, and risk profile of the institution. Average Response: 2.0 (1.9 second quarter report, 1.8 first quarter report) Comments: • We agree with the overall scope relative to our organization, prior results, and the stable management function here. • Key risks were properly identified and the examinations activities were properly focused. Survey Statement 2: The examination process helped the institution understand its authorities and comply with laws and regulations. Average Response: 1.9 (2.2 second quarter report, 2.0 first quarter report) Comments: • There was periodic communication throughout the examination period regarding current issues and areas of concern identified by FCA. This supplemented [our funding bank’s] ongoing focus on compliance. Survey Statement 3: The results and recommendations of the examination process covered matters of safety and soundness, and compliance with laws and regulations. Average Response: 2.0 (1.9 second quarter report, 1.8 first quarter report) Comments: • Have some concern that examiners were behind the curve when looking at non- ag customers and the perceived collateral risk with certain high value per acre loans. • The examination process appropriately included consideration of safety and soundness and compliance issues. Survey Statement 4: Examiners were knowledgeable and appropriately applied laws, regulations, and other regulatory criteria. Average Response: 2.0 (1.9 second and first quarter reports) Comments: • The examination team was experienced and professional. Team members had a solid understanding of the laws and regulations under which we operate. Communications and Professionalism Survey Statement 5: Communications between the Office of Examination staff and the institution were clear, accurate, and timely. Average Response: 1.9 (1.7 second and first quarter reports) Comments: • Very professional and generally timely. • Ongoing communication throughout the examination period was appropriate. • May not be the case with operating personnel. (In a discussion with the commenter, I learned that the emphasis with this comment is that examiners need to be more forthright with management personnel as to examination conclusions and recommendations.) Survey Statement 6: Examination communications included the appropriate amount and type of information to help the board and audit committee fulfill their oversight responsibilities. Average Response: 1.9 (1.8 second and first quarter reports) Comments: • Good balance between relevant/topical reporting and not providing excess filler. • Communication with the Board and Audit Committee was timely and appropriate. Examination communications provide value to the governance of [the institution]. Survey Statement 7: The examiners were organized and efficiently conducted examination activities. Average Response: 1.8 (1.9 second quarter report, 1.7 first quarter report) Comments: • Examiners do a good job. Some of the material provided in the onsite review was previously provided in sent Board Material. While it is difficult, better coordination with onsite examiners might free up time to conduct other reviews. • The examiners were very organized and prepared when conducting the examination and meetings. • The ongoing nature of examination activities involved frequent submission of information and materials to the examiners throughout the period and provided multiple feedback opportunities. • Examiner training took place during the last exam. (The commenter indicated that there were too many trainee examiners onsite and that, while he recognized the importance of training, the institution got a lesser quality examination as a result of too high a percentage of trainees.) Survey Statement 8: Examiners fairly considered the views and responses of the board and management in formulating conclusions and recommendations. Average Response: 1.8 (1.8 second quarter report, 2.0 first quarter report) Comments: • Generally so. At times, felt that conclusions were more based on macro issues than on specific needs of our organization. • We appreciate the opportunities to share views and pursue mutual understanding of examination issues, conclusions, and recommendations. Best Practices and Regulatory Guidance Survey Statement 9: The results and recommendations of the Office of Examination’s national examination activities (e.g., information technology, finance, credit, etc.) and its reports on identified best practices have assisted your institution. Average Response: 2.3 (2.1 second quarter report, 2.2 first quarter report) Comments: • We see these as a good “heads-up” and to some degree, an affirmation of our own audit activities and general controls. • The national examination activities reporting process is continuing to develop. It has been confusing as to FCA’s expectations regarding “best practices” documented in examination reports. Recently, we have received unofficial clarification from the Office of Examination delineating expectations regarding Required Actions, Recommendations, and “best practices” which have been helpful but needs to be formalized. • We seldom are informed of Best Practices. (The commenter indicated he meant that examiners should be more aggressive in indicating to board and management those practices in the institution that are good and those that need improvement.) Survey Statement 10: FCS-wide guidance from the Office of Examination (e.g., bookletters, informational memoranda, etc.) was timely, proactive and helpful. Average Response: 2.1 (1.9 in second and first quarter reports) Comments: • We view these as a valuable source for safety and soundness issues relevant to the entire Farm Credit System. • Guidance regarding Office of Examination interpretations and current issues provides insight that is helpful. It is important that such guidance remains focused on regulatory compliance and safety and soundness issues. Responses to Additional Survey Items 11a, 11b, and 12 Survey Item 11a: What aspects of the examination process did you find most beneficial? • Senior reviewers did visit with management on how they looked at certain types of loans. • We appreciated the preparedness of the examiners. With the training group on- site, the efficiency may have suffered but we support these important efforts by the Office of Examination. • Less disruption to our daily operations. • All aspects beneficial to the extent that there would be no significant differential. • The most beneficial aspects have been communication about areas of focus and issues identified by the agency. This avoids surprises and provides for timely consideration of these matters. • Examiners meeting [with] the Audit Committee. Survey Item 11b: What aspects of the examination process did you find least beneficial? • Younger examiners asked very simple questions. • Total headcount, considering the examiners and trainees, was rather large and can be difficult to accommodate the total number of people. • Request for info previously submitted. • The national examination activities focus on “best practices” needs to include a high level of discipline, as “best practices” are often based on individual opinions rather than regulatory direction. “Best practices” in place in some institutions may not be appropriate in other institutions, based on the unique operations and management in each institution. The requirement of justifying why an institution does not adopt a particular “best practice” can involve significant amounts of time and resources. Care must be taken that the agency doesn’t try to manage institutions through the examination function and “best practice” pronouncements. • Close-out meeting with management was too brief. (Commenter meant that due to the large number of trainee examiners the presentation was done more as a best case scenario for the trainees to observe. The commenter felt that with fewer trainees the presentation would have been structured with a normal focus on a full discussion of examination findings and conclusions.) Survey Item 12: Please provide any additional comments about the examination process and related communications. • We were pleased with the examination, thoroughness, and professionalism of the group. The audit committee and full board also appreciate the personal service and attendance at the board meeting by the lead examiners. • Overall, the examination process was professionally conducted and communications have been clear and effective.
Third Quarter FY 2007 FCS Survey
Published by the Farm Credit Administration, Office of Inspector General on 2007-09-01.
Below is a raw (and likely hideous) rendition of the original report. (PDF)