oversight

Audit of the Information Technology Security Controls of the U.S. Office of Personnel Management's Enterprise Server Infrastructure General Support System FY 2011

Published by the Office of Personnel Management, Office of Inspector General on 2011-05-16.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                           UK ITED STATES OF FICE OF PERSONI"EL MA KAGEM Ef\T

                                              Wash ington, DC 204 j5



   Office of the
Inspector General

                                              Audit Report


                              u .s, OFFICE OF PERSON:"!EL MANAGEMf.:"T

                        AUDIT OF THE IlWOR'IATIO:"! TECH]\"OLOGY SECUIUTY
                     CONTROLS OF rtuc u.s. onlCE OF PERSONNf.1. MANAGf.ME1'TS
                              t:NTI:RPRISt; St;RVER INFRASTlHJCTlJltE
                                    GENERAL S{;PPORT SYSTE:\f
                                                FV 2011

                                             WASHI]\"GTON. D.C.




                                     Report No. 4A-CI-OO-ll-016


                                     Date:           5 / 16/2 0 1 1




                                                                    ;;:1/ t~(;;'
                                                                              ./   ..




                                                                        Michael R. Esser
                                                                        Assistant Inspector General
                                                                           for Audits




        .......opm ·llov                                                                    .. ww. u s aj o b s , j( OY

                               UNIT ED STATES OFFICE OF PERSON.:-.lEL              ~1A N AG EME :,-[ T
                                                    Wa~ h in g ton ,   DC 20·11S


  Office of the
tnspecror Ijeneral

                                                   Executive Summary

                                       u.S. OFFICE OF PERSON:IIEL :\IANAGEMEl"T


                             AUDIT OF TIlE INFORMATION TECIINOLOGY SEClJlUTY

                          CONTROLS OF THE U.S. OFFICE OF PERSONNEL :\IANAGEME:IIT'S

                                   ENTERPIUSE SERVER INFRASTRUCTURE

                                         GE:IIERAL S{;PPORT SVSTDI

                                                   FV 2011


                                                      WASHINGTON, D.C.




                                               Report No. 4A-CI-OO-ll-O 16


                                               Date:                    5/ 16/20 1 1

             This final audit report discusses the results of our review of the information technology security
             controls of the U.S. Office of Personnel Management ' s (Ol' M) Enterprise Server Infrastructure
             General Support System (ESI). Our conclusions arc detailed in the "Results" section of this
             report.

             During this audit we documented the following opportunities for improvem ent:
                     •	 The ESI information system security plan (ISSP) was prepared in accordance with the
                        fermat and methodology outlined in )JIST guidance, However, the ESI lSS P does not
                        contain details of the interconnections between ESI and other systems as required by
                        NIST SP 800- 18.
                     •	 Several weaknesses identified during disaster recow ry exercises have not been addressed
                        or remediated.
                     •	 The Office of the Chief Info rmation Officer (OCIO) has not (annalI)'
                        documented common contro ls provided by ESI or implemented a process to share this
                        informat ion with the own ers of other applications relying on this support system.




        .. w... opm·eov                                                                                  www, u saj obs .g ov
We also determined that the following elements of the ESI security program appear to be in
full FISMA compliance:
•   A security certification and accreditation (C&A) of ESI was completed in September
    2010 by the Bureau of Public Debt.
•   The OIG agrees with the security categorization of “high” for ESI.
•   A risk assessment was conducted for ESI in 2010 that addresses all the required elements
    outlined in relevant NIST guidance.
•   The security controls of ESI were tested by an independent source and internally by the
    OCIO.
•   The ESI contingency plan is routinely maintained and tested in accordance with NIST
    Guidance.
•    A privacy threshold analysis (PTA) was conducted for ESI. The PTA revealed that ESI
    does not require a privacy impact assessment. We agree with this assessment.
•   The ESI Plan of Action and Milestones (POA&M) follows the format of the OPM
    POA&M guide, and has been routinely submitted to the Office of the Chief Information
    Officer for evaluation.
•   We independently tested 24 security controls for ESI and found that 1 of the security
    controls was not in place during the fieldwork phase of the audit.




                                            ii
                                                                 Contents
                                                                                                                                               Page

Executive Summary ......................................................................................................................... i
Introduction ......................................................................................................................................1
Background ......................................................................................................................................1
Objectives ........................................................................................................................................1
Scope and Methodology ..................................................................................................................2
Compliance with Laws and Regulations..........................................................................................3
Results ..............................................................................................................................................4
         I. Certification and Accreditation Statement ........................................................................4
        II. FIPS 199 Analysis .............................................................................................................4
       III. Information System Security Plan .....................................................................................4
      IV. Risk Assessment ................................................................................................................6
       V. Independent Security Control Testing ...............................................................................6
      VI. Security Control Self-Assessment .....................................................................................7
     VII. Contingency Planning and Contingency Plan Testing ......................................................7
   VIII. Privacy Impact Assessment ...............................................................................................8
      IX. Plan of Action and Milestones Process .............................................................................9
       X. NIST SP 800-53 Evaluation ...............................................................................................9
   Major Contributors to this Report ..............................................................................................11
Appendix: Office of the Chief Information Officer’s February 3, 2011 response to the draft
          audit report, issued January 13, 2011
                                        Introduction
On December 17, 2002, President Bush signed into law the E-Government Act (P.L. 107-347),
which includes Title III, the Federal Information Security Management Act (FISMA). It requires
(1) annual agency program reviews, (2) annual Inspector General (IG) evaluations, (3) agency
reporting to the Office of Management and Budget (OMB) the results of IG evaluations for
unclassified systems, and (4) an annual OMB report to Congress summarizing the material
received from agencies. In accordance with FISMA, we evaluated the information technology
(IT) security controls related to the Office of Personnel Management’s (OPM) Enterprise Server
Infrastructure General Support System (ESI).

                                        Background
ESI is one of OPM’s 43 critical IT systems. As such, FISMA requires that the Office of the
Inspector General (OIG) perform an audit of IT security controls of this system, as well as all of
the agency’s systems on a rotating basis.

The Office of the Chief Information Officer (OCIO) has been designated with ownership of ESI.
ESI supports OPM in meeting its goals by serving as an infrastructure environment for the
processing of payroll and benefit related actions for current and former federal government
employees. ESI operates in a                       environment. The mainframe infrastructure is
supported by the agency’s Data Center Group within the OCIO.

This was our second audit of the security controls surrounding ESI. The findings from the first
ESI audit report, issued in 2004, were closed prior to the start of this audit. We discussed the
results of our audit with OCIO representatives at an exit conference.

                                          Objectives
Our objective was to perform an evaluation of security controls for ESI to ensure that the OCIO
officials have implemented IT security policies and procedures in accordance with standards
established by OPM, FISMA, and the National Institute of Standards and Technology (NIST).

OPM’s IT security policies require managers of all major information systems to complete a
series of steps to (1) certify that their system’s information is adequately protected and (2)
authorize the system for operations. The overall audit objective was accomplished by reviewing
the degree to which a variety of security program elements have been implemented for ESI,
including:
•   Certification and Accreditation Statement;
•   FIPS 199 Analysis;
•   Information System Security Plan;
•   Risk Assessment;
•   Independent Security Control Testing;
•   Security Control Self-Assessment;
•   Contingency Planning and Contingency Plan Testing;
•   Privacy Impact Assessment;


                                                 1
•   Plan of Action and Milestones Process; and
•   NIST Special Publication (SP) 800-53 Security Controls.

                                Scope and Methodology
This performance audit was conducted in accordance with Government Auditing Standards,
issued by the Comptroller General of the United States. Accordingly, the audit included an
evaluation of related policies and procedures, compliance tests, and other auditing procedures
that we considered necessary. The audit covered FISMA compliance efforts of the OCIO
officials responsible for ESI, including IT security controls in place as of January 2011.

We considered the ESI internal control structure in planning our audit procedures. These
procedures were mainly substantive in nature, although we did gain an understanding of
management procedures and controls to the extent necessary to achieve our audit objectives.

To accomplish our objective, we interviewed representatives of OPM’s OCIO office and other
program officials with ESI security responsibilities. We reviewed relevant OPM IT policies and
procedures, federal laws, OMB policies and guidance, and NIST guidance. As appropriate, we
conducted compliance tests to determine the extent to which established controls and procedures
are functioning as required.

Details of the security controls protecting the confidentiality, integrity, and availability of ESI
are located in the “Results” section of this report. Since our audit would not necessarily disclose
all significant matters in the internal control structure, we do not express an opinion on the ESI
system of internal controls taken as a whole.

The criteria used in conducting this audit include:
•   OPM Information Technology Security Policy Volumes 1 and 2;
•   OMB Circular A-130, Appendix III, Security of Federal Automated Information Resources;
•   E-Government Act of 2002 (P.L. 107-347), Title III, Federal Information Security
    Management Act of 2002;
•   NIST SP 800-12, An Introduction to Computer Security;
•   NIST SP 800-18 Revision 1, Guide for Developing Security Plans for Federal Information
    Systems;
•   NIST SP 800-30, Risk Management Guide for Information Technology Systems;
•   NIST SP 800-34, Contingency Planning Guide for Information Technology Systems;
•   NIST SP 800-37, Guide for the Security Certification and Accreditation of Federal
    Information Systems;
•   NIST SP 800-53 Revision 3, Recommended Security Controls for Federal Information
    Systems;
•   NIST SP 800-60 Volume II, Guide for Mapping Types of Information and Information
    Systems to Security Categories;
•   Federal Information Processing Standard Publication 199, Standards for Security
    Categorization of Federal Information and Information Systems; and
•   Other criteria as appropriate.



                                                 2
In conducting the audit, we relied to varying degrees on computer-generated data. Due to time
constraints, we did not verify the reliability of the data generated by the various information
systems involved. However, nothing came to our attention during our audit testing utilizing the
computer-generated data to cause us to doubt its reliability. We believe that the data was
sufficient to achieve the audit objectives. Except as noted above, the audit was conducted in
accordance with generally accepted government auditing standards issued by the Comptroller
General of the United States.

The audit was performed by the OPM Office of the Inspector General, as established by the
Inspector General Act of 1978, as amended. The audit was conducted from November through
December 2010 in OPM’s Washington, D.C. office.

                    Compliance with Laws and Regulations
In conducting the audit, we performed tests to determine whether OCIO’s management of ESI is
consistent with applicable standards. Nothing came to the OIG’s attention during this review to
indicate that the OCIO is in violation of relevant laws and regulations.




                                               3
                                             Results
 I. Certification and Accreditation Statement

    A security certification and accreditation (C&A) of ESI was completed in September 2010.

    NIST SP 800-37, Guide for the Security Certification and Accreditation of Federal
    Information Systems, provides guidance to federal agencies in meeting security accreditation
    requirements. The ESI C&A appears to have been conducted in compliance with NIST
    guidance.

    The Bureau of Public Debt (BPD) was contracted by the OCIO to prepare the C&A package
    for ESI. OPM’s Senior Agency Information Security Officer reviewed the ESI C&A
    package and signed the system’s certification package on September 29, 2010. OPM’s Chief
    Information Officer signed the accreditation statement and authorized the continued
    operation of the system on September 29, 2010.

II. FIPS 199 Analysis

    Federal Information Processing Standard (FIPS) Publication 199, Standards for Security
    Categorization of Federal Information and Information Systems, requires federal agencies to
    categorize all federal information and information systems in order to provide appropriate
    levels of information security according to a range of risk levels.

    NIST SP 800-60 Volume I, Guide for Mapping Types of Information and Information
    Systems to Security Categories, provides an overview of the security objectives and impact
    levels identified in FIPS Publication 199.

    The ESI security categorization analysis categorizes information processed by the system and
    its corresponding potential impacts on confidentiality, integrity, and availability. ESI is
    categorized with a high impact level for confidentiality, high for integrity, moderate for
    availability, and an overall categorization of high.

    The security categorization of ESI appears to be consistent with the guidance of FIPS 199
    and NIST SP 800-60, and the OIG agrees with the categorization of high.

III. Information System Security Plan

    Federal agencies must implement on each information system the security controls outlined
    in NIST SP 800-53 Revision 3, Recommended Security Controls for Federal Information
    Systems. NIST SP 800-18 Revision 1, Guide for Developing Security Plans for Federal
    Information Systems, requires that these controls be documented in an Information System
    Security Plan (ISSP) for each system, and provides guidance for doing so.




                                                 4
The ISSP for ESI was created using the template outlined in NIST SP 800-18. The template
requires that the following elements be documented within the ISSP:
•   System Name and Identifier;
•   System Categorization;
•   System Owner;
•   Authorizing Official;
•   Other Designated Contacts;
•   Assignment of Security Responsibility;
•   System Operational Status;
•   Information System Type;
•   General Description/Purpose;
•   System Environment;
•   System Interconnection/Information Sharing;
•   Laws, Regulations, and Policies Affecting the System;
•   Minimum Security Controls;
•   Plan Completion Date; and
•   Plan Approval Date

The ESI ISSP contains the majority of the elements outlined by NIST. However, the ESI
ISSP does not contain details of the interconnections between ESI and other systems.

The ISSP correctly states that NIST does not require systems to list interconnections with
internal organizations, but the ISSP also indicates that ESI interfaces with several systems
owned by external entities. The details of these external interfaces are not disclosed in the
ISSP as required by the NIST guide. Specifically, the ESI ISSP does not detail the following
information about each interfacing system: name, organization, type of interconnection,
authorizations, dates of agreement, FIPS 199 category, C&A status, and name and title of
authorizing official.

Recommendation 1
We recommend that the ESI ISSP be revised to include identifiers of the external systems
that interconnect with ESI (name, organization, type of interconnection, authorizations, dates
of agreement, FIPS 199 category, C&A status, name and title of authorizing official).

OCIO Response:
“We concur.”

OIG Reply:
As part of the audit resolution process, we recommend that the OCIO provide OPM’s
Internal Oversight and Compliance (IOC) with evidence indicating this recommendation has
been implemented.




                                              5
IV. Risk Assessment

    A risk management methodology focused on protecting core business operations and
    processes is a key component of an efficient IT security program. A risk assessment is used
    as a tool to identify security threats, vulnerabilities, potential impacts, and probability of
    occurrence. In addition, a risk assessment is used to evaluate the effectiveness of security
    policies and recommend countermeasures to ensure adequate protection of information
    technology resources.

    As part of the C&A process, BPD conducted a vulnerability assessment of ESI and evaluated
    the risk of each vulnerability in accordance with NIST SP 800-30 standards. BPD identified
    18 vulnerabilities during this assessment, and for each one documented:
    •   Vulnerability Description;
    •   Threat Source;
    •   Existing Controls;
    •   Likelihood, Impact, and Risk Rating; and
    •   Control Recommendations.

    ESI provided BPD sufficient evidence to close findings for five vulnerabilities and
    determined that one vulnerability was due to a false positive test result. Remediation
    activities for the remaining 12 vulnerabilities are appropriately tracked with the ESI Plan of
    Action and Milestones (POA&M) (see section IX below).

V. Independent Security Control Testing

    A security test and evaluation (ST&E) was completed for ESI as a part of the system’s C&A
    process in September 2010. The ST&E was conducted by BPD, an OPM contractor that was
    operating independently from the OCIO. The OIG reviewed the controls tested to ensure that
    they included a review of the appropriate management, operational, and technical controls
    required for a system with a “high” security categorization according to NIST SP 800-53
    Revision 3, Recommended Security Controls for Federal Information Systems.

    The ST&E labeled each security control as common, system-specific, or hybrid. A common
    control is a security control that is inherited from another system or physical environment. A
    system-specific control is a control that is implemented directly on an individual application.
    A hybrid control is where part of the control is deemed common and part is deemed system
    specific. All types of controls were tested as part of the ST&E due to the fact that ESI is a
    general support system that both inherits and provides common security controls.

    The possible outcomes for each control test were fully satisfied, partially satisfied, and not
    satisfied. BPD reviewed and tested over 200 controls as part of the ST&E and concluded
    that 33 were partially satisfied and the rest were fully satisfied. The 33 partially satisfied
    control tests were condensed into the 18 security weakness findings discussed in Section IV
    above.




                                                   6
VI. Security Control Self-Assessment

     FISMA requires that IT security controls of each major application owned by a federal
     agency be tested on an annual basis. In the years that an independent ST&E is not being
     conducted on a system, the system’s owner must conduct an internal self-assessment of
     security controls.

     The designated security officer for ESI conducted a self-assessment of the system’s controls
     in April 2010. The assessment included a review of the relevant management, operational,
     and technical security controls outlined in the NIST SP 800-53 Revision 3. The OCIO
     attempts to perform a complete and thorough security self-assessment each year. The OCIO
     did not detect any security weaknesses in the FY 2010 self-assessment.

     Although the ESI self-assessment indicated that there were zero security weaknesses in the
     system, an OIG review of the same security controls indicated that a weakness does exist (see
     section X, below).

VII. Contingency Planning and Contingency Plan Testing

     NIST SP 800-34, Contingency Planning Guide for IT Systems, states that effective
     contingency planning, execution, and testing are essential to mitigate the risk of system and
     service unavailability. The OPM IT security policy requires that OPM general support
     systems and major applications have viable and logical disaster recovery and contingency
     plans, and that these plans be annually reviewed, tested, and updated.

     Contingency Plan
     The ESI Disaster Recovery (DR) Plan documents the functions, operations, and resources
     necessary to restore and resume mainframe operations when unexpected events or disasters
     occur. The ESI DR plan is reviewed and updated annually and contains the majority of
     elements recommended by NIST SP 800-34 guidelines, including:
     •   System background information;
     •   Concept of operations;
     •   Notification/activation phase;
     •   Recovery operations; and
     •   Procedures to return to normal operations.

     Contingency Plan Test
     NIST SP 800-34 provides guidance for conducting and documenting contingency plan tests.
     Contingency plan testing is a critical element of a viable disaster response capability.

     In May 2010, the OCIO conducted its annual disaster recovery test. The test involved
     restoring all mission critical functions at a remote facility. The documentation resulting from
     the testing activity contains the majority of the items required by the NIST guide including
     the scope, objectives, participants, and logistics of the test.



                                                      7
      The test summary included a section of “areas for further review” that documents the issues
      or concerns that were discovered during the test. There were 19 issues detected during the
      FY 2010 test, several of which were considered “major” in nature. The majority of the issues
      were also identified in the disaster recovery tests from FY 2008 and FY 2009. Although the
      OCIO has documented the fact that issues exist, it does not appear that they have attempted
      to remediate these weaknesses. We acknowledge the fact that remediation activity for
      several of these issues requires support from OPM program offices outside of the OCIO.
      However, we believe that the OCIO should take primary responsibility for coordinating
      remediation activity since ESI is a critical general support system that many other OPM
      applications rely on for common controls.

      Recommendation 2
      We recommend that the OCIO develop and implement a plan to remediate weaknesses
      identified during ESI disaster recovery tests; remediation activities should be tracked on the
      ESI POA&M.

      OCIO Response:
      “We disagree in part with the recommendation. Clearly there are not 19 weaknesses.
      However, the list of observations should be reviewed to determine which, if any, of the
      items are actual weaknesses. The Data Center agrees that any items found to be actual
      weaknesses need to be documented in a POA&M and a plan developed to remediate them.
      However, the Data Center does not control infrastructures outside the ESI, nor does it
      determine which tests will be conducted by the Lines of Business or other organizations.
      During the ESI DR exercise the Data Center recovers the ESI environment and executes
      tests to ensure the platform is wholly recovered. While the Data Center can make test
      recommendations, decisions regarding the testing of infrastructure external to the ESI and
      customer applications are outside the control of the Data Center. Any weaknesses found
      during the review of the list should be documented and tracked in the POA&M of the
      organization responsible for taking corrective actions; not necessarily the ESI POAM.
      Likewise, plans to remediate any weaknesses should be developed by the parties
      responsible for taking corrective actions.”

      OIG Reply:
      After reviewing the OCIO’s response to the draft report, we acknowledge that there may be
      fewer than 19 weaknesses identified during the most recent disaster recovery exercise. The
      intent of our recommendation is to encourage the OCIO to use the formal POA&M process
      to track any weaknesses that are identified; a statement to which the OCIO agrees. As part of
      the audit resolution process, we recommend that the OCIO provide IOC with evidence
      indicating that weaknesses identified during the FY 2011 disaster recovery exercise are
      tracked on the ESI POA&M.

VIII. Privacy Impact Assessment

      The E-Government Act of 2002 requires agencies to perform a screening of federal
      information systems to determine if a Privacy Impact Assessment (PIA) is required for that


                                                     8
    system. OMB Memorandum M-03-22 outlines the necessary components of a PIA. The
    purpose of the assessment is to evaluate any vulnerabilities of privacy in information
    systems and to document any privacy issues that have been identified and addressed.

    The OCIO completed an initial privacy screening of ESI and determined that a PIA was not
    required for this system because it does not contain Personally Identifiable Information (PII).
    Although several applications residing on the ESI mainframe contain PII, the OCIO staff
    supporting ESI does not have access to this data.

IX. Plan of Action and Milestones Process

    A POA&M is a tool used to assist agencies in identifying, assessing, prioritizing, and
    monitoring the progress of corrective efforts for IT security weaknesses. OPM has
    implemented an agency-wide POA&M process to help track known IT security weaknesses
    associated with the agency’s information systems.

    The OIG evaluated the ESI POA&M and verified that it follows the format of OPM’s
    standard template, and has been routinely submitted to the OCIO’s Security and Privacy
    Group for evaluation. Nothing came to our attention to indicate that there are any current
    weaknesses in the management of the ESI POA&M.

X. NIST SP 800-53 Evaluation

    NIST SP 800-53 Revision 3, Recommended Security Controls for Federal Information
    Systems, provides guidance for implementing a variety of security controls for information
    systems supporting the federal government. As part of this audit, we evaluated the degree to
    which a subset of these controls had been implemented for ESI, including:

    •   AC-2 Account Management                          •   IA-1 Identification and Authentication
    •   AC-5 Separation of Duties                        •   IA-5 Authenticator Management
    •   AC-6 Least Privilege                             •   MA-1 Maintenance Policy and Procedures
    •   AC-7 Unsuccessful Login Attempts                 •   MA-2 Controlled Maintenance
    •   AC-11 Session Lock                               •   MP-6 Media Sanitization and Disposal
    •   AT-3 Security Training                           •   PE-1 – 18 Physical and Environmental
                                                             Controls
    •   AU-2 Auditable Events                            •   PL-4 Rules of Behavior
    •   AU-3 Contents of Audit Records                   •   PM-1 Information Security Program Plan
    •   AU-6 Audit Review, Analysis, Reporting           •   PS-4 Personnel Termination
    •   CA-7 Continuous Monitoring                       •   RA-5 Vulnerability Scanning
    •   CM-2 Baseline Configuration                      •   SC-5 Denial of Service Protection
    •   CM-3 Configuration Change Control                •   SI-2 Flaw Remediation

    These controls were evaluated by interviewing individuals with ESI security responsibilities,
    reviewing documentation and system screenshots, viewing demonstrations of system
    capabilities, and conducting tests directly on the system.



                                                   9
Although it appears that the majority of NIST SP 800-53 Revision 3 security controls have
been successfully implemented for ESI, one tested control was not fully satisfied.

a) PM-1 Information Security Program Plan

   ESI is a general support system that provides common security controls to other
   information systems and applications. ESI also inherits several security controls from
   program offices outside the OCIO (primarily physical controls related to building
   security).

   Although the OCIO’s Security and Privacy Group is currently developing a list of
   common controls that ESI shares with other systems, this information has not been
   formally documented and shared with other OPM program offices. Without a well
   defined list of common controls, the owners of other systems must use their own
   judgment to determine which security controls are inherited from ESI, increasing the risk
   that these systems have controls that are not adequately implemented or tested.

   NIST SP 800-53 Revision 3 control PM-1 states that an organization should develop an
   agency-wide Information Security Program Plan that documents the program
   management controls and organization-defined common controls.

Recommendation 3
We recommend that the OCIO formally document common controls provided by ESI and
implement a process to share this information with the owners of other applications relying
on this support system.

OCIO Response:
“We concur. This work is in progress.”

OIG Reply:
As part of the audit resolution process, we recommend that the OCIO provide OPM’s IOC
with evidence indicating this recommendation has been implemented.




                                             10
                   Major Contributors to this Report
This audit report was prepared by the U.S. Office of Personnel Management, Office of
Inspector General, Information Systems Audits Group. The following individuals
participated in the audit and the preparation of this report:

•                  , Group Chief
•                    , Senior Team Leader
•                       , IT Auditor




                                          11
                                                     Appendix

                             UN IT ED STATES OFFIC E OF PERSO NNEL MANAGEMEN T


Chief Information
     Offiw


        MEMORANDUM FOR

                       CIIIEF, INFORMATION SYSTEMS AUDIT GROU P

                                                                                             "/
        FROM:	                       MATTHEW E. PERRY
                                                                           ~~~
                                     CHIEF INFOR..V1ATION OfFICER                 02/0 ;;-.LJ
        Subject:	                    Response to the Draft: Audit Report No. 4A-CI-OO- l 1-01 6

                                     FY 20 11 IT Security Controls ofO PM's Enterp rise Server

                                     Infra struc ture Ge nera! Support System


        Thank you for the opport unity to comm ent on the su bjec t l'Cp011. The resu lts pro vided in the
        draft report consist of a number of recommendation s. The recomm endations are valuable to our
        program impro vement efforts and aft er a ca reful review of the report. we offe r the following
        comments.

        III.        Informati on System Secur ity Plan

        The 20 }O OIG Audit repor states: "The ESIISSP contains (he majority ofthe elements outlined
        by NIST. However. the ESIISSr does not contain de/ails a/the interco nnections between ESt
        and other systems. "

        CIO Comment:

        We concur.


        The 20/0 OIG Audit reportrecomme nd...: "Recommen dation 1 We recommend that the ESI
        ISSP be revised to include identifiers a/the external systems that interconnect with ESI (name,
        organization, type ofinterconnection, authorizations. dates oj agreement, FIPS J99 category ,
        C&A status, name and rifle ofauthorizing ofJicild). "

        CTa Co mment:

        We concur.


        VII .       Contingenc y Planning and Co ntingency Plan Testing

        The 201 0 OIG A udit repo rt states : " The re were 19 issues detected du ring the FY 20 10 test,
        several of whieh were considered "maj or" in nature. The majority of the issues were abo
        ident ified in the di sast er reco ver y tests from FV 2008 and FV 2009. A ltho ugh OCI0 has
        document ed the fact that issues ex ist. it does not appear that that they ha ve attem pted to
        remed iate these weaknesses."




                                                                                                   .................job •.go.

                                                                                                     2


CIO Comment:
We disagree with this finding as it appears to reflect a misunderstanding of the 19 issues
referenced. The 19 issues referenced are from a list titled "Areas for Further Review" that was part
of a Data Center internal document. This list documents observations (good and bad) from the
2010 ESI DR exercise. The document was not intended tor publication; it was simply an internal
record, and as such it had not been edited for language or for usc by personnel not intimately
familiar with the ESI DR test process. The 19 observations in the list can be grouped as follows
depending upon their nature:

Observations 4,5,10,12, and 13 were included on the list simply to document that these functions,
which may not have been tested in previous exercises, were in tact successfully tested in the 2010
ESI DR exercise. Their inclusion on the list was a positive not a negative comment. They require
no further attention.

Observations 1,6, and 16 were included on the list to document that these functions that may not
have been tested in previous exercises were in tact successfully tested on a small scale in the FY
20 I0 ESI DR exercise. Their inclusion on the list was intended to document their successful tests
and suggest that broader testing might be appropriate in the future. Responsibility tor expanding
the testing of these three functions lies outside the purview ofthe Data Center.

Observation 3 documents the fact that the capacity of circuit between the Sterling Forest DR site
and Boyers needed to be increased. This upgrade has since been completed and the new circuit
tested. The new circuit will be employed in the upcoming 20 II ESI DR exercise.

Observations 2, 9, II, 14, IS, and 17 were included on the list to document the tact that the parties
responsible for these functions chose not to test them during the 2010 ESI DR exercise.
Organizations outside the Data Center decide which functions to test based upon their priorities,
resources, and previous tests. These specific functions may have been tested at other times
independent of the ESI DR exercise. Their inclusion on the list was intended to document
functions which the responsible parties may wish to consider testing during future ESI exercises.
Responsibility for testing these six functions lies outside the purview of the Data Center.

Observations 7 and 8 were included on the list to document network related configuration changes
needed to provide or improve disaster recovery access from specific functional areas. These
changes are recommended by the Data Center but are outside the control of the Data Center.

Observations 18 and 19 were included on the list to document the continuous need to work as a
tcam with other organizations in refining the ESI DR test environment preparation process. These
items do not affect the ability to recovery ESI services during a real disaster. The DR test
infrastructure configuration is much more complex than an actual disaster recovery configuration
because during a DR test both the live production environment and the DR testing environment
must operate concurrently while physically and logically separated. Observations 18 and 19 are
part of an ongoing process to improve preparation and deployment of the DR test environment
without disruption to the live production environment. This process has no finite end point,
instead it evolves as technology and the OPM infrastructure evolves.
                                                                                                           3


Below is the "Areas for further Review" list cited in the 0 10 draft audit. Com ments (i n bold)
have been included below each item to add clarity.

 Areas for Fur ther Review - The overall testing was quue successful with only a fe w areas which
 need to be reviewed. A num ber of the ureus could be considered major. These are in connectivity
 to the customer bas e. Ho ping the lBAf Enterp rise Servers systems available is a prerequisite of
 the test but there also lUIS to be connectivity to where the end user is located.

      I.	 There was no Disaster Reco very           available p rior to the test. Network Management
          made the? decision not to include the                   because it Wll .\" being phased out and
          not to include the                 because tt was too new. In the event ofa Jisaster. 1III
          _        is noli' being hostedfr om both TRE and Macon. GA OPM locations. The impact
          ofnot ha ving _ availahle would he severe and mean there would he no remote
          access into KefU~ml OPM applications whi ch ore not running on _
                  " Hut many (l the remote users rely on _ f or their acc~
          •       applications/rom    hom e especial ly f or all R&/J applications. 1''1,\ PIP:'; users do
          not use ~Ilh() uXh CJ:'; and Fl.)' support personnel are depende nt on it to maintain
          app lications, DC has ways 10 access                              applications. maintaining
          them remotely wit h only a 1'PN connection. In the event (go disaster, many lisen have
          heen told to work at home. On th e second day of the test, -,'liM cha nged its po sition and
          assisted one . :\1SA& C home user in San Francisco to gain access to fl ew _             which was
          successfur

         _      access was s uccessfu lly tested on a small scale. This entry is intended to
         document that success and sug:~est expllnding: te~ting: o f _ access during future
         OR exercises. More robust tests of the_DR will be conducted after tbe new
         _infrastructure is deployed in Macon, GA . ~M manages _                    and decides
         the scope of the _      te st.

      2,	 There \Vas no e-mail access duri ng the lest (L~ requeste d by many user s. There is a p lan    (0
          recover some e-mail services in Boyers. PA as pa rt (~f,'v'A.rs Disaste r Recovery Plan
          OPAl users who are ut ho me and ha n? their own Internet Service Provider. could use
          WebMail to access the recovered system provided the e-Mail sen 'en are no! hosted in
          THH. Home users who rel y on _        will not be {I Nc fO acce ss e-mail,

          This entry is intended to raise th e possibility of testing:cJ\.·lail during: futu re I)I{
          exercises. Tests of e-MailllR have been successfully performed independent of the
          E St DR exercise. NI\-l manages e-Mail and related e-Mail DR tests.

      3.	 There is (J continuing review underway to address the spee d ofthe two communicat ion
          Jines: Sterling Forest, AT 10 Boyers. PA and Sterling Forest to Macon, GA_The Sterling
          Forest /0 Boyers connectivity consists ofthree T-l circuits today and may need 10 he
          upgraded to DS-3 speeds in a r eal DR, DC needs to ensure the process is in place 10
          exercise the option. The T· ] circuit front Sterling Forest to Macon. GA may need to
          upgr aded i ll the erem ofa disaster since Macon wo uld be the location (!lOI'Jj 's /SI'. II
                                                                                                   4


   NAf would implement diverse routing. ISP traffic couklflowfrom Macon to Boy ers over
   IJS-3 /im:s lind then come into Sterling Forest on olle ofthe three T-l circuits.

   A l>S-3 communication circuit between Sterling Forest, NY and Boyers, PA has
   been installed ami tested will be used in the 2011 EST DR exercise.

-I,	 There are -10 + FI,)' Federal rem ote sites which are connected throug h Sprint MPLS
     connectivity into Washington DC 's TRB. The p lan is tofailoverfrom TRB to Boyers. 1'.'1
     in the event ofa disaster. This was tested and was successfulfor the three locu tions
     tested.

   FIS relies on work performed at FlS remotes sites. This entry documents the fad
   that Sprint MPLS connectivity, though not ESI hosted, was successfully tested
   during this year's DR exercise. This is pesinvc; not negative.

5.	 There arc ab out 10+ FIS Federal remote sites connected using an Inte rnet connection. A
    SIT/all VPN app liance was hosted out a/O PAl Macon, (; A which serviced the testing f rom
    Miami, Fl.. The lest was successful.

   FlS relies on work performed at FIS remotes sites. This entry documents the fact
   that an Internet connection "as successfully tested during this year's UH. exercise.
   This is positive; not negative.

6.	 FIS has field investigators who carry laptops and access the PIPS' sys tem remotely. The
    remote test coming through the Internet \ I"GS successful even though there is no ISP
    providerfor Boyers. PA. OP_H has links 10 the lrnentet throuXh TRB and Macon. GA.
    L)'}> access into /'/I'S is r eI)' new lind expanding A portion ofremote access is through
    dial circuits into VPN concentrato rs. A growing population of remo te FIS users are
    coming throug h the Internet which would imply remo te connectivity using the Internet
    would have 10 come through OI),H 's ;\//1(.: 0 11. Gil lSI'. Macon was pro visioned with a
    small VP.iV appliance/or the test and it U"(lS successful. The locu tion is not hosted with
    significant sized ~ T l'..'app liances 10 host the entire PIS workl oad. There are no VI',\'
    concentrators hosted in Sterling Forest as purl ofthe                        t . Therefore in the
    event ofdisaster. FIS Federal Invest igators IF() U!d have to visit their many remote sites 10
    enter data.

    FIS relies on the invesngators being able to upload their data from their laptops via
    the Internet. This entry documents the successful test of this functionality but r..ises
    the potential capacity limitation of the Macon, GA VP~ concentrator in providing
    access for large numbers of FIS investigators during a disaster. N"M manages the
    VP.:'oI concentrators and related DR lests,


7.	 There was no capabilityfor the fixed FIS remote sites (numbering 5(J+) to be able to
    print reports dur ing the dis aster. The LAN printing methodology implemented has yet to
    provide redundant LAN print queue .l· in other tha n the TRfl locat ion Printfrom PIPS'
                                                                                                 5

    travels front the	                     to the remote loca lion PIPS terminal and then is
    handed offto the	                              '. The local high speed ne twork printer is
    only accessible using Washing ton IJC TRn hasted

    This DR prinlin~ capability issue is understood b)' NM and FIS. DC worked with
    others m develop a detailed set of instructions nn how to utilize "Named Printer"
    capability that mitigates the problem by bypassing the	                    . These
    instructions were distributed to about half flf the remote F1S locations. In order to
    ~it)' the staff in each location must make cbanucs to bypass the
    ~ . Some of the field offices deployed the changes and found they
    work "ell; other offices did not attempt to make the changes. The "Named Printer"
    change mitigates this problem, but the change must be performed in the field hy FIS
    staff.

8.	 Merit Syste ms Accountability & Compliance personnel arc located in external OI'M sit es
    arou nd the coun try. Their offices are connected to O PAl inlo Washington DC '.'I TR E.
    There arc no ,VA1 provisionsfor these circ uits to be replaced by comp arable ones in
    Boye rs. PA or Macon. GA. Testersfrom the and San Francisco, eA an d Philadelphia. PA
    offlces were successful accessing their _         application called _ from their hom es
    using specinllv provisioned means ofaccess n llled _           Using this home access they
    have no f acility 10 print. Priming is one of their requ irements. The implication is all
    t hmtan Capital l.eadership and Mer it Accountability offices )\JIO are connected using
    dedicated 1'-1 circuits into Washington DC 's TRB must workfrom home lI.~in}!, _
    There was a very limited tes sf rom San Francisco using the nel4·' ~ sys tem. ~
    _            has not been implemented to attempt to d() ~ printing 1t is being
    recomm ended 10 JlSA &C they request to be moved to NAt's ,lIPI.S or Internet
    connections using a Vr N. Ifthis is completed then the will have access 10 the Disaster
    Recovery system in Sterling Fore st , LV}:

    Merit Systems Accountability & Com p lia nce personnel do not have access to _
    _           a p plica ti ons during a disaster because the)' are still using dedicated T-l
    circuits, These circuits should be r eplaced with modern communications capability,
    This is a NM engineering issue.

9   In the ]009 rest, the Service Credit applicat ion was never successfutlv re covered. III the
    20 10 test. the A18F R&B Retirement application call ed Ser vice Credit was /10 1 attempted
    because of p roblems in the application unrelated to Disaster Recovery.

    The ESI hosts the bulk of the Retirement System applications. A number of years
    ago a key part of the system, Service Credit! was moved outside the ESI to the
    distrihutcd platform. The Data Center recommend.... that Service Credit he in cluded
    in the a nnu al ESI I)R exercise as it is an integral part of th e retirement system.
    Recovering and testing it is outsid e th e purview of the DC.

10. ln the 2009 test. the                       was successfully recovered bUI only able 10 be
    tested in Boye rs. Inthe ]0 J() test. the MHF R&H Rvtiremcns applica tion clIllcd _
                                                                                                6


   was successfully recovered on the rep lacement                     in Boy ers, PA. Testing of
   the system was successfully completed by personnel in Hoyer.s. I'A and able 10 be tested
   successfully by personnel in the Gaithersb urg, "ID testing locat ion.

   This entry documents the fact t h a t _ , though not ESI hosted, was successful
   recovered and tested during this yearts ~:Sl DR exercise. This is positive; not
   negative.

11. The Chic/ Financial Officer 's (CFO ) system culled PFIS was 1U!\'Cr successf ully
    recovered un the replacement                   in Boy ers. PA. in the 2009 test, the fest
    was never successfully recovered. The new impleme ntation o/ClJIS' at an out sourced
    location has a dependency on PFJ,,,' within OI'M to process financial data and invoices
   for FIS. The CFO chose to exclude PFlSf rom the 2010 DR test

   The PFIS applicution runs on a server outside the E:SI. Recovery of I)FIS was not
   attempted during the t:SI DR exercise. This is mentioned for the sake of
   completeness as PFIS is a financial component that interfaces with the FIS
   application suite. Recovering and testing it is outside the purview of the DC

12. The R&B Insurance Services application called FH/lIJ](){)() was successfully recovered
    on a replacement                    located in Boy ers, FA, The system was thoroughly
    tested and is the second time in a row if has been successfully recove red and used in a
    DR fest.

   This entry documents the fact that FEHB2000. though not E51 hosted, was
   successful recovered and tested during this lear's r,SI IlR exercise. This is pusitive;
   not negative.

13. The F!.~ e-QII ' server did nor participate. 'llle c-QIP operational plan has it
    being hosted in Boyers. Pnfor six (6) months lind then hosted in Washington DC 's TRE
   jar six (6) months. The server H'W' located in Boyers al ready during this test. Fail-over is
    demonstrated en',)' six (6) months. 7111.1' is sufficient evidence thai e-QIP is recoverable
    in the event ofa disaster.

   The KSI hosts FIS's Persennel lnfurmatiun Processing S~~I)licalion.
   I<>QIr. an integral part of the PIPS syst em, is hosted on a - - ' outside the
   ESI. For the sake of completeness, the independent e-QIP test was reported in tbe
   ~:Sl exercise summary. This is IHJsitin; nut negative.


14. There wm .vIm /10 connection availablefor DR to the FI.\: contractor IlO.l-red .
                                for outside agency access using the AgetUT Menu. In the event of
    a disaster. this would exclude outside agency access, Numbering 2K i usersfront
    accessing I'll'.')'. In the event ofa disaster. this critical requirement would 110 f he
    avaiiubte with the /2 hour window required.
                                                                                                  7


   The ESI hosts F1S's I·IPS a pplica tion. A he)' rJlls remote user     llCCl'SS   facility is
   hosted a t . (a contractor site). Ff S contracted for these serv ices and did not to
   include them in the ESI OR exercise. Remote access has always been part of each
   F:SI OR exercise. and the nun -parti cipation o f . ha s been r eport ed to FIS each
   year. The}' have taken no action to correct this deficiency. Since the cuntr ae t is
   owned and managed by FIS co rrecting this deficiency is outside the purview of the
   1lC.

J5. There was n() PIS Department of Defense (DOD) JPAS connection available for DR
    where inquiries ar e passedfront DO/) to O PAl In the event afa disaster. this cri tical
    requirement wo uld n01 be available with the 12 hour window required.

   The F.Sr husts FlS's PIPS application. A key PIPS remote DOD user access facility,
   .JPAS , is hosted through a con nect ion from the                                FIS
   requested the connection originally through the Pentagon a nd new has the
   connection to IIIIdircctly. Remote access has always been part of each E SI DR
   exercise, and the lack ora .WAS ilK con nect ion has be en reported tn FIS each year.
   The)" have taken no action to correct this deficiency. Since the connection
   agreement is between IfIS and non, correcting this deficiency is outside the
   purview of the DC.

/6, A number oj                  File transfers wa C? inclu ded in the Plan supporting various
    Lines of Business:

        a.	 FiS - _                 fiw credit information (futu re)
        b.	 FiS _ _ for credit in/ormation (fu!IIre )
        c.	 FI,\' - U.\ ' Cen sus (futur e)
        d.	 Fl S - FBi (fut ure)
        e. FIS - Agency Delivery (future )

       f PIS - IRS (future)

       g.	 E-HR I- Human Resources dat afrom e-i IRI ·.Y contractor H'as s uccess ful because
            oflP addressing issues a ll NM 's part along with e-Hkls need to cut short the
            time allocated 10 the exercise.
       II.	 R &B - A nnu it y Payro ll data completed 10 FAlS 's Kansa s City, A!O loca tion
            ( sncccssfut )
        i.	 Human Resources So lutions - Data exc hanges (success tul)
       J.	 R &B - Social Secu rity Administration (future ]

   The ESI provides the bulk of OI''\l's electronic data excha nge services. As part of
   the disaster preparedness services pruvided by the nc, recommendations are
   provided to Lint's of Business and CIO's application s uppo r t areas. The a bove list
   descrtbcs those data exchanges the DC believes to be key and should be co nside r ed
   for testing b)' the Lines of Bu siness. Since ea ch Line of Business determines what is
   important for them to test the DC only offers its r ecommendations. For the sa ke of
   completen ess, this nh....ervatio n documents th e advice and results. Of the 10 rests
   recommend ed 3 were su ccessfully tested and 7 were de ferred by the Lines of
                                                                                                 8

   Business. The Lines of Business      ma~·   wish to consider testing these data exchanges
   in the 2011 ESIUH; exercise.

/7	 No discussions were conducted hy FI,)' oft esting DR connectivuyfor its USIS Kroll. and
    CACI contractors. This sho uld he consideredfor the DR ]OW fest. These contractors are
    an essential part ofFIS operations and would be needed in the event ala dis aster.

   The F:SI hosts FIS's IlU-S applicat ion. As part of the di saster preparedness services
   provided hy the DC, recommendations an" provided to Lines of Uusiness and c ur s
   application support areas. The above observation lists contractors the DC believes
   FIS should consider including in the ESI DR test. Sfncc each Line of Business
   determines what is important fur them io test. the nc is only in a position to offer its
   recommendations. FlS may wish to consider including the above contractors in
   future tests, hut doin~ so is a F1S decision.

/8. There were DNS prohlems throughout the lest rep/IF addressing is the respons ibility of
    NAL One ofthe major problems J.i-'as the lack ofdocumentcuian crea ted hy N Xf and in the
    coordi nation of DC and ;,,'.\1 abou t what IP addressing will he used during the test . /\,,\1
    personnel are rota ted into the test ncw each year which does not provide lime to
    complete the experience of one test and am)' it forword into the next year. DC and JV,H
    suiffs needs to work closer p rior to the (est to ensure suffici ent kno wledge ofrelevant
    network topology and seui ngs are is in place in order 10 de bug network issues in a timely
    manner. A bright spot in this y ears test for N.H is the work                     who
    prefo rmed the dut ies o/ NAr.., DR Project Manager. His organizational skills greatly
    assisted in coordinating the work of the NAJparticipants. Unlike DC staffwho are
    located in Sterling Forest and Gaithersburg. NA! has stofflocated in Sterling Forest.
    Gaithersburg, Boyers. Macon, and Ft Meade.

   The structure of the network tUflolul!)' during a real disaster would have few if any
   cha nges. However, during an I<:SI UR exercise the production systems in TRU must
   continue to operate but be blocked from~ Sterling Forest and Gaithersburg
   recovery site access. The complexity associated with n"eontigurin/!; the network and
   rerouting applications for the ESt DR exercise is significa n t Each year the
   coordination between th e various organizations has improved. The ultimate goal is
   to have the overall test be executed precisely and have :111 parts work the first lime.
   This observation is intended as a reminder to ensure all ESI DR exercise
   participants st r ive to improve UR test documentation prior to the annual exercise to
   achieve this goal. This does not impact the recovery of th e ESI during an actual
   disaster.


J 9.	 The conunued refinement ofthe documentation prodded hy DC of the DR URLs needs to
      he continued. There were afew cases where the URL in the Test Plans did not match with
   what eventually worked. Work needs 10 befocused on how these URLs are made
   avatlabte throug h the nvs Servers ma inta ined by ;\lM.
                                                                                                  9


         This issue relates to Observation 18 (above). Along with refining the DR exercise
         documentation the method of accurately determining and deploying URLs should
         be improved to avoid errors. This must be a joint effort between NM and ne. This
         does not impact the recovery of the ESt during an actual disaster.

The 2010 OIG Audit report recommends:
"Recommendation 2
We recommend that OC10 develop and implement a plan to remediate weaknesses identified
during ES1 disaster recovery tests: remediation activities should be tracked on the ES1 POA&M"

CIO Comment:
We disagree in part with the recommendation. Clearly there are not 19 weaknesses. However, the
list of observations should be reviewed to determine which, if any, of the items are actual
weaknesses. The Data Center agrees that any items found to be actual weaknesses need to be
documented in a POA&M and a plan developed to remediate them. However, the Data Center
does not control infrastructures outside the EST, nor does it determine which tests will be
conducted by the Lines of Business or other organizations. During the EST DR exercise the Data
Center recovers the EST environment and executes tests to ensure the platform is wholly recovered.
While the Data Center can make test recommendations, decisions regarding the testing of
infrastructure external to the EST and customer applications are outside the control of the Data
Center. Any weaknesses found during the review of the list should be documented and tracked in
the POA&M of the organization responsible for taking corrective actions; not necessarily the EST
POAM. Likewise, plans to rcmediate any weaknesses should be developed by the parties
responsible for taking corrective actions.

X.       NIST SP 800-53 Evaluation

The 2010 OIG Audit report states:
"Although the OC10's Security and Privacy Group is currently developing a list ofcommon
controls that ES1shares with other systems. this information has not been formally documented
and shared with other OPMprogram offices. Without a well defined list ofcommon controls, the
owners ofother systems must use their own judgment to determine which security controls are
inheritedfrom ESI, increasing the risk that these systems have controls that are not adequately
implemented or tested. ..

CIO Comment:
We concur.

The 2010 OIG Audit report recommends:
"Recommendation 3
We recommend that OC10fiJrmal(v document common controls provided by ES1 and implement a
process to share this information to the owners ofother applications relying on this support
system. "

CIO Comment:

We concur. This work is in progress.