oversight

Results of the OIG's Special Review of OPM's Quality Assessment of USIS's Background Investigations

Published by the Office of Personnel Management, Office of Inspector General on 2015-09-22.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                           UNITED STATES OFFICE OF PERSONNEL MANAGEMENT 

                                                  Washington, DC 20415



  Office of the                                  September 22, 2015
Inspector General




      MEMORANDUM FOR BETH F. COBERT
                     Acting Director

      FROM:	                   PATRICK E. McFARLAND
                               Inspector General

      SUBJECT: 	               Results of the OIG’s Special Review of OPM’s Quality Assessment of
                               USIS’s Background Investigations (Report No. 4A-RS-00-15-014)

      The Office of the Inspector General (OIG) recently conducted a special review of the U.S. Office
      of Personnel Management’s (OPM) Quality Assessment over US Investigations Services’ (USIS)
      Background Investigations. The purpose of our special review was to analyze the validity of
      OPM’s Federal Investigative Services’ (FIS) Quality Assessment methodology and to ensure its
      findings objectively represented the sampled USIS background investigations (also referred to as
      cases), as stated in OPM’s memorandum for the record titled “Federal Investigative Services
      Case Review - Round Two Sample Results.”

      We issued our draft special review memorandum to Merton W. Miller, Associate Director, FIS,
      on June 2, 2015. FIS’s July 1, 2015 comments on the draft special review were considered in
      preparing this final report and are included in Attachment 4. For specific details on the special
      review findings, please refer to the “Findings” section of the memorandum.

      This memorandum has been issued by the OIG to OPM officials for resolution of the findings
      and recommendations contained herein. As part of this process, OPM may release the report to
      authorized representatives of the reviewed party. Further release outside of OPM requires the
      advance approval of the OIG. Under section 8M of the Inspector General Act, the OIG makes
      redacted versions of its final reports available to the public on its webpage.

      To help ensure that the timeliness requirement for resolution is achieved, we ask that FIS
      coordinate with OPM’s Internal Oversight and Compliance (IOC) office, to provide their initial
      response to the OIG within 60 days from the date of this memorandum.

      IOC should be copied on all responses to this final memorandum on our Special Review.
      Subsequent resolution activity for all report findings should also be coordinated with IOC. FIS
      should provide periodic reports through IOC to the OIG, no less frequently than each March and
      September, detailing the status of corrective actions, including documentation to support this
      activity, until all findings have been resolved.




 www.opm.gov	                                                                                   www.usajobs.gov
Honorable Beth F. Cobert                                                                                      2


BACKGROUND:

The mission of OPM’s FIS is to ensure the Federal Government has a suitable workforce that
protects national security and is worthy of the public trust. FIS is responsible for providing
investigative products and services for over 100 Federal agencies to use as the basis for a variety
of adjudicative decisions, including but not limited to security clearance and suitability decisions
as required by Executive Orders and other rules and regulations. Over 95 percent of the
Government’s background investigations are provided by OPM. Prior to October 1, 2014, OPM
held both fieldwork and support services contracts with USIS to assist FIS with completing
background investigations. However, on September 9, 2014, OPM informed USIS that it would
not exercise options to extend the term of these contracts beyond September 30, 2014.

An investigation by the OIG determined that during the period March 2008 through September
2012, under the fieldwork contract, USIS failed to perform contractually required quality reviews
of background investigations prior to submitting them to FIS (hereafter referred to as “dumped
investigations”).1 The OIG, the House Committee on Oversight and Government Reform, and
the Senate Committee on Homeland Security and Governmental Affairs expressed concern that
the final closing review of a portion of these dumped investigations was performed by the same
company, USIS, under its support services contract. 2

OPM and the OIG agreed that FIS would proceed with a Quality Assessment of cases that were
both (1) dumped by USIS under the fieldwork contract and (2) reviewed and closed by USIS
under the support services contract. The OIG would then verify FIS’s Quality Assessment.
During late March and early April 2014, staff from FIS and the OIG met to discuss the
methodology proposed by FIS for conducting their Quality Assessment and on April 7, 2014, the
OIG communicated our general agreement (See Attachment 1) with FIS’s proposed
methodology and we requested that the methodology be documented in writing. Subsequently,
on April 11, 2014, FIS provided the OIG with a written copy of its proposed methodology (See
Attachment 2). FIS completed its Quality Assessment and provided the OIG with the summary
of its results on September 22, 2014.

In order to evaluate the overall quality of the background investigations at the time FIS released
them to customer agencies (i.e., after the closing review process was complete), FIS reviewed a
representative sample of 1,100 out of 103,369 fieldwork intensive investigations that were
allegedly dumped, and closed by USIS under the support services contract.

FIS’s draft Quality Assessment results concluded that “It does not appear there was any effort on
the part of USIS to intentionally close investigations and not refer those meeting criteria to the
Federal staff. During the time frame of the alleged dumping, USIS continued to refer cases to
Federal review in large numbers.  The Quality Assessment revealed that most of the cases

1
 Fieldwork can be defined as background investigative coverage obtained primarily through human interactions and
can include personal interviews, communications with record providers, and human searches of databases.
2
    The support services contractor was responsible for the review and closing of background investigations
Honorable Beth F. Cobert                                                                                      3


(90.7%) were closed in accordance with the contract and were found to be Complete or Justified
(i.e., any missing coverage was properly annotated). A smaller subset (6.1%) was determined to
be incomplete, but Acceptable for Adjudication in accordance with the March 10, 2010
Department of Defense (DoD) Memorandum entitled, ‘Adjudicating Incomplete Personnel
Security Investigations.’ Only 3.2% were determined to be Missing Coverage or Issue
Resolution and most of these errors appear to be the result of a lack of attention to detail.”

OIG’s SPECIAL REVIEW METHODOLOGY:

In order to determine the validity and objectivity of FIS’s Quality Assessment, we assessed the
statistical sampling methodology, as developed by OPM’s Planning, Policy, and Analysis office
for FIS. Then, we judgmentally selected a sample for our own testing purposes in order to assess
the accuracy of FIS’s categorization of cases sampled (i.e., data reliability).

FIS’s sampling universe consisted of 103,369 fieldwork intensive background investigations that
were allegedly dumped by USIS between March 2008 and September 2012, and also reviewed
and closed by USIS under the support services contract. The universe was stratified based on the
type of background investigation, the seriousness of issues identified during the background
investigation (moderate or elevated), and the fiscal year (FY) in which the case was closed.
Cases from FYs 2008 and 2009 were combined since those cases were considered of lower risk.
Cases from FYs 2011 and 2012 were also grouped together because there were very few dumped
background investigations in 2012.

The strata were proportionally sampled by FIS based on risk – FYs 2008 and 2009 cases and
suitability cases were sampled at a lower rate because they were viewed to be potentially of less
concern. Cases reviewed for Top Secret clearance eligibility, or involving elevated final case
closing seriousness codes, were sampled at a higher rate. The statistical estimation of the
sampling results was appropriately weighted based on the sampling rates amongst the various
strata.

Based on our review of the statistical sampling methodology used for FIS’s Quality Assessment,
nothing came to our attention to indicate that it was not consistent with principles of statistical
sampling. In addition, we sought the opinion of the U.S. Department of Veterans Affairs (VA)
OIG’s Office of Statistical Operations to further validate our sampling methodology. The VA
OIG determined that the “sampling methodologies selected are appropriate to compute
statistically valid estimates.”

After reviewing FIS’s statistical sampling methodology, we judgmentally selected a sample of
120 of the 1,100 cases reviewed by FIS during its Quality Assessment.3 We reviewed all
available documentation relevant to these cases in order to determine whether we concurred with
FIS’s conclusions regarding the quality of each case in our sample. 



3
  For the 110 background investigations, we randomly selected 10 percent of cases from each strata field. We also
judgmentally selected an additional 10 background investigations that were deemed Incomplete, but Acceptable by
the Department of Defense’s Memorandum and Unacceptable in FIS’s Quality Assessment.
Honorable Beth F. Cobert                                                                          4


FINDINGS:

Based on our analysis of the background investigations we reviewed, we disagree with FIS’s
Quality Assessment results, as described in the below Findings.4

        Improper use of Department of Defense (DoD) Memorandum

            	 FIS’s Quality Assessment methodology included utilizing guidance contained in a
               DoD Memorandum, dated March 10, 2010, to categorize certain cases as Incomplete
               per FIS’s quality standards, but still Acceptable for Adjudication. We raised no
               objection to this approach when FIS initially described the protocols for its Quality
               Assessment, as it provided a means of categorizing potential quality issues by
               severity. However, after analyzing the cases in our sample, we do not agree with
               how FIS applied the DoD Memorandum during its Quality Assessment.
               Specifically, the DoD Memorandum indicates that an explanation should be
               provided in the background investigation report when information is missing or
               incomplete. Our sample included 13 cases which FIS categorized as Incomplete, but
               Acceptable for Adjudication per the DoD Memorandum. We do not concur with
               FIS on any of these cases because no explanation or “Investigator’s Note” was
               provided to explain the missing coverage. Therefore, we believe these cases should
               have been categorized as Unacceptable.

            	 We are also concerned that FIS used the DoD Memorandum as a blanket
               justification for the incomplete background investigations of other independent,
               non-DoD entities when it should have applied only to DoD background
               investigations.

            	 Finally, we observed that the DoD Memorandum was not an agreement between
               DoD and OPM, but rather direction from DoD to its components on whether and
               how to adjudicate background investigation reports that were Incomplete. We
               recognize that categorizing certain cases in FIS’s Quality Assessment as Incomplete,
               but Acceptable for Adjudication has value for FIS when attempting to determine the
               severity of quality issues and whether corrective action is required. However, the
               fact remains that all of the background investigations so categorized failed to meet
               FIS’s established quality standards, and the quality issues in these cases should have
               been identified and corrected during the original closing review process by FIS.

      FIS’s Response:

      “Your memorandum indicated that you had no objection to the approach whereby FIS used
      the March 10, 2010 DoD Memorandum to categorize certain cases as Incomplete but still
      Acceptable for Adjudication. You also indicated that you do not agree with how FIS applied
      the 2010 DoD Memorandum during the Quality Assessment. It is difficult to understand
      why you found that we improperly used the DoD Memorandum, as the methodology for our

4
    Refer to Attachment 3 for further details
Honorable Beth F. Cobert                                                                     5


   Quality Assessment was developed in coordination with your office, OPM's Office of
   Planning and Policy Analysis (PPA) and the Chief of Staff at OPM. During the period
   February through April of 2014, there were several teleconferences and email exchanges
   among the four parties to discuss the methodology for selecting the sample population of
   investigations for review as well as the criteria for the analysis of these investigations. We
   sought transparency and collaboration prior to the FIS review and provided detailed
   documentation of our review process and methodology to the OIG. We also provided your
   office the 2010 DoD Memorandum and indicated how we used it to provide a defined three-
   tiered metric for assessing the degree to which information was missing from these
   investigations. As stated in your letter, this methodology was generally agreed upon by both
   parties at the time the FIS review commenced. As such, we proceeded with our review using
   this documented and agreed-upon methodology.

   Subsequent to the completion of the FIS review, the OIG requested that FIS provide training
   for selected OIG personnel so that they could begin an independent evaluation of FIS’s
   results. In March 2015, FIS personnel provided two days of high-level training for three OIG
   staff members on the investigative requirements for the case types in the selected sample.
   FIS also provided office space for three to four OIG staff members for the period of
   March 17, 2015 to April 8, 2015, while they conducted the special review of 120
   investigations selected from the FIS sample of 1,100 investigations. During this time, the
   OIG and FIS staff enjoyed a collaborative working relationship and met several times each
   week to discuss specific case scenarios, as well as FIS investigative and operational policies.
   FIS personnel also explained to OIG staff the rationale for using the March 2010 DoD
   Memorandum to categorize the completeness of investigations within the sample.

   Another of your concerns was that FIS used the DoD Memorandum as "blanket justification"
   for incomplete background investigations for non-DoD entities when it should not have been
   applied to these entities. We would like to reiterate that the DoD Memorandum was not used
   as a "blanket justification" for either DoD or non-DoD entities, but as stated above, the
   criteria in the memorandum was used as a standardized gradient measure of the information
   missing from all investigations, regardless of requesting agency. As previously noted by
   both OIG and FIS, this methodology was mutually agreed to at the onset of the sampling.

   Your third point related to this finding is that the DoD Memorandum was not an agreement
   between DoD and OPM, but direction from DoD to its components on how and when to
   adjudicate incomplete investigations. We concur and recognize this fact. We agree that
   cases categorized as incomplete failed to meet OPM quality standards and as a result, our
   assessment included these cases in the approximate 10% of investigations that were not
   closed in accordance with the USIS support contract. However, it is important to note that
   while we do agree that the DoD Memorandum was a directive to its various components
   regarding adjudication of incomplete investigations, the memorandum is just that; guidance
   to the DoD components on how to adjudicate investigations that although technically
   incomplete, are sufficient enough to render determinations in accordance with established
   adjudicative guidelines.”
Honorable Beth F. Cobert                                                                                           6


    OIG’s Reply:

    We acknowledge that we knew it was FIS’s intent to use the March 10, 2010 DoD
    Memorandum during its Quality Assessment. However, that does not mean we were going
    to automatically agree with FIS’s interpretation of how the DoD Memorandum was used in
    the Quality Assessment, without a complete understanding of its use. In addition, in the
    OIG’s memorandum, dated April 7, 2014, we informed FIS that “Once the FIS review is
    complete, we intend to perform a subsequent independent evaluation of FIS's work, and
    therefore request that FIS maintain all relevant documentation and artifacts relevant to its
    review.” Therefore, we are not persuaded by FIS’s argument that prior discussions of FIS’s
    proposed methodology negate our findings regarding how the DoD Memorandum was
    actually used. Once we fully evaluated FIS’ Quality Assessment process, we determined
    that FIS’s methodology regarding the use of the DoD Memorandum was not a proper
    application, because: (1) there were no investigator’s notes for the background investigations
    as required; (2) it was not intended for non-DoD agencies; and, (3) while the DoD
    Memorandum does have value as a “standardized gradient measure of the information
    missing,” the fact remains that the background investigations did not meet FIS’s Quality
    Standards.

    Inaccurate Conclusions on Background Investigations

         	 We identified five background investigations in the sample that FIS deemed
            Complete/Justified where we did not reach the same conclusion. In our opinion, the
            five background investigations were Unacceptable and did not meet FIS’s quality
            standards for background investigations,
                                                      due to missing law enforcement checks and
                                 5
            employment records.

         	 In addition, we concluded one background investigation in the sample met FIS’s
            quality standards and should have been categorized as Acceptable; however, FIS
            concluded it was Unacceptable.

    FIS’s Response:

    “Your review identified six background investigations where you did not agree with the
    conclusions made in our assessment. We agree that in these six cases our findings were
    inaccurate based on OPM's operational guidance. We agree with your assessment that the
    evaluation was complicated by the fact that, in four of the five cases identified with missing
    law coverage, the coverage was not missing in its entirety and was provided in part.”




5
  To clarify, law enforcement checks were not missing in their entirety; instead, in four out of the five cases, several
law enforcement checks were required such as state, city, or military base, and only one of those required checks
was missing. The one remaining case was missing an employment record.
Honorable Beth F. Cobert                                                                         7


   Lack of Documentation

      	 In three sample cases we reviewed, FIS had identified issues in the previous
         background investigations and was unable to provide them for our review.


                                                FIS’s Quality Assessment deemed these cases as
          Complete/Justified. However, we believe that FIS cannot reach a conclusion on the
          quality of a dumped investigation without having all of the documentation that was
          available at the time the investigation was initially completed, especially when the
          prior background investigation contained derogatory information.

   FIS’s Response:

   “You found that in three investigations there were prior files with issues that were not
    provided as part of your review and that FIS could not reach a conclusion on the quality of
    an investigation without the prior file for review. The FIS review relied on the issue code
    information available for each item in the Personnel Investigations Processing System
    (PIPS) for these prior investigations to reach reasonable conclusions. Using that data for
    these three particular investigations, there was no indication that prior issues persisted into
    the current investigation. In addition, all of the prior investigations were adjudicated
    favorably and the issues in the prior investigations were coded as non-actionable at the time
    the investigations were previously closed. As you noted in your findings, the purpose of
    reviewing prior files is to determine if issues present during a prior investigation could
    impact the current investigation. While in three documented cases FIS was unable to review
    the prior files in their entirely, FIS did meet the intent of the procedure and reviewed the
    prior investigation to determine if any issues that could impact the current investigation
    were present. Since, in each of the three cases, the prior investigations were each favorably
    adjudicated and found to contain no actionable issues, FIS did in fact review the prior files
    to establish no issues were present that would impact the current investigations. Therefore,
    we disagree with this finding.”

   OIG’s Reply:

   We strongly believe it is imperative that FIS obtain the previous background investigation
   files of cases with issues in their entirety, rather than relying on the Personnel Investigations
   Processing System’s issue codes and favorable agency adjudications. In our opinion,
   physically reviewing the previous background investigation is the only way to accurately
   determine if prior issues persist into the current background investigation. Furthermore, a
   previous favorable adjudication does not exempt FIS from following its own policies and
   procedures,
Honorable Beth F. Cobert                                                                          8


   In addition, we feel that apart from the OPM policy and procedures requirement, not
   reviewing prior background investigations in their entirety leaves OPM susceptible to
   missing key issues and identifying patterns of behavior that could potentially impact current
   background investigations. As a hypothetical example, if the subject of a background
   investigation was found to be a recovering alcoholic with no sign of alcohol abuse or
   treatment during the coverage period, the subject’s background investigation may have been
   favorably adjudicated with few, if any, issue codes. That does not preclude the possibility of
   a later relapse, and if the prior background investigation file was not reviewed during the
   current investigation, the prior history may be overlooked.

CONCLUSION:

We disagree with FIS’s Quality Assessment results because we identified 21 background
investigations (18 percent of our sample of 120) that FIS deemed Acceptable but which we
believe were not in compliance with FIS’s background investigations quality standards. In
addition, we identified one case that met FIS’s quality standards, however, FIS concluded it was
Unacceptable.

It is important to note that we did not attempt to assess the severity of the quality issues in those
background investigations where our conclusions differed from FIS’s because our intent was
only to analyze the validity and objectivity of FIS’s Quality Assessment, and not to make a new
assessment. Additionally, we recognize that the adjudicating agencies that received these
background investigations made individual assessments and final adjudications of these cases
and could have returned the background investigations to OPM, if the adjudicators found the
background investigations were of insufficient quality for adjudication. However, it remains
FIS’s responsibility to provide a complete background investigation to the customer agency.

Finally, we take issue with FIS’s statement that “It does not appear there was any effort on the
part of USIS to intentionally close investigations and not refer those meeting criteria to the
federal staff.” In our opinion, FIS’s Quality Assessment was not designed in a manner that
would allow such a conclusion to be drawn since there was no comparison between the
background investigations that were dumped by USIS and those that were not.

RECOMMENDATION:

We recommend that FIS evaluate the 103,369 dumped background investigations, as follows:

     If or when the subjects of those background investigations are submitted for
       reinvestigation, FIS should determine if there was any missing coverage in the dumped
       investigations and, if so, FIS should schedule those missing items as part of the
       reinvestigation.

     For those subjects who have already been reinvestigated since the identification of
      USIS’s alleged misconduct, FIS should determine if there was any missing coverage in
      the dumped investigations and, if so, schedule those missing items as soon as possible.
Honorable Beth F. Cobert                                                                            9


FIS’s Response

“We do not agree with the draft recommendation to evaluate and potentially reopen 103,369
dumped background investigations as the scale of such a recommendation is not commensurate
with the findings reflected in your draft memorandum. As previously stated, your review
essentially identified only six background investigations where you did not reach the same
conclusion as our review. The primary basis for your disagreement with our assessment is based
on 13 investigations that we categorized as Incomplete but Acceptable for Adjudication that you
concluded should have been rated as Unacceptable, although doing so would have been
inconsistent with the mutually agreed-upon methodology for the assessment. In addition, none of
the quality errors in any of the sampled investigations were significant enough for the
adjudicating agencies to request that the investigations be reopened. The issue at hand is 13
investigations that are missing an Investigator Note to explain the absence of otherwise required
coverage. All of these investigations at issue were adjudicated by the requesting agency without
any requests for corrections or additional work by the requesting agency. An Investigator's Note
does not provide any additional coverage, but serves to document and/or explain why otherwise
required coverage is missing. Therefore, the substantive and adjudicative information within
each of the 13 investigations would remain unchanged.

The re-evaluation of over 103,000 investigations because 13 investigations that we acknowledge
contained quality errors, but in your view were not categorized properly, is not feasible.
Evaluating these investigations to determine the potential for missing investigative coverage that
is unlikely to change an adjudicative outcome would require an excessive number of resources
that would be diverted from FIS's primary and critical function of providing background
investigations in a timely manner to over 95% of the Federal Government.

An alternative recommendation arising from the FIS review and the OIG analysis of that review,
and one that has already been implemented, would be that FIS implement a fully federalized
investigative review process where all investigations receive a complete federal review before
delivery to the customer agency. In addition, it should be noted that FIS did not renew the USIS
fieldwork or USIS support contracts in September 2014.”

OIG’s Reply:

We do not expect and did not recommend that FIS reopen 103,369 background investigations.
We do recommend that FIS perform an evaluation in order to categorize and flag those dumped
investigations due to the risk of quality errors. The recommended categorization will:
1) identify and address those which have already been reinvestigated, and 2) identify and flag
those which have not been reinvestigated yet, so that they receive additional scrutiny when FIS
next has occasion to open an investigation on that subject. This will allow FIS the opportunity to
address any issues and to apply additional scrutiny to these background investigations that may
not have had proper review. We want to ensure that FIS does its due diligence in ensuring
individuals are suitable for the clearances for which they are sponsored.

Further, FIS’s statement that “The issue at hand is 13 investigations..” disregards the fact that
only a sample of cases was reviewed. These 13 cases represent more than 10 percent of the
Honorable Beth F. Cobert                                                                         10


sample reviewed, and while this error rate cannot be projected to the full population, it does
provide an indication that a large number of cases may have contained “quality errors.”

In addition, we acknowledge FIS’s intent to implement a fully Federal review process; however,
we do not feel a future implementation retroactively addresses potential discrepancies in those
103,369 background investigations. Therefore, we stand by our initial recommendation.

Please contact me, at (202) 606-1200, if you have any questions, or someone from your staff
may wish to contact Michael R. Esser, Assistant Inspector General for Audits, at                      ,
or Michelle Schmitz, Assistant Inspector General for Investigations, at             .

Attachments

cc: 	Chris Canning
     Acting Chief of Staff

   Mark W. Lambert
   Associate Director, Merit System Accountability and Compliance

   Janet L. Barnes 

   Director, Internal Oversight and Compliance 

                                                                                        ATTACHMENT 1 



                         UNITED STATES OFFICE OF PERSONNEL MANAGEMENT 

                                               Washington, DC 20415 


   Office of the 

Inspector General 

                                                 April 7, 2014

       MEMORANDUM FOR JEFFREY C. FLORA
                      Deputy Associate Director, Quality
                      Federal Investigative Services .

       FROM:                    LEWIS F. PARKER, Jr.         Vf'
                                :Oeputy Assistant Inspector General for Audits
                                Office of the Inspector General

                                KIMBERLY A.      HOWELL~
                                Deputy Assistant Inspector General for Investigations ·
                                Office ofthe Inspector General

      SUBJECT:                  Review ofUSIS Dumped B~ckground Investigation Cases

      The purpose of this memorandum is to communicate the Office of the Inspector General's (OIG)
      comments related to the Federal Investigative Service's (FIS) proposed review of background
      investigation cases performed by a contractor, United· States Investigations Services (USIS), that
      were allegedly closed without an adequate quality review ("dumped" cases).

      FIS intends to select a sample of dumped cases to be subject to an evaluation by FIS 's Quality
      Assurance. While we generally agree with FIS's proposed methodology for this review, we have
      one recommendation related to this process. FIS planned to exclude cases from 2008 and 2009
      from the sample population because individuals investigated in 2008 should have been subject to
      a re-investigation in 2013, and those from 2009 should be re-investigated in 2014. However, PIS
      is unable to determine which specific individuals have, in fact, been re-investigated, so we
      recommend that all cases from 2008 and 2009 be included in the sample universe.

      We request that FIS formally document the details of its final sampling methodology and quality
      review process, and provide this information to the OIG in advance of starting its review. Once
      the FIS review is complete, we intend to perform a subsequent independent evaluation ofFIS's
      work, and therefore request that FIS maintain all relevant documentation and artifacts relevant to
      its review.

      Please note that the OIG's support of this current FIS quality review does not indicate that our
      office will not perform future audits, evaluations, or reviews of the USIS dumped cases or the
      FIS background investigation process as a whole.

      Please contact us if you have any questions regarding this memo, or your staff may wish to
      contact                    , Special Agent in Charge on              , or                  ,
      Chief, Information Systems Audits Group, on
Jeffrey C. Flora                                           2

cc: 	   Ann Marie Habershaw
        Chief of Staff

        Norbert E. Vint 

        Deputy Inspector General 


        Merton W. Miller 

        Associate Director 

        Federal Investigative Services 


        Michelle Schmitz 

        Assistant Inspector General for Investigations 

        Office of the Inspector General 


        Michael R. Esser 

        Assistant Inspector General for Audits 

        Office of the Inspector General 


                           

        Special Agent in Charge 

        Office of the Inspector General 


                         

        Chief, Information Systems Audits Group 

        Office of the Inspector General 


                        

        Supervisory Case Analyst 

        Federal Investigative Services 


                      

        Manager, Survey Analysis 

        Policy Planning and Analysis 

                                                                                        ATTACHMENT 2 





                               UNITED STATES OFFICE OF PERSONNEL MANAGEMENT 

                                                     Washington, DC 20415 


Federal Investigative
      Services
                                                       Aprilll, 2014


                 MEMORANDUM FOR 	
                                             Special Agent in Charge
                                             Office of the Inspector General


                 FROM: 	                     JEFFREY C. FLORA 

                                             Deputy Associate Directo , uality 

                                             Federal Investigative Services 


                 SUBJECT: 	                  Proposed FIS Review

              Per your request, the purpose ofthis memorandum is to formally document the details ofthe
              Federal Investigative Services' (FIS) proposed review of background investigations submitted
              by United States Investigations Services (USIS) that allegedly did not receive a contractually
              required quality review (hereafter referred to as "dumped" cases). The review will cover
              alleged dumped investigations closed by USIS contractor personnel during the period March
              2008 to September 2012, and will be conducted at the FIS office in Ft. Meade, Maryland.

              ObJective

              The objective of our review will be to evaluate the overall quality of a sample of dumped
              investigations from the population of upper level case types (i.e., all case types that include
              fieldwork except NACLC/ANACI) closed by USIS contractor personnel during the period
              March 2008 to September 2012. This review is focused on the investigations closed by USIS,
              as the Congressional Committee on Oversight and Government Reform is particularly
              concerned that these dumped investigations were being reviewed by the same company,
              USIS, which allegedly dumped them. Examining the contractor-closed investigations will
              allow us to direct the analysis toward any potential conflict ofinterest issues that may exist in
              these investigations. These objectives will be met by a review and analysis of a statistically
              valid sample selected from the upper level case type population of dumped investigations
              closed by USIS.

              Sampling Methodology
             The sampling methodology for this review was provided by the Policy Planning and Analysis
             (PPA) staffand is included as an attachment.
Quality Review Process

The selected sample ofinvestigations will be reviewed by investigation case analysts at the Ft.
Meade PIS office. The analysts will review each investigation closed by the USIS contractors
from the Closing Authorization and Support Team (CAsn to determine ifthe investigations
were closed in accordance with policies and procedures in effect at the time the cases were
closed.

To conduct this review, the analysts will use criteria reflected in the following documentation:
•	
•	    Annex A to Director of Central Intelligence Directive 6/4 -Investigative Standards for
      Background Investigations for Access to Classified Information
•	    DoD Memorandum, "Adjudicating Incomplete Personnel Security Investigations,"
      dated March 10,2010
•	
•	
•	
•	

The analysts will be reviewing the investigations to determine:

1. IfCAST perfonned the closing action in accordance with the criteria provided in the
applicable operational instructions.

2. Ifthe overall quality of the cases that did not receive the contractually required review by
the field contractor was acceptable.
    • 	 These cases will be evaluated for quality using a three-tiered strategy. Cases will be
        evf!,luated and placed in one ofthree categories based on an assessment of each
        investigation:
            1. 	 Complete or Justified:
                     • 	 Complete- Those investigations in which all required
                         leads/investigative elements are obtained in full. There are no· gaps in
                         scope (the timeframe requiring coverage ofleads) and any issues
                         present are sufficiently resolved. Since all investigative elements are
                         completed in full, no leads or elements contain an explanation for
                         lacking coverage.
                     • 	 Justified - Those investigations in which there are gaps in, or missing
                         required investigative elements. The gaps or missing elements are
                         either: Impossible to obtain (i.e., the leads does not exist and no amount
                         of additional effort would result in obtaining the lead), or reasonably
                         exhaustive efforts were made to fill the gap or obtain the coverage, but
                         the efforts were unsuccessful. The gaps in coverage or missing
                         elements are accompanied by a sufficient explanation which details the
                         efforts made to obtain the element and why those efforts were
                         unsuccessful. Any issues present are sufficiently resolved to the extent
                         possible.
            2. 	 Investigations with incomplete or missing information, but may be adjudicated
                 per established DoD guidance (Reference: DoD Memorandum "Adjudicating
                 Incomplete Personnel Security Investigations, dated March 10, 201 0);
            3. 	 Coverage is missing without explanation or issues are not sufficiently resolved
                 (excluding the exceptions noted in 2. above) These investigations will be
                 evaluated against clearance databases to determine if clearances have been
                 granted erroneously or if additional work was performed by the adjudicator.

Please contact me at                 or             at              if you have any questions 

regarding this memorandum. 


Attachment: 

FIS Investigation Audit Sampling Methodology 


cc: 	   Ann Marie Habershaw
        Chief Of Staff

        Norbert E. Vint 

        Deputy Inspector General 


        Merton W. Miller 

        Associate Director 

        Federal Investigative Services 


        Michelle Schmitz 

        Assistant Inspector General for Investigations 

        Office of the Inspector General 


        Michael R. Esser 

        Assistant Inspector General for Audits 

        Office of the Inspector General 


        Kimberly A. Howell 

        Deputy Assistant Inspector General for Investigations 

        Office of the Inspector General 


        Lewis F. Parker, Jr. 

        Deputy Assistant Inspector for Audits 

        Office of the Inspector General 


                         

        Chief, Information Systems Audit Group 

        Office of the Inspector General 

 Attaclunentl


                          FIS Investigation Audit Sampling Methodology
                                              Survey Analysis 

                                        Planning and Policy Analysis 

                                               April tO, 2014 


Background
The pwpose ofthis document is to detail the methodology behind selecting a sample of background
investigations that were allegedly "dumped" by a sub--contractor between March 2008 and September
2012. The first portion ofthe document describes the methods used to draw a stratified, random
sample of 1,096 NACLC and ANACI investigations for re-review. The second portion of the
document describes the proposed methods to select stratified, random sample of 1,100 investigations
from a complementary population of investigations (i.e, those not classified as a NACLC or ANACI).

 Sampling Methods
 Round I: NACLC and ANACI Investigations
 FIS provided the Survey Analysis (SA) group of Planning and Policy Analysis a cleaned sample
frame, or comprehensive list ofthe 77,333 investigations eligible to be audited. The file contained a
unique investigation identifier, an indication as to whether it was a NACLC or ANACI investigation,
 and a closing date ofthe investigation. SA imported the raw data into the statistical software package
 SAS® and grouped the investigations into mutually exclusive and exhaustive set of grouping, or
strata. Table 1a shows the original set ofpopulation counts broken out by investigation type and
calendar year, and Table 1b shows how these were collapsed to form six strata. Table 1b also includes
the stratum sample sizes. Specifically, within each stratum, an independent sample was selected using
the SURVEYSELECT procedure in SAS, which has a built-in randomized algorithm SA has utilized
for a variety of sampling efforts, such as the Federal Employee Viewpoint Survey (FEVS).

Some ofthe benefits of stratified random sampling design are as follows:
   • 	 Enables more control over the types of cases sampled.

    • 	 Increased precision for the overall estimated proportion (i.e., a narrower confidence interval).

    • 	 There is no need to sample from each stratum at a uniform rate. If there are investigations of
        particular analytic interest or ofheightened concern that can be pre-identified on the sample
        frame, they can be oversampled relative to other investigations.


The sample allocation shown in Table 1b was developed after deliberating with subject matter experts
in FIS. The more recently completed investigations were sampled at a higher rate relative to those
completed earlier. And there were so few cases from 2012 that it was considered most appropriate to
census those cases.

The key statistic to be estimated from the sample is the percentage (synonymously, a proportion or
rate) of investigations that had the potential for an improper e-adjudication. The difficulty associated
with designing a sample that targets specific precision levels (e.g., a maximum margin of error) for
this kind of statistic is that the precision is a function of the estimated percentage itself, a byproduct of
which is not known for sure until the sample has been drawn and the data collected. Nonetheless, after
working through some ''what-if" scenarios and consulting reports from comparable audits conducted
by GAO, the level of precision to be achieved from the design summarized by Table 1b appeared more
than sufficient.
Attaclunentl

Table 1a: Population Counts ofNACLC and ANACI Investigations
          Type:NACLC
                          Population
    Year     Count          Percent
     2008    21,124           27.3%
     2009    20,641           26.7%
     2010    12,849           16.6%
     2011    12,124           15.7%
     2012       278            0.4%
  Subtotal   67,016           86.7%

             Type:ANACI
                          Population
   Year Count               Percent
    2008 2,995                 3.9%
    2009 3,064                 4.0%
    2010 2,412                 3.1%
    2011   1,828               2.4%
   2012       18               0.0%
 Subtotal 10,317              13.3%

    Total 772333


Table 1b: Stratum Counts and Sample Sizes ofNACLC and ANACI Investigations
                  Type:NACLC
               Population               Sample
     Year        Percent SamJ!le          Rate
2008-2009         41,765    300           0.7%
2010-2011         24,973    300           1.2%
     2012            278    278         100.0%
  Subtotal        67,016    878           1.3%

                   Type:ANACI
               Population               Sample
     Year        Percent SamJ!le          Rate
2008-2009           6,059   100           1.7%
2010-2011           4,240   100           2.4%
     2012              18    18         100.0%
  Subtotal        10,317    218           2.1%

     Total            77J.33   1~096      1.4%
 Attachment!

 Although stratification can achieve efficiencies, it complicates the estimation process. Specifically, to
 account for the disproportionate sample rates, stratum-specific weights must be assigned and utilized
 during any kind of full-sample analysis. For example, after re-reviewing the 1,096 investigations that
 were drawn as part of Round 1 ofthe audit, it was determined that 6 were improperly flagged for
 potential e-adjudication. The estimated error rate is not simply 6/1,096 =0.5474%, but a weighted
 average that compensates for the disparate representation of strata in the sample. The estimated error
 rate accounting for the sample design was somewhat higher, 0.8630%.

Round 2: Non-NACLC and Non-ANAC/ Investigation Types
In fP.is section we outline our proposed methods for sampling the complementary investigation types,
those not classified as either a NACLC or an ANACI. As in Round 1, FIS has provided SA a cleaned
sample frame containing a unique investigation identifier and the following variables that are
candidates for the stratification scheme: (1) investigation close date; (2) investigation type; and (3)
seriousness code. Because there were numerous investigation types and case seriousness codes, many
of which were similar in nature, SA consulted with subject matter experts in FIS to dichotomize them
as follows:

Investigation Type:
    1. 	 Top Secret. These consist ofthe following case types:

             • 	 SSBIPR
             • 	 PhasedPR
             • 	 SSBI
             • 	 SDI 13-36
             • 	 SGI 37-60
             • 	 SGI0-36
    2. 	 Suitability. These consist of the following case types:

            •	   PRI
            •	   PRIR
            •	   MBI
            •	   LBI
            •	   LDI 13-36
            •	   BI
            •	   BDI 13-36
            •	   PTSBI
            •	   BGI0-36
            •	   RSI

.Seriousness Code:
     1. 	 Moderate. These consist ofthe following codes:

            • 	 A= There are potentially actionable issues which, standing alone, would not be
                considered disqualifying under security/suitability considerations.
            • 	 B= There are potentially actionable issues which, standing alone, would probably not
                be disqualifying under security/suitability considerations.
            • 	 G =There are no issues
            • 	 R = There are no actionable issues
Attachmentl

    2. 	 Elevated. These consist ofthe following codes:

              • 	 E = There are other matters, such as qualifications, medical issues, or inconclusive
                  results, that may affect your detennination.
              • 	 W = (This code is no longer used) - This investigation developed issues which,
                  depending on the mission of your organization and/or the duties ofthe position, you
                  may wish to consider when making the suitability/security determination in this case.

The population counts and proposed sample design for these strata are summarized in Tables 2a and
2b, respectively. The overall sample size proposed (1, 100) is very similar to that from Round 1
(1,096). Also similarly to Round 1, we propose grouping investigations by close date, but with a
somewhat different collapsing routine. Aside from the fiscal year delineation as opposed to calendar
year, we propose grouping the very small number of investigations from 2012 with those from 2011,
as well as those from 2008 with those from 2009. This is because the investigations conducted in
2008 are scheduled for re-investigation in 2013, and those conducted in 2009 are scheduled for are­
investigation in 2014, both of which will involve a Federal review. Because these are potentially of
less concern, they will be sampled at a lower rate than the more recently completed investigations.
The design also places a greater emphasis on top secret investigation types and those with elevated
seriousness codes.

Table 2a: Population Counts ofNon-NACLC and Non-ANACI Investigations
                                   Year: 2008
                                                                             Population
Type                           Seriousness              Count                  Percent
Suitability                    Moderate                  2,278                    2.2%
Suitability                    Elevated                  3,869                    3.7%
Top Secret                     Moderate                  9,009                    8.7%
To~! Secret                    Elevated                 13,911                   13.5%
Subtotal 	                                              29,067                   28.1%

                                       Year: 2009
                                                                             Population
Type                           Seriousness              Count                  Percent
Suitability                    Moderate                  2,520                    2.4%
Suitability                    Elevated                  5,474                    5.3%
Top Secret                     Moderate                  9,067                    8.8%
To£ Secret                     Elevated                 21,827                   21.1%
Subtotal 	                                              38,888                   37.6%
Attachment!


                                  Year: 2010
                                                                   Population
 Type                      Seriousness             Count             Percent
 Suitability               Moderate                 3,126               3.0%
 Suitability               Elevated                 4,924               4.8%
 Top Secret                Moderate                 4,624               4.5%
 Top Secret                Elevated                14,106              13.6%
 Subtotal                                          26,780              25.9%

                                  Year: 2011
                                                                   Population
 Type                     Seriousness              Count             Percent
 Suitability              Moderate                    234               0.2%
 Suitability              Elevated                     77               0.1%
 Top Secret               Moderate                  4,949               4.8%
 Top Secret               Elevated                  3,250               3.1%
 Subtotal                                           8,510               8.2%

                                  Year: 2012
                                                                   Population
 Type                     Seriousness              Count             Percent
 Suitability              Moderate                     0                0.0%
 Suitability              Elevated                     0                0.0%
 Top Secret               Moderate                    70                0.1%
 Top Secret               Elevated                    54                0.1%
 Subtotal                                            124                0.1%

Total                                             103 369



Table 2b: Stratum Counts and Sample Sizes ofNon-NACLC and Non-ANACI Investigations
                                     Year: 2008-2009
                                                     Population
Type            Seriousness              Count         Percent SamJ!Ie   SamJ!leRate
Suitability     Moderate                  4,798           4.6%     50           1.0%
Suitability     Elevated                  9,343           9.0%     50           0.5%
Top Secret      Moderate                 18,076          17.5%    100           0.6%
Top Secret      Elevated                 35,738          34.6%    250           0.7%
Subtotal                                 67,955          65.7%    450           0.7%
Attachment!


                                    Year: 2010
                                                  Population
Type          Seriousness              Count         Percent   Sam~le   Sam~leRate
Suitability   Moderate                  3,126          3.0%        so         1.6%
Suitability   Elevated                  4,924          4.8%        so         1.0%
Top Secret    Moderate                  4,624          4.S%        7S         1.6%
To~ Secret    Elevated                 14,106         13.6%       22S         1.6%
Subtotal                               26,780         2S.9%       400         1.5% 


                                  Year: 2011-2012 

                                                  Population
Type          Seriousness              Count        Percent    Sam~le   Sam~leRate
Suitability   Moderate/Elevated           311          0.3%        so        16.1%
Top Secret    Moderate                  5,019          4.9%       100         2.0%
To~ Secret    Elevated                 3,304           3.2%       100         3.0%
Subtotal                                8,634          8.4%       2SO         2.9%

Total                                 1032369                   12100        1.1%
                                                                                                                                                       ATTACHMENT 3 


                                    OIG QUALITY ASSESSMENT RESULTS
                                  FIS' QUALITY
 OIG                                                   OIG ASSESSMENT
        CASE NUMBER   CASE NAME   ASSESSMENT                                                                          OIG COMMENTS
COUNT                                                      RESULTS
                                    RESULTS

                                   Incomplete but                        The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  1                                  Acceptable           Unacceptable   assessment results.

                                   Incomplete but                        The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  2                                  Acceptable           Unacceptable   assessment results.


                                   Incomplete but                        The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  3                                  Acceptable           Unacceptable   assessment results.

                                                                         ·The employment record for the                              was not obtained.
                                                                         •In addition, FIS' was unable to provide the previous background investigati01_1 for our review.
  4                               Complete/Justified      Unacceptable   This case should be marked as "unacceptable" in FIS' quality assessment results.

                                   Incomplete but                        The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  5                                  Acceptable           Unacceptable   assessment results.
                                                                          FIS' was unable to provide the previous background investigation for our review. We are unable to make a
                                                                         conclusion without the previous background investigation. This case should be marked as "unacceptable" in
  6                               Complete/Justified      Unacceptable   FJS' quality assessment results.
                                                                         The law check for                Superior Court was not scheduled to obtain coverage. This case should be
  7                               Complete/Justified      Unacceptable   marked as "unacceptable" in FIS' quality assessment results.

                                                                         FJS' was unable to provide the previous background investigation for our review. We are unable to make a
  8                               Complete/Justified      Unacceptable   conclusion without the previous background investigation.

                                                                         The law check for               military base was not scheduled to obtain coverage. This case should be
  9                               Complete/Justified      Unacceptable   marked as "unacceptable" in FIS' quality assessment results.

                                                                         The law check for            Texas military base was not scheduled to obtain coverage. This case should be
  10                              Complete/Justified      Unacceptable   marked as "unacceptable" in FIS' quality assessment results.

                                    Incomplete but                       The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  11                                  Acceptable          Unacceptable   assessment results.

                                    Incomplete but                       The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  12                                  Acceptable          Unacceptable   assessment results.
                                                                         FJS' was unable to provide the previous background investigation for our review. We are unable to make a
                                                                         conclusion without the previous background investigation. This case should be marked as "unacceptable" in
  13                              Complete/Justified      Unacceptable   FJS' quality assessment results.


                                    Incomplete but                       The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
  14                                  Acceptable          Unacceptable   assessment results.
      Incomplete but                     The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
15      Acceptable        Unacceptable   assessment results.



      Incomplete but                     The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
16      Acceptable        Unacceptable   assessment results.


      Incomplete but                     The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
17      Acceptable        Unacceptable   assessment results.



      Incomplete but                     The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
18      Acceptable        Unacceptable   assessment results.


                                         We determined the case meets FIS' quality standards, therefore, the case should be marked as "Acceptable" in
19     Unacceptable        Acceptable    FIS' quality assessment results.

                                         The law check was scheduled for                    and the case was closed with the law check pending.
                                         Therefore, we cannot determine ifthe law check was favorable or had issues. This case should be marked as
20   Complete/Justified   Unacceptable   "unacceptable" in FIS' quality assessment results.


      Incomplete but                     The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
21      Acceptable        Unacceptable   assessment results.


      Incomplete but                     The DoD memo was not used properly. This case should be marked as "unacceptable" in FIS' quality
22      Acceptable        Unacceptable   assessment results.
                                                                                                       ATTACHMENT 4 



                                  UNITbD STATES OFFICb 01:-' FERSONNJ.::L MANAGEMENT



l'L'deml luvesligalivt·
       Servin·-~·                                                 July1,2015


           MEMORANDUM FOR MlCHET.I .E SCHMITZ
                                            Assistant Inspector General l(>r Investigations

                                            MICHAEL R. ESSER
                                            ;\:-:sistnnl Inspector General f(>r Audits

           FROM:                            MERTON W.          MIL~)~
                                            Associate Director
                                            Federal Investigative Services

           SUBJECT:                         OPM Response to the OIG's Special Review or OPM's Quality
                                            Assessment of USIS's llackground lnvestigalions
                                            (Reporl No. 4A-RS-00-15-0 14)

           Thank you l(}r giving the rcdcral Investigative Services (FIS) lhc opportunity to commcnl on the
           Oflic1.~ oi'Pt~rsonnel Managcmenl's (OPM) Oflkt! of the rnspt!elor General (OKi) drafl
           memorandum ol' Jimlings from the special review ofOPM's Quality /\sscssrm;nt of US
           lnvestigati()nS Service (USIS) background investigations. We are committed to continuing to
           work \Vith you in OUI' cfft)l'ts to impt'ovc the quality of FIS backgl'ound investigations.

           QIQFindingftJ
           improper U,\'e <d'Deparlmen! <d'Defense (DoD) .~temoramlum

          Your memorandum indicated that you had no objection to the approach \vhcrcby FIS used the
          March I0, 10 I0 DoD f\.1cnwrandum lo categorize certain cases as Incomplete but still
          i\(.;<:eptablc fbr Adjudication. You also indicated that you do not agree with h()w FIS applied lhe
          2010 DoD Memorandum during the Quality Assessment. Jl is di llicult lo understand why you
          lt1tmd that we improperly used the DoD Memorandum, as lhe methodology tor our Quality
          Assessment was developed in coordination with yom otficc, OPM's Office of Plrmninp, and
          Policy Analysis (PPA) and the Chief ofStaiTat OPM. During the period February through April
          of 2014. there were several tclceonlcrcnces and emai I exchanges nmong the four parties to
          discuss the methodology tor selecting the sample population of investigations I(H· review· as well
          as the criteria for the analysis of these investigations.' We sought transparcn~y and colluboration
          prior to the FlS review and provided detailed documentation of nur review process and
          mcthndology to thl: 010. We also provided your office the 2010 DoD Memonmdum ami
          indic.alcd hmv we used it to provide a defined three-tiered metric for assessing the degree lo
          \vhich information \\'as missing Ii·mn these investigations. As stated in your lettel', this


          1
              AUnchmenl I documents the coordination erfol't~ th;ll nccur·rcfl pl'io•·tn FIS's Quality As8c:<.smcnt.




          W'/J)"ol.QIJIII.'JU'I
                                                                                                         2

methodology was generally agreed upon by both purli<.:~ at the time the fiS review commenced.
As such. W<.: prm:eetled \'v'ith our rcvicv,· using this documented am.l agreed-upon methodology.

Subsequent to the completion of tlu: FJS revic\V, the OlG rcquc!'>Lctl that fiS provide training I(Jr
selected oro pcrsoni1cl so that they could begin an independent evaluation of FIS's result!>. In
March 2015, FIS personnel provided tv,ro days of high-level training for th1'cc· OIG staff member:;
on the invc.s.tigativ~:: retiuirements for the case types in the selected sample. FIS also provided
office space for three to fou•· OICi stair memhcrs for the period March 17,2015 to April 8, 2015.
\vhih.! they conducted the special review of 120 investigations selected from the FIS :-:ample of
1,100 investigations. Dllring this time. the OIG and FIS statr Cf~joyetl a collaborative working
relationship and meL several times each \Vcck lo discuss spccitlc case .scenarios. as well as FIS
invcstigativ~..: anti operalional policies. FIS personnel also explained to 010 stalflhe rationale for
using the March 20 l 0 DoD Memorandum to categorize the completeness of investigations
\vithin th~ sample.

Another of yom concern:-~ was that FIS used the DoD Memonmdum as "blanket j usli liculion·· for
incomplete background investigations for nonwDuD entities w·hen it should not have been applhxl
to these entities. We would like to reiterate that the DoD Memorandum was not used as a
"blanket justification'" for either DoD or non-Dol> entities, hut !l.'> stated above. the cl'itcria in the
memorandum was used as a standardized gradient measure or the information missing from all
investigations. regardless or requesting agency. As previously noted by both OIG anti FIS, this
methodology was mutually agreed to at the onset of the S<lmpling.

Your third poinlrdatcd to this finding is that th~ DoD Memorandum was not an agreemem
between DoD and OPM, but din~t:tion J1·om DoD to its components on how and when to
adjudicate incomplete investigations. We concur and rct:ognize this t:-1ct. We agree that cases
c<ltegorizcd as incomplete failt!d to meet OPM quality standards and as a result, our assessment
included these <.:a<>es in th~ approxirnale 10%, of investigations that w~rc not dos~d in w.:cortlance
with the USIS supp011 contract. llowcvt.::r. it is impor·lant to nutc that while \Ve do agree that the
Dol> Memorandum was a directive lu its various components regarding adjudication of
inc.ompkk investigations, the ruurnorundum is just that~ guidance to the DoD components on
how lu adjudicate investigations that alLhough technically incomplete, arc suft1cicnt enough to
render determinations in accordance with cstablishct.l adjut.licaLive guidelines.

OIG   Eir~c!ing   #2
hwccuratc ( 'om<O·!u:-dons on Had.rgroumllm·,•.vO}{ations

Your revic\v idcntit1ed six background investigations \Vhcrc you did not agree with the
conclusions made in our assessment. We agrc<: that in these six cases our Ilndings \Vcre
inaccurate ba~cd on OPI'vrs operational guidance. w~~ agree with your assessment thnl the
evaluation was complicated by the fact that. in four of the five ca.~es identified with missing lmv
coverage. the coverage \Vas nnt mtssing in its cnlircty and was provided in part
                                                                                                      3


OIG Finding #J
Lade (?/'Documentation

You found that in three investigations there were prior files \Vith issues that were not provided as
pmt of your revic\v and that FIS could not reach a conclusion on the quality of an investigation
without I he prior file for review. The FIS review relied nn the issue code information available
for each item in the Personnel Investigations Processing System (PIPS) for these prior
investigations to reach reasonable conclusions. Using that data for these three particular
investigations.. thcl'c "vas no indication that prior issues pcrsisrcd into the cum~nt investigation. In
addition, all ofthc prior investigations were adjudicated favorably and the issues in the pl'ior
investigations were coded as non-actionable at lhe time the investigations were previously
closed, As you noted in your findings. the purpose of rcvicvving prior tiles is lo detennine if
issues present dll!'ing, a prior investigation could impact the cuncnt investigation. While in three
documented cases FIS was unable to review the prior filc:s in their l!nlircty, PIS did meet the
intent of the procedure and reviewed the pl'iOI' investigation to determine if any issues that could
impact Ihe current investigation were !>resent. Since, in each of the three cases. the prior
invetltigutions \Vere each favorably adjudicated and f(mnd to contain no ac1ionabk issues, FIS did
in lact rcvkw the prior tiles to establish no issues were present thnt would impact the current
investigations. Therefor·c, \VC disagree with this finding.

010 Conclusion

We do not agree with the draft recommendation to evaluate and pot~:ntially reop~n l 03,369
dumped background investigations as the scale of such a recommendation is not commensurate
\vith the findings rellected in your draft memorandum. As previously stated, yow· review
essentially i~kntificd only six background investigations where you did not reach the same
c.om:lu:sinn as our review. The primary ha:sis for your disagreement \Vith our assessment is bused
()!l 1.1 invcstigutions that we categorized as Incomplete but Acceptable for Adjudication lhat you

coru.:luded should have been rated as Unucccptabl~. although doing so would have heen
inconsistent with thL.! mutually agreed-upon meHwdology for the assessment. In addition, none of
the quality errors in any of the sampled investigations were significant enough Ibr the
adj udic:ating agencies to request that the investigations he reopened. The issue at hand is 13
investigations that arc missing an Investigator Note to explain the absence of otherwise required
coverage, All of these investigations at issue were adjudicated by the requesting agency without
any requests feu t:orro::ctiuns or additional work by the r~qu~:sting agency. An Investigator's Note
does not provide f!ny additional coverage, hut sct·vcs to document and/or explain why othcrvvisc
required co·verage is missing. ThNcforc, the substantive and adjudicative infonmtLion \vithin
t:ach oflhc 1J investigations would remain unchanged.

The J'c-cvaluation or over I 03.000 investigations because 13 invcstig'ations that we acknowledge
contained quallty crror·s, hut in your view were not categorized properly, is not feasible.
Evaluating these investigations to determine the potential for missing invcstigutiv~ coverage that
is unlikely to change an adjudicative outcome would require an excessive number        or rc~ources
that would be divcrtL:ti lhnn FJS' s primary and critical function of providing backgrounu
investigations in a rimely manner to over 95% of the Federal Government.
                                                                                                  4


An alternative rccomrncndation arising IJ·om the FIS review and the OIG analysis oftlwt rcvicv.',
and one that has already been implemented, would b~~ that FIS implement a fully iCdcralized
investigative review pro(;ess where all investigations receive a complete federal review hel(m:
delivery to the customcl' agency. In addition, il should bt: noted that f[S did not renew the US IS
t1cldv~-wk or USTS ~upporl contracts in September 2014.


Again. thank you for the opportunity to comment on the draft memorandum. lfyou have any
questions or wan I to discuss furlht:r, please fed free to contact Jeff Flora at    .
                                                                                                    5



Attachment I
Documcnt~d Joint Efli.>rls hdween FJS, OIG, PPA and OPM Chief ofStatT Prio1·to FIS Quality
Assessment

Feb mary 6. 2014 · Con fercncc call was hdd to discuss the process for selecting the population
of US IS ''dumped" investigations closed hy the support contmctor fol' review. Participants
included OPM's Chief of Stalf personnel from OPM's Office nf Planning and Policy Analysis
(PPA) and FlS personnel.

February I~. 2014 - Couii.:rcncc call between FIS and PPA personnel to determine the \'v'ay
n)rward on selecting a representative sample size for population of cases lor this review.

March I 0, 2014 - FIS sent an cmai t to PPA and attached spreadsheets containing aU of the 131­
lypc inv!.!s.tigations dumped by USTS and closed by CAST during the period March 2008 to
September 20 t 2 lor the; purposes of obtaining a statistintlly vi.tlid sample seJection.

Marc.h 27, 2014 -FTS, PPA and OIG pl·rsonnd participated in a teleconference to discuss the
sampling methodology for this review of cases as well as the critcl'ia lor the analysis ur the cases.

Murch 31,2014 OIG sent an email to OPM-FIS with a list of questions generated m; the rcsuiL
of the March 2ih ldeconfcrcncc. Most ofthc (JUestions were related to the sampling
methodology, but thl: email also rettue.sted the March 2010 DoD Memorandum n::gcmling
adjudicating incomplete investigations.

Apri I 3, 2014 - FIS, in coordination \Vith the OPM Chief of Staff: provided an email respnnse to
the OIG's questions, and includ('d a copy of the March 2010 DoD Memorandum regarding
adjudicating incomplek investigations as \veil as a documen1 outlining FIS's quality processes.

April 7. 2014- OIG provided a memorandum to FlS with comments rdaling to the OPM-FIS
revit~\V
       ofth(' USIS dumped cases. Thl~ memorandum indicated the f()llowing: "While we
generally agree with FIS's proposed methodology for this review, \VC have one recummendation
related to this process. FIS planned to exclude: cases ti-01n 200R-2009 from the sample population
because individuals investigated in 2008 should have been subject loa re-investigation in 2013,
and those ft·om 2009 shnuld be re-lnvestigatcd in 2014. However, FIS is unable tn dct~.:rrnine
which specific individuals have, in fact. been re-invesligaled, so we recommend that all cuscs
from ::wos and 2009 be indudcd in the sample universe." FIS concurred with this
recommendation and included cases fwm the 2008-2009 timeframe per the ow·~ request.

April I0. 1014- FIS provided a memorandum to the OIG thal formally documented th!2' details
of the FIS revic\v of the USIS dump(!d background investigations. The documentation inclutl~d
the sampling methodology. the quality review· process and the three-tiered strategy used to
evaluate the sampled im,..~.:stigations.
                                                                                                    6


April II, 2014- OIG sent an email to FIS regarding the memorandum l'cqucsting more specific
inlonnation regarding the Jltclors that went into the sample-size selection to include confidence
leveL margin or error. precision, etc.

April 21,2014- t'IS sent an email to OIU containing inlorrnatiun provided by PPA that
addressed OIG's questions presented in their April II th email.

April 23, 2014- 010 sent an email to PIS indicating: "Thanks for providing this infonnatiou.
Based on this response and the prt:vious documentation that you provided, we arc comfoJ1ablc
with your plan l'or conducting this review. Please let me knovv· if you have any tlucstions. At
smnc point in the near future we would like to meet with you for a status update. Thanks."

April 25. 2014 FIS personnel commenced the review of the 1,100 investigations jointly
selected tor the Quality Assessment.