oversight

Report 2003-004-AMR - Evaluation of EEOC's Performance and Results Reporting

Published by the Equal Employment Opportunity Commission, Office of Inspector General on 2004-01-01.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

OIG-03-14-AMR                Evaluation of EEOC’s Performance and Results Reporting                                             Final


TABLE OF CONTENTS

EXECUTIVE SUMMARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.0      INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
         1.1  Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
         1.2  Purpose of Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
         1.3  Scope and Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.0      FINDINGS & CONCLUSIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
         2.1  Organization and Other Practical Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
                    2.1.1 Organization and Length . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
                    2.1.2 Readability, Clarity, and Visual Features . . . . . . . . . . . . . . . . . . . . . 9
                    2.1.3 Accessability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
                    2.1.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
                    2.1.5 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

         2.2       Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
                          2.2.1 Extent to Which Stated Goals Are Met . . . . . . . . . . . . . . . . . . . . . 12
                          2.2.2 Adequacy of Strategic Goals and Objectives . . . . . . . . . . . . . . . . . 13
                          2.2.3 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
                          2.2.4 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

         2.3       Performance Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
                         2.3.1 Objectivity, Measurability and Usefulness . . . . . . . . . . . . . . . . . . . 17
                         2.3.2 Performance Measure Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
                         2.3.3 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
                         2.3.4 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

         2.4       Budget Links, Management Challenges, External Factors, Verification and
                   Validation of Data, and Program Evaluations . . . . . . . . . . . . . . . . . . . . . . . . 22
                         2.4.1 Links Between EEOC’s Budget and its GPRA Reporting . . . . . . . 22
                         2.4.2 Management Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
                         2.4.3 External Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
                         2.4.4 Verification and Validation of Data . . . . . . . . . . . . . . . . . . . . . . . . 24
                         2.4.5 Program Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
                         2.4.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
                         2.4.7 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.0      MATTERS FOR CONSIDERATION, CONTEXT OF RECOMMENDATIONS,
         AND EVALUATION FOLLOW-UP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28



EEOC Office of Inspector General                                -i-                                   January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                                               Final


APPENDICES

Appendix A      Acronym List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-1
Appendix B      Office of Resource and Information Planning Comments and OIG Response B-1


EXHIBITS

Exhibit 1       Useful and Less Useful GPRA Goals and Measures . . . . . . . . . . . . . . . . . . . . . . . 5
Exhibit 2       Summary Table of Ratings for the 2004 Plan and 2002 Report–Organization
                and Other Practical Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Exhibit 3       Summary Table of Ratings on Goals and Strategic Objectives . . . . . . . . . . . . . . 11
Exhibit 4       Summary Table of Ratings on Performance Measures . . . . . . . . . . . . . . . . . . . . 17
Exhibit 5       Objective, Measurable, and Quantifiable Performance Measures . . . . . . . . . . . . 18
Exhibit 6       Performance Measures That Fail to Meet Standards . . . . . . . . . . . . . . . . . . . . . . 18
Exhibit 7       Summary Table of Ratings on Management Challenges, External Factors,
                Verification and Validity of Data, and Program Evaluations . . . . . . . . . . . . . . . 22




EEOC Office of Inspector General                             -ii-                                   January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                     Final


EXECUTIVE SUMMARY

The Government Performance and Results Act of 1993 (Public Law 103-62), also known as GPRA,
requires federal agencies, including the EEOC, to align program outputs and activities and to
measure outcomes that clearly impact American citizens. GPRA contains three performance
management requirements–Agency Strategic Plans, Annual Performance Plans, and Annual
Performance Reports.

This report is intended as a baseline assessment of the Agency's Fiscal Year 2004 Budget Request
and GPRA Annual Performance Plan and Annual Program Performance Report–FY 2002. The
review’s primary objective is to determine how well EEOC presents GPRA-related information,
including support for key data.. We emphasize that this is not an assessment of agency performance,
but an assessment on how performance is reported. We discuss the 2004-2009 Strategic Plan, but
because it is primarily a strategic, and not a reporting, document, this assessment does not focus on
it. Major objectives are to determine:
•       The usefulness, to Congress and other stakeholders, of Fiscal Year 2004 Budget Request and
        GPRA Annual Performance Plan and Annual Program Performance Report–FY 2002
•       EEOC’s status in achieving key outcomes and meeting key challenges
•       The adequacy of the goals, objectives, and performance measures contained in the 2004 Plan
        and 2002 Report
•       The quality of the organization and other practical aspects of the 2004 Plan and 2002 Report

Our review found EEOC’s latest Performance Report and Performance Plan contain much
information that is useful in determining:
•      intended performance across the Agency
•      credibility of EEOC’s performance information
•      progress towards meeting Agency goals

However, in each of these areas, we found significant gaps between what EEOC reported and
standards set forth in GPRA reporting guidelines. Major areas for improvement include:
•      presenting information more clearly and providing timely public access to the Performance
       Report and Performance Plan
•      including goals and measures that will provide a fuller picture of intended performance
•      adding adequate descriptions of the methods used to ensure the accuracy and reliability of
       performance data

In March, 2002, OIG provided ORIP with an draft report assessing how well EEOC reported GPRA
performance information in the 2000 Annual Report and the 2002 Annual Plan. Some of the
weaknesses in the 2000 Annual Report and the 2002 Annual Plan, noted by OIG in the draft report
(OIG-01-12-AMR–a document intended to promote discussion and improvement in reporting
practices) have been eliminated or reduced significantly.



EEOC Office of Inspector General                     1                         January x, 2004
OIG-03-14-AMR            Evaluation of EEOC’s Performance and Results Reporting                                  Final


Improvements include:
•     objectiveness and clarity of performance measures
•     performance measure target setting
•     integrating budget and performance data
•     graphics and other visual features

Major conclusions and recommendations are highlighted below–the full results of this review are
described throughout the report.

CONCLUSIONS ABOUT THE PERFORMANCE REPORT AND PERFORMANCE PLAN

 Primary conclusions:       Primary                 Primary conclusions:              Primary conclusions:
 organization and           conclusions:            GPRA performance measures         management challenges,
 other practical            GPRA goals and                                            external factors, verification
 aspects                    objectives                                                and validity of data, and
                                                                                      program evaluations
 Conclusion 2.1.4a:         Conclusion              Conclusion 2.3.3a: The            Conclusion 2.4.6a: Linkage
 Overall, the 2002          2.2.3a: Several         Agency should continue its        between the budget and
 Annual Performance         major Agency            efforts to improve the            performance goals exist but
 Report and 2004            activities, including   objectiveness and clarity of      need significant improvement.
 Annual Performance         cross-cutting           performance measures.
 Plan are well              efforts, lack                                             Conclusion 2.4.6b:
 organized and              adequate outcome-       Conclusion 2.3.3b: The            Management challenges and
 presented in a             focused goals.          Agency could significantly        external factors should be
 reasonable manner.                                 improve GPRA reporting by         clearly identified and
                                                    ensuring that each major goal     discussed.
 Conclusion 2.1.4b:                                 and program contains sufficient
 The 2002 Annual                                    performance measures to make      Conclusion 2.4.6c:
 Report and 2004                                    an overall assessment of          Reported validation and
 Annual Plan are not                                progress to goals.                verification activities are not
 made available easily                                                                adequate.
 or in a timely manner                              Conclusion 2.3.3d: The
 to the public.                                     Agency could significantly        Conclusion 2.4.5d: EEOC
                                                    enhance its GPRA reporting by     recognizes that program
                                                    obtaining and using baseline      evaluation reporting and
                                                    and benchmark data.               activities need improvement.

RECOMMENDATIONS AND MATTERS FOR CONSIDERATION
Based on the conclusions, we offer nine recommendations. We have taken into account the new
EEOC strategic plan and ORIP comments on the draft report (see Appendix B). In reviewing
actions taken in response to recommendations, OIG will take into consideration ORIP’s need for
cooperation with other parties, and other time-consuming factors.


EEOC Office of Inspector General                            2                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                     Final


Recommendation 2.1A: Improve the presentation of tables and graphics through uniform
presentation standards.

Recommendation 2.1.B: Make the Performance Plan and Performance Report accessible and timely
for the public by ensuring that they are on the Agency website within one week after they are
published on paper.

Recommendation 2.2A: Develop goals and appropriate performance measures for the cross-cutting
goals described in Strategic Plan.

Recommendation 2.3A: Refine customer satisfaction measures for mediation and education and
training.

Recommendation 2.3B: Include baseline data for new measures, to ensure that proper targets are
selected and that readers understand the targets are reasonable.

Recommendation 2.3C: Improve performance measure targets. Targets should be achievable, but
challenging, and require significant improvement.

Recommendation 2.4A: Show increasing links between the budget and performance goals and
activities in Subsequent Performance Reports.

Recommendation 2.4B: Clearly identify all major management challenges and external factors and
include a brief discussion of EEOC’s actions to address the challenges and external factors.

Recommendation 2.4C: Improve Validation and Verification reporting, including providing a clear
description of the methods and procedures in place for ensuring lack of significant error rate, and
the problems with the procedures, so that the reader can judge whether performance data will be free
of bias and other significant error.

Recommendation 2.4D: Add information about the ongoing and planned program evaluations.

Matters for Your Consideration: Partner with another agency to improve performance reporting.

Additional detail concerning findings, conclusions, and recommendations are provided throughout
this report. Appendix B contains ORIP comments on the draft report and our response to the
comments.




EEOC Office of Inspector General                     3                         January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                        Final


1.0 I N T R O D U C T I O N


This section of our report provides background on performance reporting, including the Government
Performance and Results Act of 1993 (Public Law 103-62), commonly known as GPRA. This
section also introduces the issues and terms covered in the report and describes the scope and
methodology of our review.

1.1     BACKGROUND

GPRA is one of several pieces of legislation that affect the way Federal agencies are required to
manage their programs. These laws aim to improve mission performance, management of programs,
and use of resources.

GPRA is intended to focus government decision making, management, and accountability on results
and outcomes. GPRA requires agencies, including EEOC, to link Agency resources, program
outputs and activities and to measure outcomes that clearly impact American citizens. Impact, the
ultimate result of an organization's efforts, is often not entirely within the control of the organization.

GPRA focuses on achievement of mission related results through requirements for planning,
budgeting, assessment, and accountability. These results (outcomes) are what an agency wishes to
accomplish, not agency products and services (outputs) or processes. GPRA contains three
performance management requirements–Agency Strategic Plans, Annual Performance Plans, and
Annual Performance Reports. Exhibit 2, on the following page, summarizes the GPRA Performance
Management Cycle, including the three key performance management requirements.

The Strategic Plan looks ahead six years at what the Agency plans to accomplish. The Performance
Plan links the goals in the Strategic Plan to Agency activities by identifying measures for tracking
Agency performance. The Performance Report demonstrates how well the agency performed in its
efforts to accomplish goals and meet level of performance goals (targets).

In order to meet the outcome-oriented purpose of GPRA, EEOC must define, measure, and report
on key aspects of performance. Performance in any organization can be measured at three
levels–activity, output, and outcome. Impact is a high form of outcome.

GPRA focuses on outcomes. Therefore, organizations that can focus performance measurements
on outcomes and impacts may better achieve GPRA's planning and reporting requirements than
those that do not. Unfortunately, outcomes are often difficult and costly to measure.

Examples of useful and less useful GPRA goals, taken from the cited agencys’ plans are shown in
Exhibit 1.


EEOC Office of Inspector General                     4                            January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                             Final


                  Exhibit 1. Useful and Less Useful GPRA Goals and Measures

 Agency                       Goal or Measure                       Usefulness as a Performance
                                                                    Indicator
 U.S. Small Business          Jobs Created by SBA Borrowers         Useful: the measure ties directly to the
 Administration                                                     purpose of the 504 loan program*
 U.S. Small Business          Improved access to capital and        Not useful: it is output-based, not
 Administration               credit                                directly linked to whether a business
                                                                    succeeded*
 Agencies with similar
 work process as EEOC
 National Mediation           Mediate to closure 70 railroad        Useful: results oriented, directly ties to
 Board                        and airline mediation cases           agency mission**
 Federal Labor Relations      Use and promote alternative           Not useful: it is activity, not result-
 Authority                    methods of resolving and              based**
                              avoiding disputes and provide
                              services to enhance labor-
                              management relationships
* source of analysis: , Office of the Inspector General U.S. Small Business Administration, May, 2001
** source of analysis: Office of Inspector General, U.S. Equal Employment Opportunity Commission, June, 2003

EEOC Performance Planning and Reporting
EEOC began its GPRA activities in 1996. After a one-year planning process, EEOC submitted
its first strategic plan to Congress in 1997, and submitted an updated strategic plan in the year
2000. A new strategic plan (for fiscal years 2004-2009) was completed in September, 2003.
The Strategic Planning Working Group (SPWG), represented by at least one staff member of all
EEOC offices, developed the new strategic plan. The new strategic plan is not a performance-
based document and therefore not within the primary scope of our review. Because the new
strategic plan affects the context for our recommendations, we took it into consideration in
formulating our recommendations.

The Strategic Planning and Management Controls Division, (SPMCD) located within the Office
of Research, Information and Planning (ORIP), is responsible for leading and/or coordinating
several EEOC GPRA activities.




EEOC Office of Inspector General                       5                             January x, 2004
OIG-03-14-AMR           Evaluation of EEOC’s Performance and Results Reporting                            Final


In 2002, OIG provided ORIP with a draft report concerning the 2000 Annual Report and the
2002 Annual Plan. Many of the weaknesses noted by OIG in that report have been eliminated or
reduced significantly. Improvements include:
•      objectiveness and clarity of performance measures
•      performance measure target setting
•      integrating budget and performance data
•      graphics and other visual features

Therefore, this report contains fewer recommendations than the 2002 draft report. As stated in
the 2002 draft report, we understand that implementing some changes may take a considerable
period of time and may require redeployment of resources and/or additional resources, as well as
extensive coordination and cooperation with other EEOC offices. Other changes, such as those
regarding accessibility, require modest resources and could be implemented shortly.

OIG also conducted performance-related work (i.e., OIG has recommended that the EEOC
reevaluate the performance measurement process for the Agency’s Revolving Fund
(Performance Audit of the Education, Technical Assistance and Training Revolving Fund,
January 22, 2002))


1.2     PURPOSE OF REVIEW

The Office of Inspector General (OIG) conducted a review of the Agency’s performance
reporting. A baseline assessment of the Agency’s performance reporting can be helpful to
Agency management and others wishing to assess EEOC’s progress, including:
•       Meeting its mission and goals
•       Meeting management and other goals laid out by OMB and others
•       Meeting management challenges

This review is intended as a general assessment of the Agency's Annual Program Performance
Report–FY 2002 (Performance Report) and Fiscal Year 2004 Budget Request and GPRA Annual
Performance Plan (Performance Plan). It is not a review of the Agency's reporting system
and/or processes, although an understanding of the system and processes is necessary to place
the assessment in the proper context.1 It also does not assess the adequacy of the budget request
amounts. The review’s primary objective is to determine how well EEOC presents GPRA-
related information, including support for key data. Major objectives are to determine:




        1
            In the course of the project, we obtained key information about the GPRA processes at EEOC.

EEOC Office of Inspector General                         6                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

•       The usefulness, to Congress and other stakeholders, of the Annual Program Performance
        Report–FY 2002 and fiscal year Fiscal Year 2004 Budget Request and GPRA Annual
        Performance Plan
•       EEOC’s status in achieving key outcomes and meeting key challenges
•       The adequacy of the goals, objectives, and performance measures contained in the
        Performance Plan and Performance Report
•       The quality of the organization and other practical aspects of the Performance Plan and
        Performance Report

Because there are no published assessments of EEOC’s performance reporting and planning
documents, this report seeks to establish a baseline on the most critical items in those documents,
including those items cited above. Other Inspectors General have conducted reviews of their
agencies’ GPRA reporting, including evaluation of performance measures.


1.3     SCOPE AND METHODOLOGY


The scope of this review focuses on the adequacy of EEOC's goals, measures, objectives, and
strategies to meet goals and objectives, availability and reliability of data supporting
performance information, and the quality and availability of communication in the Performance
Plan and Performance Report.

OIG interviewed Agency officials and reviewed pertinent documents, including:

•       EEOC's 2002, 2003, and 2004 Performance Plan
•       EEOC's 2000, 2001, and 2002 Performance Report
•       EEOC's 2000-2005 Strategic Plan and update materials
•       EEOC's 2004-2009 Strategic Plan
•       Government Performance and Results Act of 1993 (Public Law 103-62)
•       OMB Circular No. A-11, part 2 (and other GPRA reporting guidance)

OIG compared elements of EEOC’s most recent final GPRA documents (the 2004 Performance
Plan and 2002 Performance Report) to various standards. The standards used for evaluating the
Performance Plan and Performance Report were developed from several sources, including
GAO, OMB, and the Mercatus Institute (a research institution affiliated with George Mason
University). Standards derived from GAO include those found in GAO/GGD-10.1.20 Guide to
Assessing Agency Annual Performance Plans, 1998. OMB-derived standards are found in OMB
Circular No. A-11, part 2.

This report was conducted in accordance with generally accepted government auditing standards
as published in the Comptroller General’s Government Auditing Standards, 1999 Revision and
through the amendments of 2003. The review took place from March 2003 through June 2003.

EEOC Office of Inspector General                    7                                January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

2.0 F I N D I N G S A N D C O N C L U S I O N S

OVERALL
Our review found EEOC’s latest Performance Report and Performance Plan contain much
information that is useful in determining:
•      intended performance across the Agency
•      credibility of EEOC’s performance information
•      progress towards meeting Agency goals

However, in each of these areas, we found significant gaps between what EEOC reported and
standards set forth in GPRA reporting guidelines. Major areas for improvement include:
•      presenting information more clearly and providing timely public access to the
       Performance Report and Performance Plan
•      including goals and measures that will provide a fuller picture of intended performance
       (the Annual Plan)
•      adding adequate descriptions of the methods used to ensure the accuracy and reliability
       of performance data

ORGANIZATION
This Section provides findings, conclusions, and recommendations regarding EEOC’s Annual
Program Performance Report–FY 2002 and 2004 Budget Request and GPRA Annual Performance
Plan. Section 2.1 presents organization and other practical aspects, Section 2.2 assesses goals and
objectives, Section 2.3 evaluates performance measures, and Section 2.4 covers management
challenges, external factors, verification and validity of data, and program evaluations. The final
subsection of each Section presents conclusions and recommendations.


2.1     ORGANIZATION AND OTHER PRACTICAL ASPECTS

Subsection 2.1.1 assesses the EEOC's Performance Plan and Performance Report on Organization,
and Length, Subsection 2.1.2 assesses Readability, Clarity, and Visual Features, subsection 2.1.3
discusses Accessability, 2.1.4 presents conclusions, and 2.1.5 contains recommendations.

Organization and other practical aspects of the Performance Plan and Performance Report are
important because they affect readability and the overall impression the reader receives.
Organization and the other characteristics reviewed in this Section reflect criteria used by the
Mercatus Group.

Overall, EEOC receives a “Fair” rating in this category. Many features received a “Good” rating.
However, accessibility receives a “Poor” rating. Exhibit 2 summarizes the ratings in Organization
and Other Practical Aspects of the Performance Plan and Performance Report.



EEOC Office of Inspector General                    8                                January x, 2004
OIG-03-14-AMR          Evaluation of EEOC’s Performance and Results Reporting                   Final

   Exhibit 2. Summary Table of Ratings for the EEOC 2004 Performance Plan and 2002
              Performance Report--Organization and Other Practical Aspects

                                       C A T E G O R Y
                General      Length Readability Clarity Visual   Accessability Overall
                Organization                            Features
2004 Plan       Good               Good      Good              Good   Fair      Poor    Fair
2002 Report Good                   Good      Fair              Good   Fair      Poor    Fair
Notes:          Good = almost always met standard/criteria
                Fair = often met standard/criteria
                Poor =usually did not meet standard/criteria

Several categories, including clarity and visual features, are improved from the 2003 Plan and 2001
Report, while accessability declined.

2.1.1    Organization and Length

Organization

The 2004 Performance Plan and the 2002 Performance Report are generally well organized. Each
document contains an easy to follow table of contents. In addition, the organization of material is
also logical. For example, the 2004 Performance Plan addresses major programs and activities that
directly affect the public before discussing executive direction and support. In order to be easily
understood, simple and effective organization is critical.

Length

Both documents rate highly for containing an effective length. The Performance Plan, at 81 pages,
is a good length for size and scope of Agency activities and blending of budget request and
performance information. A reader can obtain a high degree of detail, but not get bogged down in
information that is not central to each issue. Similarly, the Performance Report, at 35 pages,
generally presents an appropriate level of detail, with additional information contained in
appendices.

2.1.2    Readability, Clarity, and Visual Features

Readability

The Performance Plan and Performance Report are, for the most part, highly readable. For example,
each document contains adequate white space to enable the reader to move quickly through each
page. In addition, exhibits are usually presented on the same page as the accompanying text.

EEOC Office of Inspector General                        9                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                      Final

However, both the Performance Plan and Performance Report fail to place the Strategic Objective
text on the same page as the measures to support the objective. This means that the reader (unless
she memorizes the Strategic Objective) must go back a page or more to understand how the
objective fits the measure.

Clarity

The 2002 Performance Report and 2004 Performance Plan each attained good results for clarity.
The Performance Plan and Performance Report contain generally clear and declarative statements.
Each document is also relatively free of jargon that would make it hard for a non-employment law
expert to understand. For non-specialists to quickly understand the contents of any agency’s
Performance Reports and Performance Plans, the documents must use clear language.

Visual Features

The visual features of the Performance Report and Performance Plan contain several characteristics
that detract from their usefulness. Some visual features of each document are effective, such as the
use of bullet points to emphasize key information.

One visual feature deficiency is that some tables are interrupted with blank space, making it difficult
to determine if some data may be missing. A distracting feature is that the words “employee,”
“employees,” “employer,” and “employers” are partially underlined in the Performance Report.

Tables and graphs are used moderately well in the Performance Plan and Performance Report. For
example, Agency workload data is clearly presented in table format. Most tables and graphs in the
report are consistent in appearance. Much of the inconsistent and distracting graphics present in
previous reports, were eliminated. However, tables vary considerably in appearance, causing the
reader to frequently pay attention to the form, rather than the substance, thereby making it more
difficult to comprehend the information.

2.1.3   Accessibility

Accountability to the public requires readily available Performance Reports and Performance Plans.
Unfortunately, the 2004 Performance Plan was not made available on the public website until more
than six months after issuance, and the 2002 Performance Report was not made available on the
website until more than three months after issuance. This is a major weakness, given that these
documents can provide stakeholders and the general public with vital information about past
performance and plans for improving performance. OMB states that following transmittal to
Congress, both the Performance Report and Performance Plan should be made quickly and readily
available, in electronic format, to the public.




EEOC Office of Inspector General                    10                                   January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

2.1.4   Conclusions

Conclusion 2.1.4a     Overall, the 2002 Performance Report and 2004 Performance Plan are well
organized and presented in a reasonable manner. Tables and graphs could be improved with
consistent standards.

Conclusion 2.1.4b Neither performance document was available timely to the public via the
Agency website. Making these documents readily available and easy to find (within three mouse
clicks) would remedy the problem.

2.1.5 Recommendations

OIG recommends the Director, Office of Research, Information and Planning, ensure that:

2.1A The Strategic Planning and Management Controls Division improve the presentation of tables
and graphics by ensuring uniform presentation standards.

2.1B The Office of Research, Information and Planning make the Performance Plan and
Performance Report accessible and timely for the public by ensuring that they are on the Agency
website within one week after they are published in paper.


2.2      GOALS AND OBJECTIVES

This section assesses the goals and objectives in the Performance Report and Performance Plan.
Subsection 2.2.1 assesses the extent to which EEOC reporting documents meet the stated major
goals and objectives, Subsection 2.2.2 assesses several measures of adequacy of selected goals and
objectives, subsection 2.2.3 presents conclusions, and subsection 2.2.4 contains recommendations.

The Agency’s 2004 Performance Plan and 2002 Performance Report received ratings of fair because
some goals and objectives are either not objective or not focused on end results and outcomes.
Strategic goals and objectives are important because they link the mission of an agency and the
measures that assess progress in achieving the mission. Exhibit 3 summarizes these results for each
key criteria.




EEOC Office of Inspector General                    11                               January x, 2004
OIG-03-14-AMR           Evaluation of EEOC’s Performance and Results Reporting                       Final

         Exhibit 3. Summary of Ratings for Strategic Goals and Strategic Objectives

                                         C A T E G O R Y
                 Stated    Key Strategic Objectives Include             Goals Reflect        Overall
                 Goals Met Goal Areas Outcomes                          Activities           Rating
                           Included      Reflecting the                 Supporting Cross-
                                         Strategic Goals &              cutting Program
                                         Mission
2004 Plan        Not        Fair                  Fair                  Fair                 Fair
                 applicable
2002 Report Fair                Fair              Fair                  Fair                 Fair
Notes:   Good = almost always met standard/criteria
         Fair = often met standard/criteria
         Poor =usually did not meet standard/criteria

The 2004-2009 Strategic Plan presents several different strategic goals and objectives than those in
the 2004 Annual Plan. If these are adequately incorporated into EEOC reporting documents, this
area could improve.

2.2.1    Extent to which Stated Goals Are Met

The 2002 Performance Report receives a rating of “Fair” because the objectives and measures often
support the goals, however, they do not adequately cover the breadth of the goals, preventing full
assessment of whether the goals were met.

For example, Strategic Goal 2 is to “Promote equal opportunity in employment.” Of the nine
measures supporting the two Strategic Objectives under Strategic Goal 2, eight are activity based
and/or vague, including:
•      measure 2.1.1–The number of consultations with employer stakeholders on operational and
       legal issues (activity based)
•      measure 2.2.4–Provide EEOC informational material to federal sector employees and
       employee groups (activity based and vague)

The reliance on numbers of events, persons, or activities does not provide for assessment of the
effectiveness of the events and activities. Therefore, the nine measures, while presenting some
useful information, are of limited use.




EEOC Office of Inspector General                         12                                 January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                    Final

2.2.2   Adequacy of Strategic Goals and Objectives

The 2004 Performance Plan received a rating of “Fair” in adequacy of strategic goals and strategic
objectives. While each of the strategic goals are adequate, and some strategic objectives are focused
on end results and outcomes, a significant number are not.

Key Strategic Areas Included in GPRA Reporting

The three major strategic goals relate closely to the Agency’s mission by emphasizing enforcement,
outreach, and management outcomes. For the 2004 Performance Plan, the Agency retained each
of its three Strategic Goals from the 2003 Performance Plan. All strategic objectives were also
retained. This continuity allows the Agency to maintain a history of performance against established
goals.

Objectives Include Outcomes Reflecting the Strategic Goals and Mission

The Agency's strategic objectives are important because they link the broad strategic goals to the
particular performance measures. Some of the Agency's Strategic Objectives are outcome-based
and cover critical Agency efforts, making the objectives highly useful. However, many of the
strategic objectives are not effective because they are process-oriented, and do not link to mission-
related outcomes. All Strategic Objectives are evaluated below.

Strategic Objective 1.1

 Improve the effectiveness of the private sector enforcement program, including the use of
 charge prioritization, mediation and, litigation.



This strategic objective is outcome oriented, focusing on improved effectiveness of the private
sector enforcement program. This strategic objective supports quantitative performance measures
concerning the business of enforcement.

Strategic Objective 1.2

  Enhance the effectiveness of the federal sector program by utilizing a comprehensive
  enforcement strategy.


This objective is also outcome oriented and centered on improved effectiveness, promoting the
creation of performance business-like quantitative performance measures.




EEOC Office of Inspector General                    13                                 January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                    Final

Strategic Objective 1.3
 Strengthen partnerships with State and Local Fair Employment Practices Agencies (FEPAs)
 and Native American Tribal Employment Rights Organizations (TEROs) to enhance effective
 implementation of laws addressing employment discrimination.


This Strategic Objective is process-oriented (how something is performed), not outcome (bottom
line) oriented. The cited objective is to strengthen partnerships with various state and local
organizations in order “to enhance effective implementation of laws addressing employment
discrimination.” A more pertinent objective would capture an outcome directly related to effective
implementation of employment discrimination laws. In this way, the means (partnerships) are
secondary to the end (implementation of laws) which can then be measured.

Strategic Objective 2.1
  Encourage and facilitate voluntary compliance with equal employment opportunity laws
  among employers and employer groups in the private and federal sectors.


This strategic objective is process, not outcome oriented, and sets the stage for output, rather than
outcome performance measures (e.g., number of classes taught or posters posted, rather than
measurable changes in the policies or behaviors of employers and employer groups). Given the
Agency's emphasis on promoting equality of opportunity in the workplace and enforcing federal
laws prohibiting employment discrimination., this strategic objective would be improved by adding
focus on the effectiveness of Agency actions, not the amount or type of actions taken.

Strategic Objective 2.2
 Increase knowledge about individual rights under equal employment opportunity laws among
 the public and employee groups.


This objective is process oriented, not outcome oriented, and sets the stage for output, rather than
outcome performance measures (e.g., number of consultations or publications, rather than
measurable changes in the education level among the public and employee groups). This strategic
goal places emphasis on the means by which EEOC attempts to influence equal opportunity results.
Clearly, the Agency needs to continue measuring numbers of meetings and publications, as these
outputs are necessary to achieve key outcomes. However, including these numbers in GPRA
reporting is only moderately useful.




EEOC Office of Inspector General                    14                                 January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                   Final

Strategic Objective 3.1
  Enhance staff capabilities and substantive knowledge to improve work processes and job
  functions through training, partnership, team-based approaches, and customer-based
  principles.


This Strategic Objective is outcome oriented and captures several key workforce targets. However,
the Agency does not include any other human capital strategic objectives in the 2004 Performance
Plan. This objective does not take into account two areas that should have performance goals,
retention of employees and appraisals linked to program performance.

Human capital is a key issue for federal agencies.

        “Attention to strategic human capital management is important because building agency
        employees' skills, knowledge, and individual performance must be a cornerstone of any
        serious effort to maximize the performance and ensure the accountability of the federal
        government. GPRA, with its explicit focus on program results, can serve as a tool for
        examining the programmatic implications of an agency's strategic human capital
        management challenges.” (From the U.S. General Accounting Office, GAO-01-872T)

Strategic Objective 3.2

 Provide policy direction and guidance to achieve all Strategic Goals.


This Strategic Objective possesses limited usefulness. Its strength is recognizing the importance of
policy in accomplishing strategic goals. However, the objective is means-oriented and vague,
greatly limiting its usefulness. For example, it appears most difficult to produce measures that can
be quantified and have established baselines and meaningful indicators.

Strategic Objective 3.3
  Instill a knowledge base by attaining and maintaining a robust technological competency
  and through research, analysis and evaluation of organizational components, procedures
  and processes.


This Strategic Objective seeks to address technology acquisition and simplifying directives, two
issues that significantly affect accomplishment of the Agency mission. Because the Strategic
Objective describes internal activities related to the Agency's knowledge infrastructure, means-
oriented performance measures are likely to be, and are, used to assess progress. Means are the
resources used to accomplish something. In this case, the objective is adequate because the nature

EEOC Office of Inspector General                    15                                January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                   Final

of infrastructure makes for measures which are the means towards accomplishing mission-related
goals, not the outcomes that determine success in accomplishing mission-related goals.

However, the Strategic Objective is quite broad, encompassing technology and evaluation of various
organizational characteristics. Technology acquisition and implementation does not appear to
overlap significantly with streamlining directives. This makes a combined assessment of progress
difficult and not as useful as separate assessments.

Cross-cutting Issues

The Performance Plan should identify goals reflecting critical aspects of cross-cutting programs.
This means that for programs where other federal agencies are involved, it is helpful for agencies'
Performance Plans to show coordination. Efforts noted in an Agency's Strategic Plan are often
logical choices for inclusion in the Performance Plan. EEOC's 2000-2005 Strategic Plan identifies
several such efforts. The 2004 Performance Plan contains some goals and corresponding measures
and targets for these cross-cutting efforts. Cross-cutting efforts that are included in the 2004
Performance Plan include several performance measures under Strategic Objective 1.2. For
example, Performance measure 1.2.4 relates to collecting and verifying EEO data from other federal
agencies.

However, several significant cross-cutting efforts are not included. The Performance Plan does not
include, measures concerning the cross-cutting efforts with Department of Justice, including
coordination with the Department of Justice on cases involving allegations of discrimination against
state and local government employees. Justice is responsible for litigating these types of cases.
Similarly, EEOC and the Department of Labor issued memoranda in 1999 enabling both agencies
to improve the enforcement of laws prohibiting workplace bias, particularly laws against
employment discrimination in compensation. These memoranda establish better coordination and
communications between the two agencies and provide for training to increase staff awareness of
potential compensation discrimination cases. The 2004-2009 Strategic Plan includes several cross-
cutting efforts as highly important in achieving prevention of employment discrimination.

2.2.3   Conclusions

Conclusion 2.2.3a   Several major Agency activities, including cross-cutting efforts, lack adequate
outcome-focused goals.

2.2.4   Recommendations

OIG recommends the Director, Office of Research, Information and Planning, ensure that:

2.2A The SPMCD continue to work with Office of Field Programs, Office of Federal Operations,
and other offices as appropriate to establish goals and appropriate performance measures for cross-
cutting efforts. It may be necessary to coordinate with the other agencies. EEOC could work with

EEOC Office of Inspector General                    16                                January x, 2004
OIG-03-14-AMR           Evaluation of EEOC’s Performance and Results Reporting                   Final

the other agencies to develop a consistent set of performance measures that address these efforts.
EEOC should strive to identify outcome-focused measures that track the effects of the cooperation.


2.3      PERFORMANCE MEASURES

Subsection 2.3.1 assesses the extent to which the measures are objective, quantifiable, and useful,
2.3.2 assesses the adequacy of measures' targets, subsection 2.3.3 presents conclusions, and 2.3.4
presents recommendations. Overall, the content of the performance measures earn a rating of
“Fair.” Exhibit 4 summarizes the results.

                   Exhibit 4. Summary of Ratings for Performance Measures

                                       C A T E G O R Y
                 Objective Measurable &            Useful/          Targets      Overall Rating
                           Quantifiable            Appropriate      Adequate
2004 Plan        Fair           Fair               Fair             Fair         Fair
2002 Report      Fair           Fair               Fair             Fair         Fair
Notes:          Good = almost always met standard/criteria
                Fair = often met standard/criteria
                Poor =usually did not meet standard/criteria

2.3.1    Degree to Which Performance Measures Are Objective, Measurable and Quantifiable,
         and Useful

Objective, Measurable and Quantifiable
Generally, the performance measures in the Performance Report and Performance Plan are
objective, measurable, and quantifiable. Examples of measures meeting standards in these areas are
shown in Exhibit 5. The analysis focuses mostly on performance measures contained in the 2004
Plan because many of the measures in the 2002 report are no longer in use.




EEOC Office of Inspector General                       17                               January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                         Final

         Exhibit 5. Objective, Measurable, and Quantifiable Performance Measures

 Measure                                         How it meets standards
 1.1.2 (2004 Performance Plan) Percent           • what should be observed, time frame, and
 of sampled district office charge files         population are specified
 with information supporting the                 • specific target level or level of improvement are
 categorization of the charges as “A,”           cited
 “B,” or “C”
 3.2 (2004 Performance Plan) Improve         • what should be observed, time frame, and
 the efficiency of paying commercial         population are specified
 vendors                                     • specific target level or level or improvement are
                                             cited
However, in several instances, the measures are not clear. For example, Measures 1.3.2 and 3.1.1
in the 2004 Performance Plan are vague. Exhibit 6 details several instances of other inadequate
performance measures.

                Exhibit 6. Performance Measures That Fail to Meet Standards

 Measure                                                 Why the Measure Does Not Meet
                                                         Standards
 1.2.5 (2004 Performance Plan) Comprehensive             • not outcome focused (conducting a review
 review and assessment of federal agencies’              is an activity)
 EEO and affirmative employment programs.                • critical terms are not objective (i.e., what
                                                         is a comprehensive review and
                                                         assessment?)
 3.–5 (2004 Performance Plan) Percent of all             • not outcome focused (completing a course
 employees who pass that are required to take a          is useful only if it leads to more effective
 core curriculum course using distance learning          employee performance)
 employee development center (e-learning)                •difficult to understand
 2.–2 (2004 Performance Plan) Target outreach            • critical terms are not objective (i.e., what
 activities based on a FY2003 survey of                  is targeting?)
 respondents in order to increase the respondent         • measure does match with its objective
 mediation participation rate.                           (i.e., the purpose of the measure is to
                                                         quantify the increase in participation, not
                                                         target activities)




EEOC Office of Inspector General                    18                                      January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                    Final

  Usefulness

  Many of the performance measures contained in the Performance Plan and Performance Report
  contribute substantially to determining if the Agency is making progress towards performance
  goals. Performance measure 1.1.1, the “percent of resolved private sector charges resolved within
  180 days” and 1.1.3 in the 2004 Performance Plan “Number and percentage of class cases
  successfully litigated,” ties directly to the Strategic Objective 1.1 and Strategic Goal 1.

  Financial performance for the Agency is a key issue, and is included in the 2002 Performance
  Report in the form of several performance measures (e.g., Performance Measure 3.3.2).
  Unfortunately, a key financial measure of the Revolving Fund, present in the 2002 Report, is not
  included in the 2004 Performance Plan, as recommended in OIG report #01-07-APO, “OIG
  Performance Audit of the Education, Technical Assistance and Training Revolving Fund.” When
  cost is a key issue, it should be identified and assigned performance measure(s).

  EEOC has also made progress in its reporting by eliminating many performance measures that
  simply measured activity levels and therefore were not particularly useful. For example, the
  performance measures aimed at increasing or simply recording the number of educational and
  outreach activities no longer exist. The 2002 Performance Report, in its narrative, duly notes some
  of these numbers so that readers can understand the scale of the activities. Performance measures
  that correspond with educational and outreach activities are now more closely geared to outcomes.

  Several performance measures contained in the 2004 Performance Plan and 2002 Performance
  Report do not adequately indicate progress towards the Strategic Goals or Strategic Objectives.
  For example, Performance Measure 3.–.4. in the 2004 Performance Report (“Employee
  performance plans linked to Agency strategic goals”) is an internal measure that, at best, is only
  vaguely related to how effectively the Agency performs its services (e.g., processing
  discrimination complaints or educating employers) and therefore does not indicate progress
  towards the strategic goal of Enhancing Agency Effectiveness.

  The 2004-2009 Strategic Plan includes a limited number of performance measures and states that
  the plan will be used to establish annual performance measures for inclusion in the performance
  reporting documents. Many of the measures in the 2004-2009 Strategic Plan are objective,
  measurable, and quantifiable. Not enough information is provided to determine if the targets are
  adequate.

  2.3.2   Performance Measure Targets

  The Agency set many useful targets in the 2004 Performance Plan. However, some of the targets
  are not appropriate or not enough information is given to determine why particular targets were
  chosen.




EEOC Office of Inspector General                     19                                January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                   Final

  Appropriateness of Targets

  The 2004 Performance Plan and the 2002 Performance Report show that the Agency often sets
  adequate targets, an improvement over early GPRA reporting. For example, the target for measure
  1.2.3 in the 2004 Performance Plan doubles the percentage (from 20 to 40) of EEO appeals cases
  from other federal agencies that are resolved within 180 days. Targets, also known as
  performance goals, are a desired level of performance expressed as a tangible, measurable
  objective, against which actual performance can be compared.

  EEOC has improved target setting, but weaknesses remain. EEOC no longer meets or exceeds
  every target, as was the case in the 2002 Performance Report. This indicates that EEOC is setting
  more challenging targets, a good sign. However, some targets are still inadequate. For example,
  the proposed 2003 and 2004 targets for “Percent of private sector charges resolved within 180
  days” (performance measure 1.1.1.), is 60 percent. EEOC achieved 64 percent and 65 percent in
  2001 and 2002 respectively. Setting a goal lower than the current level of performance (or at an
  extremely modest level) does not encourage improved performance and indicate that a lower
  achievement level is acceptable.

  Most targets are clear, including the proposed 2003 and 2004 measures under Strategic Goal 3,
  “Enhance Agency Effectiveness” contained in the 2004 Performance Plan, but several are not,
  rendering the data of limited use. For example, “Pilot comprehensive review and assessment of
  6 federal agencies.” (It is not clear what would constitute success for this target–initiating the
  reviews, completing the reviews, or some other level or performance).

  Baseline and Benchmark Data

  The performance measures in EEOC’s Performance Report sometimes do not include adequate
  baseline and bench marking data to allow firm understanding of progress. GPRA reporting on
  performance measures should include baseline data to allow for a view of progress. For example,
  Performance Measure 2.–3. in the 2004 Performance Plan, does not provide information about the
  extent to which individuals may believe training is useful now. Without an indication of the
  baseline number, a target of 50 percent can appear unambitious and/or a number that was not
  developed in a considered fashion.

  Benchmark data from other organizations could be helpful in setting targets and assessing results.
  For example, if the benchmark organization (i.e., a comparable government organization with
  comparable systems) strives for 90 percent, EEOC would want to reconsider 50 percent as a target.
  EEOC does not regularly obtain, use, and report baseline and benchmark data when developing
  performance measure targets.




EEOC Office of Inspector General                     20                               January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                   Final

  2.3.3 Conclusions

  Conclusion 2.3.3a      While most performance measures in the Performance Plan and Report are
  objective and measurable, the Agency should continue its efforts to improve the objectiveness and
  clarity of the measures. Thorough review of the wording of measures and development of new
  measures would result in improved understanding by decision makers and those working to
  achieve and accurately measure success in meeting goals. Measures identified in this report and
  other sources as “needing improvement” are logical measures to address. The performance
  measures included in the 2004-2009 Strategic Plan are a good beginning towards overall
  improvement in performance measures.

  Obtaining knowledge from experts outside the Agency and gathering input from those in the
  Agency with expertise and resources for developing, reviewing, and refining performance
  measures could help ensure continued improvement in performance measure development.

  Conclusion 2.3.3b     The Agency could significantly improve performance reporting by ensuring
  that each major goal and program contains sufficient performance measures to make an overall
  assessment of progress to goals.

  Conclusion 2.3.3c     The Performance Plan and Performance Report often lack the baseline or
  benchmark information necessary to determine the adequacy of performance measure targets and
  progress towards a goal.

  Conclusion 2.3.3d     The Agency could significantly enhance its performance reporting by
  obtaining and using baseline and benchmark data when developing performance measure targets.

  2.3.4   RECOMMENDATIONS

  OIG recommends the Director, Office of Resources, Information and Planning ensure that:

  2.3A The Strategic Planning and Management Controls Division work with the Office of Field
  Programs and other offices as appropriate to improve measurement of customer satisfaction for
  mediation and education and training. The EEOC, particularly in the 2004-2009 Strategic Plan,
  has recognized customer satisfaction as a vital component of assessing progress towards goals.
  Surveying best practices in assessing customer satisfaction (private and public sector) should
  provide helpful insights for developing new, and improving existing, measures.

  2.3B The Strategic Planning and Management Controls Division work with the Office of Field
  Programs and other offices as appropriate to include baseline data for new measures to ensure that
  proper targets are selected, as well as to ensure that readers understand the reasonableness of
  targets.

  2.3C The Strategic Planning and Management Controls Division coordinate with the Office of
  Field Programs and other offices as appropriate to continue improving performance measure
EEOC Office of Inspector General                     21                               January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                        Final

  targets. Targets should be achievable, but challenging, and require significant improvements.
  Establishing targets that are not challenging or not based on existing data does not encourage
  maximum effort from Agency employees.


  2.4     BUDGET LINKS, MANAGEMENT CHALLENGES, EXTERNAL FACTORS,
          VERIFICATION AND VALIDATION OF DATA, AND PROGRAM
          EVALUATIONS

  Section 2.4.1 covers the connection between the Agency’s budget and its GPRA reporting. Section
  2.4.2 addresses Management Challenges, Section 2.4.3 assesses External Factors, Section 2.4.4
  discusses Verification and Validation of Data (V&V), Section 2.4.5 assesses Program Evaluation,
  conclusions are presented in Section 2.4.6, and recommendations presented in 2.4.7

  The Performance Report and Performance Plan should contain information concerning
  evaluations, management challenges and external factors affecting progress, and verification
  and validity of data. The 2004 Performance Plan and 2002 Performance Report provided a
  substantial amount of useful information in these categories and receives an overall rating of
  “fair”. However, in some areas, there are major gaps in the type and quantity of information
  provided. The Agency receives a “Good” rating for Management Challenges and a “Poor”
  and/or “Fair” rating in each of the other three categories. Exhibit 7 summarizes the ratings.

    Exhibit 7. Summary of EEOC Ratings for Management Challenges, External Factors,
                 Verification and Validity of Data, and Program Evaluation

                                       C A T E G O R Y
                   Budget and        Management External          Verification Program Overall
                   Reporting         Challenges Factors           and Validity Evaluation Rating
                   Links                                          of Data
  2004 Plan        Fair              Good             Poor/Fair Poor/Fair      Fair          Fair
  2002 Report Not Applicable Good                     Poor/Fair Poor/Fair      Poor/Fair     Fair
  Notes: Good     = almost always met standard/criteria
         Fair     = often met standard/criteria
         Poor     = usually did not meet standard/criteria

  2.4.1 Links Between EEOC’s Budget and its GPRA Reporting

  EEOC has begun to establish necessary linkages between its budget and its performance.
  However, the linkage lacks several useful details. The 2004 Performance Plan contains a
  modest amount of linkage. For example, budget and GPRA performance information is
  supplied for the category “Comprehensive Enforcement Program.” However, the amount of

EEOC Office of Inspector General                     22                                    January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                   Final

  funding for a set of activities or performance measures within the category (i.e., “Prevention of
  Employment Discrimination”) is not provided.

  OMB guidance states that a performance plan should display “the amount of funding being
  applied to achieve the performance goals and indicators for that activity.” Annual
  Performance Plans should link directly to an agency’s budget. Such linkage shows the results
  an agency intends to achieve with the funds requested. According to the Director, SPMCD,
  the new and planned systems, including Integrated Mission System and Integrated Financial
  Management System will provide data that will be helpful in this regard. For example, the
  IFMS should provide improved cost allocation data, a feature useful in linking budget with
  performance.

  2.4.2 Management Challenges

  Mission critical management challenges and program risks must be addressed in order to help
  improve the performance of any agency. An agency’s performance plan should set
  performance goals concerning major management problems. The 2004 Performance Plan
  received a “Good” rating in this category because several management challenges are
  discussed in detail.

  Most of the top management challenges (as identified in the OIG’s Inspector General
  Assessment of Management Challenges, November 2003) facing EEOC are addressed in the
  Plan:

  •       workforce repositioning
  •       strategic management of human capital
  •       financial performance
  •       budget/performance integration
  •       competitive sourcing
  •       E-government

  For example, several human capital issues are addressed in the plan, including linking strategic
  goals to employee performance and improving employee knowledge and skills. The 2004
  Performance Plan is an improvement over previous Performance Plans, because human capital
  issues are addressed in greater detail and the Information Resources Management Strategic
  Plan uses timeliness as a factor in accomplishment of key objectives.

  However, identifying the specific management challenges is difficult, because many of the
  challenges are implied in the “Program Highlights” sections that follow presentation of each
  strategic objective. The 2004 Performance Plan does not contain an identifiable section for
  Management Challenges and External Factors.




EEOC Office of Inspector General                     23                               January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                    Final

  2.4.3 External Factors

  The 2004 Performance Plan received a “Fair” rating in this category because several external
  factors are noted and discussed in moderate detail. EEOC has made significant progress from
  previous plans by noting additional external factors into its analysis. Review of external
  factors (e.g., demographic changes and public opinion) assist decision makers to assess the
  likelihood for achieving goals and actions needed to attain goals.

  The introduction to the 2004 Performance Plan notes an increase in workload and complexity
  caused by passage of the American with Disabilities Act in 1990 and the aging workforce. The
  Performance Plan also clearly explains how the Agency is managing, and plans to continue to
  manage and mitigate, these factors. Several other external factors, such as demographic trends
  and technological advancements, are also mentioned. The 2004-2009 Strategic Plan notes
  several additional external factors.

  2.4.4   Verification and Validation of Data

  The Performance Plan and Report describe verification and validation (V&V) efforts broadly,
  but the reader cannot determine if existing verification and validation procedures are credible.
  Presenting credible performance information is necessary to adequately assess progress
  towards agency goals. Decision makers, inside and outside the Agency, need assurance that
  program and financial data is timely, complete, accurate, useful and consistent to make well-
  informed decisions.

  The Performance Report for each Agency should contain a comparison of actual performance
  with performance targets identified in the Performance Plan. In addition, GPRA requires the
  Performance Plan to include a description of the means used to verify and validate measured
  values. For the Performance Report to be useful, the reported target data must be accurate and
  supported adequately.

  Verification is the checking or testing of data to lessen the risk of erroneous data. Validation
  means the testing of data to ensure significant bias is not created by errors. Significant error,
  including bias, reduces the ability to assess the extent to which performance goals are
  achieved.

  Validation and Verification Reporting

  The 2004 Performance Plan and 2002 Performance Report contain adequate descriptions of
  several validation and verification efforts. However, no overall statement or details about the
  current reliability of the performance data is provided. Therefore, the Agency has not provided
  sufficient information to allow the reader to judge whether existing data is sufficiently free of
  bias and other error. This makes it difficult to judge if the Agency is meeting performance
  targets and goals. The 2002 Report noted that EEOC is making progress in implementing
  technology that are critical for data collection and verification. The Director, SPMCD, stated
EEOC Office of Inspector General                     24                                January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

  that in order to adequately verify and validate much reported data, the new systems need to be
  implemented. This indicates that much of the data currently reported is neither verifiable nor
  can it be validated.

  Implementation (recently accomplished and/or soon to be accomplished) of several tools is
  noted, including:
  •       issuing Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and
          Integrity of Information Disseminated by the U.S. Equal Employment Opportunity
          Commission
  •       Integrated Financial Management System Plans for various improvements need to
          have dates associated with them, so that readers can assess progress in meeting V&V
          goals. One ongoing V&V effort not noted in the GPRA reporting is the quarterly
          reconciliation of private sector charges.

  Support for Performance Results

  For each of the two performance targets, the supporting data and manner in which the results
  were reported were generally accurate. Data reported for Performance measure 1.1.3 (cited in
  the 2002 Performance Report) are supported by a spreadsheet that details each case that
  contributed to the results. Data for Performance Measure 3.3.5 are supported by data
  generated by the government agency to which EEOC contracts the work.

  For information on gathering of baseline data that can support data validation and verification
  efforts, see Section 2.3.2.

  2.4.5 Program Evaluation

  The 2004 Performance Plan shows a modest improvement from previous Plans. The 2004
  Plan demonstrates that EEOC conducts a limited program evaluation effort, but understands
  their importance and intends to increase evaluation efforts.

  The 2004 Plan describes a variety of evaluation efforts, including the efforts of EEOC, GAO,
  and the OIG. The 2004 Plan also notes planned studies of Alternative Dispute Resolution
  initiatives and training provided through the revolving fund. EEOC states that these studies
  would include customer/participant satisfaction. This type of evaluation can be highly
  effective in determining if stakeholder concerns are being met.

  However, as in previous Plans, there is no schedule of evaluations or presentation of results.
  The Director, SPMCD, stated that EEOC has not yet committed to the type and timing of
  evaluations. In addition, the 2004 Plan does not state the results from program evaluations,
  making the program evaluation section only somewhat useful. OMB and GAO guidance call
  for stating evaluation results. To integrate evaluation into performance reporting, the results
  and actions resulting from evaluations should be included in the Performance Plan.

EEOC Office of Inspector General                     25                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

  Some of the program evaluation efforts cited in the 2004 Performance Plan are components of
  program evaluation activities, not program evaluations. In general, these efforts appear to be
  useful, though not a substitute for program evaluation as envisioned under GPRA. For
  example, review of outreach reports, and participant evaluation forms are included in program
  evaluation reporting by EEOC. These types of reviews are not program evaluations, though
  they may contain useful data and analysis.

  Reporting on program evaluation is required. The Performance Report and Performance Plan
  may include:

  •       summary of findings and recommendations of program evaluations
  •       adjustments to program evaluation schedules
  •       information on how interested parties may obtain copies of program evaluations

  Program evaluations are important in determining whether an agency is achieving desired
  results and outcomes, the factors affecting performance and opportunities for improvement. In
  addition, program evaluations can be used by the Agency to identify the degree and/or amount
  that its efforts are contributing to results and what actions the Agency may take to better
  accomplish its goals.

  Under GPRA, program evaluations are assessments, through objective means, of the manner
  and extent to which programs achieve their objectives. Therefore, all program evaluation
  efforts in the GPRA context should be designed to:
  •       assess, objectively, the target program or activity
  •       link directly to EEOC’s GPRA goals and objectives
  •       contain assessment of results/outcomes and effectiveness (rather than resources
          expended)
  •       be useful to EEOC stakeholders

  The 2004-2009 Strategic Plan shows promise for improving Agency performance reporting.
  The Strategic Plan includes both descriptions and a schedule of program evaluations. This is
  the type of information, along with results from evaluations, that should be included in future
  performance reporting.

  2.4.6 Conclusions

  Conclusion 2.4.6a: Existing links between the budget and performance goals and activities
  should be improved.

  Conclusion 2.4.6b: Management challenges and external factors should be clearly identified
  and addressed. The 2004-2009 Strategic Plan includes information that could be included in
  subsequent performance reporting documents.


EEOC Office of Inspector General                     26                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

  Conclusion 2.4.6c    Reported Validation and Verification activities are not adequate.
  Developing stronger data reliability efforts would significantly improve the Performance Plans
  and Performance Reports, and increase the ability of EEOC to improve performance.

  Conclusion 2.4.6d       EEOC recognizes that program evaluation activities and reporting need
  improvement. Conducting a limited number of formal program evaluations could provide the
  Agency management with critical information about outcomes and processes, while also
  satisfying the intent of GPRA. The current schedule of evaluations (as included in the 2004-
  2009 Strategic Plan), should be included in subsequent performance reporting documents. The
  progress and results of evaluations, and deviations from the evaluation schedule should be
  noted and explained.

  2.4.7 Recommendations

  OIG recommends the Director, Office of Research, Information and Planning ensure that:

  2.4A Subsequent Performance Reports show increasing links between the budget and
  performance goals and activities. OMB guidance (OMB circular No. A-11) provides various
  and flexible options for achieving closer links. EEOC could plan for particular improvements
  with each year’s submission of the Performance Plan.

  2.4B Agency GPRA products clearly identify and discuss all major management challenges
  and external factors. Identifying and obtaining pertinent information regarding the challenges
  will require annual input from, and discussion among, the Office of the Chair and Senior
  Agency managers in the field and headquarters.

  2.4C The Strategic Planning and Management Controls Division include in the EEOC
  Performance Report and Performance Plan general descriptions of current methods and
  procedures in place for ensuring lack of significant error rate, and problem with the
  procedures. If no such procedures are in place for key data, this should be reported as well.
  This will ensure that the reader can fairly judge whether current performance data will be free
  of bias and other significant error. In doing so, the Performance Plan and Report should
  disclose fully all target data limitations. This may require revising the Performance Data Call
  Letter to include requirements for inclusion of data limitations.

  2.4D EEOC include the program evaluation schedule, deviations from the schedule, and
  program evaluation results in subsequent performance reporting products. We note that
  effective program evaluation require significant resources. These resources may take the form
  of additional resources (such as for a new contract awarded to a consulting firm), and/or a
  redistribution of existing resources (such as ORIP or OIG using existing staff). The GAO
  report GAO-03-454, Program Evaluation–An Evaluation Culture and Collaborative
  Partnerships Help Build Agency Capacity, notes that various strategies may overcome the
  impediments of constraints on spending and restrictions on federal information collection.
  These include collaborative partnerships and expert and peer reviews.
EEOC Office of Inspector General                     27                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                  Final

  3.0 MATTERS FOR CONSIDERATION, CONTEXT OF
  RECOMMENDATIONS, AND EVALUATION FOLLOW-UP

  This section provides matters for consideration that are based on the conclusions taken as a
  whole. In addition, the context of the recommendations and evaluation follow-up information
  is provided.

  Matters for Your Consideration
  OIG recommends that the Director, Office of Research, Information and Planning establish a
  partnership with one or more similar agencies to improve performance planning and reporting.
  Partnerships are a systematic method for mutual learning and discussion of issues of common
  concern. It could be particularly beneficial to partner with an agency that performs similar
  activities (e.g. case processing and mediation). For example, experiences in developing goals
  and performance measures could be mutually beneficial. In addition, various methods for
  verifying and validating could be discussed in an environment that allows for a full discussion.
  Forums such as seminars and large group meetings do not typically allow for these discussions
  and similar benefits. Finally, EEOC could benefit from shared resources. For example,
  program evaluation resources could be shared, providing greater efficiency.

  Context of Recommendations
  All recommendations in this report are intended to prompt key actions that will result in
  significantly improved performance planning and/or reporting. The recommendations are not
  intended as short term prescriptions that will result in a flawless performance plans and
  reports. Instead, the recommendations are meant to provide several key milestones that should
  be met in EEOC’s evolving effort to make GPRA an increasingly useful tool for EEOC and
  those who need to assess EEOC’s accomplishments.

  To successfully implement some recommendations in this report, changes to various EEOC
  performance reporting and planning processes may be needed. Therefore, ORIP will need to
  work over an extended period of time with other offices to make those process changes.
  Primary offices involved in these efforts may include Office of the Chair, Office of the Chief
  Financial Officer and Administrative Services, Office of Federal Operations, Office of Field
  Programs, Office of Human Resources, Office of Information Technology, and the Office of
  Legal Counsel.

  ORIP comments on the OIG’s draft 2002 report stated that ORIP does not implement or
  directly control GPRA planning and implementation, therefore, recommendations in the report
  cannot be effectively carried out or resolved by ORIP. However, ORIP is responsible for
  EEOC’s GPRA reporting efforts and therefore should be the lead office for recommendations.
  To ensure that ORIP has support in implementing recommendations that fall outside of ORIP’s
  authority, OIG believes that the Office of the Chair should provide all necessary support to
  ORIP.

EEOC Office of Inspector General                     28                              January x, 2004
OIG-03-14-AMR         Evaluation of EEOC’s Performance and Results Reporting                   Final

  We understand that implementing some changes may take a considerable period of time and
  require redeployed and/or additional resources, as well as extensive coordination and
  cooperation with other EEOC offices. This condition does not eliminate the need to make the
  changes. Other changes, such as those regarding report accessibility, require a minimum of
  resources and coordination and should implemented in upcoming reporting cycle (2003 Annual
  Report and 2005 Annual Plan).

  We note that while the new strategic plan will result in new performance measures, most of the
  recommendations in this report will not be affected (e.g., report presentation and accessibility).


  Evaluation Follow-up
  The Office of Management and Budget issued Circular Number A-50, Audit Followup, to
  ensure that corrective action on audit findings and recommendations proceed as rapidly as
  possible. EEOC Order 192.002, Audit Followup Program, implements Circular Number A-50
  and requires that, for unresolved recommendations, a corrective action workplan should be
  submitted within 30 days of the final evaluation report date, describing specific tasks and
  completion dates necessary to implement audit recommendations. Circular Number A-50
  requires prompt resolution and corrective action on recommendations. Resolution should me
  made within six months of final report issuance.

  In response to the 2002 draft report, ORIP objected to use of audit follow-up for some of the
  draft recommendations, noting that some recommendations required cooperation with other
  EEOC offices and that audit resolution and actions to implement recommendations could be
  difficult. While agreeing to actions that will be taken, and taking those actions, can be
  difficult, improving EEOC performance reporting will greatly benefit EEOC’s stakeholders.




EEOC Office of Inspector General                     29                               January x, 2004
OIG-03-14-AMR           Evaluation of EEOC’s Performance and Results Reporting             Final

  APPENDIX A: Acronym List


  CFO             Chief Financial Officer
  DoJ             Department of Justice
  DoL             Department of Labor
  EEOC            Equal Employment Opportunity Commission
  FY              Fiscal Year
  GAO             U.S. General Accounting Office
  GPRA            Government Performance and Results Act
  IG              Inspector General
  OIG             Office of Inspector General
  OMB             Office of Management and Budget
  ORIP            Office of Research, Information and Planning
  SPMCD           Strategic Planning and Management Controls Division
  V&V             Verification and Validity




EEOC Office of Inspector General                    A-1                          January x, 2004
OIG-03-14-AMR           Evaluation of EEOC’s Performance and Results Reporting                 Final

  Appendix B: Office of Resource and Information Planning Comments
  and OIG Response

  SUMMARY:

  OIG thanks ORIP for its timely response to, and detailed observations about, the our draft
  report. We are pleased that you believe it provides useful material to improve the Agency’s
  GPRA products. We have made changes, as appropriate, to the report. Our responses to your
  comments can be summarized in the following manner:

     Our assessment focuses on EEOC performance reporting, not strategic planning, therefore,
      EEOC can improve its reporting by improving weakness found in the most recent annual
      performance plan and report
     If OIG fails to issue a final report, as you recommend, it would harm, not help, EEOC in its
      efforts to report performance information in a timely and efficient manner–other agencies
      have implemented plans to address their OIG’s recommendations regarding performance
      reporting
     OIG disagrees that the scoring system used in the draft report is inadequate, as a result, no
      changes have been made
     OIG has taken into account EEOC’s strategic plan (issued after the draft report) in the final
      report–while this does not affect our findings, several of the recommendations are no
      longer required


  For details on these and other issues, please see OIG responses (in italics) following the
  relevant ORIP comment below. Your memo indicates you may have further comments. We
  believe these additional comments may assist OIG in assessing future performance reports, and
  we encourage you to send your comments to OIG at your earliest convenience.




EEOC Office of Inspector General                    B-1                              January x, 2004
                  U.S. EQUAL EMPLOYMENT OPPORTUNITY COMMISSION
                                 Washington, D.C. 20507

Office of Research,
Information and Planning
     ORIP’S COMMENTS AND OIG’S RESPONSES

                                              October 15, 2003


     MEMORANDUM

     TO              :        Aletha L. Brown
                              Inspector General

     FROM            :        Deidre M. Flippen
                              Director, Office of Research, Information and Planning

     SUBJECT         :        ORIP Review and Recommendations of the OIG’s Draft Report
                              No. 03-14-AMR, Evaluation of EEOC’s Performance and Results
                              Reporting

             Thank you for the opportunity to review the OIG’s draft Report No.03-14-AMR,
     Evaluation of EEOC’s Performance and Results Reporting, submitted to me with your
     memorandum dated September 24, 2003. In the interests of time, I have only focused on
     several major areas of the report, even though I may have comments to offer in other areas.
     Also, to put my comments and recommendations in context, I offer preliminary observations
     about the EEOC’s strategic planning process and the significant enhancements we have made
     since the Chair’s arrival.

     Enhancing the Agency’s Strategic Planning Environment

             The EEOC has issued three Strategic Plans since passage of the Government
     Performance and Results Act of 1993 (GPRA) . The most recent one was issued on October 1,
     2003, and covers fiscal years 2004-2009. Each of the agency’s Strategic Plans governed the
     formulation of our Annual Performance Plans and our subsequent reports. In recent years, we
     have made progress in developing a Performance Budget to integrate our yearly performance
     plan with our budget submission and meet the requirements of the President’s Management
     Agenda. Finally, for the first time this year, the agency is required to submit a Performance
     and Accountability Report, which will integrate our GPRA Annual Program Performance
     Report, the agency’s annual financial statements, and the annual assurance about the adequacy
     of the our management controls. This history illustrates the accelerating pace for fully
     upgrading and integrating GPRA’s strategic concepts with the agency’s management of its
     financial and other resources.



   EEOC Office of Inspector General                  B-2                               January x, 2004
        Our Strategic Plan for Fiscal Years 2004 - 2009 sets the stage for dramatic improvements
we must make over the next few years. For instance, the Plan was already used to prepare the
fiscal year 2005 Performance Budget we submitted to the Office of Management and Budget in
September.

        In addition to restructuring the Plan around the agency’s Five-Point Plan, it contains two
major improvements from our earlier plans. For the first time, the Strategic Plan includes
performance measures, including measures of public/customer confidence by using surveys to
collect critical data. These measures track 6-year targets to achieve specific results over the life
of the Plan. Before the current Plan, the agency’s Annual Performance Plan was the only GPRA
document that contained measures, and they only forecast one-year targets. The majority of the
Strategic Plan’s measures are new and more “outcome” focused, in light of the requirements in
recent years throughout government. Previously, most of our measures were activity-based or
“output” measures. Also for the first time, we have included in the Plan a schedule of program
evaluations we will conduct to assess program results.

OIG response: Our assessment focuses on EEOC reporting of performance, not on strategic
planning. We are pleased that the Strategic Plan now includes elements previously absent or
inadequately addressed. It will be useful that these, and the changes recommended by OIG in
this report, are included in EEOC’s future performance reporting documents.

Comments in Response to OIG’s Report

        I address several common themes in the OIG Report with my observations in mind:

•       The value of OIG findings and recommendations are unfortunately diminished, when its
        reviews of GPRA documents and implementation efforts are delayed. For example, the
        current OIG report only briefly acknowledges the new Strategic Plan. Its adoption and
        implementation dramatically effects the usefulness of the report’s findings and
        recommendations, however. The Plan’s new framework, the inclusion of outcome
        measures, and the incorporation of a program evaluation schedule needed to be
        considered, because they would have affected the report’s evaluation and assessment of
        our GPRA process and presentation.

        On page 2, the report states: “We note that while a new strategic plan is in development
        and will likely result in new performance measures, many of the following
        recommendations will not be affected (e.g., report presentation and accessibility).”
        [underline added] We do not agree with this statement. In our opinion, 8 of the 13
        recommendations2 are no longer relevant and should be significantly reevaluated based
        on the new Strategic Plan.


        2
                The 8 recommendations are: 2.2A, 2.2C,2.3A,2.3B,2.3C,2.3D, 2.4B*, and 2.4D
                (*recommendations 2.4A and 2.4B in the summary on page 3 do not agree with
                the descriptions on page 26. We used the description of 2.4B on page 26.)
EEOC Office of Inspector General                B-3                                    January x, 2004
        The new Strategic Plan was developed over the past year by two cross-organizational
        teams, with OIG participating on both teams. Although finalized this past September, the
        Plan’s basic structure and most of the performance measures were in place in sufficient
        time for the OIG’s review to have considered them as part of this assessment. Because of
        the dramatic changes made in the Plan, we can not use the OIG’s report effectively to
        improve the GPRA program.

The product has not been delayed and therefore its usefulness is not diminished. In fact, by
taking into account the final strategic plan, our assessment is fully up-to-date. Our
recommendations are intended to be useful not only for the current performance reporting cycle,
but beyond. Several of the recommendations could well take longer than the current reporting
cycle to implement.

Our assessment focuses on the existing performance reporting, not on planned reporting),
because we believe that in order to see where one is going, one needs to know where one has
been. Without a baseline assessment of actual annual performance reporting, assessments
would necessarily focus on process and long-term planning. Both of these are useful, but
fundamentally different types of assessments. The current Strategic Plan is not relevant to our
assessment of previously issued annual plans and annual reports. It is, however, relevant in
helping construct useful recommendations for improving future performance reporting
documents.

For these reasons, the issuance of a new Strategic Plan does not diminish the need to improve
EEOC’s performance reporting. The Strategic Plan contains some new performance measures
scheduled for implementing in 2004. However, our review focused on the performance
reporting conducted by EEOC for the 2002 (performance report) and 2004 (performance plan).
This reporting is linked to the 2000-2005 Strategic Plan. Because our review focuses on
performance reporting, the structure of the new Strategic Plan (in draft or final form) was not
assessed.

Regarding OIG input to the 2004-2009 Strategic Plan–The OIG review of the draft strategic
plan, and its contribution to the Strategic Planning Working Group were efforts that stand apart
from this report.

•       I am also concerned about the “scoring” technique used in the report. The technique is
        too subjective since it does not use a balanced scale, there is no detailed discussion to
        support a particular score, and it is not evident if, or how, individual criteria were
        weighted to determine an overall score. See OIG response following the detailed
        comments on scoring for a discussion of the balanced scale we used in this assessment.
        The OIG Report references three sources that were used to develop standards applied to
        the scoring. The General Accounting Office’s Guide3 incorporates the concepts


        3
                The Results Act: An Evaluator’s Guide to Assessing Agency Annual Performance
                Plans, GAO/GGD-10.1.20, (April 1998)
EEOC Office of Inspector General                B-4                                  January x, 2004
        contained in Part 2 to OMB Circular A-11, suggesting it may be the most useful
        document to reference for the purpose of this review.

OIG Response: We developed a set of standards that incorporates elements for assessing
performance documents from Mercatus, GAO, and OMB. However, we did not adopt GAO’s
conceptual framework for evaluating GPRA documents, but adopted a more balanced
framework and an approach that reflected this balance. While we found GAO concepts helpful,
our framework and methodology does not mirror GAO’s (or Mercatus’s). Therefore, we should
not follow suggestions based primarily on GAO documents. A discussion of GAO and other
guidance follows.

As we developed our own assessment and scoring framework, we found that GAO’s guidance is
useful, but not the only useful guidance. The Mercatus Scorecard is also widely recognized as
useful is assessing performance reporting. For example, current and former members of
Congress (e.g., Hon. Todd Platts, Chairman, House Government Reform Subcommittee on
government Efficiency and Financial Management), agency heads (e.g., Sec. of Labor Elaine
Chao), and various experts view Mercatus assessments as useful. GAO referred to a Mercatus
assessment when reviewing the General Services Administration’s performance. GAO also
solicited comments from Mercatus when developing its own 2003 performance and
accountability report. Therefore, we did not change scoring to more closely reflect GAO’s
conceptual guidance.

        The Guide describes a conceptual framework to help evaluate GPRA documents, and it
        provides an example of a scoring technique. GAO’s approach illuminates my concerns
        with the report’s application of the scoring technique. For example, GAO cautions that
        the Guide was developed for a “general” assessment. To amplify this point, the Guide
        notes that evaluators should assess individual criteria within three broad categories,
        framed as questions, but combine these assessments into an “overall judgement about
        what level of progress the plan as a whole represents....” (Page 11) GAO further
        elaborates that “[s]coring the plans by tallying up the instances of compliance–for
        example, by counting the number of performance measures that were clearly linked to
        the strategic plan–is not recommended. Rather, in this example, the concern should be
        that, if there are no performance goals specified for a given strategic goal, or if there are a
        large number of performance goals with no visible link to the agency’s strategic plan,
        then there should be an explanation of those exceptions.” (Page 12)

        GAO only applies its “scoring” system to a combined assessment. In contrast, the
        technique in OIG’s report scores individual criteria. The GAO does not recommend
        “scoring” individual “criteria” but to focus, instead, on an overall assessment.

OIG response: In addition to our comments above regarding our balanced approach compared
to GAO’s, we believe that efficient improvement of overall performance reporting cannot take
place without improving individual elements. Therefore, identifying elements to improve is
critical. It follows that without grading of these elements, it would be difficult to justify
recommendations for improving those elements. The GAO Guide, as you point out, is intended

EEOC Office of Inspector General                 B-5                                     January x, 2004
as a general assessment of the performance plans, while our objective was to convey both
overall and specific assessments, of both the annual plan and the annual report. Therefore, we
use several scoring standards, both GAO and non-GAO. In appropriate cases, in order to
document and support a rating, we quantified instances where criteria was met or not met. It is
considered acceptable to quantify instances of compliance with reporting criteria:
•       OIG of The Securities and Exchange Commission’s document “GPRA Performance
        Reports,” March, 2002
•       OIG of the Federal Communications Commission, “Report on Audit of the Federal
        Communications Commission Implementation of the Government Performance and
        Results Act (GPRA), March, 1999

Also, the scoring used in the OIG report does not represent a balanced scale. Two of the three
gradations (Poor and Fair) are usually associated with a “below average” assessment and have a
negative connotation. The remaining one (Good) is usually associated with an “average”
assessment and is not considered the top of a linear, three-point scale. For example, the three
points of a three-point scale could be Inadequate, Adequate and Excellent to cover the full range
of possible assessments. Although we do not think a scoring approach should be used at all,
particularly at the individual criteria level, we would suggest that a five-point scale would be
more appropriate (Inadequate, Weak, Adequate, More than Adequate, Excellent). GAO’s
example in the Guide (“generally meets,” “partially meets” and “falls well short of meeting”)
also could be considered as a more appropriate scale for these types of assessments. However,
as GAO recommends, any scale should only be used for an overall assessment and not individual
criteria.

OIG Response: We disagree, the scale we use is balanced. The scale has one positive rating,
one neutral rating, and one negative rating. The word “fair” is defined by Western New
Collegiate Dictionary as “sufficient but not ample: ADEQUATE.” [emphasis is Western]
Therefore, there is one positive rating, one neutral rating, and one negative rating. We provide
a more detailed definition of “fair” throughout the report: often met standard/criteria.
Therefore, it should also be clear to readers that “fair” is not a negative rating.

In addition, “average” is defined as a level typical of a group, class, or series, (i.e.,a
comparison to others). We did not compare EEOC’s performance reporting to that of other
agencies—we feel this would be extremely time consuming and minimally useful. Instead, OIG
compared EEOC’s reporting against well-established criteria.

Finally,“excellent” should be used on scales with more than three ratings. “Excellent” and
“failing/unacceptable” are not included because we chose a three point scale so that readers
could easily and quickly grasp where EEOC rates. A three point scale simply places all grades
above fair into one category and similarly places all grades below “fair” into another. This
means there are no “excellent” or “failing” scores, but general categories indicating
performance less than or better than adequate. Therefore, even if we had used included
“excellent” and “failing,” it would not have changed the results of the assessment.



EEOC Office of Inspector General               B-6                                  January x, 2004
Some organizations use three point scales, some use four or five point scales, some use multiple
scales. For example, the SBA Office of Inspector General uses a three point scale to assess the
usefulness of performance measures, as does the Presidential Management Agenda to assess
results. Neither scale is inherently superior because a three point scale is simpler to grade and
comprehend results, while a five point scale provides more depth. Future evaluations may
include a scales with more than three levels for added depth.

        In addition to the scale used, I am concerned about other aspects of the scoring technique.
        It is not clear how a specific “score” was determined. We recognize that judgements are
        involved in selecting a score, however, there is no descriptive section or attachment in
        the report to elaborate on the factors used to determine a particular score. GAO provides
        lengthy descriptions so the reader can better understand how the overall assessment was
        determined.

OIG Response: The assessment uses criteria found in GAO and Mercatus publications. Each
specific score (e.g., for accessibility) was determined by comparing what was reported to the
criteria. We often cite examples to illustrate why EEOC performance reporting does or does not
meet the criteria. The assessment is intended, in a concise manner, to present findings,
conclusions, and recommendations. Future assessments could include detailed appendixes such
as those you describe.

        Also, it is not clear whether individual criteria was weighted, or appropriately weighted,
        to determine an overall assessment score. As GAO notes in its Guide on page 11, “[i]n
        assessing progress towards the ideal [performance plan], evaluators should also recognize
        that some deficiencies are more important than others, and that an unsuccessful agency
        effort to address a particularly challenging task shows greater progress towards
        accomplishing the Act’s [GPRA’s] goals than does not having attempted the task at all.”
        In the report’s assessment whether any weighting was used by answering the following
        questions: Was a deficiency less important in the overall scheme? Was the observed
        deficiency a result of the agency’s unsuccessful attempt to try to implement the criteria or
        did the agency not try to implement at all?

OIG Response: Our overall assessment of EEOC’s performance reporting is provided in both
the draft and final product, on page 1 of the executive summary and on page nine of the final
report). To support such an overall assessment, we scored individual criteria, used the
individual criteria score to produce an overall rating for each category, then assessed category
performance to determine an overall assessment. Please note that the overall assessment is our
judgement and not a “grade.” Rather, the overall assessment, notes positive and negative
aspects of EEOC’s performance reporting, highlighting areas for improvement.

        In our opinion, the report suggests that each criteria has equal weight. This is a critical
        problem with the scoring technique. For example, for the six criteria assessed on page 8
        for “Organization and Other Practical Aspects” (2.1), the 2004 Plan received 4 “Good”
        scores, a “Fair” score, and a “Poor” score in Accessability. The “Overall” assessment,
        however, is “Fair;” apparently based on the judgement that the Plan was not

EEOC Office of Inspector General                B-7                                   January x, 2004
        electronically accessible on the agency’s web site. In fact, the Plan was accessible in
        other ways, which was not acknowledged. There is no discussion about how GAO’s two
        areas for properly discussing the weighting of a particular area factored into the overall
        assessment. Was the deficiency less or more important than the other areas assessed?
        Did the “agency” initiate attempts to electronically post the Plan but, for a variety of
        reasons, was unsuccessful? Or, did it not try to post the Plan at all? The answers to these
        questions are not addressed in the report. It appears that this “deficiency” may have
        disproportionately influenced the overall assessment. This is just one example of our
        concern with the construction and application of the scoring technique used in the report.

OIG Response: The overall assessment for each major category (e.g., Organization and Other
Practical Aspects) is based on the judgement that each criteria is vital to that category,
therefore, a “Poor” rating in one criteria would make it difficult, (given there are only six
criteria in the category), for the overall rating to be above “Fair.” Recall that “Good” means
criteria were almost always met. We stress that while a category’s overall rating is noteworthy,
improvement can occur only if the individual criteria show improvement. It is not our intention
to rate criteria that are not extremely important. Therefore, we did not assign a given weight to
each criteria, but instead made a judgement about a category’s overall assessment.

To restate: this report assesses performance reporting results, rather than efforts at
performance reporting. From this, an outcome/results-based viewpoint, whether EEOC
attempted to post a report or take other reporting is not vital.

We defined accessibility as readily available Performance Reports and Performance Plans. This
is very similar to the Mercatus definition. Because timely electronic posting provides readily
available documents and most citizens have access to the web (in their homes or public
libraries) other systems for distribution to the public are of marginal importance. Because
EEOC did not perform well in either of these areas, the results in this criteria were very low and
thus received a rating of “poor.” Given the low-cost of posting documents, prompt electronic
posting to the public website is a valid measure for this criteria. Finally, to discuss each rating
for each criteria and category at the level of detail you describe would be extremely expensive,
detract from the results-oriented report we hope to provide, and result in a voluminous, less
timely, and therefore, less useful report.

•       Finally, we have raised before our concern about using the OMB Circular A-50, Audit
        Followup, and the EEOC Order 192.002, Audit Follow-up Program, in situations similar
        to these types of OIG efforts that, we believe, do not fall within the purview of audit
        followup. I feel that working together in a collaborative effort to refine the GPRA
        process and to develop our GPRA products would be more beneficial than having to
        evaluate and respond to individual recommendations through the cumbersome processes
        established for audit followup. Although we may not fully endorse all of your
        recommendations, we feel it would be more informative and useful to the agency to
        discuss findings and recommendations with a more flexible approach.



EEOC Office of Inspector General                B-8                                   January x, 2004
        OIG Response: The audit-follow up process is well established, and is appropriate to all
        recommendations contained in this report. This issue is discussed in the
        Recommendations section.

Recommendations

The following recommendations provide a good opportunity to use the OIG’s Report No. 03-14-
AMR, Evaluation of EEOC’s Performance Reporting, to support a continuous process of
improving the agency’s strategic planning endeavors.

•       Do not issue the report at this time.

OIG Response: Failing to issue a final report would not support improvement of agency
performance reporting. As you point out, “[the draft report] provides useful material to improve
the agency’s GPRA reports.” We stress that the report does not focus on “strategic planning
endeavors,” but on performance reporting. OIG’s publish performance report assessments at
various times in the calendar year–we would like to explore with you an optimal report issuance
schedule.

    The report does not address the significant impact the new Strategic Plan has already had on
    our GPRA and budget documents. At least 8 of the 13 recommendations are no longer
    relevant (see footnote 1). It would be valuable for the OIG to review the new Strategic Plan
    so that the agency can benefit from its observations and recommendations as we move
    forward with its implementation and its effect on our other GPRA and budget documents in
    the future.

OIG Response: Our review focuses on the performance reporting contained in 2004 Annual
Plan and the 2002 Annual Report. Your comments do not specify which GPRA documents have
already been impacted by the Final Strategic Plan. Regardless, because the scope of our review
covers only documents issued prior to the Strategic Plan, our analysis of the these documents is
not changed by the Final Strategic Plan or documents altered subsequent to its issuance.

However, we take into account the final Strategic Plan as relates to several conclusions and
recommendations. Two recommendations were removed, and one was modified as a result of
your comments and assessment of the Final Strategic Plan. For example, the program
evaluation conclusions and recommendation were modified because the Strategic Plan
contained an evaluation schedule). In addition, we plan to review the entire final Strategic Plan
as part of an upcoming review.


•   If a report is issued, a different scoring technique and scale need to be used.

    If the report is issued, it is important to alter the scoring technique. The report should not
    score assessments of individual criteria. Even some of the categories should not be scored
    separately. Instead, they could be combined into overall assessments. The rating scale needs

EEOC Office of Inspector General                 B-9                                  January x, 2004
    to be changed to reflect a more appropriate range. The report should provide explanations
    for distinctions made in the assessments and scores. In particular, scores should be
    appropriately weighted to reflect the relative importance of different criteria and allow for
    agency efforts to address particular criteria, as noted by the GAO.

OIG Response: The scoring system has not been altered. Your concerns regarding scoring are
addressed in OIG comments earlier in this appendix. No scoring system is perfect, however, we
believe ours is straight forward and points clearly to specific areas for improvement. Many
different scoring systems have been used to assess GPRA documents in the federal government.
We used a system that simple to administer and easy for readers to comprehend the results. We
will review our scoring system. Future assessments may be more sophisticated, as knowledge of
assessing performance reporting increases and EEOC performance reporting improves.

•   Remove audit follow-up procedures for this type of evaluation and these recommendations.

    This type of report can provide useful guidance to improve the agency’s GPRA reports. The
    report acknowledges that many of the changes will take a long time to have an effect. This
    type of evaluative work is not an audit and is not conducive to corrective action plans. For a
    variety of reasons, there may be concerns with implementing some of the recommendations.
    Under audit followup procedures, valuable time and resources are often required to resolve
    outstanding issues; particularly, where judgements are involved. It is important to
    encourage collaborative efforts to improve GPRA plans and approaches, which evolve and
    can only be assessed over time.

OIG Response: While collaborative efforts are vital to improving agency GPRA reporting, we
believe that implementation of recommendations will be most efficient, effective, and can be
tracked more accurately with the established follow-up procedures. When assessing GPRA
reporting, other OIGs issue recommendations and work with the agencies using the same type of
process used for traditional audits (e.g. an EPA OIG report assessing performance measures
noted that the agency developed a corrective action to implement the report’s
recommendations). We will be happy to discuss, as appropriate, any concerns you may have.

    Thank you again for this opportunity to comment on the draft of OIG’s report. It provides
    useful material to improve the agency’s GPRA products. We look forward to working with
    you in a flexible framework to refine these products; ensuring that their presentation
    continuously improves.

c: Leonora L. Guarraia
   Chief Operating Officer




EEOC Office of Inspector General                B-10                                  January x, 2004