oversight

Planning and Evaluation Services' Implementation of the Government Performance and Results Act 1999 Performance Reports and 2001 Plans (A&I 2000-016) PDF (39K)

Published by the Department of Education, Office of Inspector General on 2001-01-01.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

INSPECTION MEMORANDUM


TO:           Dr. Alan L. Ginsburg
              Planning and Evaluation Service Director
              Office of the Under Secretary

FROM:         Mary Mitchelson
              Assistant Inspector General for Analysis and Inspection Services
              Office of Inspector General

SUBJECT:      Review of Planning and Evaluation Services’ Implementation of the
              Government Performance and Results Act 1999 Performance Reports
              and 2001 Plans (A&I 2000-16)


Planning and Evaluation Services (PES) requested the Office of Inspector General (OIG) conduct a
review of PES’s implementation of the Government Performance and Results Act (GPRA) 1999
Performance Reports and 2001 Plans. You requested that we obtain responses from Department of
Education (ED) employees who had worked on creating the current version of GPRA reports and
plans. You also requested that we gather information on how program offices are using information
from these documents in their programs.

Overall, a majority of interviewees stated that PES did a commendable job implementing GPRA,
particularly since this was the first performance report the Department submitted to Congress. The
interviewees acknowledged that some problems and challenges associated with the GPRA process
existed, but overall their experience was positive.

In general, interviewees commented favorably on PES’s assignment of specific representatives to
assist individual programs. Program staff suggested that this personal interaction made the process
easier to understand. Interviewees also stated that PES representatives who worked with the programs
for an extended period of time gained an understanding of the programs which helped communication
between PES and the programs.

A few reoccurring suggestions surfaced during our interviews. Interviewees stated that there were too
many changes made by PES throughout the GPRA reporting process. Most interviewees suggested
that changes made to the process should be kept to a minimum, particularly changes made toward the
closing stages of the report preparation. Another common suggestion from interviewees was to adjust
the timeframe of the GPRA reporting process. Many interviewees stated that the timeframe was
“rushed,” particularly toward the end of the process. Some suggested PES attempt to coordinate the
GPRA timeframe with that of the budget process since both activities require similar information from
programs.
Approach
We developed questions in five broad categories: Background Information, Interaction with PES,
Creating the GPRA Reports and Plans (Vol. I/II), Using the Results and Interviewee General
Thoughts.

Over the course of two months, we interviewed approximately 30 managers and staff from a variety of
program offices within ED headquarters. We selected a majority of the interviewees from a list
provided by PES. To achieve a broader representation of programs within the Department, we
selected additional interviewees based on the budget size of their programs. (A list of our questions
and interviewee comments may be found in the attached tables.)

Results
The following information summarizes our findings, which are broken into the five broad categories
mentioned above.
                                     Background Information

Almost half of the interviewees (12/28) have worked for ED for 10 or more years with most spending
more than a year in their current positions. The majority of the interviewees did not work on one
specific aspect of the GPRA report, but participated in many aspects of the process. Additionally,
several interviewees indicated they had read the final draft but focused only on the section dealing
with their programs.
                                       Interaction with PES
When questioned about the quality of interaction with PES, the majority of the interviewees responded
positively. Interviewees praised PES representatives who worked individually with the various
programs. Further, interviewees commended PES representatives for their willingness to learn more
about particular programs. Although there were a few complaints about the interaction with PES, the
consensus among most interviewees was that PES staff members were approachable and that a
collaborative environment existed.

Approximately half of the interviewees had not taken the data-quality training provided by PES. The
majority of those individuals who took the training, commented that the training was either too basic
or not applicable to them. One individual suggested PES create different levels of training (basic,
intermediate, advanced) to ensure that opportunities would be available to those with varying levels of
experience. A few interviewees said the training was useful for those who were unfamiliar with the
GPRA process.

While PES was generally praised for the quality of interaction with each program, a few
improvements were offered. Interviewees commented that PES should allow individual programs
more input when developing performance indicators. Another common suggestion was that PES
should give programs more time to complete the process in order to create a product of higher quality.
An additional suggestion was that PES refrain from changing and updating indicators and goals in the
middle of the process which resulted in confusion and frustration. Finally, the consensus among

                                               2
interviewees is that the GPRA process is evolving and that PES and programs need to continue to
work collaboratively to improve the process.

                    Creating the GPRA Reports and Plans (Volumes I/II)
Our questions for this part of the interview sought information concerning interviewees’ thoughts on
creating and writing the current version of the GPRA reports and plans. While responses varied by
interviewee, a few themes emerged from the responses.

Many positive comments emerged concerning the creation of these documents. A few interviewees
stated that the GPRA process was valuable because it empowered their programs to hold grantees
more accountable for results. Another interviewee stated that the easiest part of the process was
writing the document since a majority of it was written by PES.

Of the 28 people interviewed, half indicated that the individual program indicators were appropriate
because they selected them. The interviewees who thought the indicators were inappropriate for their
programs generally stated that the indicators were chosen by PES. One interviewee suggested that
PES should try harder to involve programs in the process of creating indicators.

Another concern of interviewees was the constant change in format during the creation of the current
version of the GPRA reports and plans. This was a difficult obstacle for interviewees because drafts
that were near completion needed to be rewritten due to the changes.

A final concern for interviewees was the timeline of the GPRA process. Most interviewees felt rushed
to meet the deadline. Some suggested that PES begin the GPRA process earlier in the year to ensure
that programs would not be as pressured at the conclusion of the process. Another interviewee
suggested that PES coordinate its timeline with Budget Services’ timeline. This action would be
helpful since both offices require the same information from programs. Moreover, it would be less
stressful for interviewees if timelines were better coordinated.

A majority of the interviewees stated that the GPRA process required more than half of their time and
effort during the weeks prior to the deadline. During the remainder of the year, however, GPRA
responsibilities accounted for a small portion of their daily activities. Half of the interviewees worked
with other principal offices when writing the current version of the GPRA reports and plans. Most of
these interviewees stated that it was easy to communicate and share information with the other offices.

                                          Using the Results
This portion of our interview measured how effectively the results of GPRA were used in each
program office. The majority of interviewees stated that the intent of GPRA was beneficial to the
Department’s mission. Furthermore, many of our interviewees agreed that GPRA enhanced their
particular program’s objectives. Some program staff, however, indicated they had already established
goal-enhancing objectives.

A number of interviewees acknowledged that it will be difficult to foresee the positive outcomes of
GPRA since baseline data is in the development stages.


                                                3
Many interviewees indicated that GPRA has refocused their views of the objectives established for
their programs. Many stated that GPRA places more of an obligation on grantees because it ultimately
directs them to link their programs with the GPRA indicators. Therefore, grantees are more
accountable since the implementation of GPRA. Additionally, some program offices have created a
guidance section to demonstrate to grantees how their programs relate to GPRA. Frequently,
interviewees stated their programs had benefited from the GPRA process.

Several interviewees said creating the GPRA reports and plans has been useful for their individual
programs. They indicated that the process gives each program office an opportunity to make a case
for why its program should be funded and why it represents a good use of taxpayer money. The
majority of interviewees stated that GPRA has lead to stronger and higher quality performance from
their programs. Finally, all interviewees reported that GPRA is an evolving process which should
continue to improve.
                                  Interviewee General Thoughts
Approximately one-third of interviewees indicated they read reviews (e.g., congressional or General
Accounting Office reports) of ED’s plan. The interviewees who did elaborate indicated the reviews
failed to provide any new information concerning the GPRA plan. Additionally, they found the
reviews unhelpful or confusing.

Almost all interviewees rated the process of creating the current version of the GPRA report and plan
as equal to or better than the year before. Only one interviewee indicated that the process was inferior
compared to the previous year. Many interviewees realized that PES had a monumental task in
compiling data; they understood the process as fluid and evolutionary in nature. Most interviewees
maintain a positive outlook for an improved process in the future.

Recommendations for improving the GPRA process within the Department were varied. A number of
individuals suggested the Deputy Secretary and the Assistant Secretaries need to do more to
emphasize the importance of GPRA. One individual suggested that the Deputy Secretary conduct
meetings not only with managers but also with employees closely involved in the process. Another
interviewee commented that while GPRA is on “the radar screen” (i.e., visible) more resources are
needed to compile accurate data.

Conclusion
Comments and suggestions regarding the GPRA process were wide and varied. Hence, a dominant,
reoccurring theme did not emerge. Most interviewees, however, reported that constant changes
pertaining to indicator modifications and timelines made the process more difficult to complete.
Overall, interviewees stated that even though preparing the GPRA reports and plans is a difficult and
frustrating process, PES did a commendable job assembling this year’s products. We have no formal
recommendations for PES at this time.



Attachment



                                               4