Expenditures, Problems, and Prospects of Management and Utilization of Program Evaluation

Published by the Government Accountability Office on 1977-10-06.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                         DCCUMENT F!SUME
03710 - (B28340901

Expenditures, Problems, and Prospects of Management and
Utilization of Proqram Evaluation. Cctobur 6, 1977. 9 pp. + 8
appendices (51 pp.).
Testimony before the Senate Committee on Human Resources; by
Harry S. Havens, Director, Program Analysis Div:sion.
Issue Area: Program Evaluation Systems: Information for Congress
    and the Public (2603).
Cont&ct: Program Analysis Div.
Budget Function: General Government: Other General Government
     (806) .
Congressional Relevance: Senate Committee on Human Resources.
Authority: Program Evaluation Act cf 1977; S. 2 (95th Cong.).
    Congressional Budget Act of 1974. Elementary and Secondary
    Education Act.
          In the last ten years, the most important stimulus for
development of program evaluation capability has been the
congress itself. This has keen acccmplished by increasingly
requiring evaluation of a program in the authorizing
legislation, by considering proposals for the provision of
general evaluation capabilities for meeting Congressional needs,
and by strengthening the evaluation role of the General
Accounting Office. Evaluation processes and specific evaluation
products still have many inherent problems, including:
timeliness, relevance, inadequate data, analytical quality,
incomprehensibility of repcrts, and inadequate followup. There
seems to be no universal working definition of the term
evaluation. Expectations are too high about the knowledge that
can be qained from any given evaluation. Agencies need to make a
greater effort to anticipate the needs of Congress, but Congress
should help by making clear its committees' oversight needs. It
might be wise to have several different evaluations for any
given progre.m. It must be recognized that different users have
different needs for evaluation. To assure credibility,
evaluation results should be available for reanalysis by other
evaluators. None of the prcblems of evaluation techniques are
insurmountable. (Author/SS)
                        Unitec States General Accounting Office
                               "ashingtou, D.C. 20548

                                                             FOR RELEASE ON DEL"IS' Y
                                                             Expected at 10:00. a..
                                                             Thursday, October 6, 1977

                                      STATEHIT OF
                          IRECTO   HARRY S. HbAVENS
                        DIRECTOR, P      RAN ANALYSIS DIVISION

                                      BEFORE THE
                        SEXATE COOIITE     ON UL
                                               N RESOURCES


Mr. Chpirman and Members of the C-sttee:
        We appreciate the opportr ity to appear at theme hearings oan the cost,

a       mnaowent and utilization of progrma evaluation in haman resources prp-
grai.       GAO's concern for these aa:ters is grounded both in our longstanding
respons.i.±titres for revirwins Gucvernmat programs and the additional

responsibllties assigned to us        y Titles VII and VIII of the Congress-
ional ludget Act of 1974.

        I would like to focus briefly on several key problem which need to

be overcome if evaluation is to achieve its greatet utility.            In
addition to these rurklu, I would like to submit for the record several

appendices to which I will be referriqg.
    In recaut years, the most important stimulus for development of
evaluation capability has been the Congress itself, which has t&aVe.

action of two general types.
    In the last 10 years the Congress has increasingly required. evaluation

of a program in the legisLation authorizing that progrm.         This language

-anges from general requirements         sonme laws to rather detailed spQcific-
ations for evaluation in other laws.

    Another najor congressional       nitiative has' been sore geniral in

scope.   The Congress has considered a number of proposals during the

past 10 years for the provision of general evaluation capabilAties for

meeaing th   needs of the Congress.      Title VII of the Congressional

Budget Act of 1974 is perhaps the most important example of this sort
of initiative.   It specifically strengthens the evaluation role of the

General Accounting Office by requiriug us:
    -to review and evaluite' Governaent programs     carried on under

     existing law,                ,

    -to develop and recomend methods for reviewing and evaluating
     Government programs and activities carried ou under existing law, and

    -to assist CoIMittees on request in de'eloping statements of

     legislative objectives and goals and methods for asseasing and report-

     ing actual program performance.

GAO has continued to expend a substantial part of its resources on its
review and evaluatio- of particular program.        We also hbae initiated

a number of activities to develop and test methods and approaches which

will improve the usefulness to the Conrnmss of the evaluation information

it receives.
      Hore recently congressional interest has arisen in developing a
process for assuring that progrins will be systaeatically evaluated and

an opportunity afforded for the results to be translated into congress-

iemal action.      The Sunsec approach, currently under consideration as
the Program Evaluation Act of 1977 (S.2) is a leading exmaple of this


      Congressional initiatives appear.to reflect a growing recognition of
the importance of evaluation as a source of information- for doeision-

making, coupled with an avwaenass that the art of evaluation stiLl has
-   lang way to go.     I believe there is also a groaing recognition that
thb Congress and its Comittees will need to play a sore direct role in
iArroving the use of evaluation research products.


      A host of problems can be identified in evaluation processes and in

speclfic evaluation products.         There are regular dtscussions' in the
coammity about problemw         of tim*lineas, relevance, inadequate data,
analytical quality, Incomprehensibility of reports, inadequate follow-up
and so on.     These are erucial problem        and warrant our deepest concern
because they go to the beart of the usefulness of evaluation.          Other
factors are more paripheral, but can still impede progress when they

com   to tha surface.     For    xaemple, there is no uniform understanding
of the term "evluation."         I an not convinced that a rigid definition in
desirablc, but the absence o~ a comon understanding c.n cause management

problems bccause of uncertain responsfbilities and can areatly complicate
any effort to assess the overall costs and effectiveness of evaluation.
    In identifying, the problems which we will discuss today, we are
generalizing from several sources of informatioD.      We fivl these problems

in broad surveys of departments' and agencies' evaluation activitias,
and in cozpiiing inventories of evaluation reports such ao those liited

in our three volume series of Congressaioal Sourcebookt.      (Appondix 1)

We also hive obtained inforticn faim surveys by OCB and ao         preliminary

inforatcion from a survey of knowledge production activities, including

evaluation,.by the National Aczdemy of Sciences.      Another source. of

iformarion on problems is our own progrin reviews 'n which we obtas
and attempt to use information fom eval.uation reports aud data or

particular progras or activities.      (Appendix 2)     till-another source

of this information is our in-depth asses       nt of rvaluations pert ormd

in particular agencies or under particular laws requiring or authorizing
    Today I would like to concentrate on the matters which ae        most

likely to affect the use of evaluation by the Congress.

    First, expectations are still too high about the kncwledge that can

be gained from any given evaluation.       The Congress uee   information

about th   cost of programs and their impact on beneficiaries and others.

A real"stic expectation is that a well planned and managed evaluation

will help 3upply that information.     (Appendices 3 and 4)    But evaluations

are not black benes into which evaluation funds can be poured and

optimzm decisions-come out.   Only the political process can judge

                                    - 4-
the value of progsr   results and compare their priority to the results

of other programs compectng for a limited budget.     There is no black box

for that.

    A second key problem has to do with timing of evaluations, making

information available when it is needed by decisionsakars.    This is

closely related to the problem of over-eapectationa.     I suspect that

we may never truly "solve" this problem.     Once a question has been

asked, the questioner is understantably impatient for the answer.       But

there may be an unavoidable time lag of several yeara between the

decision to conduct an evaluation and the issuance of r.report, par-

ticularly if the program is a complex one.
    Wthile this time lag may be unavoidabLe, it's effects can be mitigated

by better long range plannAnz   of evaluation sctivities in the agencies.

None of us have perfect foresight, but it should be possible to
anticipate needs and aim evaluation products at key congressional decisions
such as rsauthorizations.   (Arpendix 5)   Tnis problem can alao be alleviated

by Improvements Ln the flow of information and its -:.ccumulation over
time in ways that coincide with the decision process.     For example, we

are currently working with your Comittee to assess your requireumnts for
systematic information on progras authorizbd by the Elementary and

Secondary Education Act.    (Appendix 6)


    Another problem is relevance.    An evaluation which answers the

wrong questions isn't lkllly to get such attention.     Here again, agencies

need to mace a greater effort to anticipate the needs of Congress.
But Committees also need to help by specifying the congressional

oversight issues and questions they want considered when the agency

designs the evaluation.   In response to a request from Senator Lehy for
assistance in studying the feasibility of his proposed resolution (S.Ros. 307)

introduced in the 94th Congress, we are in the process of developing a

suggested approach that could be used by the Congreac when it desires
to establish a structured oversight process.     (Appendix 7)
    After attew;pting to apply the proposed resolution to seleceed pieces
of legislation GAO found that many different evaluations- entaling
varyin   costs, times to complete,    and levels of measurment precqsion--
seemed both possible and plausible for any particulaz    program.    In
order to narrow the list of possible ea.zluations to those that the Congress
will find most useful and worth the cost, tae Congress itself needs

to comunicate its oversight and evaluative information needs and
priorities to those responsible for conducting the evaluation.

Needs of Varied Users

    In discussing the problem of assuring relevance, it must be recognized

that dlfferent'-users have different needs for evaluation.      The individual
program manager, for example, may be primarily. looking for information

which will help him improve the efficiency of the operation for which he

is responsible.   Other users-the agency head, MOB, the President and the
Congress-may find information of this sort interesting, but may really

need evaluation results which will help them compare the impact of this

program against ccmpetitg programs.
    The task of identifying the intended user of an evaluation and assur-

ing that the planned evaluation W.11 meet that user's needs is further

complicated by the existence of various levels within an agency at which
evaluation studies are performed.     It is quite comeon, for example, to

have both a ca.ntral evaluation staff which reports to the Secretary and

evaluation staffs in the major operating groups of an agency.        In

addition, of coursa, evaluation activities may be found in other parts

of an agency, such as in the interial audit staff, the budget shop, and
the legislative or policy development office.      An arrangement like this

can help assure that the needs of various users within the agency are
met.    But that makes it even more Iaportant to have good communicstion

among the various evaluators- and an effective evaluation planning process.
It is essential that the pl]anning process integrate the needs of-users
both within and outside the agency and. llocate available resources as
effectively as possible to      tect the full range of needs.   This is par-

ticularly important in view of the complexities introduced by varied

sources of funds for conducting evaluations and research.

Assurina Vliaitty

       There ir also an important question of how to assure the credibility

of rescerch findings, once they are dete-rined to be relevant to a

policy question.    Results of research and their interpretation say be

affacted in complax ways by the research design, research procedures,

and methods of analyis.      GAO's role as an indapendent auditor leads it

to be particularly alert to these problams and the social research
comunity as a whole is siving increased attaruton to thin.         (Appendix 8)

One approach being used by GAO and others is the reanalysis of one
researcher's data by another.     Careful reanalysis of evaluations and

other research can identify the often subtle impact of particulrr

methods on the research data or on its analysis and interpretation.
    This sort of reaalysis, however, requires open access to the

research and statistical data.     On occ     -z   tbo.s may includ     accass to

individually identified information for b'-.p-ses of verification.             While

this sort of access     raises several difficult matters, we believe the

climate is good for acceptance by the social research co              iunty of the

need for nudit and raanalysis, particularly of policy-oriented research.
I believe the questi3n is not whether it will be done, but how bast to

do it.
     Indeed, .I believe the climate is good for starting to deal effecttVi.ly

with most of the problems I have outlined in this statement.            We have

frequent contact with various parts of the communityboth inside
Government and outside and have made use of individuals and groups as

expert ccnsu..'ants iL developing our evaluation capability.           We a.     in

the process of establishing a special group of experts who can help us

in the task of developing solutions, both to the general problms I've

been discussing and to the problems of how be-t to carry out a specific

evaluation of a particular program.     Our experience has been that soot

of the people who work in this area recognize the importance of solving

these problems and want to help do so.
     There is always a risk that enumerating probls, as I have done,

will leave the impression that there is very little good to say about

program evaluation.     I want to dispel that notion. I bhave talkd about

problems because that's where we can make the greatest progress.               At the

sae time, I want to emphasize our belief that a great deal of useful
work is going on now.      he Congress end th.     xaecutive Branch have accss to

                                            8 -
better iniornatior than ever before about the costs and impacts of
Federal programs, and such of tnat infora-aLon is being used.
    .?. us, dssaCtisfaction with the state of program evaluatioi is a

basis for optimisa.   It reflects the fact that the   uaestions being
asked by deciaionekars are   properly, getting:tougher to answer and

that the anwer   are likely to be used.- If the questions were easy,
there would be no evaluation problems; if the answers vent unused, it
wouldn't matter whether we solved the evaluation problems or not.

Because we believe the answers are needed and will be used, we are

confident that the evaluation problems can and will be overcome.

    That concludes my prepared remarks, Mr. Chairman.    My colleagues and
I would be happy to try to r-awer any questions.

      United States General Accounting Office
              Washington, D.C. 20548

                   APPENDICES TO
                   STATEMENT OF
                  HARRY S. HAVENS
                    bEFORE THE

                  October 6, 1977

APPENDIX                                                   PAGE

   1        Congressional Sourcebook Series                  1

   2       Ex mples: GAO Assessments Of Agency               3
           Evaluations Of Particular Programs
           Or Activities

   3       Evaluation And Analysis To Support              21

   4        The Use Of Large Siale Models                  .26

   5        The Problem Of Timing Of Evaluation            28

   6       GAO Assistance To The Senate Committee          32
           On Auman Resources In Srecifying And
           Developing Requirements For Fiscal,
           Budgetary An,' ?'ogram-Related Information

   7       Finding Out How Programs Are Working:           42
           Suggestions Being Developed For Congressional

   8       Assistance BY Social Scieace Research Council   45
           In Methods For Audit And Reanalysis By The
           General Accounting Office
                                                         APPENDIX 1


Federal Program Evaluation
     One of the products developed in response to the Congressional Budget

Reform Act of 1974 and as a direct result of the 1975 joint OMB/GAO survey of

evaluation in 18 Federal agencies was the directory titled Federal

Program Evaluations.     This directory is intended to be an important link

between the Fe, eral agercies who develop evaluation information and the

Congressional ;staffs who may require evaluation information for legislative

and oversight purptoses.

     The first edition listed over 1700 evaluation reports that had been

produced in fiscal years 1973, 1974, and 1975.     The key features of the

directory are its several indexes which allow the user to search for inform-

ation by subject, legal authorization (name, public law number, and U.S.

Code), agency, and program name.     In addition, the basic citations identify

the agencies or subunits responsible for both the program and the evaluation


     Currently we' are updating Federal Program Evaluations.    During the

summer of 1977 we contacted more than 60 Federal departments, agencies,

commissions and other organizations.     Approximately 45 provided nearly

1500 evaluations covering fiscal years 1976 and 1977. The largest set of

of citations, over 400, represents, as might be expected, the GAO program

evaluation effort.     DREW provided approximately 250 entries and the Agency

for international Development and the Department of Rousing and Urban

Development each contributed more than 100.     Other major sources included

the Departments of Agriculture, Interior, Justice, Labor and Transportation
and the Veterans Administration.    Most of the non-contributing agencies,

generally among the smallest, simply 4do not perform a formal report-writing

evaluation process.

     We plan to publish this second edition with an improved format and

additional features early in 1978.    Successive editions, possibly on a

biennial basis are envisionad.     The information contained in the directory

will also be available through SCORPIO, the Library of Congress' computer-

based information retrieval system.    As additional program evaluation

infcrmation becomes available the computer data base cn be updated and

Congress will have Access to the very latest information without the

need for republishing at frequent intervals.

Federal *Info*rmition Sources and Systems_

     This volume describes approximately 1,400 Federal sources and systems

maintained by 91 executive agencies, which cortain fiscal, budgeting, and

program-related information.

'Requlrements f r Recurring Reports to the Congress

     This volume describes the various requirements for recurring

 rejorts to the Congress from the executive, legislative, and judicial

 branches and other agencies of the Federal Government.

                                                                   APPENDIX 2

                          XAMPLE~:-  GAO ASSESSMENTS
                                OR ACTIVITIES

Multi-Agency (including multi-EEW afancy)

     Returnin the Menta.lly ;isab led to eComuniv: Govern.
Needs to do More. HRD-76-152, HRD-76-152A, January-.7, '1977

     Although deinstitutionalization of the mentally disabled has

been a national goal since 1963, Federal agencies that can iafluance

this goal have not yet developed a comprehensive and clearly defined

national plan to achieve   the goal, or a management system to insure that

the goal is properly implemented.

               "In the absence of a national strategy or

          management system to implement dainstitutionalization,

          Federal officials responsible for other programs that

          affect deinstitutionalization generally (1) were not

          aware of the national goal or had not received instructlons

          on implementation, (2) had not implemepted their programs'

          to help achieve the goal, (3) had not amdartaken joint

          efforts directed at deinstitutionalization,   or (4) had not

          monitcred or evaluated their programs'   impact on


     Social Research and Development of Limited Use to National

Policymakers, HRD-77-34, April 4,    1977.

     The Office of Management and Budgot has issued no directive

establishing standardized or preferable criteria for monitoring social

R & D performers; and the criteria established by HEW for monitoring

prospectivt     performers was so beatd that t was, often of limited use to

agency officials.         Also, a uniform methodology for monitoring social

R & D projects has not yet been established by HEW.

                   "During our review of the National Institute of

              Education, we noted tCatt an opportunity exists for

          more consistent and effective project monitoring of

              R & D projects.     We found

                   -a     lack of detaild, .formal guidance for

                        assessing projects and

                  -inadequate      staffin, procedures which

                        resulted in (1) some monitors being

                        overloaded with projects and (2) monitors

                        being assigned to oversee projects in

                        areas where they have little expertise.

                  "At the Social and Rehabilitation Service,

          established guidelines for the monitoring of social

          R & D projects did not exi3t.            We found
                  -project officers being responsible for

                        monitoring as few as 1 and as many as 18

                        projects simultaneously,

                  -progress reports submitted as often as

                        monthly or es infrequently as semiannually,


                  -project officers not visiting or making

                        different numbers of visits to projects."

     Regarding monitoring procedures, also see:         Grant and Contract

Activities of the National Center for Health Services Research,

MWD-76-89, April 6, 1976.

     We found that:. The Center

                   -had not clearly defined the role its

                     project officers were tc fulfill in carrying

                     out monitoring activities and

                   -had     not established any procedures or guidelines

                     for carrying out monitoring responsibilities."

      Inequalittes in the Preventive Health Services Offered to

Federal Employees, MWD-76-62, June 14, 1976.         A major message in

this report was the lack of data evaluating the value of providing

preventive health services to Federal employees.

      How States Plan for and Use Federal Formula Grant Funds to

Provide Health Services,-MWD-75-85,        December 9, 1975.   Few program

evaluations or analyses identify the need for program improvements and

methods or approaches to health problems whic         show success.

      Fundamental Improvements Needed for Tiely Promulgation of
Health Program Regulations,        HRD-77-23, February 4, 1977.   This report
evaluated and analyzed the process for the issuance of regulations for seven

HEW agencies.      This analysis was initially requested by the House Sub-

committee on Health and the Environment,        Committee on Interstate and
Foreign Commerce.         However, after the evaluation was started, the Secretary

of HEW requestr-     the study in o der to assist them in their own evaluation

of regulation processing.

     The Well Being of Older People in Cleveland, Ohio. ERD-77-70,

Apri2 19, 1977.

                  "To answer this question, the Congress
          needs infcrmation on the impact of Federal.programs o1

          the people they are trying to help.        Such information
          is spread piecemeal throughout many Fbderal, State,

          local and private agencies.      As a consequence, no
          Federal agency has evaluated the combined effect of the

         many programs on older people.         Currently, even the

         amount of Federal funds supporting programs for older

         people cannot be determined.      An overview of the impact
         of Federal prolrams    on older peoa   le--multiprogram
         evaluation-is ne ited.

                  "Multiprogram evaluations performed by a single

         agency looking across agency lines at mpny different

         departments are necessary.      To assist the Congress and
         demonstrate that meaningful multiprogram evaluations can

         be made, we attempted to determine the imsact of Fedcral

         programs on older people.      We looked at 23 Federal
         programs administered by various agenc.es, including the

         Department of Agriculture; Health, Education, and Welfare

         (HEW); Rousing and Urban Development; Labor; and Transportat-

         ion "

    Most avencies'   Programs to Assist Employees With Alcohol-Ralated

Problems Still Ineffective, HRD-77-75, September 7, 1977.   Federal

agencies ard the CSC should direct their efforts toward evaluations of
civilian alcoholism program activities or they will not be in a

position tc know how effective programs really are.

                                    - 7 -
Office of -uman Development

     New Child Support Legislation-Its Potential Impact and How to

lmnrove It,    MWD-76-63,    April 5, 1976.

                   "The lack of action by HEW to administer

              and monitor tha program was one major weakness noted.

              This was characterized by no single organization having

              total program responsibility, program efforts lacking

              coordination, and basic program information not

              being available."

                   "GAO is recommending that legislative changes

              be made and that the annual program report to the

              Congress contain information to help determine how

              much the new legislation has improved program


      Mo-e Can Be Learned And Done About the Well-being of Children,

MWD-76-23, April 4, 1976.

                   "The report addresses the need for Federal

              evaluation of prbgrams concerning the well-being of

              children, for research directed toward problems

              identified through such evaluation, and for greater

              dissemination of research knowledge.

                   "GAO devised an unprecedented method for

              measuring the progress of children accepted

              for protective services by welfare agencies.    This

              method focuses on the well-being of children rather

              than on the number and types of services provided or

              available. "
Office of Education/National Institute of Education

(also see appendix 5   )

     Follow Through:   Lessons Learned From Its 'valuation And Need
To Improve Its Administration.MWD-75-34, October 7, 19'5.

    The Office of Education contracted for a national evaluation to assess

effects of approaches undertaken in the Follow Through program-an experimental

program designed to find more effective approaches to teaching young

children from low-income families.

               '"We recommend that the Secretary of HEW

         direct the Office of Education to insure that

         future experimental programs are not designed

         apart from evaluation to maximize the degree to

         which experimental results will be statistically


    Bilingual Education:     An Unmet Need, NWD-76-25, May 19, 1976.

              "Because adequate plans were not made to

         carry out, evaluate, and monitor the Bilingual

         Education Program, the Office of Education has

         progressed little toward

              -identifying    effective ways to provide

                bilingual education instructions,
              -adequate training of bilingual educueiou

                teachers, and

              -developing    suitable teaching materials.
              "No comprehensive information is available on

         the program's effect on students' academic progress,
            but the Office of Education has contracted for

            a. national evaluation on this.   Local project

            evaluation reports have been inadequate and of

            little use to local and Federal decisionmakers."

     The National Assessment of Educational Progress:         Its Results

NeedTo.Be Made More Useaful.     RD-76-113, July 20, 1976.

     The National Assessment of Educational Progress is a project

which annually surveys the knowledge, skills, and atti±-Ades of

young Americans.    We reported that NEW needs to red:.rect the project by

(1) identifying informational and other needs of decisiomiakers,

(2) determining the feasibility and cost effectiveness of alternative

approaches to satisfy the needs, and (3) deciding on the assessment

approach to be used.

Center for Disease Control

     The Urban Rat Control Program Is In Trouble, MWD-75-90,

September 29, 1975.    Our report specifically pointed out the need for the

agency and Congress to determine more measurable objectives for program

progress.   Our review included a verification of the agency's evaluation

process in which we concluded that the conclusions to%ched by the

agency evaluations were subject. to variables that had not, but should

have been, considered.

                                     - 10 -
Health Service Administration

     Factors That Impede Progress in fmplementing the Health Maintenance

Organization Act of 1973, HRD-76-128,    September 3, 1976.     This review

involved a nationwide questionnaire instrumental in evaluating the

impact and attitudes of potential beneficiaries of the        MHO Act of 1973.

This information and that resulting from concurrent HEW and GAO studies

of HEW management resulted in specific legislative changes to the EMO Act.

In addition, GAO requested legislation which was passed to revise legally

mandated evaluation requirements that the orignial act placed .on GAO.

     We found also in this review that "'EWhas developed data-

reporting requirements which, alone, will not provide sufficient in-

formation for the evaluations required by section 1315 of the Act.

HEW will rely on special studies to fully meet its evaluation requirements."

     Progress, but Problems in Developing Emergency ?,edical Services

Systems, MRD-76-150, July 13, 1976.     This report commented on the slow

progress of the Federal program that encourages National Emergency Medical

Services Systems and specifically cirticized HEW by noting a need for

the Department of improved guidelines for evaluation grantee progress and

assessing readiness to proceed with system development.

     Letter report to Senator James Abourezk on Investigation of

Allegations Re:   Indian Health Service, HRD-77-3, November 4, 1976.        This

report was the second of two reports that dealt specifically with the

inability of the agency to respond to its management information and

evaluation systems.   We also criticized the input process of the

evaluation system with regard to the number uf American Indian woman

who had undergon. sterilization procedures in the Indian hospitals.

                                      - 11 -
      Outpatient Health Care in Inner Cities:       Its Users. Services,
and Problems,     WD-75-81, June 6, 1975.'    One source of data for this
review was health studies and demographic data from the various

public and private agencies.

                  "HEW has developed at, indicator to ida'tify

          medically underserved areas.        This indicator is
          composed of:     percentage of population with income

          below the poverty level, percentage of population

          65 and over, infant mortality rate, and physicians

          per 1,000 population.

                  "Using this indicator, we determined that the

          eight social planning areas in Cleveland were

          significantly medically underserved.        Using the
          same indicator, the Erie County Health Department

          determined that an area in Buffalo having a large

          concentration of the poor also was significantly

          medically underserved.     This area included most
          of the model cities area.     In both cities these

          medically underserved areas contain the greatest

          conccntrations of low income people."

     Improving Federally Assisted Family Planning Programs, MWD-75-25,

April 15, 1975.    This report evaluated and questioned the usefulness of
the National Reporting System for the Federal Family Planning Program

and also recommend that HEW establish criteria for monitoring and

evaluating costs and performance of. family planning programs.

                                     - 12 -
Cc-munity Servicer Adm.u-f.stration

     Improvements Needed in Community Services Administration's Grantee

Self-Evaluation System, HmD-76-151, July 20, 1976.

                "Our revfew centered on the       :   ^y's system requiring

          grantee     self-evaluation.   We assessed how grantees

          in CSA's Chicago, San Francisco, and Philadelphia regions

          had implemented the system.       Our review included

          discussions with Federal, State, and local program

          officials and an examination of self-assessment,

          planning, and other related reports used in evaluating

          antipoverty programs.

                "In July 1975 CSA Issued standards to evaluate

          the effectiveness of CSA administered programs and

          projects.    Ir,June 1976 CSA was completing development

          of guidelinoo for using these standards in making

          CSA funding determinations.       Following are CSA's

          standards which generally restate the 1969 OEO standards

          of effectiveness for local community action and

          other programs.

                -Strengthen :ommunity capacity to plan

                    and coordinate poverty-related programs.

                -Improve organization of services related

                    to needs of the poor.

                -Maximize      participation of poor in the


                                      - 13 -
               -Broaden    community resources invested in

                   antipoverty activities.

               -Increase    innovative approaches attacking

                   the causes of poverty.

               -- Maximize employment and training opportunities

                   for groi'ps served.

               "CSA headquarters needs to provide better wversight

          and guidance to its regional offices on implementing tho

          self-evaluation process.       Specifically, we found that:

               --Regional and headquarters offices' had

                   not established or appropriately staffed

                   formal organizational structures for over-

                   sight of CAA evaluation activities.

               -Regional offices were not obtaining

                   and using relevant CAA self-evaluation

                   and planning reports.

               -- Incnsistent regional guidance contributed

                   to disparity in the existence and quality

                   of CAA self-evaluation systems."

     Data Gathered on 189 Federal Programs Benefiting the Poor,-

MWD-75-87, June 2, 1975.    Our review gathered data on 189 Federal

programs bencfitting the poor.     Data included 150 studies made during

fiscal year 1969-73 by internal audit staffs, agency evaluation groups

and contractors.    About half of the 150 could be classified -     program

evaluation or effectiveness studies, usually performed by contractors.

                                     - 14 -
     "We identified 71 reports, issued by our

Office from July 1968 through June i974, dealing with

reviews of the effectiveness of Federal programs

benefiting the poor.

     "Several of our reports in recent years have

pointed out the need for more coordination among

Federal programs.    Where several agencies are pro-
viding assistance to individuals or communities,

often no single agency:.is assigned responsibility

for coordinating all programs having similar


     "Our analysis of those reports dealing

with the evaluation of programs having similar

objectives suggests that in several areas

persons can be served by more than one program,

not necessarily duplicative but certainly similar

in nature.    Thus, a person might bs eligible for
similar benefits from at least two programs, one

based on the type of assistance offered and one

based on the category of persons served."

                          - 15 -
Department of Labor

     Departmeut of Labor's Past and Future Role In Offender

Rehabilitation    MWD-75-91, August /,   1975.

                 "Labor has tried a wide range of research

          and demonstration projects to find ways to alleviate

          the di.fficult problem of criminal offender
          rehabilitation.    Some programs appeared to have promise.
          Pretrial intervention is a preventive program which

          seeks to save individuals from having criminal records

         while putting them on a constructive path to pro-

          ductive lives in society.      Inmate training seemed to
         offer some help to offenders in developing employable

         skills.    The model ex-offender program, as a job

         placement effort, assisted offenders in finding jobs.

                 "Because the objective of any research and

         development is to determine the best method for solving

         a problem, evaluation of these efforts is important

         and they are needed to decide the best course of

         lction.    Labor's-past efforts to evaluate criminal

         rehabilitation programs have been hampered by poor

         recordkeeping and difficulties in locating ex-offenders

         after release from prison.      Because followup data on ex-
         offenders who have completed rehabilitation programs is

         a key element in the present evaluation process, it

         may be necessary to revise evaluation concepts if there

         is no significant impro-           .i   obtaining this data."
    Labor stated that a set of goals and objectives is under review and

programs are being evaluated.    Labor said a study -ould be made to find

the best way to make post release followup on offenders.

Social Security Administration

     Improvements Needed in Medicaid Program Management Including

Investigations of Suspected Fraud and Abuse. MWD-75-74, April 14, 1975.

               "Utilization review is the system used to determine

          the appropriateness of medical care provided and to

          identify and prevent overutilization of medical services.

         Utilization review has two basic purposes:      -(1) to help

          insure that individuals receive high quality medical

          care and (2)   to control program costs by preventing

          unnecessary use.

               "The Social Security Act requires States to

          have operational utilization review systems for all

          services provided by Medicaid and lists specific

          requirements for utilization reviews of institutional


               "The compliance problems relating to utilization

          review reported by the regions and in numerous HEW

          audit reports indicate a lack of SRS action to insure

          that States have effective utilization review systems.

          HEW's delay in issuing regulations and its failure to

          impose penalties has delayed the effective implementation

          of utilization review systems in the States.     SRS should

          move rapidly to assist the States in improving their

                                    - 17 -
         systems to protect against unnecessary and inapprop-

         riate utilization and thereby reduce Medicaid costs

         and improve the quality of care provided under Medicaid.

          Improved utilization review systems should also help

         detect and control fraud and abuse."

    Improvements Needed in Rehabilitating Social Security Disability

Insurance Beneficiaries, MWD-76-66, May 13, 1976.

               "Under the 1966 program management agreement,

         Rehabilitation Service AdminiAtration, (RSA) agreed

          to furnish data to SSA for evaluating the Beneficiary

         Rehabilitation Program effectiveness.        In turn, SSA

          intended to provide an evaluation to the Board of

         Trustees for its annual report to the Congress.        How-

          ever, RSA has not furnished the necessary data and

          SSA has not developed it   independently.     As a result,

         program planning and evaluation have not occurred as

          originally intended, and the Board of Trustees has not

          had the information necessary to report program

          effectiveness to the Congress.

               "Inadequate staffing for the Beneficiary Rehabi-

          litation Program and the lack of an adequate management

          information system resulted in inadequate HEW assessments

          of program progress and potential and insufficient guid-

          ance to State vocational rehabilitation agencies in

          understanding tha program's goal and in interpreting

          eligibility criteria.

                                  - 18 -
               "This may explain why, nationally, the number

          of beneficiaries reported by. HEW as rehabilitated

          has increased each year sint e the beginning of the

          Beneficiary Rehabilitation Program, while the number

          of beneficiaries being removed from the bmefafit rolls

          have leveled off at about 2,500, having peaked at

          3,078 in 1970.

               "HEW and the Board of Trustees have not been able

          to provide accurate information on the program's operation

          and potentia3, to the Congress.

               "Program administration would be improved by the

          periodic monitoring of progress and performance

          assessments which are provided for in the Secretary's

          Operational Planning System.      This would also assist

          the Board of Trustees in presenting to the Congress

          an evaluation of the program's operation."

     Legislation Needed To Improve Program For Reducing Erroneous

Welfare Payments, HRD-76-164, August 1, 1977.

               "Since the quality control program was initiated

          in 1973, HEW has continually through 1976, overstated

          the programs' accomplishments.      Savings estimates

          resulting from error reduction were not based on

          valid statistical projections and included actions

          which did vtot necessarily produce direct savings

          in welfare pryments.   aZW did not consider the adminis-

          trarive costs that would be associated with implementing

                                   - 19 -
corrective actions.   In addition, States

generally did not conduct cost effectiveness

studies before starting corrective actions,

although required to do so by HEW."

                         - 20 -
                                                               APPENDIh   3



     The General Accounting Office issued a document, Evaluation and

Analysis to Supporr Decisionmaking, PAD-76-9, September 1, 1976, which

it described as "* * *a first step in collecting and disseminating

general concepts on these activities and how they are related to

other activities in the continuum of decisionmaking about Government


Excerpts from Introduction

               "Thus ultimate choices about programs--decisions

          about whether to do or not to do something-will be

          policy choices.    However, political leaders, public

          administrators, and the public need as much information

          as posbible on the choices that must be made. This

          need has stimulated the development of various analytical

          techniques which have been grouped under labels such

          as program evaluation and policy analysis.     The art of
          evaluation and analysis is not yet sufficiently develop-

          ed to permit preparation of a manual covering 'how to do

          it' in every situation."
               "Thus, we offer this document as a first step in

          collecting and disseminating lessons    learned in GAO
          and elsewhere about analysis and evaluation.     Generally
          speaking, we offer this guidance for the use of anyone

         who is 'evaluating' programs and 'analyzing'    policy
          choices in the sense of engaging in a careful appraisal

          of what happened, why it happened, what choices are
                                   - 21 -
         available for future actions, and what the implications

         are of those choices.

               "The concepts and guidance which we offer must be

         adapted to specific; program sit' %tions. Program

         objectives are seldon   as clearly stated or agreed

         upon as would be desirable for evaluative purposes, nc

         program operates in isolation from other social or economic

         events; and data and measurement techniques are almost

         always less adequate than desired.    It is in the adapt-
         ation of the ideal and the theory to the specific situ-

         ation that the persons doing the work show their worth."
Excerpts form Chapter 3:   The Evaluation and Analysis Continuum

               "For purposes of this document, drawing sharp

         distinctions between evaluation and analysis is

         less useful than focusing on the two hasic questions

         which decisionmakers, and their staff, face:

         (1) What actually has happened Is a result of past

         or current pol.cies and programs and what have we

         learned? and (2) What should be done in the future

         and what are our options?    Answering these questions

         can, in turn, be roughly translated into broad cltsses

         of activities:    appraising the results of policies

         a&id programs and assessing alternative policies and

         programs ."

                                  - 22 -
Excer'ts from __apter 4: Appraising Results of Policies
 and Programs and Assessing Alternative Approaches

              "The activities of appraising results and

         assessing alternatives of programs and policies share

         certain fundamental concepts in which the mode of

         inquiry is essentially the same.             These fundamental

         concepts include:

             -ascertain decisionmakers'         needs,

            -- defining the nature and scope of the problem,

            -determining     valid objectives, and

            - specifying comprehensive measures.
                       *           *              *

               "The process of appraising results should begin

          concurrently with policy or program implementation and

          continue as needed.    Continuous appraia.L, through a

         well structured management information system, should

          be maintained, but even when it        exists there wi.1l be

          a need for special reviews from time to time.

               "After the fundamental concepts discussed above

          are understood they must be further developed through

          application of other more specific concepts and methods


              mking valid comparisons,

             --developing needed information,

             -- interpreting    >_,gram results, and

             --checking the completeness of the appraisal.

                                       - 23 -
              "As in the case of appraising policy and

         program results, the methods used in assessing

         policy and program alternatives build on the funda-

         mentals discussed at the beginning of this chapter.

         In this case also, there are additional concepts

         and methods which are nbaded, such as:

            -developing a range of alternatives,

            -screening the preliminary alternatives,

            -estimating       the measurable consequenceis,

             -assessing      provisional orderings,

             -determiiing      the impact of constraints,

             -reassesa;ing     the ordering of alternatives, and

             -- checking; the completeness of the assessmant."

Contents of Chapter 5:    Practical Aspects of Managing and
 Performing Studies

           Formulating an agenda of studies

               Identifying emerging problems

               Decidier   which problems to study,

           Beginning a study

               Praparing a detailed study plan

               Selecting the study team

               Establishing lines of communication

               Selecting appropriate methods

           Conducting a study

               Collecting relevant data

               Testing the reliability of data

                                      - 24 -
   Protecting the confidentiality of

      information about individuals

   Documenting and referencing

    Adhering to time schedules

    Leading and coordinating the study


    Using computer-based models (see also Appendix 4)

Co.mmunicating study results

    Specifying the nature of reports

    Communicating with clarity and con-


    Following up

                      - 25 -
                                                                APPENDIX 4

                         THE USE OF LARGE SCALE MDDELS

       To deal with complex issues in such areas as social welfare, food,
 energy, the environment, transportation, and urban planning; Government

policy analysts and decisionmakers,     in increasing numbers, have been using
 conceptual models, often implemented on a computer, to perform program and

policy analyses.    In concept, a model is a simplified representation of

 the underlying structure of an issue.     Such a model can be used by analysts
to assist decisionmakers in assessing the interaction of several elements

of an issue'and the comozned response of these elements to specific alter-

native policy options.

       Models allow analysts and decisionmakers to deal with aspects of these

issues which are not readily susceptible to analysis with other tools.

However, a model is a simplified representation of an issue based on

simplifying assumptions, approximations,     and judgments all of which affect
the validity, reliability, and accuracy of the model's results.      Obviously
there is a need to guard against the temptation to view a model as a magic

"black box" which automatically gives truthful and complete answers.       The
fact that aspects of an issue are examined by computer in minute detail

and at electronic speed can give a false air of reality to the results.          A
prospective policy analyst/decisionmaker may use a mode.L's results without

being fully aware of the assumptions,    approximations, and judgments that
went into the model, and how they affect these results.     Thus, GAO feels
it   is essential that these moaels are carefully evaluated to establish

the confidence in their results.

                                        - 26 -
      With regard to programs and policies of concern to the Committee on

Human Resources, we have conducted an assessment of the Transfer Income

Model (TRIM).      Several versions of this model have been used to provide

estimates to policymaker- in both the Executive branch and the Congress, of

the distribution effects, program costs, and other impacts of proposed

changes in major social programs.      TRIM has been used to analyze pro. r: ,

such as Aid to Families with Dependent Children, Food Stamps, Supplea            .al
Security Income, and Federal Individual Income Tax programs; variations

of a housing allowance program; and, negative income tax .proposals siuch a4

the Income Supplement Program and the Allwomance for Basic Living Expenses

Program.      TRIM is being used to support the work of President Carter's

Welfare Reform Task Force.

      GAO's evaluation of TRIM sought answers to a number of questions such


      -What    are the major assumptions made in the model?

      -What effect do these assumptions have on the model's results?

      --Is the model documentation sufficient to understand, use, and

       maintain the model?

                                        - 27-
                                                                  APPENDIX 5


     A recent GAO report Problems and Needed ImDrovements in Evaluatina

Office of Education Programs, (HRD-76-165, September 8, 1977) was focused

on the usefulness and limitations of federally supported evaluations of

programs under the Elementary and Secondary Education Act.    This report
demonstrates the need to consider differing views regarding effectiveness

measures preferred by policymakers at different levels of Government.       The
report discusses actual difficulties reported by evaluation officials in

implementing study results in the policymaking climate prevailing when

the results were published.   Difficulties occurred in cases where a

direct effort had been made to cause utilization through use of a

"policy implication memorandum" procedure implemented by OE in 1972.

     In our opinion, OE needs to give higher priority to policy implication

memorandums or some other procedure for achieving increased use of

evaluation findings.   We recommend to HEW that these evaluation results

should be better timed to coincide with the legislative cyt le and that

more attention should be giveu also to more frequent briefing of con-

gresvional committee staffs on the objectives, data, and effectiveness

measures being used in these evaluations.

Excerpts from Introduction of GAO Report

               "The table below lists   the funds available

          to OE for planning and evaluation.   According to OE,
          these sums, although substantial, represent less than

          three-tenths of 1 percent of OE's total annual program

          appropriations and must cover approximately 85 legislative

                                   - 28 -
            programs.   OE's Assistant Commissioner for Planning,

            Budgeting, and Evaluation estimated that from about

            1971 on, approximately two-thirds of the OE planning

            and evaluation appropriation funds have been used for

            OE evaluation activities.    (Chapter 4 provides funding

            information on State- and local-level evaluations of

            elementary and secondary education programs.

                         OE Planning and Evaluat%.on Funds

                                           OE program
                          OE planning      funds used
                        and evaluation   for evaluation
      Fiscal Year       appropriations   (notes for a and b)    Total
                                    (000 Omitted)

        1968            $ 1,250               -                  $ 1,250
        1969              1,250               -                    1,250
      c/1970              9,512          $ 4,155                  13,667
C/,   d/1971             12,475            8,724                  21,199
d/,   e/1972             11,225            3,950                  15,175
      d/1973             10,205            9,880                  20,085
      d/1974              5,200            5,268                  10,468
      d/.1975-            6,858           11,043                  17,901
        1976              6,383           10,512                  16,895
a/Includes funds authorized from Follow Through, Emergency
  School Assistance Act, title I of the Elementary and Secondary
  Education Act, Basic Opportunity Grants, Project Information
  Packages, and Career Education programs.

b/Does not include program funds used by State and local education
  agencies for evaluations under Elementary and Secondary Education
  Act, titles I, III, VII, and VIII.

c/Does not include $5 million appropriated for grants to States
  for planning and evaluation under Elementary and Sacondary
  Education Act, title V, part C--Comprehensive Educational
  Planning and Evaluation.

                                      - 29 -
d/Includes support for the Educational Policy Research Centers
   (at Stanford Research Institute and Syracuse University Research
  Center) for the following fiscal years: $900,000 (1971);
  $900,000 (1972); $950,000 (1973); and $450,000 (1974). Monitor-
  ship of the centers was transferred to the Office of the Assistant
  Secretary for Education in fiscal year 1974.

_/Excludes $1 million earmarked for NIE planning.

                 "Systematic, comprehensive evaluation of Federal

          education programs at the Federal level dates back only

          to 1970.    At that time the Congress increased OE

          evaluation funds in response to HEW's request.

         According to OE, such efforts were largely precluded

         before then by insufficient appropriated funds for

         evaluation and too few technically qualified evaluation

         staff members.     Since fiscal year 1970, OE has

         attempted to expand and upgrade its evaluation

         activities and capabilities.       The equivalent of 23
         professional full-time stiff members are now assigned

         to evaluation.

                 "The Office of Planning, Budgeting, and Evaluation

         has designed and begun over 100 evaluation and planning

         studies; instituted an annual evaluation plan highlighting

         yearly priorities; and implemented a process for

         dissminaring, chiefly at the Federal level, the major

         results of evaluation studies.

                 "Almost all OE evaluation and planning studies are

         performed under contract.    OE's evaluation office issues
         a request for proposals after determining the study's

                                   - 30 -
         design and the techniques to be used--for example,

         sample size, analysis method, and data collection

         method.   Contractors are selected competitively.

         After a contract is awarded, an OE project monitor

         from the evaluation office monitors the contractor's

         performance by exercising approval over the approach

         to be used, making site visits, and reviewing pro-

         gress reports.   The project monitor also reviews and       !

         approves the draft report's technical adequacy,         i

         completeness, and responsiveness before the report

         is finally accepted.

Excerpts from Chapter 4 of GAO Report

        "State and local education agency officials, responding

         to our questionnaires, indicated a need to improve

         evaluation reports including the credibility of

         findings and the qualification and quantification

         of measurement data.     Other evaluation problems at

         the State and lccal level include relevance to policy

         issues, completness and comparability of data reported,

         and report timeliness.

                                    - 31 -
                                                              APPENDIX 6


      Title VIII of the Congressional Budget Act of 1974 requires that
 the General Accounting Office (GAO) assist the committees of Congress

 in specifying and developing their requirements for fiscal, budgetary

 and program-related information.    Upon request GAO has been working
with the Senate Human Resources Committee in advising and assisting
in specifying and developing its overall information requirements
order to fully participate in the new congressional budget process
to strengthen its oversight function.

     Often the lack of an adequate base of information contributes
nificantly to the difficulty in the conduct of responsive program
ations.   Responsive program evaluations can be performed only if there

is relevant and timely information readily available.    Further, the
range of information (budget, financial, program performance,
impat, 'etc.) must be structured or linked together in a meaningful
Consequently, we consider the GAO responsibilities under Title
                                                               VII and
VIII of the Congressional Budget Act as not only closely related
mutually supportive.   As we assess the information needs of committees,
we also assess the information needs as they relate to conducting
program evaluations.   In addition, the management planning and feedback
process to make the best use of the evaluative information and
                                                               to capture
timely and useable information is considered.

                                    - 32 -
Support for Committee Reports on Views and Estimates

       With the full participation and support of the agencies responsible

for programs within the Committee's legislative jurisdiction, detailed

budget information was specified and collected to support the March 15,

19i5, 1976 and 1977 "views and estimates" reports required by the Budget

Act.    Continued collaboration is essential to support the Committee in

its annual data requirements for the development of the March 15 "views

and estimates" reports as well as other phases of the budget process.

GAO will continue to support this maintenance requirement with an auto-

mated data base.     The Senate Computer Center is implementing a tracking

system for the Committee from a conceptual framework developed by GAO.

This system will follow budget-related congressional actions for the Com-

mittee's programs.     The system will be initiated each year from the GAO
automated data base which supports the Committee's "views and estimates"


Support for Committee Oversight

       In addition to this budget oriented development effort, we have been

working directly with the Committee staff and the executive agencies on

selected programs in defining and assessing the Coumittee's other infor-

mation requirements.    Fulfilling all the Committee's requirements will

need continued work.

       On December 18, 1975, we provided the Committee with an initial docu-

ment 'Discussion of Information Needs, Senate Committee on Labor and Pub-

lic Welfare (OPA-76-57))   which identified the type of information the
Committee needed in broad terms and described the conceptual framework

                                     - 33 -
 for providing that information.   We then began work on two information
 systems to satisfy some of these needs.

      The first information system was discussed above.
                                                           The second focuses
 on program planning, execution and performance information
                                                            to support over-
 sight responsibilities.

 Elementary and Secondary Education Act-(ESEA) Programs

      On May 25, 1976, we published a three-part document
                                                           entitled "Pro-
 posed Formats for Information Collection from Selected
                                                         Agencies of the
Department of Health, Education and Welfare: Part
                                                    I, Office of Education;
Part II, the National Institutes of Health; and Part
                                                      IIT, the Center for
Disease Control, the National Institute for Occupational
                                                          Safety" (PAD-76-
33). Attachment I displays a list of the types of
                                                    information elements
these collection formats attempted to capture.

      Information used to test the Elementary and Secondary
                                                             Education Act
 (ESEA) programs part of thiu system was collected
                                                    mainly from Office of
Education. The programs included are those authorized
                                                         by ESEA Title I -
Educationally Deprived Children, Title IV, Part B
                                                    - Libraries and Learning
Resources (Consolidation), 'Title IV,' Part C - Innovat3on
                                                           and Support (Con-
solidation) and Title VII - Bilingual Education.
                                                    The complete program
structure is identified in Attachment II.

     During last sulmer and fall GAO persoanel assisted
                                                         Office of Educa-
tion program and budget personnel in completing the
                                                     formats designed to
collect information. An initial evaluation of the
                                                    results has been made
of the uefulness to the Committee of the information
                                                       and the .f Isibility
and desirability of providing the information to
                                                 the Committe

                                   - 34 -
     The s:lected education programs were and are currently being analyzed

as to their potential fo': supplying information for measuring accomplish-

ments against legislative, judicial, and executive operating objectives.

The information collected included program overview and budget execution

infirmation, as well as performance and impact indicators.        In accordance
with guidance from the Committee staff we analyzed information and clas-

sified the requirements as (1) easily filled, (2) filled with some addi-

tional efforts, and (3) requirements needing long-range development to

fill the information gaps.     This effort is described in a GAO document

which will soon be provided to the staff of the Committee on Human Re-

sources.     The appendices to the draft document exhibits the information

collected from the Office of Education and demonstrates a display which

could be used to present the information.

     We believe that budget, financial and program information currently

available within OE offers an opportunity to provide an improved base of

information to the Committee.    There is,     however, no cohesive presentation

of this information available to the Committee.       If properly displayed
and packaged for the Committee, we feel this information could be of valu-

able assistance to the Committee in carrying out its oversight and budget-

ary functions.    This should facilitate linking the budgetary, evaluation,

and performance information with congressional and agency decision-making

processes.    Also, linking the information with the congressional and agency

decision-making process should force better timing of evaluations, more

consistent information, and 'ocus evaluation objectives on program impacts.

One of the objectives of our project is to demonstrate the feasibility of

agencies providing such a presentation.

                                    -   35 -
      We believe there is sufficient information to support the initiation

 of a system of information for the Committee.       However, there is still a
 void in the area of good hard evaluation and performance type data.

      Adequate program performance impact data is often not systematically

 available in agency information systems.       For instance, we have found
 that evaluation information for some programs is not centrally -;Jordi-

nated.      Further, the performance and impact information thae   is available
 is often difficult to link with planning and resource allocation structures.

Education Information at State and- Local Education Agencies
      Because of the information systems' inadequacies' discussed above,

conducting sound evaluations is difficult.        In an attempt to determine
whether information is available outside the federal sector which could

help fill this void we have recently obtained assistance from the GAO

New York Regional Office/Albany staff.        We are jointly conducting an ex-
ploratory search of Education information at the State and local education

agencies.     lhis will not be a performance evaluation.     Through this work
an analysis will be made of Federal, State and local ESEA implementing

regulations focusing On the information available.at the State and local

education agencies in relation to the regulation requirements.        The re-
suits of this search will include an assessment of the:        potential useful-
ness of the information available at the State and local level, location

of the information, apparent voids in information, and any changes in

legislation needed to provide required information.        The results of this
work will be incorporated in our overall information systems development


                                     - 36 -
Further Assessment

     As further Regional Office staff becomes availabje we will direct

other projects in evaluating and assessing the appropriateness and use-

fulness of currently available indicators.     This work could also result

in further refinements in the information requirements.

     Any information systems developed from this work will be made avail-

able to the agency as well as the Committee.

Social -Indicators
     The Committee also requested that we assist it in specifying and

developing an operational system of social indicators.      Social indicators

related to employment were chosen for this preliminary work.

     We have reviewed the available employment data series.      These in-

cluded employment and unemployment statistics, wage and salary data, work-

ing conditions and benefit data, new Job satisfaction measures, and

worker health and safety data.   We were able to describe the strengths

and weaknesses of the data as social indicators.      We found that these em-

ployment concerns were represented in several, budget functions and. many.

programs to varying degrees.

     We eaamined the current state of the art for systems of social indi-

cators and found that. the indicators were not systematically related to

particular programs or budget functions.     Presently, the only operational

system is a list of descriptive statistics.     It is difficult to specify

the contents of a list as complete because a theory has nu. been developed

 jhich can measure social well-being in employment.     We are working with

                                  - 37 -
the Committee staff to determine how to set up an operational system of

employment indicators to improve oversight and to assist it   in the bud-
getary decisions.

Need for initiation of Systematic Information Processes

     We believe that the approaches we are developing and testing for

the Committee will assist the agencies'in developing more responsive

planning of evaluations and feedback of results and the Committee in

receiving useful and timely information.   It is our opinion that the
usefulness of performance and evaluative information for the Cosmittee

could be improved through the initiation of a systematic process.

                                 - 38 -
                                                                 ATTACMMENT I

                                      INFORMATION ELEENTS

    -authorizing        legislation
    -- pending and proposed legislation

    -time     limits (if   any on the authorizing legislation)
    -funding   constraints (if any) included in the authorizing legislation
    -program/subprogram objectives

    -other programs with similar or related objectives

    -short     program/subprogram description

    -program/subprogram manager

    -program/subprogram       evaluation (accomplished, planned and in process)
    -recipient information

    -target     group
    -project,     grant and loan information
    -participating institution information

    -program performance indices

-geographic        distribution information
-budget authority



-- receipts and reimhursements


-    transfers


                                            - 39 -
-- reprogrammed funds

-estimated unobligated balances (prior years and current year)
   (available and unavailable)

-OMB   budget account number
-year's    financial plan

-program   direction and operation costs

                                 - 40 -
                                                  AAIlAl"'LE.L   a-.

             Elementary and Secondary Education Act Programs

Title I - Educationally Deprived Children

     - Grants to Local Education Agencies
     - State Administrat.on
     - Special Grants to Urban/Rural (Repealed June 30, 1975)
     - Incentive Grants
     - State Programs
          -Handicapped in State Schools
          --MigreRtory Children
          -Neglected and Delinquent Children
     - Studies and Evaluation
          -Participation Study
          -Study on Updating Count of Children
          -Study- o Compensatory Education
          -Study on Measure of Poverty
          --Prngram Evaluation
Title IV, Part B - Libraries and Learning Resources, Consolidated

     - State Administration
     - Equipment and Minor Remodeling
          -State Administration
          -Loans to Non-profit Schools
     - School Library Resources
          -Administration   of the State Plan
     - Guidance,   Counseling, Testing
          -State Activities

Title IV, Part C - Innovation and Support, Consolidation

    - Strengthening State and Local Education Agencies
    - Supplementary Educational Centers and Services
    -   Dropout Pevention
    -   Nutrition and Health
    -   Comprehensive Planning and Evaluation
    -   State Administration

Title VII - Bilingual Education

     - Basic Program (Grants to Local Education Agencies for Classroom
     - Materials Development
     - Advisory Council
     - Training
         -Professional Development
         -Resource Centers
    - Assistance to State Education Agencies
    - Commissioner's Report on Bilingual Education
                                    - 41 -
                                                          APPENDIX 7


      A suggested oversight procedure is being developed by GAO in response

 to a request from Senator Leahy and in fulfillment of GAO's responsibilities

under the Congressional Budget Act, to develop and recommend to the

Congress methods for the review and evaluation of Government programs.

     The suggested oversight procedure being developed, if ap:lied by

the Congress, would establish a disciplined process for agencies
follow in monitoring, evaluating, and reporting on their programs
order to answer congressional oversight questions.

     This suggested procedure is being designed to avoid pitfalls common

to program evaluatior and to give the Congress several opportunities

to communicate and clarify its oversight concerns to the responsible

executive agencies.

     Under the procedure being developed, the Congress would first
establish its oversight requirements in authorizing legislation.
purpose of these requirements is to assure that.the agencies know,
explicitly as possible at the time the legislation is enacted,
it is they are to report to the Congress, and when, about the implemen-

tation and evaluation of the program.

     The required reporting about program implementation and evaluation

following enactment would be aimed at establishing the basis for
lating the general oversight concerns of the Congress into practical

questions and evaluation criteria that fit the legislation or program

under review.

                                   - 42 -
     The procedure under development would provide several opportunities

for discussion between committees and agencies on the oversight questions

which are most important and on the evaluation measures which can sat-

isfactorily answer those questions.

     Thus, the oversight procedure being developed, while establishing

a disciplined review process, would permit case-by-case flexibility

for tailoring the type of evaluation to the nature of the program or

legislation under review.

     Tne procedure under development would provide for the Congress

to consider whether oversight questions such as the following can be

answered in a manner consistent with legislative intent, before requiring

an agency to conduct a detailed, time-consuming, and costly evaluation


     1-Has the executive branch initiated implementation of the program?

     2-Has the responsible executive agency developed, designed, and
        established the program?

     3--Are specific program activities and operations being carried
        out at the field or operating level of the program?

     4-Can the. operating program be evaluated and can congressional
       .oversight questions be answered using agreed-upon measurements
        and comparisons within acceptable limits of time, cost, and

     Since the cost of answering each of the preceding questions increases

as one proceeds down the list, GAO's suggested oversight process is being

designed to proce-d in a systematic manner both during and after the

enactment of authorizing legislation in order to answer these kinds of

                                      - 43 -
basic oversight questions first.    In this way, to detect and resolve,
as necessary, any problems which may. arise in program implementation
program evaluation planning before an evaluation study of
                                                           a program's
outcomes, impacts, and/or performance is conducted.

                                   - 44 -
                                                              APPENDIX 8


    The GAO's involvement in the audit and reanalysis of social experiments

had led to heightened interest, both inside and outside of the GAO, in

the development of standards and procedures for such work.   The develop-

ment of standards and procedures for audit and reanalysis requirec that

the audit community and the social research community resolve several

difficult matters.   Among these are; 1) issues aoout the premature release

of research data; 2) the possibility that audit and reanalysis be viewed

as intrusive procedures which might affect the outcome of an experiment;

and 3) issues about the protection of individually identified research

data obtained from participants.   On this latter point, in addition to

important legal and moral considerations, there is the practical matter

that without a reasonable guarantee of confidentialfty, citizen candor

for evaluation purposes may suffer.

    In order to prepare ourselves to deal with these difficult matters,

the Social Science Research Council was awarded a contract by the GAO.

The contract has the general purpose of assisting the GAO in its develop-

ment of methods and techniques for auditing social experiments.   Included

in the contract scope is the specific purpose, "to identify and analyze

alternative methods by which GAO might meet its legislated responsibilities

in ways that will avoid being an undue influence on and/or causing damage

to experimental design and research results."

   GAO perceived the need to describe for the SSRC and the social

research community its reasoning behind its desire to audit and re-

analyze social experiments.   Consequently, the attached background paper

                                   - 45 -
dated April 8, 1977, was prepared.   The background paper describes GAO's
statutory responsibility to review and evaluate the results of Government

programs and activities, which include social research and social experi-

mentation.   It also describes the nature of GAO's interest when it
acceltes individually identified personal data for such purposes.

    We are conducting other related studies of audit and reanalysis

experience from our own work.   It is expected that these studies coupled
with the report by SSRC will enable GAO to publish additional guidelines

for the review and evaluation of social programs in. accordance with can

responsibilities under Title VII of the Congressional Budget Act.     This
work, it should be pointed out, is occlurtng simultaneously with what

appears to be heightened Awareness by social researchers that an appropriate

next step iL the development of a set of comprehensive standards and

procedures by which to judge the quality of work performed by members

of the field.

                                  - 46 -
                                                             APRIL 8, 1977

                        BACKGROUND PAPER FOR USE BY





Purpose of this paper

     SSRC has been awarded a contract which has the general purpose

of assisting the GAO in its development of methods and techniques for

auditing social experiments.   Included in the contract scope is the

specific purpose to identify and analyze alternative methods by which GAO

might meet its legislated responsibilities in ways that will avoid being

an undue influence on and/or causing damage to experimental design and

research results.

     This paper is intended to assure that the SSRC Committee bas a

full understanding of the GAO responsibilities so that any alternative

methods are accurately evaluated in terms of meeting those GAO responsi-


Implications for research and experiments
  of GAO responsibilitie-

     The Budget and Accounting Act, 1921, requires the Comptroller General

to investigate all matters relating to the receipt, disbursement and

application of public funds and to make investigations and reports tequired

by either House of Congress or by their Committees.    So that he may do so,

the Act also provides the Comptroller.General or his authorized employees

access to and the right to examine any books, documents, papers or records

                                    - 47 -
of ~all deparLcu,, .,Au   establishments of the Government except the

Legislative Branch and the Supreme Court.

     The Legislative Reorganization Act of 1970, as amended b the        _

Congressional Budget Act of 1974, requires the Comptroller General to

review and evaluate the results of Government programs and activities

carried on under existing law when ordered by either House of Congress,

or upon his own initiative, or when requested by any Congressional committee

having jurisdiction over such programs and activities.

     In order to carry out these broad investigative, evaluative, and

reporting duties imposed on the Comptroller Generai, the GAO 'needs the

access to records also provided by the above statutes.     That access

includes access to research and statistical records maintained in individually

identifiable form.

     The Privacy Act of 1974 specifically provides for GAO's access to

records on individuals maintained by Federal agencies, including provision

that such records may be disclosed to the Comptroller General or his author-

ized representatives in the course of performing the duties of the Office,

without the written request of, or prior written consent of, the subject of

the record.

The audit of social experiments

     Social experiments, as distinct from the usual experiment conducted

within the research community, are large and expensive, and are intende!d to

have direct impact on the policy process.     In contrast to the usual pro-

cedures of social research, such experiments are difficult to replicate.

Verification through replication is'an essential canon of the scientific

tradition.    All of the above point to the fact that by their very nature

                                     - 48 -
the conditions which surround social experiments are such that they are not

necessarily conducive to the open exchange, criticism, and careful examination

which one would expect in any experiment.      Thus, a strong case is made for

the audit function as a surrogate for replication, a particularly important

consideration in view of the policy-oriented objectives of a social experiment.

     The GAO audit function is viewed by Congress as an independent source

of information needed for use in its oversight of Federal programs and in

its authorization and appropriations activities.      The audit function can be

viewed in an experimental program as a particularly important source of

information about an experiment which, depending upon the outcome, could

contribute to Congressional debate about whether to adopt a new national


     The GAO activities in this type of audit may be broadly described as

consisting of two functions:

           1. The evaluation of the experimental design of a
              social experiment in order to determine whether
              the design is adequate. to supply the data necessary
              to answer the questions which underly the experiment.

           2. Sufficient verification activities to assess the
              adeuacy of the implementation of the experimental
              design in the actual data collection efforts.

 Audit practice and privacy of data

      In 1972, GAO issued a document for guidance of government auditors,

 Standards For Audit Of Governmental Organizations, Programs, Activities

 And Functions.    This document provides specific rationale for the

 standards which an auditor is expected to f4olow in his work.     Regarding

 the possible need for the auditor's access to individually identified data

                                      - 49 -
for purposes of verification such as reinterview, the following standards

are particularly relevant:

        -Due professional care is to be used in conducting
          the audit and in preparing related reports.

        -Sufficient, competent, and relevant evidence is to
          be obtained to afford a reasonable basis for the
          auditor's opinions, judgments, conclusions, and

     However, the function of audit of social research and social experimenta-

tion and its need for direct access to the data raises the question of

-privacy and protection of individually identified social research data.

There have been various occasions in which: GAO has obtained selective

access to such data, adequate for the objectives of the particular audit.

     In its reviews of social research and social experimentation, GAO

is not interested in personal information about individuals to make

determinations about them or about their rights and entitlements.     GAO is

interested in that information only as an aid in evaluating the research

or the experimental program being reviewed.

Most effective methods fdr the'
  audit of social experiments

     The essence of an experiment should be to test some new idea.     Such

an experiment requires collection of data in a carefully designed procedure

 to measure the effect of some experimental treatment or treatments in com-

parison with what exists in very similar situations without the experiment.

      AO0 may decide that to test the validity of an experimental data

base, access to individually identified data is required for several reasons.

One reason might be that the auditor needs to reinterview a sample of the

participants to verify that the subject selection procedures have been

                                   - 50 -
carried out in accordance with the experimental design.   Another reason
might be to verify that other conditions or variables of the experiment

are correctly recorded, e.g., type and quality of health care services

received, quality of a participant's housing, etc.

     Accordingly, it is important for GAO to consider, in planning each

audit where review of a social experiment is involved, the costs and

benefits of alternatives to reinterviewing that satisfy GAO's responsibilities.

It should be expected that GAO will decide in some cases that in meeting

its responsibilities to the Congress, reinterviewing is the most effective

method to achieve its objective.

                                   _   i _