oversight

Potentially Overlapping High School Programs.

Published by the Department of Education, Office of Inspector General on 2011-12-01.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                                  UNITED STATES DEPARTMENT OF EDUCATION
                                         OFFICE OF INSPECTOR GENERAL


                                                                                                                  Control Number
                                                                                                               ED-OIG/A19K0013
                                                         December 1, 2011
Tony Miller
Deputy Secretary
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202

Dear Mr. Miller:

This final audit report, entitled Potentially Overlapping High School Programs, presents the
results of our audit. The objectives of our audit were to (1) assess the extent to which the
Department of Education’s (Department) high school programs are duplicative, and
(2) determine if the Department has collected data that show whether these programs appear to
be effective and efficient in reducing gaps between low-income and minority students and their
peers in high school graduation and college access/success.




                                                      BACKGROUND



The Department establishes policy for, administers, and coordinates most Federal assistance to
education. The Department’s mission is to serve America’s students – more specifically, to
promote student achievement and preparation for global competitiveness by fostering
educational excellence and ensuring equal access.

To facilitate the administration of grant programs authorized and funded by Congress, the
Department is organized into a number of Principal Offices (POs). Each PO is responsible for a
portfolio of distinct, albeit related, programs and initiatives. For example, the Office of
Elementary and Secondary Education (OESE) administers programs designed to assist State and
local educational agencies (LEAs) in improving the achievement of elementary and secondary
school students, particularly those who are disadvantaged. Programs under the Office of
Postsecondary Education (OPE) are intended to address the national need to increase access to
quality postsecondary education, strengthen the capacity of colleges and universities, and
provide teacher and student development resources. The Office of Vocational and Adult
Education (OVAE) oversees programs related to adult education and literacy, career and
technical education, and community colleges.

Other POs with significant numbers of grant programs include the Office of Innovation and
Improvement (OII), which supports trials of innovations in the education system and broadly
disseminates lessons learned; the Office of Safe and Drug-Free Schools (OSDFS), which

 The Department of Education’s mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                   excellence and ensuring equal access.
Final Audit Report
ED-OIG/A19K0013                                                                      Page 2 of 21

provides financial assistance for activities aimed at drug and violence prevention and the
promotion of health and well-being of students in elementary and secondary schools and
institutions of higher education; and the Office of Special Education and Rehabilitative Services
(OSERS), which supports programs that help educate and provide for the rehabilitation of
individuals with disabilities, as well as research.

Many of the programs administered by these POs are geared toward high school students, the
focus of this audit. As noted in the Department’s High-Priority Performance Goals, the
President’s vision is that “… by 2020, America will again have the best-educated, most
competitive workforce in the world with the highest proportion of college graduates of any
country. To do this, the United States must also close the achievement gap, so that all youth –
regardless of their backgrounds – graduate from high school ready to succeed in college and
careers.” Among the related educational outcomes listed are improving all states’ overall and
disaggregated high school graduation rates and improving the nation’s overall and disaggregated
college completion rate.

To inform its efforts concerning high school programs, the Office of the Deputy Secretary
requested that the Office of Inspector General (OIG) perform work that would answer the
objectives previously stated on page 1.




                                     AUDIT RESULTS



Our audit found that while none of the Department’s high school-related programs appears to be
duplicative, there is some overlap among programs. Specifically, we noted that 6 of the 18
(33 percent) high school-related programs that we identified appear to overlap with at least one
other program. We also noted that a number of the programs we reviewed, to include all six of
the programs we noted that appear to overlap with other high school programs, have been
proposed for elimination and/or consolidation in past Department budget submissions as well as
its most recent Elementary and Secondary Education Act reauthorization proposal, partly due to
concerns over duplication with other programs.

We found that, although the Department has collected performance data on the 18 programs
included in our review, it has not collected data or established performance measures specifically
related to the programs’ effectiveness in reducing gaps between low-income and minority
students and their peers in high school graduation and college access/success. However,
although data on all of these programs’ effectiveness in closing achievement gaps is unavailable,
we noted that eight of the programs (44 percent) do have measures that require the collection of
data specific to low-income and minority student performance with regard to high-school
graduation rates or college access/success among program participants. The Department may be
able to further use such data to determine program impact on reductions in achievement gaps.
Final Audit Report
ED-OIG/A19K0013                                                                                Page 3 of 21

Of these eight programs, we noted that five (63 percent) – Advanced Placement Incentive
Program (APIP), Advanced Placement Test Fee Program (APTF), Gaining Early Awareness and
Readiness for Undergraduate Programs (GEAR UP), Talent Search, and Upward Bound Math-
Science (UBMS) – appear to generally be showing positive results regarding high school
graduation rates or college access/success in the noted populations, based on a review of
available Departmental performance data. Conversely, three of the programs – Migrant
Education-High School Equivalency Program (ME-HEP), Prevention and Intervention Programs
for Children and Youth who are Neglected, Delinquent, or At-Risk (N-D), and Upward Bound
(UB) – may not be producing such results.

In its response to the draft audit report, the Office of the Deputy Secretary (ODS) agreed with the
recommendations and described corrective actions planned. ODS stated it appreciated OIG’s
insight and would continue to examine whether the Department’s support for high schools is
configured to have the most positive effects for students. ODS also noted that it did not believe
that program overlap is inherently undesirable, and believed that the clarity and accuracy of the
report would be improved by providing further information or explanation in some areas. After
reviewing the comments, we have modified some areas of the report to provide further clarity as
requested. We have also modified recommendation 1.3 to recognize that the Department is
limited by statute in its ability to prevent grantees from receiving funding under similar
programs. The recommendation now focuses exclusively on efforts that could assist in ensuring
students are not over-served by similar programs. Other than the modifications noted we have
not made any additional changes to our findings and recommendations. ODS’ comments are
summarized at the end of each applicable finding. The full text of ODS’ response is included as
Attachment 5 to this report.


FINDING NO. 1 – Overlap Exists Among Some Department High School Programs

Our audit found that while none of the Department’s high school-related programs appears to be
duplicative, there is some overlap between programs. Specifically, we noted six high school-
related programs that appear to overlap with at least one other program.

We identified a total of 18 Department grant programs 1 that either serve high school students
only (directly or indirectly) or include them as a primary target population. Eight of these
programs are administered by OESE, five by OPE, two each by OVAE and OII, and one by
OSDFS. [See Attachment 1 for more detailed information on these programs.]

While each of the 18 programs reviewed contain some unique characteristics, we found that they
can be grouped, essentially, into two main categories: (1) those with a focus on one subject area
or on a specific subpopulation of students; and (2) those with a broad focus on encouraging high
school graduation and/or promoting college access/success, primarily (but not solely) among
low-income and minority students. The latter category can also be subdivided into programs that

1
 For purposes of this audit, a “grant program” was defined as any program with a separate listing in the
Department’s “Guide to Education Programs” and/or a unique Catalog of Federal Domestic Assistance number.
Final Audit Report
ED-OIG/A19K0013                                                                                  Page 4 of 21

relate to career and technical education (administered by OVAE) and programs focused mainly
on academic preparation for postsecondary education (administered by OESE and OPE).
As shown in Table 1 below, we identified nine programs that fall under the “Specific Subject
Area or Subpopulation” category (Category A), and nine programs that fall under the
“High School Graduation and/or College Access/Success” category (Category B). The nine
programs included under Category A have more narrowly-focused goals, objectives, and
performance measures, and/or are targeted toward certain, often hard to reach, subpopulations.
These programs are also generally smaller, in terms of annual funding, than those included in
Category B and, despite sharing some similarities, offer fundamentally different services to
unique populations. As a result, there appeared to be little potential for substantial overlap or
duplication with the other high school programs. We subsequently focused our work on
assessing the extent to which this occurs between programs included under Category B.

                                        Table 1: Program Focus
                     Category A                                            Category B
              Specific Subject Area or                           High School Graduation and/or
                   Subpopulation                                     College Access/Success
  Advanced Placement Incentive Program (APIP) (D)           High School Graduation Initiative (HSGI) (D)
  Advanced Placement Test Fee Program (APTF) (D)               School Improvement Grants (SIG) (F)
 Migrant Education-High School Equivalency Program
                                                              Smaller Learning Communities (SLC) (D)
                   (ME-HEP) (D)
  Prevention and Intervention Programs for Children
  and Youths Who Are Neglected, Delinquent, or At           College Access Challenge Grants (CACG) (F)
                   Risk (N-D) (F)
                                                             Gaining Early Awareness and Readiness for
                 Striving Readers (D)
                                                              Undergraduate Programs (GEAR UP) (D)
 American Academies for History and Civics (AAHC)
                                                                            Talent Search (D)
                          (D)
     Close Up Fellowship Program (Close Up) (E)                         Upward Bound (UB) (D)
    Grants to Reduce Alcohol Abuse (GRAA) (D)                  Career and Technical Education (CTE)* (F)
      Upward Bound Math-Science (UBMS) (D)                       Tech Prep Education (Tech Prep)* (F)
                           9                                                        9
 “D” denotes discretionary grant programs, “E” denotes earmarks or Congressionally-directed programs, and “F”
 denotes formula grant programs and noncompetitive discretionary grant programs.
 * Denotes programs related to career and technical education administered by OVAE.

Table 2 shows additional detail on the nine OESE, OPE, and OVAE high school programs
included under Category B above that we identified as having a broad focus on encouraging high
school graduation and/or promoting college access/success.
Final Audit Report
ED-OIG/A19K0013                                                                                         Page 5 of 21

                    Table 2: High School Graduation and/or College Access/Success
       Program
                        PO       Program                                                                Target
        Name                             2                         Goal
                                  Office                                                              Population
    High School        OESE       AITQ       To support effective, sustainable, and coordinated    Students in
    Graduation                               statewide school dropout prevention and reentry       schools with
    Initiative                               programs.                                             high dropout
                                                                                                   rates

    School             OESE       SASA       To improve student achievement in Title I             Students in low-
    Improvement                              schools identified for improvement, corrective        performing
    Grants                                   action, or restructuring so as to enable those        schools
                                             schools to make adequate yearly progress (AYP)
                                             and exit improvement status.
    Smaller            OESE       AITQ       To assist high schools in creating smaller learning   Students in large
    Learning                                 communities that can prepare all students to          schools
    Communities                              achieve to challenging standards in college and
                                             careers.
    College Access      OPE     HEP/State    To increase the number of low-income students         Low-income
    Challenge                    Service     prepared to enter and succeed in postsecondary        students
    Grant                                    education by fostering partnerships among
                                             Federal, state, and local governments and
                                             philanthropic organizations through matching
                                             challenge grants.
    Gaining Early       OPE        HEP/      To significantly increase the number of low­          Low-income
    Awareness and                 Student    income students who are prepared to enter and         students
    Readiness for                 Service    succeed in postsecondary education.
    Undergraduate
    Programs

    Talent Search       OPE        HEP/      To increase the percentage of low-income, first-      Low-income,
                                  Student    generation college students who successfully          potentially first-
                                  Service    pursue postsecondary educational opportunities.       generation
                                  (TRIO)
                                                                                                   college students

    Upward Bound        OPE        HEP/      To increase the percentage of low-income, first-      Low-income,
                                  Student    generation college students who successfully          potentially first-
                                  Service    pursue postsecondary educational opportunities.       generation
                                  (TRIO)
                                                                                                   college students

    Career and         OVAE       DATE       To increase access to and improve educational         All students
    Technical                                programs that strengthen education achievement,
    Education                                workforce preparation, and lifelong learning.

    Tech Prep          OVAE       DATE       To increase access to and improve educational         All students
    Education                                programs that strengthen education achievement,
                                             workforce preparation, and lifelong learning.




2
    Refer to Attachment 4 for definition of noted acronyms.
Final Audit Report
ED-OIG/A19K0013                                                                                      Page 6 of 21

All seven of the OESE and OPE programs contain elements that are designed to improve student
academic achievement, encourage high school graduation, and promote college access/success –
although the degree to which each of these activities occurs varies from program to program.
HSGI, SIG (in part), SLC, and GEAR UP are typically thought of as having a more pronounced
effect on the first two areas, while CACG, Talent Search, and UB provide a link between
secondary and postsecondary education. Both of the OVAE programs, CTE and Tech Prep,
promote the integration of academic, career, and technical education between secondary and
postsecondary schools.

In conducting our audit, we identified essentially four areas where overlap can occur:
(1) program goals, objectives, and performance measures; (2) target population; (3) services
provided; and (4) the manner in which services are provided. We established that to be
duplicative, a program would have to match another program in all four areas. We noted that
none of the programs could be deemed duplicative; however, six of the nine programs
(67 percent) appear to overlap to varying degrees with at least one other program, as follows:

    •	 CACG, GEAR UP, Talent Search, and UB: This group of programs provides similar
       services to similar target populations, including assistance in the college admissions
       process and academic, career, and financial counseling. Talent Search and UB are
       especially alike, in that both are discretionary grant programs that target individual
       students; share the exact same goal, objective, and performance measures; and, according
       to the Department’s Fiscal Year (FY) 2012 TRIO 3 Budget Justification, provide the same
       services (although UB also provides an on-campus residential summer component and
       work-study positions). The main difference between these two programs, as described to
       us by program officials, is in the level of intensity of services provided and, subsequently,
       impact observed. Talent Search is a “light touch” program, focused primarily on the
       various types of counseling described above, that served 360,000 individuals in FY 2010,
       at a cost to the Federal government of approximately $400 per participant. UB, on the
       other hand, offers a more comprehensive program, to include academic instruction in
       various subjects in addition to the counseling described above, and provided services to
       53,000 participants valued at almost $5,000 per participant. The number of participants
       per project also differs significantly, averaging about 780 for Talent Search and 80 for
       UB. Lastly, Talent Search can provide services to middle school students, while UB
       focuses on high school students only.

    •	 CTE and Tech Prep: Both programs share a common goal, service the same target
       population, and report on identical performance measures in the Department’s
       performance reporting system and annual budget justifications to Congress. They differ
       somewhat in how the goal is achieved – with CTE implemented within individual school
       districts, in accordance with local and State plans, while Tech Prep, although also part of

3
  The Federal TRIO Programs are Federal outreach and student services programs designed to identify and provide
services for individuals from disadvantaged backgrounds. TRIO includes eight programs targeted to serve and assist
low-income individuals, first-generation college students, and individuals with disabilities to progress through the
academic pipeline from middle school to postbaccalaureate programs. We included three of the programs
(Talent Search, UB, and UBMS) for review as a part of this audit because they are focused specifically on high
school students.
Final Audit Report
ED-OIG/A19K0013                                                                                     Page 7 of 21

        local and State plans, requires the use of articulation agreements 4 between consortia of
        schools – but nevertheless strive toward the same goal. Officials with whom we spoke
        readily acknowledged overlap. They noted that Congress included a provision in the
        2006 reauthorization of the Carl D. Perkins Career and Technical Education Act
        (Perkins IV) that allows States to consolidate their CTE and Tech Prep funds. In its
        FY 2012 Tech Prep Budget Justification, the Department reported that 28 States
        consolidated at least a portion, and generally all, of their Tech Prep funds into the CTE
        program.

        We noted that the Department did not request separate funding for Tech Prep in its last
        two budget submissions. Rather, it proposed redirecting, or consolidating, funding for
        the program into CTE in order to give States and local entities more flexibility in
        allocating funds. The final FY 2011 appropriation eliminated funding for Tech Prep,
        effectively terminating the program; however, the possibility exists that funding could
        later be restored.

Government Accountability Office (GAO) “Standards for Internal Control in the Federal
Government” states

        Internal control should provide reasonable assurance that the objectives of the agency are
        being achieved in the following categories:

             •	 Effectiveness and efficiency of operations including the use of the entity’s
                resources.

The Department of Education Organization Act, P.L. 96-88, Section 102, states that among the
purposes of the Department’s mission are to

             •	 Improve the coordination of Federal education programs;
             •	 Improve the management of Federal education activities; and
             •	 Increase the accountability of Federal education programs to the President, the
                Congress, and the public.

Proposals for Congressional Action

We noted that a number of the programs we reviewed, to include all six of the programs we
noted above that appear to overlap with other high school programs, have been proposed for
elimination and/or consolidation in past Department budget submissions as well as its most
recent Elementary and Secondary Education Act (ESEA) reauthorization proposal. Among the
Department’s reasons for eliminating or consolidating programs are that the program: (1) is too

4
  An articulation agreement is an officially approved agreement that matches coursework and/or governs the transfer
of credits between schools. In the case of Tech Prep, each project is carried out under an articulation agreement
between participants in a consortium and consists of at least 2 years of high school followed by 2 years or more of
higher education or apprenticeship. The idea is to develop a structural link between secondary and postsecondary
institutions that integrates academic and career and technical education and better prepares students to make the
transition from high school to college and from college to careers.
Final Audit Report
ED-OIG/A19K0013                                                                       Page 8 of 21

small to have a significant impact nationally, (2) duplicates other programs, (3) has achieved its
intended purpose, (4) has consistently failed to achieve its intended purpose, or (5) would be
more appropriately financed by State and local agencies and the private sector. However, until
recently, Congress has for the most part continued to fund these programs. The final FY 2011
appropriation, enacted in April 2011, eliminated funding for five of the programs in our review:
(1) SLC, (2) Striving Readers, (3) AAHC, (4) Close Up, and (5) Tech Prep.

Attachment 2 shows the programs in our review that have been proposed by the Department for
elimination and/or consolidation in recent years.

Coordination Efforts

While the Department has made some improvements in coordination efforts among its high
school programs, additional improvements are needed. Specifically, we noted that the
Department’s current efforts might be strengthened by placing a greater emphasis on
encouraging coordination between program offices regarding administrative and operational
matters.

During our audit, we learned that a group referred to as the Secondary Schools Working Group
(SSWG) began meeting in November 2009. The group’s purpose is to review programs and
policies within the Department, with a focus toward improving coordination between program
offices, as well as to discuss promising initiatives and best practices underway in high schools
across the country. Based on our audit work, it appears as though much more time and attention
has been afforded to the second stated objective, with SSWG’s main product thus far being a
document submitted to the Department’s Policy Committee that identifies overarching goals for
the nation’s high schools and high school students, significant challenges, and short and long-
term strategies for achieving these goals.

SSWG participants, who include political appointees and career staff from most of the
Department’s POs, met weekly from November 2009 until June 2010, and began meeting again
starting in December 2010. Each meeting is normally devoted to one or two special topics, with
outside experts often brought in to discuss related issues. POs also sometimes give presentations
on their high school programs and provide news that may be of value to group members. During
our discussions, however, we learned that many of the officials who administer the programs in
our review were either unaware of the SSWG or were aware of its existence but did not attend
meetings. Others stated that they had attended meetings in the past, but have not done so on a
regular basis.

We discovered that there have been other largely informal efforts at coordination among related
programs as well. OPE recently underwent a reorganization that placed GEAR UP and Talent
Search in the same program office, thus allowing staff – who will be assigned grants under both
programs – to collaborate more directly to achieve related goals and objectives. Similarly, five
of the OESE programs on our list are administered by OESE/AITQ’s High School Programs
Group. Most of the group’s staff work on multiple programs and are thus well-positioned to
identify inconsistencies if the same grantee submits an application for funding under multiple,
similarly-focused grant programs. They also have a better chance of preventing a potential
Final Audit Report
ED-OIG/A19K0013                                                                                   Page 9 of 21

grantee from using funds awarded under different grants for the same activity, which can be
determined during the application cost analysis and budget review. OESE maintains a file
identifying all schools served under its grants to keep track of where funds are being spent – an
activity that we determined OPE also performs. As for coordination with other offices, officials
described working with GEAR UP, in particular, in an effort to mitigate potential overlap.

Overlapping programs increase the administrative burden on Department staff, as each program
has its own legislative and regulatory requirements, as well as application, award, and reporting
requirements. Eliminating or combining programs could help reduce the number of award
competitions, simplify the preparation of program guidance and materials, and perhaps most
importantly, allow the Department to more efficiently and effectively focus resources on
monitoring and oversight activities. Many of the program officials that we met with during this
audit stated that they wished they had more time for monitoring activities.

In addition, administering overlapping programs that do not appear to be effectively performing
or producing a positive impact allows funds to continue to be used for programs that may
provide little or no added value. Some of the programs we identified as overlapping and that
have been previously recommended by the Department for elimination or consolidation continue
to be funded, even though the most recently available performance results and evaluations
indicate that the programs may not be realizing their goals and objectives. Specifically, the UB
program was rated as ineffective in its Program Assessment Rating Tool (PART) review and was
noted as having limited to no effect on its overall population of students in related studies. 5 CTE
was also rated as ineffective in its PART review and shown to have mixed or inconclusive results
in related studies. [See Finding No. 2 for additional information.]

Overlapping programs can also increase the burden on grantees with regard to administration and
oversight. In addition, the risk exists that grantees are receiving multiple related awards and
potentially providing overlapping services to the same students and/or schools while other
qualifying students and/or schools are overlooked.

At the grantee level, we noted 168 instances of a single grantee receiving funds under both the
Talent Search (with 265 grants awarded between FYs 2007 and 2011) and UB (with 967 grants
awarded during this same time period) programs. 6 For those grantees where funds were received
under multiple programs, we found 54 instances where the same Project Director was listed in
the Department’s Grant Award Database. We note that the authorizing statute specifically
permits grantees to receive funds under both programs at the same time. We also note that the
Office of Planning, Evaluation and Policy Development (OPEPD)/Policy and Program Studies
Service (PPSS) recently contracted for a study that will analyze Department data and grantee


5
  PART was designed and implemented under the previous Administration to help assess the management and
performance of Federal programs. It was used by the Department and the Office of Management and Budget
(OMB) to evaluate a program’s purpose, design, planning, management, results, and accountability to determine its
overall effectiveness. The current Administration has opted not to continue to use this particular tool, instead
promoting a focus on transparency and accountability throughout the Federal government and an increased emphasis
on rigorous, independent program evaluations.
6
  Number of grants awarded under each program includes new grants with actual award dates noted between
FY 2007 and FY 2011 in the Department’s Grant Award Database.
Final Audit Report
ED-OIG/A19K0013                                                                       Page 10 of 21

performance reports to determine the extent to which there is overlap in schools with GEAR UP
and UB grants.

Recommendations

We recommend that the Deputy Secretary:

1.1	   Continue to actively promote coordination among similar programs, ensure that key staff
       are aware of such efforts and encouraged to participate, refocus some of the Department’s
       current efforts to better reflect coordination efforts, emphasize coordination as relating to
       administrative and operational matters, and consider formalizing other notable informal
       coordination efforts.

1.2	   Continue to work with Congress to consolidate or eliminate programs that overlap with
       one another, with an emphasis on those that do not appear to be achieving intended results.

1.3	   Ensure monitoring efforts at schools, local education agencies and/or grantees include a
       review of program participant listings to help ensure that students are not being over-
       served by similar programs and services to the detriment of other eligible students that
       could also benefit from such programs and services.

Department Comments

In its comments to the draft audit report, ODS stated that while it was encouraged that no
instances of program duplication were identified, it will nevertheless continue to examine
whether the Department’s support for high schools is configured to have the most positive effects
for the nation’s students. ODS also stated, however, that it does not believe program overlap to
be inherently undesirable, provided that services offered under similar programs are
complementary and coordinated to the extent possible. Additionally, ODS cited areas in which
it believed the clarity and accuracy of the report could be improved upon by providing further
information or explanation, particularly with regard to the differences in intensity of services
provided between the Talent Search and UB programs.

ODS agreed, in general, with all of our recommendations, stating that it will continue to promote
coordination among similar high school programs through the SSWG and by other means, such
as a CTE Strategy Workgroup established in summer 2010, in an effort to improve
administrative efficiency and overall program impact. It also referenced both its annual budget
development process and the Administration’s ESEA reauthorization proposal, which serve as
vehicles for the identification of programs that are duplicative or not achieving intended results
and contain suggestions to Congress concerning program consolidation and elimination. Lastly,
ODS stated that it agrees in principle with the idea that students should not be over-served by
Federal education programs. However, it noted that the authorizing statute for the TRIO
programs – a significant component of our review – specifically permits an entity to receive
multiple grants under different programs. Consequently, although Department staff track
whether entities are receiving multiple related grants, their ability to prevent potential service
overlap – particularly between the Talent Search and UB programs – is somewhat limited.
Final Audit Report
ED-OIG/A19K0013                                                                                  Page 11 of 21

OIG Response

While we agree that some degree of overlap between programs may not always be undesirable or
entirely preventable, we encourage the Department to continue its efforts to identify such
programs, explore opportunities for collaboration and coordination, and consider consolidation
or elimination where appropriate. As ODS noted in it comments, reducing and eliminating
duplication is a key step toward increasing efficiency and productivity.

After reviewing ODS’ comments, we have modified some areas of Finding 1 to provide further
clarity. We have also modified recommendation 1.3 to recognize that the Department is limited
by statute in its ability to prevent grantees from receiving funding under similar programs. The
recommendation now focuses exclusively on efforts that could assist in ensuring students are not
over-served by similar programs.


FINDING NO. 2 – Performance Measures and Available Data on the Reduction of
                Gaps Between Low-Income and Minority Students and Their
                Peers Are Lacking

We found that, although the Department has collected performance data on the 18 programs
included in our review, it has not collected data or established performance measures specifically
related to the programs’ effectiveness in reducing gaps between low-income and minority
students and their peers in high school graduation and college access/success. However,
although data on all of these programs’ effectiveness in closing achievement gaps is unavailable,
we noted that eight programs (44 percent) do have measures that require the collection of data
specific to low-income and minority student performance with regard to high-school graduation
rates or college access/success among program participants. The Department may be able to
further use such data to determine program impact on reductions in achievement gaps. We noted
that not all of the remaining 10 programs would necessarily have similar measures or results, as
some programs are newly-authorized or reauthorized, such that final measures have not yet been
established or reported on, or programs are narrowly-focused on unique subpopulations of
students.

Of the eight programs with measures concerning low-income and/or minority student
performance in these areas, we noted that five (63 percent) – APIP, APTF, GEAR UP, Talent
Search, and UBMS – appear to generally be showing positive results, based on a review of
available Departmental performance data. Conversely, three of the programs – ME-HEP, N-D,
and UB – may not be producing such results. 7 In addition, we noted that some of the
performance data available, particularly in terms of reports posted on program websites, are
dated; in other cases, data are unavailable.




7
 Of these eight, three – GEAR UP, UB, and Talent Search – were identified under Finding No. 1 as potentially
overlapping programs.
Final Audit Report
ED-OIG/A19K0013                                                                                   Page 12 of 21

Available Performance Data

Based on our review of available Departmental documentation and discussions with Department
officials, we determined that there are four main sources for information on program
effectiveness and efficiency: (1) annual performance plans/reports, to include data provided in
the Department’s yearly budget justifications; (2) PART reviews, instituted in the early 2000s
and administered by OMB; (3) evaluations conducted by the Institute of Education Sciences
(IES); and (4) evaluations conducted by OPEPD/PPSS. 8

Table 3 shows what information is currently available for each of the 18 programs reviewed as
part of this audit, as well as which programs were specifically identified as having performance
measures relating to low-income and minority student high school graduation rates and/or
college access/success (in bold). It does not include any evaluations that may currently be
underway.

           Table 3: Available Performance Data / Low-Income and/or Minority Student
                                     Performance Measures
                                                Low-Income
                                                   and/or
                       Annual Performance         Minority         PART              IES           OPEPD/PPSS
     Program Name
                          Plan/Report             Student        Assessment      Evaluation(s)     Evaluation(s)
                                                Performance
                                                 Measures
    APIP                                                            
    APTF                                                            
    HSGI
    ME-HEP                                                          
    N-D                                                          
    SIG
    SLC                                                                                                 
    Striving Readers                                                                  
    AAHC                         
    Close Up                     
    CACG
    GEAR UP*                                                                                           
    Talent Search*                                                                                     
    UB*                                                                                                
    UBMS*                                                                                               
    GRAA                         
    CTE                                                                                                 
    Tech Prep                                                                                           
            18                   15                   8            10                  1                  7
    * Denotes programs for which we located more than one OPEPD/PPSS evaluation.

For the eight programs identified as having measures concerning low-income and/or minority
student graduation rates and/or college access/success, we further reviewed detailed data on


8
 Our review did not include evaluations conducted by external entities. We included only those data sources that
were prepared by the Department or for which the Department played a key role in the development process.
Final Audit Report
ED-OIG/A19K0013                                                                                            Page 13 of 21

program effectiveness and efficiency included in each of the above noted data sources. A
summary of our review is presented in Table 4 below and the related narrative that follows.

                             Table 4: Summary of Program Performance Data 9
                     Percent of
                                              Percent of
                    Effectiveness
                                          Efficiency Targets
                      Targets                                      PART Rating           IES              OPEPD/PPSS
    Program                                Met/Exceeded &
                  Met/Exceeded &                                    & Program         Evaluation           Evaluation
     Name                                 Percent of Targets
                 Percent of Targets                                Results Score       Results              Results
                                           Showing Progress
                  Showing Progress
                                          Over Previous Year
                 Over Previous Year
    APIP         100%; 100% (2010)                  ~             Moderately             None                  None
                                                                  Effective; 42%
                                                                  (2005)

    APTF          25%; 75% (2010)         100%; 100% (2008)       Moderately             None                  None
                                                                  Effective; 42%
                                                                  (2005)
    ME-HEP        50%; 100% (2010)          n/a;* 0% (2009)       Results Not            None                  None
                                                                  Demonstrated,
                                                                  0% (2004)
    N-D             0%; 0% (2009)            0%; 0% (2009)        Adequate; 33%          None                  None
                                                                  (2007)
    GEAR              71%; 29%                      ^             Adequate; 13%          None           Too early to tell;
    UP               (2008/2009)                                  (2003)                                inconclusive.
                                                                                                        (2003) / Generally
                                                                                                        positive effect.
                                                                                                        (2008)

    Talent       100%; 100% (2009)         n/a;* 100% (2009)      Moderately             None           Mixed;
    Search                                                        Effective; 50%                        inconclusive.
                                                                  (2005)                                (2004) / Generally
                                                                                                        positive effect.
                                                                                                        (2006)

    UB           100%; 100% (2009)         n/a;* 100% (2009)      Ineffective;           None           Generally limited
                                                                  16% (2002)                            to no effect. (2004)
                                                                                                        / Generally limited
                                                                                                        to no effect. (2009)
    UBMS+        100%; 100% (2009)         n/a;* 100% (2009)                             None           Generally positive
                                                                                                        effect. (2006) /
                                                                                                        Generally positive
                                                                                                        effect. (2010)
    ~ The efficiency measure for the AP programs appears to relate only to APTF.
    * Although ME-HEP, Talent Search, UB, and UBMS have established efficiency measures and reported data for a
    number of years, they do not provide annual targets.
    ^ The Department’s FY 2012 GEAR UP Budget Justification identifies its efficiency measure as “… the cost of a
    successful outcome, where success is defined as enrollment in postsecondary education by GEAR UP students
    immediately following high school graduation.” However, it also notes that the Department has not yet determined
    how to calculate this measure.
    + Disaggregated UB and UBMS data is provided in OPE’s annual grantee-level performance results report.

9
    Where applicable, the year of the most current data available at the time of our review is noted.
Final Audit Report
ED-OIG/A19K0013                                                                                  Page 14 of 21

Annual Performance Reports

In reviewing available performance data, we sought to determine not only whether a program
had met the targets established for its performance measures, but also whether it showed
progress over the previous year. We reasoned that this information, taken together, would
provide a more comprehensive and accurate picture as to whether results are trending positive or
negative. With regard to program effectiveness, we noted that two programs that did not meet all
of their most recent targets did, in fact, show improvement [APTF; ME-HEP]. We also noted
one program where the opposite is true [GEAR UP]. With regard to program efficiency, we
found the Department did not have available data for two of the eight programs in our review
[APIP; GEAR UP]. In addition, four of the programs that established efficiency measures did
not provide annual targets, thus preventing us from noting whether targets were met/exceeded or
not met [ME-HEP; Talent Search; UB; UBMS]. We were, however, able to note whether
progress was made over the previous year, as these programs did report historical data. Of the
six programs with an efficiency measure(s), one met its target [APTF] and three others showed
some improvement over the previous year [Talent Search; UB; UBMS].

We noted that the Department does not report separately on UBMS in its annual TRIO budget
justifications, nor include such results in its performance reporting system. Rather, data are
aggregated with the results for regular UB. However, the Department does include
disaggregated results for each program in its annual grantee-level performance results report, the
most recent of which (2008-2009) was provided by OPE.

Lastly, we noted that not all of the performance data available are timely. This is particularly
true when viewing reports posted on program websites, many of which have not been updated
for several years. Although there may be valid reasons for why this occurs, including that it
likely takes longer for larger programs with more grantees to collate data, there is also an
overarching need for increased transparency and accountability.

PART Assessments

We reviewed applicable PART questionnaires for the seven programs that had assessments
performed. We found that two of the programs were rated “Adequate” [N-D; GEAR UP], three
were rated “Moderately Effective” [APIP; APTF; Talent Search], one was rated “Ineffective”
[UB], and one was rated “Results Not Demonstrated” [ME-HEP]. 10 In addition, none of the
programs scored over 50 percent on the program results section of the assessment. This section
focused on results that programs can report with accuracy and consistency, and itself accounted
for half of a program’s overall score. One program in our review scored 0 percent [ME-HEP]
and two others scored under 20 percent [GEAR UP; UB]. These results suggest that there may
be deficiencies with regard to the programs’ ability to achieve both short- and long-term
performance goals. However, it should be noted that all of the PART reviews were conducted
between 2002 and 2007 and may not represent the most current information on these programs.
IES officials also expressed concern that the quality and rigor of the evaluation evidence on


10
   According to the PART website, a rating of “Results Not Demonstrated” is given when a program – regardless of
its overall score – does not have agreed-upon performance measures or lacks baselines and performance data.
Final Audit Report
ED-OIG/A19K0013                                                                   Page 15 of 21

which the PART reviews were based varies significantly across programs, a sentiment echoed by
other officials throughout the Department.

IES and OPEPD/PPSS Evaluations

Two separate offices are currently responsible for program and policy evaluation at the
Department: (1) IES, through its National Center for Education Evaluation and Regional
Assistance, which was established in 2002 as the successor to the Office of Educational Research
and Improvement; and (2) OPEPD/PPSS, formerly known as the Planning and Evaluation
Service. Department officials in each office described the differences between the two as
follows: OPEPD/PPSS is focusing on short-term evaluation activities (fewer than 18 months),
policy analysis, performance measurement, and knowledge management activities, while IES is
responsible for longer-term (18 months or longer) program implementation and impact studies.

We noted that four of the eight programs (50 percent) did not have any evaluations performed by
either IES or OPEPD/PPSS [APIP; APTF; ME-HEP; N-D]. Of the four that did, three were
noted as having a generally positive effect [GEAR UP; Talent Search; UBMS] and one was
noted as having limited to no effect for participants as a whole [UB]. We also learned that the
Department is in the process of completing its analysis of data collected through evaluation
activities related to the GEAR UP and UB programs to determine if they provide information
that would be useful for program improvement.

Departmental Directive OS-01, “Handbook for the Discretionary Grant Process,” dated
January 26, 2009, states

       The Government Performance and Results Act of 1993 directs Federal departments and
       agencies to improve the effectiveness of their programs by engaging in strategic
       planning, setting outcome-related goals for programs, and measuring program results
       against those goals. … ED must establish meaningful performance standards and
       measurements for its programs so that it can provide evidence to OMB that its programs
       are effective as rated by the PART.

OMB Memorandum 10-32, “Evaluating Programs for Efficacy and Cost-Efficiency,” dated
July 29, 2010, states

       Rigorous, independent program evaluations can be key resources in determining whether
       government programs are achieving their intended outcomes as effectively as possible
       and at the lowest possible cost. Evaluations can help policymakers and agency managers
       strengthen the design and operation of programs. … Ultimately, evaluations can help the
       Administration and Congress determine how to spend taxpayer dollars effectively and
       efficiently, by investing taxpayers’ resources in what works.

We noted that programs may not have related data or measures due to the fact that they do not
have goals and objectives that specifically reference low-income and/or minority students. As a
result, the Department has not established performance measures that would enable it to measure
Final Audit Report
ED-OIG/A19K0013                                                                                Page 16 of 21

and report on the programs’ success as relating to these populations and allow it to subsequently
use the data in analyses related to the effectiveness of the programs in closing achievement gaps.
Three of the programs – HSGI, SIG, and CACG – are relatively new programs, for which
performance measures are in the process of being developed, according to program officials. 11
Of these, only CACG, which is administered by OPE, includes language in its program goal to
address increasing the number of low-income students who are prepared to enter and succeed in
postsecondary education. The other two, administered by OESE, are not specifically targeted
toward low-income and/or minority students, but, rather, will be implemented in low-performing
schools and schools with high dropout rates. Nevertheless, these programs – one of which
focuses almost exclusively on at-risk and out-of-school youth and the other of which employs a
comprehensive approach to school improvement that addresses issues concerning students,
teachers, administrators, and the schools themselves – will have notable coverage with respect to
these populations of students. We reviewed performance measures contained in the FY 2010
HSGI Notice Inviting Applications and October 2010 SIG Notice of Final Requirements and
noted that it appears the Department intends to collect data on performance by student subgroup.
However, it remains to be seen whether aggregated or disaggregated data will be reported in
annual performance reports and budget justifications.

The CTE and [former] Tech Prep programs both have a broad focus on high school graduation
and college access/success and, despite noting some efforts at disaggregating results for “special
populations” of students, do not have national measures reflected in the Department’s
performance reporting system or annual budget justifications that would require reporting
separately on the performance of low-income and/or minority students. Some programs, such as
GRAA (and, until recently, Striving Readers, AAHC, and Close Up), are very narrowly-focused
programs that do not necessarily lend themselves to measures that would require the collection of
data on low-income and minority students to potentially determine their effectiveness in closing
achievement gaps in high school graduation and college access/success.

As for areas in which program evaluations are lacking, IES and OPEPD/PPSS officials reiterated
the same point made by a number of program officials: most of the Department’s high school
programs are relatively small and do not include set-asides of sufficient size to permit IES to
conduct rigorous evaluations, each of which can cost between $5 million and $15 million and
take multiple years to complete. They noted it is not cost-effective to spend more on program
evaluations than on the programs themselves. Further, IES officials stated that although some
programs have national activities accounts from which funds can be allocated for evaluations,
resources are generally either limited relative to the size of the program or used by the office for
other purposes.

It was noted that most of the work that is done is conducted under grants awarded by the
National Center for Education Research or National Center for Special Education Research and
is organized around topics or strategies in education, as opposed to specific Federal programs.
Further, studies conducted by IES are generally initiated at the request of the program office, so

11
   HSGI was first funded in FY 2010, as the successor to the previously unfunded School Dropout Prevention
Program. SIG was first funded in FY 2007, however the Department recently redefined the program to include a
stronger focus on high schools. CACG was first funded in FY 2008 as a 2-year temporary program. In FY 2010,
the program was extended for an additional 5 years.
Final Audit Report
ED-OIG/A19K0013                                                                      Page 17 of 21

if this does not occur, and if there is no set-aside of sufficient size, as noted above, a program
may go unevaluated. Officials also stated that work aimed at determining the overall
effectiveness of the TRIO programs, in particular, will likely prove much more difficult going
forward, in light of new restraints on rigorous evaluations contained in the Higher Education Act
(HEA). It was noted that Section 402H of the HEA effectively prohibits randomized controlled
trials and requires that any evaluations focus primarily on the identification of effective program
or project practices, as opposed to comprehensive assessments of program performance.

Programs that lend themselves to but do not have measures specifically related to the
performance of low-income and/or minority students provide limited opportunity for insight into
the effect that these programs may be having on reducing historically persistent achievement
gaps and prevent the Department and grantees from identifying areas of needed improvement.
Further, a lack of or dated performance information and evaluations, including results on both
program effectiveness and efficiency, hinders the Department’s ability to determine whether a
program is achieving its intended purpose and goals in a cost-effective manner and to take
necessary action if warranted.

We noted that the Department is in the process of soliciting requests for proposals for a contract
intended to improve the quality and reporting of outcomes and impact data from its grant
programs. This effort will be overseen by OPEPD and represents a continuation and
strengthening of the Data Quality Initiative project, which began in 2006. The contractor will be
tasked with providing technical assistance to Department program offices and their grantees
regarding the design and conduct of program evaluations. Other responsibilities will include
helping program offices structure their grant competitions to encourage grantees to plan for and
collect more accurate and meaningful performance data and providing data collection and
analytical assistance to program offices in the preparation of annual reports.

Recommendations

We recommend that the Deputy Secretary:

2.1	   Establish performance measures related to low-income and minority student performance
       with regard to high school graduation and college access/success rates in applicable
       programs.

2.2	   Use the data collected from the performance measures above to analyze the effect that
       these programs are having on closing achievement gaps.

2.3	   Ensure that related performance data are available and are as current as possible to enable
       analysis on whether programs are achieving their intended outcomes as effectively as
       possible and at the lowest possible cost and to inform future proposals on program
       eliminations and consolidations.
Final Audit Report
ED-OIG/A19K0013                                                                     Page 18 of 21

Department Comments

ODS did not explicitly agree or disagree with Finding No. 2, but commented that it believed the
opening statement, as originally presented, to be open to interpretation and potentially
misleading and provided suggested revised language. ODS also stated that its ability to collect,
for comparison purposes, data on the peers of the low-income and minority students served by its
high school discretionary grant programs is limited by funding and other constraints, rendering
implementation of performance measures that examine local gaps in achievement or attainment
generally unfeasible.

Much of the remainder of ODS’s comments related to OIG’s conclusions on the effectiveness, or
possible lack thereof, of some of the programs included in our review. ODS questioned the basis
on which some of these determinations were made, specifically citing the ME-HEP and UB
programs, noting that improvement on performance measures would seem to reflect positive
results and also noting limitations regarding the usefulness of PART reviews. ODS also stated
that more recent performance data was now available for the N-D program that would impact
OIG’s statements on the performance of the program. ODS stated that OIG may have incorrectly
characterized findings from previous program evaluations for the GEAR UP and Talent Search
programs in describing results as generally positive, when, in fact, it would be more accurate to
say that some correlational evidence in line with the desired outcomes of the programs was
found.

ODS agreed, in general, with all of our recommendations, stating that it will ensure that program
websites contain the most recent acceptable performance data and also develop performance
measures related to high school graduation and college access/success rates for low-income and
minority students, provided that they are consistent with statutory and regulatory requirements
and determined by staff to be appropriate. However, it also described a number of limitations in
its ability to conduct useful analyses of achievement gaps of students served by its high school
discretionary grant programs. Among these are its relative inability to collect data for the peers
of such students, as noted above, and statutory provisions that restrict rigorous impact
evaluations of the TRIO programs, whereby other possible causes of changes in outcomes could
be isolated to determine the effect of specific programs on student achievement. Despite these
limitations, ODS stated that it recognizes the importance of continuing to work to obtain data
that can be used to assess the effectiveness of its high school programs and identified activities
that might be undertaken in support of this effort.

OIG Response

We agree with ODS’ suggested revision to the opening statement of Finding 2 and have made
the applicable change. With regard to concerns raised over some of the statements regarding
program performance, we note that our objective required that factors other than the attainment
of performance goals or improvement on these measures be taken into consideration when
describing programs that did or did not appear to be showing positive results. This included
PART reviews and evaluations conducted by IES and OPEPD/PPSS, if available. With regard to
specific concerns expressed over our ME-HEP characterization, while exceeding the target for
one of its two effectiveness measures is encouraging, it did not show progress on efficiency
Final Audit Report
ED-OIG/A19K0013                                                                     Page 19 of 21

measures and, although older, its PART review noted there were no results demonstrated. When
all noted performance data sources are considered collectively, we believe our conclusion is
supported, as is also the case with our conclusion on the UB program. In addition, we do not
believe that we mischaracterized the published evaluation results for the GEAR UP and Talent
Search programs. For the purposes of this audit, evaluations citing measured improvements
were given a generally positive characterization, regardless of whether the program was noted as
being the primary cause or contributory cause for the positive results noted. Lastly, we requested
the updated performance data for the specific program noted in ODS’ comments but did not
receive it for consideration by the time of issuance of our final report.

We recognize that, in some cases, there may be statutory and regulatory requirements or
limitations that hinder the Department’s ability to plan for and conduct useful analyses of the
effectiveness of its high school discretionary grant programs in closing achievement gaps
between low-income and minority students and their peers in high school graduation and college
access/success. Nevertheless, we encourage the Department to continue to pursue any and all
efforts that could assist it in determining whether achievement is improving for students served
under these programs.




                 OBJECTIVES, SCOPE, AND METHODOLOGY



The objectives of our audit were to (1) assess the extent to which the Department’s high school
programs are duplicative, and (2) determine if the Department has collected data that show
whether these programs appear to be effective and efficient in reducing gaps between low-
income and minority students and their peers in high school graduation and college
access/success.

To achieve the audit objectives, we:

   •	 Reviewed information on all of the Department’s grant programs to identify those with a
      primarily high school-related focus;
   •	 Reviewed legislation and regulations governing each of the selected programs, as well as
      background information available on the program websites;
   •	 Conducted discussions with OPEPD and Budget Service officials to obtain a Department-
      wide overview and understanding of such programs;
   •	 Interviewed program officials responsible for administering selected high school

      programs in OESE, OII, OPE, OSDFS, and OVAE;

   •	 Obtained and reviewed program performance data, including annual performance plans,
      annual performance reports, PART assessments, and evaluations conducted by IES and
      OPEPD/PPSS;
   •	 Interviewed IES and OPEPD/PPSS officials to gain a better understanding of relevant
      program evaluations and the Department’s program evaluation process in general; and
Final Audit Report
ED-OIG/A19K0013                                                                       Page 20 of 21

   •	 Reviewed prior OIG and GAO audit reports on overlapping or duplicative programs and
      also any reports pertaining to the programs under review.

The scope of our review was limited to Department grant programs that either serve high school
students only (directly or indirectly), or include them as a primary target population as identified
through a review of the Guide to U.S. Department of Education Programs (FY 2009) and
corroborated by Department program officials. As noted in Finding 1, we identified a total of
18 Department grant programs that either serve high school students only (directly or indirectly),
or include them as a primary target population. We subsequently grouped them into two main
categories: (1) those with a focus on one subject area or on a specific subpopulation of students;
and (2) those with a broad focus on encouraging high school graduation and/or promoting
college access/success, primarily (but not solely) among low-income and minority students. We
identified nine programs that fall under the first category and nine programs that fall under the
second category. We determined the nine programs included under the first category have more
narrowly-focused goals, objectives, and performance measures, and/or are targeted toward
certain, often hard to reach, subpopulations, and despite sharing some similarities, offer
fundamentally different services to unique populations. As a result, there appeared to be little
potential for substantial overlap or duplication with the other high school programs. We
subsequently focused our work on assessing the extent to which this occurs between programs
included under the second category.

We compared the programs selected for review for similarities between: (1) program goals,
objectives, and performance measures; (2) target population; (3) services provided; and (4) the
manner in which services are provided. We established that to be duplicative, a program would
have to match another program in all four areas, while to be overlapping, programs need only
exhibit similarities in one or more areas.

We relied, in part, on computer-processed data from the Department’s Grant Award Database to
determine the extent to which recent OESE and OPE grantees have received or are currently
receiving funds under multiple, potentially overlapping high school programs. As this
information was used primarily for informational purposes and did not materially affect the
findings and resulting conclusions noted in this report, we did not assess its reliability.

We conducted fieldwork at Department offices in Washington, D.C., during the period
November 2010 through June 2011. We provided our audit results to Department officials
during an exit conference held on June 9, 2011.

We conducted this performance audit in accordance with generally accepted government
auditing standards appropriate to the scope of the review. Those standards require that we plan
and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for
our findings and conclusions based on our audit objectives. We believe that the evidence
obtained provides a reasonable basis for our findings and conclusions based on the audit
objectives.
Final Audit Report
ED-OIG/A19K0013                                                                       Page 21 of 21


                            ADMINISTRATIVE MATTERS



Corrective actions proposed (resolution phase) and implemented (closure phase) by your office
will be monitored and tracked through the Department’s Audit Accountability and Resolution
Tracking System (AARTS). Department policy requires that you develop a final corrective
action plan (CAP) for our review in the automated system within 30 days of the issuance of this
report. The CAP should set forth the specific action items, and targeted completion dates,
necessary to implement final corrective actions on the findings and recommendations contained
in this final audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector
General is required to report to Congress twice a year on the audits that remain unresolved after
6 months from the date of issuance.

In accordance with the Freedom of Information Act (5 U.S.C. § 552), reports issued by the
Office of Inspector General are available to members of the press and general public to the extent
information contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please
call Michele Weaver-Dugan at (202) 245-6941.

                                             Sincerely,




                                             Keith West /s/

                                             Assistant Inspector General for Audit

                                                                                            Attachment 1

                                        High School Programs
 Program Name              PO                 Program Office           FY 2010 Funding     FY 2011 Funding
APIP                      OESE                    AITQ                      $27,225,355         $19,909,339

APTF                      OESE                     AITQ                      $17,969,460         $23,343,981

HSGI*                       OESE                      AITQ                     $50,000,000        $48,902,000
ME-HEP                      OESE                      OME                      $19,948,431        $19,709,450
N-D                         OESE                      SASA                     $50,427,000        $50,326,146
SIG*                        OESE                      SASA                    $545,633,000       $534,561,734
SLC                         OESE                      AITQ                     $80,107,636                 $0
Striving Readers*           OESE                      AITQ                    $250,000,000                 $0
AAHC                         OII                       TQP                       $1,815,000                $0
Close Up                     OII                        IP                       $1,942,000                $0
CACG*                        OPE                HEP/State Service             $150,000,000       $150,000,000
GEAR UP                      OPE               HEP/Student Service            $323,212,000       $302,816,154
Talent Search                OPE           HEP/Student Service (TRIO)         $141,954,000       $138,659,000
UB                           OPE           HEP/Student Service (TRIO)         $257,831,000       $305,840,000
UBMS                         OPE           HEP/Student Service (TRIO)          $35,230,000        $33,812,000
GRAA                       OSDFS             DVP National Programs             $32,712,000         $6,907,158
CTE                         OVAE                      DATE                  $1,143,497,334     $1,123,659,178
Tech Prep                   OVAE                      DATE                    $102,923,000                 $0
Program Office: Academic Improvement and Teacher Quality Programs (AITQ), Office of Migrant Education
(OME), Student Achievement and School Accountability Programs (SASA), Teacher Quality Programs (TQP),
Improvement Programs (IP), Higher Education Programs (HEP) Trio Programs (TRIO), Drug-Violence Prevention,
National Programs (DVP), Division of Academic and Technical Education (DATE).
* Denotes relatively new programs or established programs operating under newly revised rules.
                                                                                               Attachment 2

                             Proposed Eliminations and/or Consolidations
 Program
               Elimination       FYs         Consolidation        FYs                 New Authority~
  Name
APIP                                                          2011-2012    College Pathways and Accelerated
                                                                            Learning
APTF                                                          2011-2012    College Pathways and Accelerated
                                                                            Learning
HSGI*                         2004-2005                      2011-2012    College Pathways and Accelerated
                                                                            Learning
ME-HEP
N-D
SIG
SLC                           2004-2009                      2011-2012    Expanding Educational Options-
                                                                            Promoting Public School Choice
                                                                            Grants
Striving                                                      2011-2012    Effective Teaching and Learning:
Readers                                                                     Literacy
AAHC                          2007-2010                      2011-2012    Effective Teaching and Learning for
                                                                            a Well-Rounded Education
Close Up                      2004-2010                      2011-2012    Effective Teaching and Learning for
                                                                            a Well-Rounded Education
CACG                            2010
GEAR UP                       2006-2007
Talent                        2006-2007
Search
UB                            2006-2007
UBMS
GRAA                          2004-2009                      2011-2012    Successful, Safe, and Healthy
                                                                            Students
CTE                           2006-2007                         2012      CTE
Tech Prep                     2004-2009                         2012      CTE
                           11                              10
* The previous administration proposed eliminating this program when it was known as the School Dropout
Prevention Program.
~ The “New Authority” column identifies consolidated funding streams, proposed by the current administration,
under which the programs would operate. In general, overall funding would remain the same, but there would be
fewer programs to administer.
                                                                                                                                           Attachment 3


                                                                    Summary Table
              Objective One: Overlap and
                                                               Objective Two: Efficiency and Effectiveness
                      Duplication
                                                               Low-Income                                                     Proposed       Proposed
Program        Specific        High School                        and/or                                                         for            for
                                                  Annual                          PART
 Name        Subject Area      Graduation                        Minority                        IES         OPEPD/PPSS      Elimination   Consolidation
                                                Performance                    Assessment
                  or          and/or College                     Student                     Evaluation(s)   Evaluation(s)   (2004-2012)    (2004-2012)
                                                Plan/Report                    (2002-2008)
            Subpopulation     Access/Success                   Performance
                                                                Measures
APIP                                                                                                                                        
APTF                                                                                                                                        
HSGI                                                                                                                                          
ME-HEP                                                                          
N-D                                                                             
SIG                                    
SLC                                                                                                                                        
Striving                                                                                                                                     
Readers
AAHC                                                                                                                                         
Close Up                                                                                                                                     
CACG^                                                                                                                           
GEAR                                                                                                                        
UP^*
Talent                                                                                                                      
Search^*
UB^*                                                                                                                        
UBMS*                                                                                                          
GRAA                                                                                                                                         
CTE^                                                                                                                                       
Tech                                                                                                                                       
Prep^
    18             9                   9                 15           8            10             1               7              11             10
^ Denotes programs that were identified as overlapping (color-coded).
* Denotes programs for which we located more than one OPEPD/PPSS evaluation.
                                                                            Attachment 4

               Acronyms/Abbreviations/Short Forms Used in this Report

AAHC         Academies for American History and Civics

AITQ         Academic Improvement and Teacher Quality Group

APIP         Advanced Placement Incentive Program

APTF         Advanced Placement Test Fee Program

CACG         College Access Challenge Grant Program

CTE          Career and Technical Education

DATE         Division of Academic and Technical Education

Department   U.S. Department of Education

DVP          Drug-Violence Prevention, National Programs

ESEA         Elementary and Secondary Education Act

FY           Fiscal Year

GAO          Government Accountability Office

GEAR UP      Gaining Early Awareness and Readiness for Undergraduate Programs

GRAA         Grants to Reduce Alcohol Abuse

HEA          Higher Education Act

HEP          Higher Education Programs

HSGI         High School Graduation Initiative

IES          Institute of Education Sciences

IP           Improvement Programs

ME-HEP       Migrant Education – High School Equivalency Program

N-D          Prevention and Intervention Programs for Children and Youths Who Are
             Neglected, Delinquent, or At Risk

OESE         Office of Elementary and Secondary Education
OIG     Office of Inspector General

OII     Office of Innovation and Improvement

OMB     Office of Management and Budget

OPE     Office of Postsecondary Education

OPEPD   Office of Planning, Evaluation, and Policy Development

OSDFS   Office of Safe and Drug-Free Schools

OVAE    Office of Vocational and Adult Education

PART    Program Assessment Rating Tool

PO      Principal Office

PPSS    Program and Policy Studies Service

SASA    Student Achievement and School Accountability Programs

SIG     School Improvement Grants

SLC     Smaller Learning Communities

SSWG    Secondary Schools Working Group

TRIO    TRIO Programs

TQP     Teacher Quality Programs

UB      Upward Bound

UBMS    Upward Bound Math-Science
                          Department Response to Draft Audit Report                    Attachment 5


                         UNITED STATES DEPAR T MENT O F EDUCATI ON
                                                                                THE DEPUTY SECRETAR'




                                          October 26, 20 II



MEMORANDUM

TO:            Michele Weaver-Dugan
               Director. Operations Internal Audit Team
               Office of Inspector General

FROM :         Anthony W. Miller      ~f;7~
SU BJECT:      Draft Audit Repo rt. Potential O verlapping High Schoo l Programs (E D­
               OIG/AI9K00J3)

Thank yo u for the opportuni ty to comment on the draft audit report, " Potential Overlapp in g High
School Programs." As you know, the Secretary and I strongly believe that , in order to help the
Nation reacb Presiden t Obama 's goal or out-ed ucating the rest or the world , we must become
more eflicient and product ive as an age ncy. Reducing and eliminating duplicat ion in Ollr
programs. including those that serve high sc hool students, is a key step toward thi s cnd. We are
thu s encouraged that the Oflice of Inspector General (OIG ) ro und no instances of program
duplication in its review.

The Department's respon ses to the findings and recommendations of the draft report follow.

FI NDI NG NO. 1 - Ovrrhlp Ex ists Amo ng So me Departm ent Hig h Sc hoo l Prog rams

On pages 8-9 of the report, O IG discusses the potentially negati ve effects of overlap on program
administration and oversight. We appreciate OIO 's in sight in this area and will continue to
examine whether our support for high schoo ls is conligured to have Ihe mOSI pos iti ve effects for
students. However, we wi sh to note that we do not be lieve that program overlap (as opposed to
progrJm duplicatio n) is inherently undesirable. no r that an instance of oveilap should necessaril y
cause the Department to seek elimination or consolidation of the affected programs. In fact.
overlapping programs can provide va luable co mplementary serv ices to loca l educat ional
agenc ies, schoo ls. or students. In such cases, we believe that the mos t appropriate action ma y be
to seek to coordinate admini stration of the program s so that delivery of se rvices is as efficient
and effecti ve as poss ib le.

We believe that the clarity and accurac y of the report would be improved by providing further
infonnati on or expl anation in certain other areas. For instance. on page 6 of the draft report. O IG
discusses potent ial overlap among the Co llege Access Challenge Gran ts. GEAR UP. Tale nt
 earch (TS), and Upward Bound (U B) programs. with a focus o n overlap between the TS and
UB program s. including overlap in services provided. We believe that the repo rt would benefit



                            400 M ARYLAND ,WE. SW. WAB HINGTON, DC 20202
                                              ........"". t'd gO\ 

Page 2


from a more detailed di scuss ion of the difTerences in intensity of serv ice between the two
programs. UB is an intensive academic program designed to generate in program participants
the skill s and l11otivalion needed to complete high school and enter and slicceed in postseconda ry
education. While designed to encourage participants to complete high sc hoo l and undertake a
program Ofposlsccondary education. TS provides. in comparison. limited academic support:
instead, TS provides academic and career counseling and also assists students with the
postsecondary education appiicalion process, including applying for financial aid. We believe
thi s additional detail would help ensure the reader has a proper understa nding of the nature and
extent of potential service overlap between the programs.

On page 8, OIG discusses the Secondary Schoo l Working Group. We would like to note that the
Secondary School Working Group has a high level of participation, with representatives from a
majority of Deparlme nt offices (including all offices with high schoo! programs) attending
meetings on a regular bas is and an average of 20 participants at each meeting.

On page 8, OIG also d iscusses the reorganization of the Office of Postsecondary Education
(OPE). Thi s discussion should be updated to reflect the fact that the reorganization is now
complete. with administration of the GEAR UP and TS programs occurring in the same division ,
allowing staff who have been assigned grants under the programs to collaborate more directly to
achieve program goals and objectives.

On Page 9, OIG notes wi th respect to the TS and UB programs that there are no statutory or
regulatory prohibitions on grantees receiving funds under both programs at the same time. In
fact. the authorizing statute specifically permits an entity to reedv\! mult iple TRIO prugri:un
grants and permits the director of a program rece iving funds to admini ster one or more additional
programs for d isadvantaged students (e.g., GEAR UP) operated by the spo nsoring insti tution or
agency , regardless of the fund ing sources of such programs. These statuto ry provisio ns clearly
have an impact on the Dcpartlllcnt"s ability to prevent potential serv ice overlap in the TS and UB
programs. We recom mend that DIG revise the text accordingly.

Lastl y, \.ve recommend that Att achment I include a col umn that provides fiscal year 2011
fundi ng leve ls for the programs. This ,.viII help reinforce the finding that funding fo r certain
programs covered by the re port (Smaller Learning Communit ies, Striv ing Readers, Close Up.
and Tech Prep) was elimi nated and thus that there is currently less potential for program overlap
than in previous years.

Rccomm cnd:lti ons

We recomm end th a t th e DellUty Secretary:

1.1 	    Conlinu e 10 aC li\'ely promote coo rdinati on a mong simil:l r programs, ensu re thai key
         shiff :Ire awa r e of such efforts a nd eneo unlged to p:lrlicip:l le, r efoc us so me of the
         Oepa rtm ent ' s cur r ent efforts to bell er ren ec t coo rdilllltion efforts, empha size
         coordin :.tion :,s re hltin g to ndmin isl n lli \'e :",d opcnltiol1 a l matt ers, lind co nsider
         form a lizin g olh er no tabl e inform :lI coo rd in:ltion effor ts.
Page 3


         We agree with Recommendation 1.1 and , in order to improve adm inistrative efficiency
         and overall program impact. will co ntinue to prolllOie coordination among similar high
         schoo l programs through the Secondary School Working Group and othe r means.

         We note. as another example of our coordination efforts, that the Office of Vocational
         and Adult Education, in partnership with the Office of Plannin g. Evaluation. and Policy
         Development. established in summer 20 lOa Career and Technical Education (CTE)
         Strategy Workgroup consisting of representatives of numerous offices throughout the
         Department. The CTE Strategy Wo rkgroup co llabo ratively develo ped a CTE
         Transformation Strategy that is being used to help gu ide the Departmelll's proposal to
         reauthori ze the Carl D. Perkins Career and Technical Education Act. We recommend
         that these efforts be recognized in the report.

1.2 	    Co ntinue to work with Co ngress to ro nsolidat e or eliminatc program s that overlap
         with onc IHlother, with :Ill emph:lsis on those that do not apre:l r to bc achievin g
         intended res ult s.

         We agree with Recommendation 1.2. As the draft report notes. the Department has in
         past years reco mmended for consolidation or elimination a nlunbcr of Federal education
         programs. Through o ur annual budget deve lopmen t process and other means. we will
         cont inue to work to identify for consolidation or elimination programs that are
         duplicative or not achieving intended result s or that otherwi se do not warrant funding.

         We nOte that the Administration's proposal 10 rt!uuthori l.t; tht; Ekmt:ntary alld Secondary
         Educat ion Act (ESEA) would create a new high school program, College Pathways and
         Accelerated Learning. This program would replace, with a mo re comprehensive and
         flexible authority . seve ral. sometimes narrowly targeted , ESEA programs that offer
         accelerated learning opportunities or see k to prevent studen ts from dropping out of
         school , including the Advanced Placement program s and the High Schoo! Graduation
         Initiative.

1.3 	    Oc\'clop :Ind implcmcnt poliries ,lI1d proccdul'cs rrlatcd to DcplIrtmcnt gnmtcc
         nppliration rcview :tlld monitorin g effort s thnt would hrlp cllsure that loca l
         edu ca ti on ~l gr nrie s, sc hools and/or stud ents :1I'C not bring ovrr-servrd by similar
         prog ram s :wd sc rviccs.

         We agree with Recommendati on 1.3 but do nOl believe that action by the Department
         with respect to the TS and US programs is needed. Department regulations require lhat
         recipients ofTS and US grants collaborate with other Federal TRIO projects, GEAR UP
         projects, or programs serving similar populations that are serv ing the same target schoo ls
         or target area in o rder to minimize the duplication of services and promote collaborations
         so that more students can be served (34 CFR 643.II(b); 645.2I(a)(4)). In addition. stan'
         fo r these programs currently track, as part of budget reviews. whether entities are
         receiving multiple related grants. In light of these regulatory requirements and review
         procedures, and because this recommendation appears to be intended to address potential
Page 4


         se rvice overl ap in the TS and UB programs spec ifically, we do not believe thai further
         ac tion wi lh respect to the recommendation is warran ted.

         Although we agree in pri nciple wi th the idea that students should not be over-served by
         Federal educat ion programs, we note that the report did not ident ify instances of such
         over-service, and we arc not aware o f any situations in whi ch this is occ urring. Thu s.
         DIG may wish to de lete "and/or students" from the recommendati on.

FINDINC NO. 2 - Performance Me.. sures a nd Ava ilable Data on th e Reductio n of Ga lls
Between Low-Income ,lIld Minor ity Students and Their Peers Ar c L:lcking

The draft report's statemen t. "We found thaI the Departmen t has not collected data on any of the
18 programs included in our rev iew, nor has it established related pe rformance measures,
specifically related 10 the program s' effec ti veness in reducing gaps between low-income and
minority students and their peers in high school grad uation and college access/success." can be
interpreted in differe nt ways and is poten tially misleading to the reader. For clarity, we
recommend that it be revised as foll ows: "We found that , although the Department has co llected
extensive performance data on the programs under rev iew, it has not collected data or established
performance measures specificall y on effectiveness in reducing gaps between low-income and
minori ty students and their peers in high school graduat ion and co ll ege access/success." In
addition, we note that our abili ty to collect, for com pariso n purposes, data o n the peers of the
low-income and minority studen ts se rved by our high sc hool di scretionary gran t program s is
generall y limit ed by funding and OIhe r co nstraint s, rendering perfo rmance measu res that examine
loca l ga ps in achieveme nt or atta inm ent generally 110t feasiblc to implement.

We believe that the report's statement to the effect that the Migrant Educat ion -High School
Equiva lency Program (ME-i-lE I» may not be producing pos it ive results regard ing high school
graduation rales or co ll ege access/success is not accuratc. As the draft report notes, ME-H EP
exceeded the targe t for one of its two effective ness measures (Measure I. I: The percentage of
ME-HEP participants recei vi ng a Genera l Educational Development (GED) certificate) in 20 10
and also made progress fro m the previous year on both measures. We be li eve that such
performance reflects pos iti ve results and recommend that ge neral co ncl usions regardi ng ME­
HEP performance be revised acco rding ly.

We believe that the repo rt wo uld benefit from additional di scuss ion of the limitati ons of using
PART re views, whic h were conducted only through 2007, to assess current program
performance - particularly for programs for which , according to OIG. PART results suggest
deficiencies in ability to ac hi eve short- and long-term perfonnance goals. Fo r one of these
programs, ME-HEr, current an nual performance data show, as OIG notes, im provement wi th
respec t to pe rforma nce measures. In another case, UB, DIG notes that the program has currently
met o r exceeded all o f its perfomlance measure targets. We be li eve that these resu lts cast doubt
on claims of deficiency in these programs made on the basis of older information from PART
reviews.

We would also like to co rrect DIG's characterization, on page 14 and in Table 4 on page 12, of
the findings from pre vious program eva luations. Dfthe four hi gh sc hool programs studi ed
Page 5

through previous evaluations, on ly the Upward Bound Math and Science program was found to
have a generally positive effect. The eva luations of the GEAR UP and TS programs found some
corre lational ev idence in lin e with the desired outcomes of the programs. In addition, for the
eva luati on activi ties curren tly under way related to GEAR UP and UB, we believe it is more
accurate to say that the Department is completing its analysis of the data coll ected through these
activ ities to determine if they provide informal ion 1hal wou ld be useful fo r program
improvement.

Lastly, we wish to note that more recent data (for 2010 instead of2009) arc availab le fo r the
Neglected or Delinquent program. These data impact O IG's statements on the performance of
this program. We would be happy to provide the data if requested.

Recommendations

\Ve recommend that th e Deputy Sec retary:

2. 1 	   Establish I)erfonnance measures related to low-in co me and min ority stud ent
         perform ance with rega rd to high school graduation and co llege access/success rates
         in apl)licnble I)rogrn ms.

         We agree with Recommendation 2. 1 and will initiate, by Decembe r 1, 20 II , development
         of such perfo rman ce measures where they arc consistent with sta tut ory and regu latory
         requirements and determ ined by staff to be appropriate for an affected program.

2.2 	    Use the data co llec ted from the performan ce measu res above to analyze the effect
         thnt these I)rograms are ha vin g on closin g achievement ga ps.

         We agree wit h Recommendation 2.2 to the extent it is practicab le. As di sclissed above,
         Ollr ability to collect achievement data for the peers of the low-income and minority
         students served by our high school discretionary grant programs is generally limited. As
         a resu lt , we do nOl believe that we can conduct useful analyses o f achievement gaps of
         students served by these programs. However, we wil1 consider the feasibility of using
         achievement data co llected under the programs to assess the extent to which these
         programs arc serv ing their target popu lat ions.

         Furthennore. the Department cannot determine the e ffect of programs on studen t
         achievement or othe r important outcomes without isolating the other possible causes of
         changes in outcomes. Thi s is not possible usi ng data from performance measures alone.
         Unfortunately, statutory provisions in the Higher Education Act restrict the Department 's
         ability to cond uct rigorous impact evaluations o f the TRIO programs, which further limits
         our ability to determine th e e ffe ctiveness of these programs in narrowing achievement
         gaps and accomplishing their other statutory purposes.

         These limi tat ions notwithstanding, we recognize that we must cont inue to work to obtain
         data that can be used to assess the effectiveness of our high school programs. As an
         exa mpl e of such work, we note that we already use performance measures in formula
Page 6

         grant programs. including Title I Grants to LEA s and Indi viduals with Di sabil ities
         Education Act Grants to States, to examine achievement gap cl osings and will explore the
         feasibility of using data from such measu res to determine whether achievement is
         improving fo r students served under our high sc hool programs.

         Lastly, we acknowledge that the Department must continue to work to conduct rigorous
         program evaluations in a cost-effective manner. We believe that our suppo rt for the
         development and expansion of State longitudinal data systems will help significantly in
         this effort.

2.3 	    Ens u[e Ihal relal ed perform a nce data a rc lWll ila blc a nd :l r(' as cu r rent ~I S pos sibl e to
         en a ble an :l1ys is 0 11 wh eth er prog ralll s a rc :l chi ev ing th eir int end ed out co mes as
         effectively It S poss ibl e lUld lit the lowes t poss ibl e cost "nd to infor m future proposliis
         on prognl nl elilll in<.l lions a nd co nsoli<l:lI io ns.

         We agree with Recommendation 2.3 as it pertains to making current performance data
         publ icl y available and will ensure that the Web si tes of affected programs contain the
         most recent acceptable performance data by Decembe r I, 20 11.

Thank you for conducting this audit. OIG" s work in this area will provide a valuable
contriblJlion to the Department's ongoing efforts to help improve achievement and attainment in
Ollr Nation's high school s.


Allached to this memorandum are recommended technical edits to the draft report.