oversight

OIG's Independent Report on the Department's Performance Summary Report for Fiscal Year 2012

Published by the Department of Education, Office of Inspector General on 2013-02-28.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                           UNITED STATES DEPARTMENT OF EDUCATION 


                                 OFFICE OF ELEMENTARY AND SECONDARY EDUCATION 





February 28, 2013

Honorable R. Gil Kerlikowske
Director, Office ofNational Drug Control Policy
Executive Office of the President
Washington, D.C. 20500

Dear Director Kerlikowske:

In accordance with the Office ofNational Drug Control Policy (ONDCP) Circular Drug Control
Accounting, enclosed please find detailed information about performance-related measures for
key drug control programs administered by the U.S. Department of Education contained in the
US Department ofEducation 's Performance Summary Report for Fiscal Year 2012, along with
the Department of Education Assistant Inspector General's authentication of the management
assertions included in that report.

Please do not hesitate to contact me if you have any questions about this information.

                                           Sincerely,



                                           David Esquith
                                           Director, Office of Safe and Healthy Students

Enclosure #1: Department of Education Performance Summary Report for Fiscal Year 2012

Enclosure #2: Authentication letter from Patrick J. Howard, Assistant Inspector General for
Audit

cc: Patrick J. Howard




                         400 MARYLAND AVE., S. W., WASHINGTON, D.C. 20202
                                         www.ed.gov

 Our mission is to ensure equal access to education and to promote educational excellence throughout the nation.
                    Department of Education 





                       Performance Summary Report 


                                      Fiscal Year 2012 





                                         In Support of the

                              National Drug Control Strategy 


       As required by ONDCP Circular: Drug Control Accounting 




                                         February 28, 2013




                        400 MARYLAND AVE., S.W., WASHINGTON, D.C. 20202
                                         www.ed.gov

Our mission is to ensure equal access to education and to promote educational excellence throughout the nation.
                                                  U.S. Department of Education 


                            Performance Summary Report for Fiscal Year 2012 


                                                         TABLE OF CONTENTS 


Transmittal Letter ........................ .................................................................................... 1 


Performance Summary Information .... .................................................. ........................... 2 


          Safe Schools/Healthy Students ....... ..................................................................... 2 


          Student Drug Testing ...... ........ ................................... ........................ ........ ........... 8 


          Safe and Supportive Schools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ... . . . . . . . . . . . . . . . . . . . .. 10 


          Grants to Reduce Alcohol Abuse ..... ........... ............ .. .......................................... 12 


Assertions ....................... .... .......................................................................................... 21 


Criteria for Assertions ................................................................................................... 21 

                          UNITED STATES DEPARTMENT OF EDUCATION

                                 OFFICE OF ELEMENTARY AND SECONDA RY EDUCATION




February 21 , 2013

Kathleen S. Tighe
Inspector General
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202-1510

Dear Ms.Tighe:

As required by Office of National Drug Control Policy (ONDCP) Circular Drug Control
Accounting, enclosed please find detailed information about performance-related
measures for key drug control programs administered by the U.S. Department of
Education , in accordance with the guidelines in the circular dated May 1, 2007. This
information covers the Safe and Drug-Free Schools and Communities program, which is
the Drug Control Budget Decision Unit under which the 2012 budgetary resources for
the Department of Education are displayed in the Fiscal Year 2013 National Drug
Control Budget Summary.

Consistent with the instructions in the ONDCP Circular, please provide your
authentication to me in writing and I wi ll transmit it to ONDCP along with the enclosed
Performance Summary Report. ONDCP requests these documents by February 28,
2013. Please do not hesitate to contact me if you have any questions about the
enclosed information.

                                           Sincerely,

                                     '----;Q_ c/         +-(.. _______
                                           David Esquith
                                           Director, Office of Safe and Healthy Students




                         400 MARYLAND AVE., S.W., WASHI NGTON, D.C. 20202
                                          www.ed.gov

 Our mission is to ensure equal access to education and ro promote educational excellence throughout the nation.
             FY 2012 Performance Summary Information


                           Safe Schools/Healthy Students

Measure 1: The percentage of grantees demonstrating a decrease in substance
abuse over the three-year grant period. (Safe Schools/Healthy Students- FY
2005 and 2006 cohorts)

Table 1

Cohort    FY2008    FY2009     FY2010    FY2011    FY2012    FY2012    FY2013
           Actual   Actual      Actual   Actual    Target     Actual    Target
2005      34.2      n/a        n/a       n/a      n/a       n/a        n/a
2006      66.7      66.7       n/a       n/a      n/a       n/a        n/a

The measure. This performance measure is for the Safe Schools/Healthy 

Students (SS/HS) initiative, a joint project of the Departments of Education, 

Health and Human Services, and Justice. The initiative provides grants to local 

educational agencies to support the development and implementation of a 

comprehensive plan designed to prevent student drug use and violence and 

support healthy youth development. 


This measure, one of four for this initiative for the FY 2004, 2005, and 2006 

cohorts, focused on one of the primary purposes of the initiative - reduced 

student drug use. This measure was directly related to the National Drug Control 

Strategy's goal of preventing drug use before it begins. Grantees selected and 

reported on one or more measures of prevalence of drug use for students. For 

the FY 2004 - 2006 cohorts, the items selected by grantees to respond to this 

measure were not common across grant sites but, rather, reflected priority drug 

use problems identified by sites. 


FY 2012 Performance Results. FY 2012 targets were not set, nor actual 

performance data aggregated for any grant cohorts, as grant projects were no 

longer active. 


FY 2013 Performance Targets. 

Both the FY 2005 and 2006 grant cohorts were not operating in FY 2012. Thus, 

no targets were set for FY 2013. 


Methodology. Data for these grant cohorts were collected by grantees , generally 

using student surveys. Data were furnished in the second of two semi-annual 

performance reports provided by grantees each project year. If grantees 

identified more than one measure of drug abuse or provided data for individual 

school-building types (for example, separate data for middle and high schools), 



                                         2

grantees were considered to have experienced a decrease in substance abuse if
data for a majority of measures provided reflected a decrease. If a grant site
provided data for an even number of measures and half of those measures
reflected a decrease and half reflected no change or an increase, that grant site
was judged not to have demonstrated a decrease in substance abuse. While
most sites were able to provide some data related to this measure, we
considered as valid data only data from sites that used the same elements/items
in each of two years. We considered a grant site to have experienced a
decrease if data supplied reflected a decrease over baseline data provided.

The contractor for the SS/HS national evaluation used data for this measure as
part of the program evaluation. The evaluation contractor reviewed data
submitted, and worked with grantees to seek clarifying information and provide
technical assistance if grantees were having difficulty in collecting or reporting
data for this measure.

Grantees that failed to provide data were not included in the tabulation of data for
the measures. Also, grantees that did not provide data for two consecutive
project years (so that we could determine if a decrease in substance abuse had
occurred} were not included in data reported for the measure. Authorized
representatives for the grant site signed the annual performance report and , in
doing so, certified that to the best of the signer's knowledge and belief, all data in
the performance report were true and correct and that the report fully disclosed
all known weaknesses concerning the accuracy, reliability, and completeness of
the data included. Generally, the Department relied on the certification
concerning data supplied by grantees and did not conduct further reviews.

Measure 2: The percentage of SS/HS grantees that report a decrease in
students who report current (30-day) marijuana use (SS/HS- FY 2007,
2008, and 2009 cohorts)

Table 2

Cohort    FY2008    FY2009    FY2010     FY2011    FY2012    FY2012 FY2013
          Actual    Actual    Actual    Actual     Target    Actual    Target
2007      53.8      42.9      37.5      51 .9      n/a       n/a       n/a
2008      n/a       50.0      43.6      58.3       61 .8     45.6      n/a
2009      n/a       n/a       0         55.2       56.9      55.1      58.4

The measure. This performance measure is for the Safe Schools/Healthy
Students initiative, a joint project of the Departments of Education (ED), Health
and Human Services (HHS}, and Justice. The initiative provides grants to local
educational agencies to support the development and implementation of a
comprehensive plan designed to prevent student drug use and violence and
support healthy youth development. Beginning with the FY 2007 cohort, the
project period for SS/HS grants was 48 months.


                                          3

This measure, one of six for this initiative for cohorts from FY 2007 - 2009, and a
revision of the measure used by previous cohorts of grants, focuses on one of
the primary purposes of the initiative- reduced student drug use. The initiative
and th is measure are directly related to the National Drug Control Strategy's goal
of reducing illicit drug use.

FY 2009 was the last cohort of new grants made under the program and, as the
grants were for a four-year project period , the FY 2009 cohort's last year of
continuation funding was made in FY 2012. FY 2012 is the last year of
performance data submitted for the FY 2008 cohort.

FY 2012 Performance Results. Beginning with the FY 2007 cohort, grantees
were required to provide baseline data prior to implementing interventions,
generally, after the first project year grantees reported baseline data and year
one actual performance data. Across all cohorts (FY 2007, 2008, and 2009)
some sites experienced significant delays in beginning implementation of
interventions. Reasons for delays include the need to finalize partnership
agreements, complete a project logic model, develop an evaluation plan , and, for
some, to collect baseline data.

For the FY 2008 - 2009 grant cohorts, FY 2012 actual performance data have
been aggregated , and are reported in Table 2. The FY 2007 cohort of grantees
projects had ended and thus no FY 2012 data are reported.

Neither the FY 2008 nor the FY 2009 cohort met their FY 2012 targets. For the
FY 2008 cohort, there was a significant decrease in the percentage of grantees
in the entire cohort reporting a decrease in students who report current (30-day)
marijuana use. This may be explained by this cohort of grantees being in the
final stages of the grant cycle and reaching a ceiling effect related to gains
realized . At the grantee level, for the most part, a plateau effect seemed to have
occurred , with grantees either making minor increases or decreases compared to
the prior year in the students who report current (30-day) marijuana use. For the
FY 2009 cohort, almost the same percentage of grantees made progress related
to this measure, compared to FY 2011 actual performance.

FY 2013 Performance Targets. The setting of FY 2013 performance target for
the FY 2009 cohort was based on an analysis of prior-year performance. The FY
2007 cohort data (from FY 2008 and 2009) showed that the cohort's initial project
year (FY 2008) performance results were better than second project year (FY
2009) performance results. Staff analysis of grantee data resulted in the
identification of numerous factors thought to contri bute to the decline in
performance results in the second year of the project that informed the setting of
subsequent targets.




                                         4

Based on this analysis, and considering the changes made to the GPRA
measures, targets were initially set over multiple years using an incremental
increase annually of baseline plus 2, 3, and 6, percent for the FY 2007, 2008,
and 2009 cohorts.

However, given the variation in the percentage of the FY 2009 cohort of grantees
achieving the performance benchmarks in the past, we have deviated from using
our initial formula , and instead adjusted targets based on past-year actual cohort
performance. Given that the FY 2012 target was not met, we are setting the FY
2013 target as the FY 2012 actual performance plus an incremental increase of
six percent.

Methodology. Data are collected by grantees, generally using student surveys.
Data are furnished in the second of two semi-annual performance reports
provided by grantees each project year.

The contractor for the SS/HS national evaluation is using data for this measure
and from these cohorts as part of the national program evaluation . Through the
FY 2011 data collection, the evaluation contractor reviewed performance data
submitted by grantees, and worked with grantees to seek clarifying information
and provided technical assistance if grantees were having difficulty in collecting
or reporting data for this measure. The contractor supplied data for the measure
after it had completed data cleaning processes.

In FY 2012, the SS/HS national evaluation contract supported only completion of
the final evaluation report, and ED/HHS staff compiled and aggregated
performance data from annual performance reports submitted by grantees. If
data for this measure were not available at the time that performance reports
were submitted , staff followed-up with sites to attempt to obtain data for the
measure.

Grantees that failed to provide data or that provided data that does not respond
to the established measure are not included in the tabulation of data for the
measures. Authorized representatives for the grant site sign the annual
performance report and, in doing so, certify that to the best of the signer's
knowledge and belief, all data in the performance report are true and correct and
that the report fully discloses all known weaknesses concern ing the accuracy,
reliability, and completeness of the data included . Generally, the Department
relies on the certification concerning data supplied by grantees and does not
conduct further reviews.

Measure 3: The percentage of 55/HS grantees that report a decrease in
students who report current (30-day) alcohol use (55/HS- FY 2007, 2008,
and 2009 cohorts)




                                         5

Table 3

    Cohort     FY2008    FY2009     FY2010     FY2011    FY2012     FY2012    FY2013
               Actual    Actual     Actual     Actual    Target     Actual    Target
    2007       71.4      47.8       66.7       70.4      n/a        n/a       n/a
    2008       n/a       56.0       60.0       75.0      79.5       63.1      n/a
    2009       n/a       n/a        0          58.6      60.4       65.5      69.4


The measure. This performance measure is for the Safe Schools/Healthy
Students initiative, a joint project of the Departments of Education, Health and
Human Services, and Justice. The initiative provides grants to local educational
agencies to support the development and implementation of a comprehensive
plan designed to prevent student drug use and violence and support healthy
youth development. Beginning with the FY 2007 cohort, the project period for
SS/HS grants is 48 months.

This measure, one of six for this initiative for cohorts from FY 2007 and forward ,
focuses on the prevalence of alcohol use. While the National Drug Control
Strategy is focused most intensively on preventing the use of controlled
substances, the strategy does address the role of alcohol as a substance of
choice for teenagers. Data do suggest that early use of alcohol is more likely to
result in heavy later use of alcohol.

FY 2009 was the last cohort of new grants made under the program and, as the
grants were for a four-year project period, the FY 2009 cohort's last year of
continuation funding was made in FY 2012. FY 2012 is the last year of
performance data submitted for the FY 2008 cohort.

FY 2012 Performance Results. Beginning with the FY 2007 cohort, grantees
were required to provide baseline data prior to implementing interventions,
generally, after the first project year grantees reported baseline data and year
one actual performance data. Across all cohorts (2007, 2008, and 2009) some
sites experienced significant delays in beginning implementation of interventions.
Reasons for delays include the need to finalize partnership agreements,
complete a project logic model, develop an evaluation plan, and, for some, to
collect baseline data.

For the FY 2008- 2009 grant cohorts, FY 2012 actual performance data have
been aggregated , and are reported in Table 3. The FY 2007 cohort of grantees
projects had ended and thus no FY 2012 data are reported.




                                          6

For the FY 2008 cohort there was a significant decrease, compared to FY 2011
actual performance, in the percentage of grantees in the entire cohort reporting a
decrease in students who report current (30-day) alcohol use, and the FY 2012
target was not met. This may be explained, as in the case of similar cohort
declines for the previous measure, by this cohort of grantees being in the final
stages of the grant cycle and reaching a ceiling effect related to gains realized .
At the grantee level, for the most part, a plateau effect seemed to have occurred ,
with grantees either making minor increases or decreases compared to the prior
year in the students who report current (30-day) alcohol use. For the FY 2009
cohort, there was an increase in the percentage of grantees that made progress
related to this measure compared to the FY 2012 performance target set.

FY 2013 Performance Targets. The setting of FY 2013 performance target for the
FY 2009 cohort was based on an analysis of prior-year performance. The FY
2007 cohort data (from FY 2008 and 2009) showed that the cohort's initial project
year (FY 2008) performance results were better than second project year (FY
2009) performance results. Staff analysis of grantee data resulted in the
identification of numerous factors thought to contribute to the decline in
performance results in the second year of the project that informed the setting of
subsequent targets. Based on this analysis, and considering the changes made
to the GPRA measures, targets were initially set using an incremental increase
annually of baseline plus 2, 3, and 6, percent for the FY 2007, 2008, and 2009
cohorts.

However, given the variation in the percentage of the FY 2009 cohort of grantees
achieving the performance benchmarks in the past, we have deviated from using
our initial formula, and instead adjusted targets based on past-year actual cohort
performance. Given that the FY 2012 target was met, we are setting the FY
2013 target as the FY 2012 actual performance plus an incremental increase of
six percent.

Methodology. Data are collected by grantees, generally using student surveys.
Data are furnished in the second of two semi-annual performance reports
provided by grantees each project year.

The contractor for the SS/HS national evaluation is using data for this measure
and from these cohorts as part of the national program evaluation. Through the
FY 2011 data collection, the evaluation contractor reviewed performance data
submitted by grantees, and worked with grantees to seek clarifying information
and provided technical assistance if grantees were having difficulty in collecting
or reporting data for this measure. The contractor supplied data for the measure
after it had completed data cleaning processes.

In FY 2012 , the SS/HS national evaluation contract supported only completion of
the final evaluation report, and ED/HHS staff compiled and aggregated
performance data from annual performance reports submitted by grantees. If


                                         7

data for this measure were not available at the time that performance reports
were submitted , staff followed-up with sites to attempt to obtain data for the
measure.

Grantees that failed to provide data or that provided data that does not respond
to the established measure are not included in the tabulation of data for the
measures. Authorized representatives for the grant site sign the annual
performance report and , in doing so, certify that to the best of the signer's
knowledge and belief, all data in the performance report are true and correct and
that the report fully discloses all known weaknesses concerning the accuracy,
reliabi lity, and completeness of the data included. Generally, the Department
relies on the certification concerning data supplied by grantees and does not
conduct further reviews .

                               Student Drug Testing

Measure 1: The percentage of student drug testing grantees that experience a 5
percent reduction in current (30-day) illegal drug use by students in the target
population. (Student Drug Testing- FY 2006, 2007, and 2008 cohorts)

Table 4

Cohort    FY2008    FY2009   FY2010     FY 2011    FY 2012    FY2012     FY2013
          Actual    Actual    Actual     Actual     Target    Actual     Target
2006      66.7     12.5      57.0       n/a        n/a        n/a        n/a
2007      33.0     41 .7     50.0       n/a        n/a        n/a        n/a
2008      n/a      49.0      65.0       35.5       n/a        n/a        n/a

The measure. This measure was one of two measures for the Student Drug
Testing Program grant competition. The competition provided discretionary
grants to LEAs, community-based organizations, or other public and private
entities to support implementation of drug testing of students, consistent with the
parameters established by the U.S. Supreme Court or for students and their
families that voluntarily agree to participate in the student drug testing program.

Student drug testing was prominently featured between FY 2003 to 2009 in
different versions of the strategy as a recommended drug prevention
intervention.

FY 2008 was the last cohort of new grant awards made under the program and,
as the grants were for a three-year project period , the FY 2008 cohort's last year
of continuation funding was made in FY 2010.

FY 2012 Performance Results. FY 2012 targets were not set, nor actual
performance data aggregated for any grant cohorts, as grant projects were no
longer active.


                                         8

FY 2011 performance data for the FY 2008 cohort were not included in table 4
above in the FY 2011 performance report as a significant number of projects
were still under no-cost extensions and we were awaiting more complete cohort
data. These FY 2011 results are reported for the first time.

FY 2013 Performance Targets. No FY 2013 targets are applicable. FY 2011
was the last year of performance reporting of any of the Student Drug Testing
grantees.

Methodology. Data for the FY 2006 cohort came from the evaluation conducted
by a Department of Education contractor and were collected annually. Data for
subsequent cohorts were collected by grantees using student surveys, and
provided as part of the grantees' annual performance reports. Generally,
grantees prior to FY 2008 cohort did not use the same survey items to collect
data for this measure but, rather, self-selected survey items (often from surveys
already administered) in order to provide these data. Beginning with the FY 2008
cohort, we asked grantees to provide data for current (prior 30-day) use of
marijuana, as a proxy for illegal drug use. Beginning with the FY 2008 cohort, we
also instructed grantees to collect baseline data for this measure before
beginning with implementation of their student drug testing program.

Authorized representatives for the grant site signed the annual performance
report and , in doing so, certified that to the best of the signer's knowledge and
belief, all data in the performance report were true and correct and that the report
fully disclosed all known weaknesses concerning the accuracy, reliability, and
completeness of the data included. Generally, the Department relied on the
certification concerning data supplied by grantees and did not conduct further
reviews.

Measure 2: The percentage of student drug testing grantees that experience a 5
percent reduction in past-year illegal drug use by students in the target
population. (Student Drug Testing - FY 2006, 2007, and 2008 cohorts)

Table 5

Cohort    FY2008    FY2009     FY2010     FY2011    FY2012    FY2012 FY2013
           Actual    Actual     Actual    Actual     Target    Actual   Target
2006      55.5      12.5       57.0      n/a        n/a       n/a       n/a
2007      33.0      33.3       54.0      n/a        n/a       n/a       n/a
2008      n/a       58.0       58.0      37.7       n/a       n/a       n/a

The measure. This measure was one of two measures for the Student Drug
Testing Program grant competition. The competition provided discretionary
grants to LEAs, community-based organizations, or other public and private
entities to support implementation of drug testing of students, consistent with the
parameters established by the U.S. Supreme Court or for students and their
families that voluntarily agree to participate in the student drug testing program.

                                         9

Student drug testing was prominently featured in annual editions of the National
Drug Control Strategy between 2003 and 2009 as a recommended drug
prevention intervention.

FY 2008 was the last cohort of new grants made under the program and, as the
grants were for a three-year project period, the FY 2008 cohort's last year of
continuation fund ing was made in FY 2010.

FY 2012 Performance Results. FY 2012 targets were not set, nor actual
performance data aggregated for any grant cohorts, as grant projects were no
longer active.

FY 2011 performance data for the FY 2008 cohort were not included in table 5
above in the FY 2011 performance report as a significant number of projects
were still under no-cost extensions.and we were awaiting more complete cohort
data. These FY 2011 results are reported for the first time.

FY 2013 Performance Targets. No FY 2013 targets are applicable. FY 2011
was the last year of performance reporting of any of the Student Drug Testing
grantees.

Methodology. Data for the FY 2006 cohort came from the evaluation conducted
by a Department of Education contractor and were collected annually. Data for
subsequent cohorts were collected by grantees using student surveys, and
provided as part of the grantees' annual performance reports. Generally,
grantees prior to FY 2008 cohort did not use the same survey items to collect
data for this measure but, rather, self-select survey items (often from surveys
already administered) in order to provide these data. Beginning with the FY 2008
cohort, we asked grantees to provide data for past-year use of marijuana, as a
proxy for illegal drug use. Beginning with the FY 2008 cohort, we also instructed
grantees to collect baseline data for this measure before beginning with
implementation of their student drug testing program.

Authorized representatives for the grant site signed the annual performance
report and, in doing so, certified that to the best of the signer's knowledge and
belief, all data in the performance report were true and correct and that the report
fully disclosed all known weaknesses concerning the accuracy, reliability, and
completeness of the data included. Generally, the Department relied on the
certification concerning data supplied by grantees and did not conduct further
reviews .

                           Safe and Supportive Schools

In FY 2010 the Department awarded the first round of awards under the Safe and 

Supportive Schools program . Awards were made to State educational agencies 



                                         10 

to support statewide measurement of, and targeted programmatic interventions
to improve, cond itions for learning in order to help schools improve safety and
reduce substance use. Projects must take a systematic approach to improving
conditions for learning in eligible schools through improved measurement
systems that assess conditions for learn ing, which must include school safety,
and the implementation of programmatic interventions that address problems
identified by data.

FY 2012 Performance Results. Complete cohort performance data are currently
not available. Baseline data will be available by May 2013 on performance
measures for the FY 2010 cohort.

FY 2013 Performance Targets. No targets are currently set for FY 2013 as
baseline data are currently not be available on which to set these performance
targets. These targets will be set by May 2013 once baseline data are
aggregated for the entire FY 201 0 grant cohort.

Measures. ED has established several GPRA performance measures for
assessing the effectiveness of Safe and Supportive Schools grants. The
measures related to addressing the goals of the National Drug Control Strategy
include:

 (a) 	 Percentage of eligible schools implementing programmatic interventions
       funded by Safe and Supportive Schools that experience a decrease in the
       percentage of students who report current (30-day) alcohol use;
 (b) 	 Percentage of eligible schools implementing programmatic interventions
       funded by Safe and Supportive Schools that experience an increase in the
       percentage of students who report current (30-day) alcohol use;
 (c) Percentage of eligible schools implementing programmatic interventions
       funded by Safe and Supportive Schools that experience an improvement
       in their school safety score;
 (d) 	 Percentage of eligible schools implementing programmatic interventions
       funded by Safe and Supportive Schools that experience a worsening in
       their school safety score.

The school safety score is an index of school safety that may include the
presence and use of illegal drugs (including alcohol and marijuana).

Methodology. These measures constitute the Department's indicators of success
for the Safe and Supportive Schools grant program. Consequently, we advised
applicants for a grant under this program to give careful consideration to these
measures in conceptualizing the approach and evaluation for its proposed
program. Each grantee will be required to provide, in its annual performance and
final reports, data about its progress in meeting these measures.




                                        11 

                        Grants to Reduce Alcohol Abuse

Measure 1: The percentage of grantees whose target students show a
measurable decrease in binge drinking. (Grants to Reduce Alcohol Abuse
Program- FY 2005, 2007, 2008, 2009, and 2010 cohorts)

Table 6

    Cohort    FY2008 FY2009 FY2010 FY2011             FY2012    FY2012    FY2013
              Actual    Actual   Actual      Actual   Target    Actual    Target

    2005      59.3      n/a       n/a        n/a      n/a       n/a       n/a
    2007      61.5      47.0     83.0        n/a      n/a       n/a       n/a
    2008      n/a       50.7     64.0        40.3     n/a       n/a       n/a
    2009      n/a       n/a      57.1        67.0     77.0      pendinQ   n/a
    2010      n/a       n/a       n/a        50.0     n/a       75.0      n/a

The measure. This measure examines a key outcome for the Grants to Reduce
Alcohol Abuse (GRAA) program - reduction in binge drinking for the target
student population. Research suggests that early use of alcohol is more likely to
result in heavy later use of alcohol.

New grant awards were last made in FY 2010. Funds were not appropriated in
FY 2012 for new or continuation awards and, as a result, the FY 2010 cohort of
grantees was not provided their FY 2012 year 3 continuation award .

FY 2012 Performance Results. At the time of submission of the FY 2011
performance report, performance data for the FY 2008 cohort had not been
aggregated due to the large number of grantees in no-cost extensions. FY 2011
actual performance data for this cohort is being reported here for the first time.

The FY 2008 and previous cohorts had completed grant activities by FY 2011
and therefore no actual performance data are available nor were targets set for
FY 2012. The FY 2009 cohort performance data has not been aggregated due
to the large number of grantees in no-cost extensions.

The only cohort for which FY 2012 performance data are currently available is
the FY 2010 cohort, a small cohort of grantees for which FY 2011 and FY 2012
data are being reported for the first time. In the FY 2011 performance report, we
indicated that FY 2012 targets for this cohort wou ld be set once the FY 2011
performance data was aggregated, and used as a baseline. However, once it
became clear that the final year of continuation funding would not be awarded to
this cohort of grantees, we decided not to set a FY 2012 target as it was not clear
of the extent to which grantees would have the capacity to gather and report, and
respond to clarification questions about, the FY 2012 actual performance data.

                                          12 

However, grantees were ultimately able to provide actual FY 2012 performance
data, and the FY 2010 cohort made significant gains compared to FY 2011 actual
performance.

As we have received data from across cohorts for this measure and for this
program, we continue to find it difficult to discern a pattern of performance that
can serve as a basis for establishing future targets. We have carefully
considered performance reports submitted by grantees, as well as our
experience in monitoring and providing technical assistance to grantees, and
have identified some challenges that may have impeded grant performance.
Some common problems include turnover in leadership (at the authorized
representative or project director level) and challenges with collecting and
reporting valid data about the measure.

Another variable that might affect performance in sites is related to project
design . For example, we are uncertain how to assess the likely impact of a site
that is implementing a single research-based program versus sites that have
adopted a more comprehensive strategy that includes a community-based
intervention that complements school-based curricula. Finally, cohort size and
composition varies from cohort to cohort. In some years funding for a large
amount of new awards was available and in others only a handful of sites
received grants, as was the case in FY 2010 with a cohort of 8 grants.

Increasingly, over time, it became clear t hat a series of variab les serve to make
each cohort unique, and that the issue of how we established targets for th is
measure in the past was problematic. Given these challenges, and
improvements we made in data quality (including generally requiring grantees to
collect baseline data for their projects before interventions are implemented), we
modified our process for establishing targets.

While prior cohort performance may have provided some insights about general
patterns of performance that we could incorporate into our targeting setting
processes, we ultimately decided to establish numerical performance targets
after baseline data is received for the new cohorts. We generally entered these
targets for new grant cohorts into the Department's Visua l Performance System
(VPS) as "administrative" targets (for example, baseline plus 5 percent), and then
converted the targets to numerical targets after baseline data is collected and
aggregated. We believed that this process revision helped us better match
targets to cohort performance.

FY 2013 Performance Targets. Targets have not been set for any grant cohorts
as none will conduct significant activity during FY 2013.

Methodology. Data for this measure are collected by grantees and reported as
part of annual performance reports. If data for this measure are not available at
the time that performance reports are submitted, staff follow-up with sites to


                                         13 

attempt to obtain data for the measure. Grantees that fai l to provide data are not
included in the tabulation of data for the measures. Also, grantees that did not
provide data for two consecutive project years (so that we could determine if a
decrease in binge drinking had occurred) are not included in the aggregate data
reported for the measure. Authorized representatives for the grant site sign the
annual performance report and, in doing so, certify that to the best of the signer's
knowledge and belief, all data in the performance report are true and correct and
that the report fully discloses all known weaknesses concerning the accuracy,
reliability, and completeness of the data included. Generally, the Department
relies on the certification concerning data supplied by grantees and does not
conduct further reviews .

ED does not mandate data collection protocols or instruments for grantees.
Grantees select a survey item that reflects the concept of binge drinking, and
collect and report data about that survey item as part of their performance
reports. As a result, data are not comparable across grant sites, but individual
grant sites are required to use the same survey items across performance
periods. We consider sites that have experienced a decrease in the rate of binge
drinking of one percent or greater to have achieved a measurable decrease in
binge drinking.

Initially, applicants were not required to furnish baseline data as part of their
applications. Data supplied after year one were considered baseline data for the
projects. Projects required two years of data in order to determine if a decrease
in binge drinking among target students had occurred . However, the FY 2007
and subsequent cohorts were instructed to provide baseline data in their
application, or if that data was not available, to collect it before beginning project
implementation . Thus, we are able to report on grantee and cohort performance
at the end of year one.

We have provided significantly increased guidance and technical assistance to
grantees beginning with the FY 2007 cohort, and believe that these efforts have
produced data that are of higher quality and more comparable across sites than
those of previous cohorts.

Measure 2: The percentage of grantees that show a measurable increase in the
percentage of target students who believe that alcohol abuse is harmful to their
health. (Grants to Reduce Alcohol Abuse - FY 2005, 2007, 2008, 2009, and
2010 cohorts)




                                          14 

Table 7

    Cohort    FY       FY        FY         FY        FY         FY         FY
              2008     2009      2010       2011     2012        2012       2013
              Actual   Actual    Actual     Actual   Target      Actual     Target
    2005      59.3     n/a       n/a        n/a      n/a         n/a        n/a
    2007      69.2     76.5      88.9       n/a      n/a         n/a        n/a
    2008      n/a      58.6      60.0       75.8     n/a         n/a        n/a
    2009      n/a      n/a       100.0      67.0     100.0       PendinQ    n/a
    2010      n/a      n/a       n/a        50.0     n/a         50.0       n/a



The measure. This measure examines a key outcome for the Grants to Reduce
Alcohol Abuse (GRAA) program - perception of health risk for alcohol abuse
among target students. While the National Drug Control Strategy is focused
most intensively on preventing the use of controlled substances, the Strategy
does address the role of alcohol use as a drug of choice for teenagers. Data do
suggest that changes in perceptions about risks to health resulting from alcohol
use are positively correlated with reductions in alcohol use.

New grant awards were last made in FY 2010. Funds were not appropriated in
FY 2012 for new or continuation awards and, as a result, the FY 2010 cohort of
grantees were not provided their FY 2012 year 3 continuation award.

FY 2012 Performance Results. At the time of submission of the FY 2011
performance report, performance data for the FY 2008 cohort had not been
aggregated due to the large number of grantees in no-cost extensions. FY 2011
actual performance data for this cohort is being reported here for the first time.

The FY 2008 and previous cohorts had completed grant activities by FY 2011
and therefore no actual performance data are available nor were targets set for
FY 2012. The FY 2009 cohort performance data has not been aggregated due
to the large number of grantees in no-cost extensions.

The only cohort for which FY 2012 performance data are currently available is
the FY 2010 cohort, a small cohort of grantees for which FY 2011 and FY 2012
data are being reported for the first time. In the FY 2011 performance report, we
indicated that FY 2012 targets for this cohort would be set once the FY 2011
performance data was aggregated, and used as a baseline. However, once it
became clear that the final year of continuation funding would not be awarded to
this cohort of grantees, we decided not to set a FY 2012 target as it was not clear
of the extent to which grantees would have the capacity to gather and report, and
respond to clarification questions about, the FY 2012 actual performance data.
However, grantees were ultimately able to provide actual FY 2012 performance

                                          15 

data, and the FY 2010 cohort made no progress compared to FY 2011 actual
performance.

As we have received data from across cohorts for this measure and for this
program, we continue to find it difficult to discern a pattern of performance that
can serve as a basis for establishing future targets. We have carefully
considered performance reports submitted by grantees, as well as our
experience in monitoring and providing technical assistance to grantees, and
have identified some challenges that may have impeded grant performance.
Some common problems include turnover in leadership (at the authorized
representative or project director level) and challenges with collecting and
reporting valid data about the measure.

Another variable that might affect performance in sites is related to project
design. For example, we are uncertain how to assess the likely impact of a site
that is implementing a single research-based program versus sites that have
adopted a more comprehensive strategy that includes a community-based
intervention that complements school-based curricula. Finally, cohort size and
composition varies from cohort to cohort. In some years funding for a large
amount of new awards was available and in others only a handful of sites
received grants, as was the case in FY 2010 with a cohort of 8 grants.

Increasingly, over time, it became clear that a series of variables serve to make
each cohort unique, and that the issue of how we established targets for this
measure in the past was problematic. Given these challenges, and
improvements we have made in data quality (including generally requiring
grantees to collect baseline data for their projects before interventions are
implemented), we modified our process for establishing targets.

While prior cohort performance may have provided some insights about general
patterns of performance that we could incorporate into our targeting setting
processes, we ultimately decided to establish numerical performance targets
after baseline data is received for the new cohorts. We generally entered these
targets for new grant cohorts into the Department's Visual Performance System
(VPS) as "administrative" targets (for example, baseline plus 5 percent), and then
converted the targets to numerical targets after baseline data is collected and
aggregated. We believed that this process revision helped us better match
targets to cohort performance.

FY 2013 Performance Targets. Targets have not been set for any grant cohorts
as none will conduct significant activity during FY 2013.

Methodology. Data for this measure are collected by grantees and reported as
part of annual performance reports. If data for this measure are not available at
the time that performance reports are submitted , staff follow-up with sites to
attempt to obtain data for the measure. Grantees that fai l to provide data are not


                                         16 

included in the tabulation of data for the measures. Also, grantees that did not
provide data for two consecutive project years (so that we could determine if an
increase in student perceptions of harm had occurred) are not included in the
aggregate data reported for the measure. Authorized representatives for the
grant site sign the annual performance report and, in doing so, certify that to the
best of the signer's knowledge and belief, all data in the performance report are
true and correct and that the report fully discloses all known weaknesses
concerning the accuracy, reliability, and completeness of the data included .
Generally, the Department relies on the certification concerning data supplied by
grantees and does not conduct further reviews.

ED does not mandate data collection protocols or instruments for grantees.
Grantees select a survey item that reflects the concept of perceptions of harm,
and collect and report data about that survey item as part of their performance
reports . As a result, data are not comparable across grant sites, but individual
grant sites are required to use the same survey items across performance
periods. We consider sites that have experienced an increase in the percentage
of students who believe alcohol abuse is harmful of one percent or greater to
have achieved a measurable increase for the measure.

Initially, applicants were not required to furnish baseline data as part of their
applications. Data supplied after year one were considered baseline data for the
projects. Projects required two years of data in order to determine if an increase
in perceptions of harm among target students had occurred . However, the FY
2007 and subsequent cohorts were instructed to provide baseline data in their
application, or if that data was not available, to collect it before beginning project
implementation. Thus, we are able to report on grantee and cohort performance
at the end of year one, as is done for the FY 2009 cohort in this report.

We have provided significantly increased guidance and technical assistance to
grantees beginning with the FY 2007 cohort, and believe that these efforts have
produced data that are of higher quality and more comparable across sites tha n
those of previous cohorts.

Measure 3: The percentage of grantees that show a measurable increase in the
percentage of target students who disapprove of alcohol abuse. (Grants to
Reduce Alcohol Abuse- FY 2005, 2007, 2008, 2009, and 201 0 cohorts)




                                          17 

Table 8

    Cohort    FY         FY       FY          FY       FY          FY        FY
              2008      2009      2010        2011     2012        2012      2013
              Actual    Actual    Actual      Actual   Taraet      Actual    Target
    2005      74.1      n/a       n/a         n/a      n/a         n/a       n/a
    2007      69.2      47.0      88.9        n/a      n/a         n/a       n/a
    2008      n/a       49.3      58.3        72.6     n/a         n/a       n/a
    2009      n/a       n/a       100.0       67.0     100.0       pending   n/a
    2010      n/a       n/a       n/a         66.7     n/a         100.0     n/a

The measure. This measure examines a key outcome for the Grants to Reduce
Alcohol Abuse (GRAA) program -disapproval of alcohol abuse among target
students. While the National Drug Control Strategy is focused most intensively
on the preventing the use of controlled substances, the strategy does address
the role of alcohol use as a drug of choice for teenagers. Research does
suggest that increases in the percentage of target students who believe that
alcohol abuse is not socially acceptable are associated with declines in
consumption of alcohol. New awards were last made in FY 2010. Funds were
not appropriated in FY 2012 for new or continuation awards.

FY 2012 Performance Results. At the time of submission of the FY 2011
performance report, performance data for the FY 2008 cohort had not been
aggregated due to the large number of grantees in no-cost extensions. FY 2011
actual performance data for this cohort are being reported here for the first time.

The FY 2008 and previous cohorts had completed grant activities by FY 2011
and therefore no actual performance data are available nor were targets set for
FY 2012 . The FY 2009 cohort performance data has not been aggregated due
to the large number of grantees in no-cost extensions.

The only cohort for which FY 2012 performance data are currently available is
the FY 2010 cohort, a small cohort of grantees for which FY 2011 and FY 2012
data are being reported for the first time. In the FY 2011 performance report, we
indicated that FY 2012 targets for this cohort would be set once the FY 2011
performance data was aggregated , and used as a baseline. However, once it
became clear that the final year of continuation funding would not be awarded to
this cohort of grantees, we decided not to set a FY 2012 target as it was not clear
of the extent to which grantees would have the capacity to gather and report, and
respond to clarification questions about, the FY 2012 actual performance data.
However, grantees were ultimately able to provide actual FY 2012 performance
data, and the FY 2010 cohort made significant gains compared to FY 2011 actual
performance.

As we have received data from across cohorts for this measure and for this
program, we continue to find it difficult to discern a pattern of performance that

                                           18 

can serve as a basis for establishing future targets. We have carefully
considered performance reports submitted by grantees, as well as our
experience in monitoring and providing technical assistance to grantees, and
have identified some challenges that may have impeded grant performance.
Some common problems include turnover in leadership (at the authorized
representative or project director level) and challenges with collecting and
reporting valid data about the measure.

Another variable that might affect performance in sites is related to project
design. For example, we are uncertain how to assess the likely impact of a site
that is implementing a single research-based program versus sites that have
adopted a more comprehensive strategy that includes a community-based
intervention that complements school-based curricula. Finally, cohort size and
composition varies from cohort to cohort In some years funding for a large
amount of new awards was available and in others only a handful of sites
received grants, as was the case in FY 2010 with a cohort of 8 grants.

Increasingly, over time, it became clear that a series of variables serve to make
each cohort unique, and that the issue of how we established targets for this
measure in the past was problematic. Given these challenges, and
improvements we have made in data quality (including generally requiring
grantees to collect baseline data for their projects before interventions are
implemented), we modified our process for establishing targets.

While prior cohort performance may have provided some insights about general
patterns of performance that we could incorporate into our targeting setting
processes, we ultimately decided to establish numerical performance targets
after baseline data is received for the new cohorts. We generally entered these
targets for new grant cohorts into the Department's Visual Performance System
(VPS) as "administrative" targets (for example, baseline plus 5 percent), and then
converted the targets to numerical targets after baseline data is collected and
aggregated. We believed that this process revision helped us better match
targets to cohort performance.

FY 2013 Performance Targets. Targets have not been set for any grant cohorts
as none will conduct significant activity during FY 2013.

Methodology. Data for this measure are collected by grantees and reported as
part of annual performance reports. If data for this measure are not available at
the time that performance reports are submitted, staff follow-up with sites to
attempt to obtain data for the measure. Grantees that fail to provide data are not
included in the tabulation of data for the measures. Also, grantees that did not
provide data for two consecutive project years (so that we could determine if an
increase in students disapproving of alcohol abuse had occurred) are not
included in the aggregate data reported for the measure. Authorized
representatives for the grant site sign the annual performance report and , in


                                        19 

doing so, certify that to the best of the signer's knowledge and belief, all data in
the performance report are true and correct and that the report fully discloses all
known weaknesses concerning the accuracy, reliability, and completeness of the
data included. Generally, the Department relies on the certification concerning
data supplied by grantees and does not conduct further reviews.

ED does not mandate data collection protocols or instruments for grantees.
Grantees select a survey item that reflects the concept of disapproval of alcohol
abuse, and collect and report data about that survey item as part of their
performance reports. As a result , data are not comparable across grant sites,
but individual grant sites are required to use the same survey items across
performance periods. We consider sites that have experienced an increase in
the percentage of students who disapprove of alcohol abuse of one percent or
greater to have achieved a measurable increase for the measure.

Initially, applicants were not required to furnish baseline data as part of their
applications. Data supplied after year one were considered baseline data for the
projects. Projects required two years of data in order to determine if an increase
in disapproval of alcohol abuse among target students had occurred. However,
the FY 2007 and subsequent cohorts were instructed to provide baseline data in
their application, or if that data was not available, to collect it before beginning
project implementation. Thus, we are able to report on grantee and cohort
performance at the end of year one, as is done for the FY 2009 cohort in this
report.

We have provided significantly increased guidance and technical assistance to
grantees beginning with the FY 2007 cohort, and believe that these efforts have
produced data that are of higher quality and more comparable across sites than
those of previous cohorts.




                                         20 

Assertions
                          Performance Reporting System

The Department of Education has a system in place to capture performance
information accurately and that system was properly applied to generate the
performance data in this report. In instances in wh ich data are supplied by
grantees as part of requ ired periodic performance reports , the data that are
supplied are accurately reflected in this report.

Data related to the drug control programs included in this Performance Summary
Report for Fiscal Year 2012 are recorded in the Department of Education's
software for recording performance data and are an integral part of our budget
and management processes.

               Explanations for Not Meeting Performance Targets

The explanations provided in the Performance Summary report for Fiscal Year
2012 for not meeting performance targets and for recommendations for plans to
revise performance targets are reasonable given past experience, available
information, and available resources.

               Methodology for Establishing Performance Targets

The methodology described in the Performance Summary Report for Fiscal Year
2012 to establish performance targets for the current year is reasonable given
past performance and available resources.

          Performance Measures for Significant Drug Control Activities

The Department of Education has established at least one acceptable
performance measure for each Drug Control Decision Unit identified in its
Detailed Accounting of Fiscal Year 2012 Drug Control Funds.

Criteria for Assertions



No workload or participant data support the assertions provided in this report.
Sources of quantitative data used in the report are well documented. These data
are the most recently available and are identified by the year in which the data
was collected.

                            Other Estimation Methods

No estimation methods other than professional judgment were used to make the
required assertions. When professional judgment was used, the objectivity and


                                        21 

strength of those judgments were explained and documented . Professional
judgment was used to establish targets for programs until data from at least one
grant cohort were available to provide additional information needed to set more
accurate targets. We routinely re-evaluate targets set using professional
judgment as additional information about actual performance on measures
becomes available.

                               Reporting Systems

Reporting systems that support the above assertions are current, reliable, and an
integral part of the Department of Education's budget and management
processes. Data collected and reported for the measures discussed in this report
are stored in the Department of Education's Visual Performance System (VPS) .
Data from the VPS are used in developing annual budget requests and
justifications, and in preparing reports required under the Government
Performance and Results Act of 1993, as amended .




                                       22 

                                 UNITED STATES DEPARTMENT OF EDUCATION
                                                       OFFICE OF INSPECTOR GENERAL

                                                                                                                     AUDIT SERVICES

                                                         February 28, 2013

Memorandum

To:                  David Esquith
                     Director, Office of Safe and Healthy Students
                     Office of Elementary and Secondary Education

From:                PatrickJ. Howard                  ~X:~
                     Assistant Inspector o:O.:r~;;o;~:rr

Subject:             Office oflnspector General ' s Independent Report on the U.S. Department of
                     Education's Performance Summary Report for Fiscal Year 2012, dated
                     February 28, 2013

Attached is our authentication of management' s assertions contained in the U.S. Department of
Education' s Performance Summary Report for Fiscal Year 2012, dated February 28, 2.013, as
required by section 70S(d) of the Office ofNational Drug Control Policy Reauthorization Act of
1998 (21 u.s.c. ยง 1704(d)).

Our authentication was conducted in accordance with the guidelines stated in the Office of
National Drug Control Policy Circular: Drug Control Accounting, dated May 1, 2007.

If you have any questions or wish to discuss the contents of this authentication, please contact
Michele Weaver-Dugan, Director, Operations Internal Audit Team, at (202) 245-6941-.




 The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                   excellence and ensuring equal access.
                                 UNITED STATES DEPARTMENT OF EDUCATION
                                                       OFFICE OF INSPECTOR GENERAL

                                                                                                                     AUDIT SERVfCES

Office oflnspector General's Independent Report on the U.S. Department of Education's
Performance Summary Report for Fiscal Year 2012, dated February 28, 2013

We have reviewed management' s assertions contained in the accompanying Performance
Summary Report for Fiscal Year 2012, dated February 28, 2013 (Performance Summary Report).
The U.S. Department of Education's management is responsible for the Performance Summary
Report and the assertions contained therein.

Our review was conducted in accordance with generally accepted government auditing standards
for attestation review engagements. A review is substantially less in scope than an examination,
the objective of which is the expression of an opinion on management's assertions. Accordingly,
we do not express such an opinion.

We performed review procedures on the "Performance Summary Information," "Assertions,"
and "Criteria for Assertions" contained in the accompanying Performance Summary Report.
In general, our review procedures were limited to inquiries and analytical procedures appropriate
for our review engagement. We did not perform procedures related to controls over the reporting
system noted in the attached report.

Based on our review, nothing came to our attention that caused us to believe that management's
assertions, contained in the accompanying Performance Summary Report, are not fairly stated in
all material respects, based upon the Office ofNational Drug Control Policy Circular:
Drug Control Accounting, dated May 1, 2007.


                                                                  //~~
                                                                   Patrick J. Howard
                                                                   Assistant Inspector General for Audit




 The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational
                                                   excellence and ensuring equal access.