oversight

U.S. Department of Education's and Selected States' Oversight of the 21st Century Community Learning Centers Program

Published by the Department of Education, Office of Inspector General on 2013-06-21.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                                     UNITED STATES DEPARTMENT OF EDUCATION
                                            OFFICE OF INSPECTOR GENERAL



                                                                                                                           Control Number
                                                                                                                         ED-OIG/A04L0004

                                                                June 21, 2013

Deborah S. Delisle
Assistant Secretary
Office of Elementary and Secondary Education
U.S. Department of Education
400 Maryland Avenue, S.W.
Washington, DC 20202-4300

Dear Ms. Delisle:

This final audit report, “U.S. Department of Education’s and Selected States’ Oversight of the
21st Century Community Learning Centers Program,” presents the results of our audit. The objectives
of the audit were to (1) determine whether the U.S. Department of Education (Department) effectively
monitored and tracked program performance measures for 21st Century Community Learning Centers
(21st CCLC) grantees 1 to ensure that grantees met program objectives and (2) assess the processes and
controls that four selected State educational agencies (SEA) used to award and monitor subgrants.

The Department designed and followed a monitoring plan to provide oversight of SEAs and relied on
the subgrantees’ submitted data to track program performance measures. However, we found that the
Department can improve its oversight of the accuracy, reliability, and completeness of the SEAs’
annual reports on 21st CCLC program performance. In addition, although the Department monitored
the SEAs’ processes to award and monitor subgrants, we identified areas in which the Department can
improve its oversight of the SEAs’ award and monitoring processes.

Our review covered the Department’s monitoring of SEAs and the processes used by the four selected
SEAs—the Alabama State Department of Education (Alabama), Florida Department of Education
(Florida), Mississippi Department of Education (Mississippi), and Puerto Rico Department of
Education (Puerto Rico)—to award and monitor subgrants during fiscal year (FY) 2011, from
October 1, 2010, through September 30, 2011. We expanded our scope to FY 2010 for Florida and
Puerto Rico based on circumstances related to those SEAs’ grant award competition periods. Specific
details of the expanded scope are discussed in the “Objective, Scope, and Methodology” section of this
report.




1
    The term “grantee” refers to State educational agencies that received 21st CCLC formula grants from the Department.

The Department of Education's mission is to promote student achievement and preparation for global competitiveness by fostering educational excellence
                                                              and ensuring equal access.
Final Audit Report
ED-OIG/A04L0004                                                                                            Page 2 of 33



                                                 BACKGROUND


The 21st CCLC program is authorized under Title IV, Part B, of the Elementary and Secondary
Education Act of 1965, as amended by the No Child Left Behind Act of 2001 (ESEA). The purpose of
the program is to establish or expand community learning centers that provide academic enrichment
opportunities during non-school hours to students to help them meet State and local academic
achievement standards in core academic subjects. Targeting students in high-poverty areas who attend
low-performing schools, the program supports a broad range of before- and after-school activities such
as tutoring services, drug and violence prevention, counseling, art, music, recreation, technology, and
character education programs. The support activities are designed to reinforce and complement the
students’ regular academic program. The program also offers literacy and other educational services to
the families of participating children.

The Department awards 21st CCLC program formula grants to SEAs based on each State’s funding
under Title I, Part A of the ESEA (Title I). SEAs award the 21st CCLC grants through subgrants to
eligible local educational agencies (LEA), community-based organizations, other eligible public and
private entities, and consortiums of two or more of such agencies, organizations, or entities. The SEAs
award subgrants to eligible entities on a competitive basis through a peer review process of subgrant
applications or other method that ensures the quality of the applications. The awards must be for a
period of not less than 3 years and not more than 5 years, in amounts not less than $50,000 per year.

The Department’s Academic Improvement and Teacher Quality Programs, under the Office of
Elementary and Secondary Education (OESE), has oversight responsibility of the 21st CCLC program.
The Academic Improvements Programs Group (AIPG) within the Academic Improvement and
Teacher Quality Programs is composed of nine staff—a group leader and eight program officers—who
monitor the SEAs’ administration and implementation of 21st CCLC formula grants and provide
technical assistance to promote program success and ensure compliance with statutory requirements.
Each program officer is responsible for overseeing from 3 to 10 assigned SEAs. In addition, the AIPG
is supported by Berkeley Policy Associates, a contractor who assists AIPG in planning and conducting
monitoring site visits to SEAs and reporting site visit results.

For FY 2011, the Department awarded nearly $1.1 billion in 21st CCLC program funds to the States
and territories, of which nearly $131.7 million was awarded to the four SEAs we selected for review,
as detailed in the following table. 2 The SEAs awarded the 21st CCLC grants to 378 subgrantees.




2
    Related to the expanded scope, Florida was awarded nearly $53.9 million and Puerto Rico nearly $43 million in FY 2010.
Final Audit Report
ED-OIG/A04L0004                                                                          Page 3 of 33


            Table 1: 21st CCLC Grant Awards Fiscal Year 2011 (Dollars in Millions)
                                                  Grant
                                                 Award      Number of
                                   SEA           Amount    Subgrantees
                             Alabama                 $16.7          93
                             Florida                 $56.1        121
                             Mississippi             $15.4          87
                             Puerto Rico             $43.5          77
                                        Total       $131.7        378

The Department is required under the Government Performance and Results Act of 1993 (GPRA), as
amended by the GPRA Modernization Act of 2010, to develop an annual performance plan with
performance goals and indicators for each program activity set forth in its budget and report annually
on the actual performance achieved. The Department uses the GPRA indicators to evaluate the
effectiveness and efficiency of 21st CCLCs operating nationwide. The Department uses data the SEAs
and subgrantees submit to report program performance related to 16 GPRA-related performance
indicators. Specifically, the Department tracks 21st CCLC program performance from data collected
through the Profile and Performance Information Collection System (PPICS), a Web-based system
maintained by Learning Point Associates. PPICS consists of various data collection modules,
compiled from data the SEAs and subgrantees provided. The Annual Performance Report module
within PPICS collects the data required to address the GPRA indicators for the 21st CCLC program,
which include information on the progress subgrantees made during the preceding year toward meeting
their established program objectives. The SEAs either collect the Annual Performance Report data
from subgrantees and upload it into PPICS, or have the subgrantees upload the data and the SEA
approves the submitted data in the system. In FY 2011, Alabama, Mississippi, and Puerto Rico
delegated the responsibility for performance data submission through PPICS to their subgrantees,
whereas Florida delegated some data entry responsibilities to its subgrantees and performed some data
entry tasks at the State level from data provided by its subgrantees.



                                       AUDIT RESULTS


The Department could more effectively monitor and track SEAs’ 21st CCLC program performance
measures by ensuring that SEAs develop processes sufficient to provide reasonable assurances of the
accuracy, reliability, and completeness of the performance information provided. Although the
Department used PPICS to track 21st CCLC program performance measures, we found that neither the
Department nor three of the four SEAs we reviewed validated the performance data that the
subgrantees submit. As a result, the Department is unable to ensure grantees have met program
objectives because it cannot be sure of the accuracy, reliability, and completeness of the performance
data reported by SEAs.

In addition, although the Department monitored the SEAs’ processes to award and monitor subgrants,
we found that the Department did not identify internal control weaknesses we found at the selected
SEAs. We identified areas in which the Department can improve its oversight of the SEAs’ award and
monitoring processes. Specifically, we identified deficiencies in the award process at all four of the
Final Audit Report
ED-OIG/A04L0004                                                                                            Page 4 of 33

SEAs reviewed and in the SEAs’ subgrantee monitoring processes at three of the four SEAs─
Alabama, Mississippi, and Puerto Rico. The Department had previously identified and reported the
peer reviewer selection process deficiencies we found at Alabama, the monitoring deficiencies we
found at Alabama and Mississippi, and the program evaluation issues we found at Puerto Rico. At the
time of our review, the Department had recently reported these deficiencies in monitoring reports and
was in the process of either obtaining SEA responses and corrective action plans (CAP) or evaluating
the ones received. In addition, we identified deficiencies in the award process at Alabama, Mississippi
and Puerto Rico and monitoring deficiencies at Puerto Rico that the Department had not identified in
its monitoring site visits.

The Department’s most recent monitoring visits 3 conducted in FY 2012 identified deficiencies in
SEAs’ processes to award and monitor subgrants. This indicates that the deficiencies in this report are
not limited to the States we reviewed and, therefore, warrant the Department’s follow-up and
continuous oversight of the 21st CCLC program funds.

In the “Other Matters” section of this report, we discuss the Department’s and the SEAs’ performance
measures for timely issuance of reports detailing deficiencies identified in monitoring site visits.

In response to the draft audit report, the Department agreed with the two findings and either agreed or
partially agreed with all but one of the recommendations. Although the Department disagreed with
Recommendation 1.2, it proposed additional training to the 21st CCLC program office to address the
finding. Based on the Department’s comments, we revised Recommendation 1.2 and 2.1. We added
training as an option to the Department for assessing the SEAs’ monitoring efforts over the SEAs’
validation of the reported 21st CCLC performance data to Recommendation 1.2. We clarified that the
Department’s monitoring and oversight procedures should include a review of how states are assessing
peer reviewers’ qualifications in Recommendation 2.1. We did not make any other changes to the
findings or related recommendations. The Department’s comments are summarized at the end of each
finding. The full text of the Department’s response and proposed corrective actions are included as
Attachment 2 to this report.

FINDING NO. 1 – The Department Can Improve Oversight of Program
Performance Data
The Department designed and followed a monitoring plan that provided oversight of SEAs.
Specifically, the Department

    •   tracked 21st CCLC program performance using data collected from SEAs and subgrantees
        through PPICS;
    •   conducted monitoring site visits 4 at 19 SEAs during FY 2011 and 17 SEAs during FY 2012 in
        accordance with its monitoring plan for each fiscal year based on its 3-year cycle for providing
        monitoring visits to all SEAs; and
    •   reported deficiencies that, if properly corrected, could result in improvements in the grantees’
        program performance. 5
3
  The Department’s FY 2012 monitoring plan used the same monitoring protocol as its FY 2011 plan.
4
  The Department conducted the site visits from January through September 2011.
5
  Examples of the deficiencies the Department identified through monitoring SEAs are detailed in Finding No. 2 in this
audit report.
Final Audit Report
ED-OIG/A04L0004                                                                                             Page 5 of 33


However, the Department did not effectively monitor the SEAs to ensure that the data submitted are
reliable for use in assessing SEA program performance. We found that the Department can improve
its oversight of the accuracy, reliability, and completeness of the SEAs’ annual reports on 21st CCLC
program performance. We also found that the Department did not have comprehensive written
policies and standard operating procedures for monitoring the 21st CCLC program.

Performance Data Validation Could Improve Program Oversight

The Department relies on the SEAs to report accurate, reliable, and complete 21st CCLC program
performance data through PPICS. However, it does not ensure that the SEAs validate the data that
subgrantees submit. Absent sufficient testing to provide reasonable assurance of the accuracy,
reliability, and completeness of reported performance data, the Department cannot ensure that grantees
have met program objectives because it has not been assured that the data it uses to report program
performance addressing GPRA indicators are accurate, reliable, and complete.

PPICS includes internal edits that are designed to validate the internal consistency of the reported data;
detect any extreme values or values outside the norm (outliers); confirm that all required sections have
been completed; and notify the subgrantees, the SEAs, and the Department of any problems detected
with the data. However, the Department relies on the SEAs to ensure subgrantees take appropriate
action to address problems detected with the data and ensure that the data are accurate, reliable, and
complete. As detailed below, we found that three of the four SEAs reviewed—Alabama, Mississippi,
and Puerto Rico—did not sufficiently verify the accuracy, reliability, and completeness of the program
performance data reported by their subgrantees. In addition, we identified a data validation practice at
one of the SEAs reviewed—Florida—that may be considered a promising practice for SEAs.

        Alabama required each 21st CCLC grant recipient to input the required subgrantee
        performance data to PPICS, and Alabama used a contractor to ensure that each grant recipient
        entered the data into the system. However, neither the contractor nor Alabama verified whether
        the data reported to the Department were accurate, reliable, and complete. We conducted
        limited tests on the accuracy of selected data elements that 2 of the 93 Alabama subgrantees
        reported in their FY 2011 Annual Performance Reports. 6 In reviewing the data for reported
        students at all of the grade levels, we found that both subgrantees reported an inaccurate
        number of students served by grade level. Specifically, one subgrantee reported 103 students
        served in 6 grade levels (kindergarten through fifth grade) and, although the subgrantee
        accurately reported 103 students served, the number reported by each grade level contained
        discrepancies. The second subgrantee reported 92 students served in 9 grade levels
        (kindergarten through eighth grade) but the actual number of students served was 94 and the
        number reported by each grade level contained discrepancies except for those reported in the
        sixth grade. The inaccuracies in the grade level served ranged from two to nine students
        overreported and from one to eight students underreported when compared with the number of
        students actually served by grade level. Alabama did not detect the reported inaccuracies.
        Such discrepancies in grade level reporting are important because 10 of the 16 GPRA-related
        performance indicators for the 21st CCLC program used student grade level to measure
        program performance. In response to our finding, Alabama stated that, to strengthen its
        monitoring of the 21st CCLC performance data reported to the Department through PPICS, it

6
 We did not conduct similar tests for any of the other SEAs reviewed. More details of the limited tests conducted of the
accuracy of the selected data elements is discussed in the “Objective, Scope, and Methodology” section of this report.
Final Audit Report
ED-OIG/A04L0004                                                                            Page 6 of 33

      was in the process of updating its written policies, procedures, and instruments to include
      monitoring steps to (1) verify the accuracy, reliability, and completeness of the reported data
      and (2) ensure that subgrantees implement adequate data management systems that allow for
      accurate reporting of the data.

      Mississippi compared the data reported in the subgrantees’ Annual Performance Reports with
      data included in the subgrantees’ applications for continuation grants. The Director of
      Mississippi’s 21st CCLC program stated that during onsite monitoring and desk audits, staff
      compared the reported data with data in documents reviewed. However, although Mississippi
      has a total of 87 subgrantees, we found that it conducted only three onsite monitoring reviews
      and no desk audits in FY 2011. The three FY 2011 onsite monitoring reports did not contain
      any information indicating that Mississippi assessed the accuracy, reliability, and completeness
      of the program performance data gathered as of the date of the reviews. In response to our
      finding, Mississippi stated that it had developed written policies, procedures, and monitoring
      instruments to verify the accuracy, reliability, and completeness of 21st CCLC performance
      data reported to the Department through PPICS. Mississippi stated that it planned to
      implement the procedures in the closeout reviews to Year 5 subgrantees, scheduled for the
      summer of 2012, and to all subgrantees in its FY 2013 monitoring cycle.

      Puerto Rico approved the 21st CCLC program data that its subgrantees submitted in PPICS.
      Specifically, a staff member was responsible for reviewing and approving the data prior to
      FY 2011 and the 21st CCLC Coordinator approved the FY 2011 data. According to the
      Puerto Rico 21st CCLC program office, it did not have sufficient staff to review subgrantee
      documentation and validate the reported data. In addition, the written policies, procedures, and
      instruments that Puerto Rico used for fiscal and programmatic monitoring visits did not include
      steps to review a sample of documents to test the accuracy, reliability, and completeness of the
      subgrantees’ reported program performance data. In response to our finding, Puerto Rico
      stated that it was reviewing the fiscal and programmatic monitoring guides and instruments to
      determine the steps it needed to add to verify the accuracy, reliability, and completeness of the
      program performance data its subgrantees reported through PPICS. By updating the
      monitoring instrument to include data verification, Puerto Rico would shift the responsibility
      for review from the 21st CCLC program office to Puerto Rico’s Office of Federal Affairs
      monitoring unit.

      Florida provided a data collection instrument that assisted its subgrantees in collecting and
      reporting the profile and performance data they needed to complete Annual Performance
      Reports submitted through PPICS at the end of the year. Florida required its subgrantees to
      submit the data and information demonstrating progress in a midyear report, an end-of-year
      report, and a summative evaluation report. Florida also required its subgrantees to submit
      monthly reports on the participating students’ average daily attendance and the number of
      hours of operation; further, Florida required attendance lists and sign-in sheets in support of the
      information the subgrantees reported. According to the Director of Florida’s 21st CCLC
      program, monitoring the average daily student attendance data, rather than enrollment, allowed
      Florida to assess the level of funding the subgrantees needed to provide the services. In
      addition, Florida developed a calendar of evaluative visits to selected subgrantees based on a
      number of risk factors (for example, year of program funding and findings from previous years’
      evaluations). During these visits, staff reviewed specific, tangible evidence, such as results of
      pre- and post- tests administered to participant students and student report card grades, that the
Final Audit Report
ED-OIG/A04L0004                                                                             Page 7 of 33

       subgrantees collected and used to report on program performance. According to the Director of
       Florida’s 21st CCLC program, Florida compared the Annual Performance Report data with the
       data subgrantees reported to Florida throughout the year and to observations obtained from the
       site visits. Florida provided us 33 evaluation reports completed in FY 2010 and 70 evaluation
       reports completed in FY 2011. We found that Florida’s process provided adequate validation
       and could be presented to other SEAs as a promising practice for overseeing reported
       performance data.

The GPRA, as amended by the GPRA Modernization Act of 2010, requires agencies to clarify their
missions, set strategic and annual performance goals, and measure and report on performance towards
those goals. According to the U.S. Government Accountability Office’s Standards for Internal Control
in the Federal Government, internal control plays a significant role in helping managers achieve those
goals.

The Department’s monitoring instrument for SEAs includes a step to assess whether the SEAs
monitored subgrantees to ensure that PPICS data were accurate and submitted on time. However, that
step lacked specificity to adequately assess the SEAs’ monitoring efforts. Consequently, the most
recent monitoring reports that the Department issued to Alabama, Mississippi, and Puerto Rico did not
identify deficiencies with the subgrantees’ written policies, procedures, and monitoring instruments
related to validating the reported 21st CCLC performance data.

Without sufficient oversight of the accuracy, reliability, and completeness of the SEAs’ reported
21st CCLC program performance data through PPICS, the Department risks using inaccurate,
unreliable, or incomplete information to determine the overall success of the 21st CCLC program,
SEAs’ progress in achieving program performance objectives, and technical assistance needs of SEAs
to ensure the successful program implementation.

The Department Should Have Comprehensive Written Policies and Standard Operating
Procedures for Monitoring

Although the Department implemented oversight activities, it did not have written comprehensive
standard operating procedures for monitoring the 21st CCLC program. The AIPG had written
procedures in the form of a monitoring logistics guide, which outlined the roles and responsibilities of
the Department staff and a contractor for planning and conducting monitoring site visits to all SEAs
once during a 3-year cycle and issuing monitoring reports. The AIPG also had monitoring
instruments, which were tools with questions and steps to be completed during monitoring. However,
the Department did not have written policies and standard operating procedures for coordinating and
conducting key monitoring activities, including approving or rejecting CAPs submitted by SEAs,
notifying the SEAs of the acceptance or rejection of the CAP, and providing technical assistance to the
SEAs. Further, the Department did not have written procedures for other key monitoring activities that
it conducted, including
   •   calling SEAs quarterly to discuss their training, technical assistance, and monitoring activities
       for subgrantees and the States’ evaluation of the program;
   •   reviewing SEAs’ drawdown activity of 21st CCLC program funds during the quarter;
Final Audit Report
ED-OIG/A04L0004                                                                                              Page 8 of 33

      •    reviewing SEAs’ grant award competition data submitted through PPICS; 7 and
      •    evaluating CAPs in response to 21st CCLC program-related findings reported in the U.S.
           Office of Management and Budget Circular A-133 single audit reports. 8

Both the Acting Director of the Academic Improvement and Teacher Quality Programs and the former
AIPG Group Leader acknowledged the need to develop and implement written policies and standard
operating procedures governing the monitoring process. According to the U.S. Government
Accountability Office’s Standards for Internal Control in the Federal Government, internal control is a
major part of managing an organization, and written policies and standard operating procedures are
instrumental components of effective internal control. Without such policies and standards for
monitoring SEAs and for ensuring adherence to those policies and standards, the Department’s ability
to consistently monitor and improve the operational efficiency of the 21st CCLC program is limited.

RECOMMENDATIONS

We recommend that the Assistant Secretary for OESE require the AIPG to—

1.1       Ensure that SEAs implement written policies, procedures, and monitoring instruments to
          sufficiently test the 21st CCLC performance data and provide reasonable assurance of the
          accuracy, reliability, and completeness of data reported to the Department.

1.2       Revise its SEA site visits monitoring instrument or develop complementary written guidance to
          sufficiently evaluate SEA monitoring activities over the reliability of the performance data
          reported to the Department; and provide training to 21st CCLC staff on the implementation of the
          revised instrument or guidance.

1.3       Identify promising practices from the Department’s monitoring visits and communicate those
          practices to all SEAs.

1.4       Develop and implement written policies and standard operating procedures for coordinating and
          conducting key monitoring activities, including approving or rejecting CAPs submitted by SEAs,
          notifying the SEAs of the acceptance or rejection of the CAP, and providing technical assistance
          to the SEAs.

Department Comments

The Department agreed with Finding No. 1, agreed with Recommendations 1.1 and 1.3, disagreed with
Recommendation 1.2, and partially agreed with Recommendation 1.4. In response to
Recommendation 1.1, the Department proposed additional training to its staff to ensure that SEAs
implement written policies, procedures, and monitoring instruments to sufficiently test 21st CCLC
data. In response to Recommendation 1.3, the Department acknowledged that it would be useful to
collect, review, and share promising practices on performance data quality. As such, the Department

7
  SEAs report about the outcomes of State-level grant competitions—the number of applicants for each competition and the
number and amounts of grants awarded.
8
  All non-Federal entities that expend $500,000 or more of Federal awards in a year are required to obtain an annual audit in
accordance with the Single Audit Act of 1984, as amended in 1996; and the Office of Management and Budget Circular A-
133. The Post Audit Group within the Department’s Office of the Chief Financial Officer is responsible for following up
on and resolving single audit findings.
Final Audit Report
ED-OIG/A04L0004                                                                           Page 9 of 33

proposed developing and implementing a plan to efficiently and effectively collect and disseminate
promising practices to SEAs, and to identify compliance issues for monitoring.

The Department disagreed with Recommendation 1.2 and stated that its existing monitoring
instruments already include a step that requires (1) addressing SEA monitoring activities over
subgrantees to ensure that PPICS data are submitted accurately and on time and (2) reviewing the
SEAs’ documentation, procedures, guidance, and sanctions for noncompliance. According to the
Department, its proposed plan to address Recommendation 1.1 also addresses Recommendation 1.2.
Specifically, the Department proposed additional training to the 21st CCLC program office staff,
including providing examples of what constitutes sufficient testing, and providing more detailed,
written guidance articulating how subgrantee monitoring plays a critical role in ensuring performance
data reliability.

The Department partially agreed with Recommendation 1.4 and stated that the existing 21st CCLC
monitoring instruments provide guidance regarding the types of evidence that should be reviewed and
the factors that should be considered in determining compliance. However, it agreed that more
detailed guidance on how to assess the evidence and address issues related to inadequate evidence
would be useful. The Department further agreed that its monitoring teams could benefit from clear,
written guidance on activities such as reviewing SEAs’ CAPs and assessing the need for, and
providing appropriate, technical assistance to the SEAs. The Department proposed revising its
monitoring instruments and procedures based on the results of an analysis of its monitoring site visit
findings. The Department also agreed that it needed to provide technical assistance to SEAs about its
CAP process.

OIG Response

In response to the Department’s comments to the draft audit report and its proposed corrective actions,
we revised Recommendation 1.2. Specifically, we acknowledged the Department’s proposal of
training as an option for focusing on the SEAs’ monitoring efforts over PPICs data reliability. We
agree that using the Department’s current monitoring instrument, a sufficiently trained reviewer could
reasonably assess the SEAs’ monitoring efforts over the validation of the reported 21st CCLC
performance data. However, because written guidance will facilitate a reviewer’s evaluation of SEA
monitoring activities, we did not make any other changes to the finding or related recommendations.
The Department’s implementation of the proposed corrective actions in response to our finding and
related recommendations should help the Department improve its oversight of the 21st CCLC program
performance data.

FINDING NO. 2 – The Department Can Improve Oversight of SEAs’ Processes to
               Award and Monitor 21st CCLC Program Subgrants

The Department monitored the SEAs’ processes to award and monitor subgrants, and its monitoring
efforts identified deficiencies and reported those deficiencies to the SEAs. However, we found
unreported deficiencies in the subgrant award process and the subgrantee monitoring efforts. Further,
we found that the Department’s monitoring instrument did not include key steps to assess whether
SEAs carried out key steps in the subgrant award process. As a result, the Department can improve its
oversight of the SEAs and provide greater assurance that SEAs have adequate processes and controls
for awarding and monitoring 21st CCLC subgrants. In addition, in FY 2012 site visits to Maryland
and North Dakota, the Department identified similar deficiencies to those included in this report,
Final Audit Report
ED-OIG/A04L0004                                                                                           Page 10 of 33

indicating that the deficiencies we identified in SEAs’ processes to award and monitor subgrants are
not limited to the States we reviewed. The prevalence of these deficiencies warrant the Department’s
follow-up and continuous oversight of the 21st CCLC program funds. 9

SEAs’ Process to Award 21st CCLC Program Subgrants Need to be Improved

We identified deficiencies in the process used to award 21st CCLC program subgrants at all four of the
SEAs reviewed. 10 Specifically, we found

    •   internal control weakness in the peer reviewer selection process at Alabama, Florida, and
        Mississippi;
    •   inaccurate application scores used by Alabama to award subgrants to new subgrantees;
    •   deficiencies in Puerto Rico’s grant application assessment process; and
    •   lack of supporting documentation for one of five peer reviewers’ scores used to award 1 of the
        87 subgrants at Mississippi.

The Department’s monitoring instrument included steps to assess whether the SEAs awarded subgrants
to eligible entities on a competitive basis and established and implemented a peer review process for
awarding grants. However, the Department’s monitoring instrument did not include steps to assess
whether SEAs (1) verified the peer reviewers’ educational qualifications and professional experience,
(2) implemented processes for ensuring accurate scoring and ranking of subgrant applications,
(3) maintained sufficient documentation to support the award of subgrants, (4) used scoring rubrics or
checklists that verified grant applications’ compliance with State-established eligibility requirements,
and (5) resolved monitoring findings with subgrantees. Lacking these specific steps, the Department’s
monitoring efforts had not previously identified deficiencies in the award process at the SEAs we
reviewed, with the exception of the internal control weaknesses in Alabama’s peer reviewer selection
process. The Department used the same monitoring protocol in 2012.

Internal Control Weaknesses in the Peer Reviewer Selection Process. Alabama, Florida, and
Mississippi did not sufficiently verify the peer reviewers’ educational qualifications and professional
experience before selecting them to evaluate and score 21st CCLC grant applications. 11

        Alabama’s Request for Application for FY 2011 required entities that applied for 21st CCLC
        grants to submit the name and contact information of a person to be a peer reviewer. Alabama
        preferred that the entities selected an individual with some knowledge of the 21st CCLC
        program and the grant reviewing process. Subsequently, Alabama approved and selected all
        the individuals referred by the grant applicants as peer reviewers without verifying the peer
        reviewers’ educational qualifications and professional experience or requiring any form of

9
  In May 2012, the Department reported that the Maryland State Department of Education did not have a written monitoring
plan and the monitoring process in effect did not adequately integrate technical assistance to subgrantees. In August 2012,
the Department reported that the North Dakota Department of Public Instruction did not provide evidence that 21st CCLC
subgrants were awarded on a competitive basis, failed to conduct thorough outreach to eligible entities about the award
competition, and did not provide written documentation of a comprehensive and detailed peer review process for selecting
applications.
10
   The four States reviewed were judgmentally selected. See the “Objective, Scope, and Methodology” section of the report
for more detail on the selection process.
11
   States select their own peer reviewers based on each State’s selection criteria.
Final Audit Report
ED-OIG/A04L0004                                                                                      Page 11 of 33

        assurances from the peer reviewers that they had no potential conflicts of interest. We
        reviewed expenditure information for 5 of 93 Alabama subgrantees and found that
        2 subgrantees paid the peer reviewers whom the subgrantees had referred. 12 Although the peer
        reviewers did not evaluate and score the applications of the subgrantees from which they
        received payment, they scored applications that were in competition with their employer’s
        application. In response to our finding, Alabama stated that it had implemented updated
        policies and procedures that ensure that Alabama obtains evidence of the peer reviewers’
        qualifications and professional experience, such as a resume or curriculum vitae; assesses each
        peer reviewer’s qualifications; and obtains conflict of interest assurances from peer reviewers.

        Florida required peer reviewer applicants to complete a Web-based peer reviewer profile form
        that required the applicants to describe their experience in education or a related field and their
        familiarity with Federal education programs and program reviews. Florida selected the peer
        reviewers who (1) met the desired expertise and experience based on information submitted on
        their applications, (2) completed a conflict of interest form, and (3) completed mandated online
        training. However, Florida officials acknowledged that they did not require peer reviewer
        applicants to provide evidence of their educational qualifications and professional experience.
        We reviewed the documentation supporting the information for 5 of Florida’s 147 peer
        reviewers. For one of the five, Florida could not locate the conflict of interest form to verify
        the documentation supporting compliance with Florida’s qualification criteria. We did not find
        any subgrantee payments to Florida’s peer reviewers in our review of expenditure information
        for 1 of 121 Florida subgrantees. In response to our finding, Florida stated that it did not agree
        that it was necessary to require evidence of peer reviewers’ qualifications and professional
        experience because many were known to Florida or had been recommended by reliable sources
        and because Florida did not pay its peer reviewers for their services. In response to our finding
        that a conflict of interest form was missing, Florida stated that in FY 2012, it implemented an
        online system to collect and upload the peer reviewers’ conflict of interest forms.

        Mississippi’s 21st CCLC program coordinator stated that peer reviewers were selected from a
        pool of service providers. To be added to the pool of peer reviewers, applicants submitted a
        resume and an application; Mississippi reviewed the information submitted in the resume and
        application, and the deputy superintendent’s office approved the addition to the pool of
        reviewers. According to the coordinator, the selection of peer reviewers for the 21st CCLC
        program application review gave preference to applicants who had experience with after-school
        programs, school improvement, teaching, and school administration and to personnel from
        community and faith-based organizations. Although peer reviewers were required to sign a
        confidentiality agreement and conflict of interest statement, Mississippi officials acknowledged
        that they did not verify or obtain evidence of the peer reviewers’ educational qualifications and
        professional experience. In addition, Mississippi did not have written policies and procedures
        that specified the method for entry to the pool of 21st CCLC program peer reviewers and the
        minimum qualifications (selection criteria) wanted from the peer reviewers. In response to our
        finding, Mississippi provided a revised written peer reviewer selection process with added
        procedures that included (1) peer reviewer selection criteria, (2) a requirement for verification
        of teachers’ licenses in a Mississippi database, as applicable, and (3) a requirement for
        contacting peer reviewers’ references for verification of qualifications and experience.


12
  We reviewed information of five Alabama subgrantees’ expenditures to determine whether any of the peer reviewers who
participated in the grant award competition received payments from the grant applicants.
Final Audit Report
ED-OIG/A04L0004                                                                                            Page 12 of 33

Title IV, Part B, Section 4204(e) of the ESEA requires SEAs to use a peer review process or other
methods to ensure the quality of their review of applications for subgrants. The Department’s
21st CCLC Non-Regulatory Guidance, Section F-26, encourages SEAs to seek qualified individuals to
review applications and to consider soliciting potential reviewers from a large array of organizations to
develop a pool of highly-qualified reviewers and thereby ensure that quality applicants are chosen as
grantees. The 21st CCLC Guidance also suggests that SEAs consider potential conflicts of interest that
may arise in selecting peer reviewers.

Inaccurate Application Scores to Award Grants to New Subgrantees. Alabama did not have
sufficient controls to ensure the accuracy of the scoring process used to award the 21st CCLC grants to
new subgrantees for FY 2011. Out of 96 entities that applied for new 21st CCLC grants in FY 2011,
51 received grant awards. We reviewed the calculation of scores 13 for 13 of the 96 applications (7 of
the 13 reviewed received grants) and found that the scores for 5 of the 13 applications were not added
correctly. Two of the five applications with incorrect scores received grants. However, the scoring
errors did not affect the overall results of the grant award competition. Without sufficient controls
over the scoring of applications, Alabama risks awarding future funds to grantees that may not be the
most qualified.

We also found that one of the three entities that did not receive a grant award received a low score (one
out of eight possible points) in 1 of 18 sections of the application. Alabama did not provide the
sustainability section of the application to one of the three peer reviewers. The peer reviewer who did
not receive the complete application scored the section even though it was not provided, giving it a
score of one out of eight possible points. The other two peer reviewers reviewed the sustainability
section and scored it higher (one reviewer gave it a six and the other gave it a seven out of eight
possible points). Alabama did not select the application for a grant based on the application’s overall
score, which missed the minimum average score needed to be eligible for an award by five points. If
the peer reviewer had received the complete application and scored the sustainability category
consistent with the other two readers, the applicant may have been eligible to receive the grant.

According to 34 C.F.R. Section 76.770, States must have procedures for reviewing and approving
applications for subgrants, and according to Section 76.731, States and subgrantees must keep records
to show compliance with program requirements.

In response to our finding, Alabama stated that it will implement updated grant award procedures in
the next 21st CCLC grant award competition. The updated procedures will include the electronic
calculation of application scores, and Alabama will recalculate the electronic scores for a sample of
applications to ensure accurate scores.

Deficiencies in the Applications’ Assessment Process. Puerto Rico did not comply with its guidelines
for awarding 21st CCLC grants; as a result, it approved an application for a grant that did not meet the
established eligibility requirements. Puerto Rico’s 21st CCLC Program Guide provided instructions
for conducting an assessment of subgrant applications’ eligibility. According to the Guide, grant
applications that propose the purchase of brand-name equipment or material from an exclusive
distributor will be rejected. However, we found that in FY 2010, Puerto Rico approved an application

13
  To award 21st CCLC grants to eligible entities for the FY 2011 competition, Alabama used teams of three peer reviewers
to score and rank all applications. Each team of peer reviewers received three applications, and each member of the team
reviewed and scored the three applications. Each category in an application was given points on a scoring sheet to arrive at
a cumulative score per application, known as the Absolute Criterion Score.
Final Audit Report
ED-OIG/A04L0004                                                                                       Page 13 of 33

proposing the purchase of brand-name products—800 brand-name software licenses at a cost of
$10,250 and 165 brand-name handheld devices at a cost of $98,175, representing about 27 percent of
the $396,745 proposed project cost. As such, the subgrantee’s application should have been rejected
without further consideration based on Puerto Rico’s established criteria.

The checklist that Puerto Rico used to assess applications did not include a step to assess whether the
applications included a proposal for the purchase of brand-name products from an exclusive
distributor. In response to our finding, Puerto Rico stated that it included steps in a revised application
assessment checklist to determine whether grant applicants complied with the eligibility requirements.
However, we reviewed the revised checklist and did not find a step to assess whether the application
proposed the purchase of brand-name products from an exclusive distributor.

Lack of Supporting Documentation for a Peer Reviewer’s Score. Mississippi awarded 21st CCLC
grants to eligible entities for the FY 2011 competition using a peer review process that consisted of
scoring and ranking grant applications. However, Mississippi did not maintain documentation of one
of five peer reviewer’s score for one of six funded grant applications we reviewed. 14 In response to
our finding, Mississippi stated that the peer reviewers will begin to input their scores into electronic
scoring rubrics, and Mississippi’s 21st CCLC program office will maintain electronic files of all
scoring rubrics so the documentation will be available when needed.

SEAs’ Processes to Monitor 21st CCLC Program Subgrants Need to be Improved

We found deficiencies in Alabama, Mississippi, and Puerto Rico’s monitoring processes. The
Department also identified and included in monitoring reports the deficiencies discussed below, with
the exception of the monitoring deficiencies we found at Puerto Rico. At the time of our review, the
Department was in the process of either obtaining SEA responses and CAPs or evaluating the ones
received.

        Alabama did not have a formal monitoring plan for all types of entities that received
        21st CCLC subgrants. In its March 2011 monitoring report, the Department reported that
        Alabama’s monitoring plan included monitoring of each LEA subgrantee once every 3 years as
        a part of Alabama’s overall Federal Programs monitoring process. However, it did not include
        monitoring of non-LEA subgrantees such as community-based organizations, and other public
        and private entities that received 21st CCLC subgrants from Alabama. According to PPICS, of
        the 51 Alabama subgrantees that received new grant awards in FY 2011, 46 were LEAs or
        school districts, 4 were community-based organizations, and 1 was a faith-based organization.
        Of the 42 Alabama subgrantees that received continuation grant awards, 34 were LEAs or
        school districts, 7 were community-based organizations, and 1 was a nonprofit organization. In
        an April 2011 response to the Department’s monitoring report, Alabama proposed
        implementing a revised monitoring plan that included (1) onsite monitoring for each
        subgrantee, including non-LEA subgrantees, on a rotating 3-year cycle starting in FY 2012, and
        (2) remote reviews of subgrantees during the years of the cycle when onsite monitoring does
        not occur.

        Mississippi did not conduct regular systematic monitoring of its 21st CCLC subgrantees in
        FY 2011. Mississippi’s “Consolidated Federal Program Monitoring Protocol” applied to Title I

14
  We reviewed 6 of 19 applications that Mississippi awarded grants in FY 2011. Five peer reviewers reviewed and scored
each application for a 21st CCLC subgrant.
Final Audit Report
ED-OIG/A04L0004                                                                                              Page 14 of 33

         grants. The Protocol required monitoring of all 87 subgrantees in FY 2011. In a January 2011
         monitoring report, the Department reported that Mississippi did not have a monitoring process
         that was specific to the 21st CCLC program. In response to the Department, Mississippi
         revised its monitoring policies and procedures to include a monitoring plan and protocol
         specific to the 21st CCLC program. The revised monitoring plan still required Mississippi to
         monitor all 87 subgrantees in FY 2011. However, Mississippi monitored only 14 of the 87
         subgrantees. Mississippi did not dispute that it had not completed all planned FY 2011
         monitoring, but stated that it was on schedule to complete its FY 2012 monitoring activities in
         accordance with its policies, procedures, and monitoring plan, and it provided a summary of the
         activities completed. The summary indicated that Mississippi had completed 36 monitoring
         activities for FY 2012 as of March 2012.

         Puerto Rico monitored subgrantees but did not resolve monitoring findings within required
         time frames with a subgrantee to ensure that corrective actions were implemented to address
         findings in a monitoring report. In June 2011, Puerto Rico reviewed one of the subgrantee’s
         FY 2010 invoices and issued a monitoring report in July 2011 requiring the subgrantee to
         submit its comments or a CAP within 30 days. 15 Puerto Rico awarded a total of $590,683 to
         the subgrantee in FY 2010 and $753,842 in FY 2011 for the continuation of the project. After
         we discussed the issue with Puerto Rico officials, Puerto Rico sent a follow-up email to the
         subgrantee in November 2011. The subgrantee responded to Puerto Rico’s findings in
         November 2011, but as of February 2012, the monitoring report’s findings had not been fully
         resolved. The Department did not identify the deficiency in Puerto Rico’s monitoring process
         in its August 2011 onsite monitoring report. In response to our finding, Puerto Rico stated that
         it will include in its monitoring guides and instruments a specific timeline for subgrantees to
         respond to monitoring reports, and will inform subgrantees of the possible consequences of not
         responding within the specified time, which could include the return of funds.

         We also found that Puerto Rico did not conduct a comprehensive Statewide evaluation of the
         effectiveness of the 21st CCLC program and activities implemented during FY 2011. The
         Department issued a monitoring visit report to Puerto Rico in August 2011 and, based on a
         Department’s finding that the program evaluations Puerto Rico’s contractor conducted in
         FY 2010 were inadequate, Puerto Rico delayed action to determine what corrective actions
         were required so that it could implement those actions in a new external program evaluation
         contract. As a result, Puerto Rico did not evaluate the effectiveness of the FY 2011 21st CCLC
         program. In its September 2011 response to the Department’s monitoring report, Puerto Rico
         submitted a CAP stating that it was in the process of procuring a new contractor for external
         program evaluations. In response to our finding, Puerto Rico provided documentation
         indicating that in March 2012 it initiated an internal process for contracting an external
         evaluator. However, as of March 2013, Puerto Rico had not awarded a contract.

In addition, we found that Florida has reasonable processes and controls to monitor subgrants.
However, we found that Florida can improve on the timeliness of its monitoring reports. (See the
“Other Matters” section in this report for more details.) Florida used a tiered approach for monitoring
subgrantees with a focus on two types of monitoring activities: (1) subgrantees’ self-evaluations and
(2) onsite and desktop monitoring. Florida reviewed self-evaluations conducted by all Florida’s

15
   The subgrantee is the same one that included brand-name products in its application, as discussed earlier in this finding.
One of Puerto Rico’s monitoring findings was that the subgrantee had purchased 50 units of brand-name handheld devices
at a cost of $50,000 without obtaining three quotes.
Final Audit Report
ED-OIG/A04L0004                                                                                         Page 15 of 33

subgrantees using a set of documents called work papers 16 that assessed the subgrantees’ level of
compliance with the requirements for 21st CCLC funding. Florida conducted onsite 17 and desktop 18
monitoring for a sample of subgrantees selected based on a risk analysis. Florida completed
10 desktop monitoring reviews and 10 onsite monitoring site visits in FY 2011. In addition, Florida
evaluated the quality and effectiveness of Florida’s 21st CCLC programs by (1) analyzing qualitative
and quantitative data submitted by all subgrantees through completion of data collection instruments
and (2) site visits to 21st CCLC programs (subgrantees’ community learning centers) selected based on
a number of risk factors.

According to 34 C.F.R. Section 80.40(a), grantees are responsible for managing the day-to-day
operations of grant and subgrant supported activities covering each program, function, or activity,
including subgrant supported activities, to ensure compliance with applicable Federal requirements and
that grantees and subgrantees are achieving performance goals.

Title IV, Part B, Section 4203(a)(13) of the ESEA requires a State to describe the process it will use to
evaluate the effectiveness of programs and activities carried out with 21st CCLC program funds.
According to the Department’s 21st CCLC Non-Regulatory Guidance, Section H-5, States must
conduct a comprehensive evaluation (directly or through a grant or contract) of the effectiveness of
programs and activities provided with 21st CCLC funds.

RECOMMENDATIONS

We recommend that the Assistant Secretary for OESE require the AIPG to—

2.1   Provide sufficient monitoring and oversight of SEAs’ processes to award and monitor 21st CCLC
      formula grants to subgrantees, including assessing the qualifications of peer reviewers, evaluating
      and scoring grant applications, and maintaining sufficient documentation to support the award of
      subgrantees.

2.2   Enhance monitoring of SEAs receiving 21st CCLC formula grants to ensure they develop and
      implement sufficient policies, procedures, instruments, and plans that allow the SEAs to
      promptly resolve monitoring findings with subgrantees.

2.3   Assess whether additional technical assistance would benefit SEAs in the areas of peer reviewer
      selection, processes for awarding 21st CCLC grants, subgrantee monitoring, and conducting
      program evaluations.

Department Comments

The Department agreed with Finding No. 2, partially agreed with Recommendation 2.1, and agreed
with Recommendations 2.2 and 2.3. In response to Recommendation 2.1, the Department stated its
monitoring instrument and written guidance to 21st CCLC program staff required that they review

16
   The monitoring and self-assessment work papers are a set of questions that addresses general compliance with the
requirements for 21st CCLC funding.
17
   Onsite monitoring is an in-depth review of both documentation and procedures that support grant activities within the
organizational structure of the subgrantee.
18
   Desktop monitoring is a remote (desktop) review of documentation requested from the subgrantees, generally addressing
identified risk factors or areas that have been identified as common weaknesses through prior monitoring activities.
Final Audit Report
ED-OIG/A04L0004                                                                              Page 16 of 33

various elements related to the SEAs’ selection of peer reviewers and their qualifications, including
criteria for selecting peer reviewers, the vetting process for assessing the peer reviewers’ qualifications,
and the SEAs’ peer reviewers’ comments to ensure that they support the scores. However, the
Department proposed strengthening its monitoring instrument by adding a probe to look into the peer
reviewers’ educational qualifications and professional experience. The Department also agreed to
examine ways to sufficiently sample, test, and provide technical assistance on common problems with
State scoring and evaluation documentation.

The Department agreed with Recommendation 2.2 and proposed revising its monitoring instrument, as
necessary, to ensure rigorous oversight of SEAs’ policies, procedures, instruments, and plans to ensure
that the SEAs promptly resolve monitoring findings with subgrantees. The Department also agreed
with Recommendation 2.3 and proposed developing a plan to conduct a needs assessment to identify
technical assistance needs specific to program requirements and refine the technical assistance
guidance offered to SEAs.

OIG Response

We did not make any changes to the finding, but made a clarifying change to Recommendation 2.1
based on the Department’s comments. As stated in the finding, the Department’s monitoring
instrument included a question that asked whether the SEA established and implemented a peer review
process for awarding grants. To answer that question, the monitoring instrument called for the review
of certain documents, including the criteria for selecting the peer reviewers and peer reviewers’
comments. However, the Department’s monitoring instrument did not include steps to test to what
extent the SEAs verified the peer reviewers’ educational qualifications and professional experience.
We modified our recommendation to clarify that we are recommending that the Department’s
monitoring and oversight procedures should include a review of how states are assessing peer
reviewers’ qualifications. The Department’s implementation of the proposed corrective actions in
response to our finding and related recommendations should help the Department improve its oversight
of the SEAs’ processes to award and monitor 21st CCLC program subgrants.



                                        OTHER MATTERS


The Department Should Evaluate its Performance Standard for Issuing Monitoring Reports

The Department did not meet its performance standard in issuing site visit monitoring reports to the
four SEAs we reviewed. The report issuance ranged from 5 to 109 days beyond the 45 business day
standard in the Department’s Guide. In addition, the Department did not officially notify the SEAs
that it could not send them monitoring reports within the established time frames. The Department’s
Monitoring Logistics Guide states that the team leader for the 21st CCLC program should mail the
final monitoring reports to the SEAs within 45 business days of the last day of the site visits. The
Guide also states that the Department should notify the SEAs if the team leader cannot send the reports
within the 45 business days due to legal or other issues. However, the 45-day standard in the Guide is
inconsistent with the established GPRA performance target for FY 2010 that was 40 days. In its
Final Audit Report
ED-OIG/A04L0004                                                                                               Page 17 of 33

FY 2011 GPRA indicators, the established target was 35 days. The GPRA indicators are intended to
measure the Department’s progress in achieving the objective of improving the operational efficiency
of the 21st CCLC program. As a result, even if the Department met the standard established in its
Guide for issuing monitoring reports to SEAs during the two fiscal years, the Department would not
have met its GPRA performance target.

According to the former AIPG group leader, the delays in issuing the monitoring reports to the SEAs
were primarily due to limited personnel. Delays in issuing site visit monitoring reports and notifying
the SEAs may result in delays in the SEAs implementing corrective actions needed to achieve program
goals and ensure effective administration of 21st CCLC grants in compliance with applicable laws and
regulations.

We suggest that the Assistant Secretary for OESE evaluate the current GPRA performance target for
issuing site visit monitoring reports and determine whether the target is reasonable or should be
adjusted to reflect the process necessary to issue a report.

In its response to the draft report, the Department stated it is coordinating the monitoring and technical
assistance across program staff and tracking completion and report dissemination dates. This effort
will help the Department determine whether the 45-day standard for monitoring report issuance is
reasonable and identify any substantive barriers in achieving the standard.

The Department Should Encourage SEAs to Issue Timely Monitoring Reports

Florida issued untimely final onsite monitoring reports to four of seven subgrantees for which it
conducted onsite monitoring during FY 2010. 19 Florida’s 21st CCLC Policy, Monitoring, and
Compliance Unit Standard Operating Procedures required Florida to issue final monitoring reports to
subgrantees within 45 calendar days of the date of the preliminary report, or 45 calendar days from the
date of receiving the subgrantee’s request for reconsideration of findings. However, we found that
Florida issued the final onsite monitoring reports to the four subgrantees from 25 to 83 days beyond
the established targeted calendar days.

We suggest that the Assistant Secretary for OESE encourage SEAs to communicate monitoring
findings timely to subgrantees to facilitate corrective actions needed to ensure that subgrantees spend
funds for their intended purposes and achieve program goals.

In its response to the draft report, the Department stated that it hopes that renewed efforts to meet its
own 45-day reporting standard and future SEA technical assistance opportunities will enhance existing
monitoring and oversight efforts and serve to encourage SEAs to issue timely monitoring reports.




19
     Florida’s monitoring activities included onsite monitoring to 7 of its 131 subgrantees during FY 2010.
Final Audit Report
ED-OIG/A04L0004                                                                                            Page 18 of 33



                           OBJECTIVE, SCOPE, AND METHODOLOGY


The objectives of the audit were to (1) determine whether the Department effectively monitored and
tracked program performance measures for 21st CCLC grantees to ensure that grantees met program
objectives and (2) assess the processes and controls that four selected SEAs used to award and monitor
subgrants.

To evaluate the Department’s monitoring and tracking of 21st CCLC performance measures, we
reviewed the Department’s monitoring of SEAs during FY 2011, from October 1, 2010, through
September 30, 2011. In addition, we reviewed the most recent monitoring reports the Department
issued to the four selected SEAs (the reports were dated from November 30, 2009 through August 18,
2011). We also reviewed the SEAs’ responses to the monitoring reports and the Department’s
acceptance or rejection of the proposed corrective actions as of November 21, 2011. Further, we
reviewed the Department’s monitoring reports to Maryland issued in May 2012 and to North Dakota
issued in August 2012 because the Department identified significant issues in both States’ processes
for awarding and monitoring subgrants relevant to our work on the 21st CCLC program.

To evaluate the SEAs’ processes to award and monitor subgrants, we judgmentally selected four
SEAs—Alabama, Florida, Mississippi, and Puerto Rico. We selected Alabama and Florida based on
potential risks identified through available information on ongoing related work, and Mississippi and
Puerto Rico based on recommendations made by the Department due to issues it identified in
monitoring site visits. We reviewed the processes the selected SEAs used to award and monitor
subgrants during the FY 2011 period. We expanded our scope at Florida and Puerto Rico based on
circumstances related to those SEAs’ grant award competitions. Specifically, Florida did not perform
a grant award competition for new subgrants during the period reviewed; as such, we expanded our
scope and reviewed the processes Florida used to award subgrants in FY 2010. In addition, we
conducted a limited review of the grant award processes Puerto Rico used in FY 2010 because 20 of its
FY 2010 subgrant applications were reevaluated in FY 2011 20 and because Puerto Rico issued a report
in FY 2011 that reported deficiencies that occurred in FY 2010.

To accomplish our objectives, we—
       •   Assessed the Department’s written policies, procedures, monitoring plans, and monitoring
           instruments for monitoring SEAs and tracking program performance measures.
       •   Reviewed monitoring reports the Department issued to the four selected SEAs and the
           proposed corrective actions for findings related to the SEAs’ grant award processes and
           subgrantee monitoring.
       •   Reviewed a contract the Department awarded to a third party to manage PPICS and analyze the
           performance data, and a contract the Department awarded to a third party to support the
           Department’s monitoring and evaluation activities for the 21st CCLC program.
       •   Gained an understanding of the PPICS’ internal data validation controls and of the processes
           the four selected SEAs used to validate PPICS data submitted by subgrantees.

20
     We had selected 2 of the 20 FY 2010 subgrant applications that Puerto Rico reevaluated in FY 2011 for our review.
Final Audit Report
ED-OIG/A04L0004                                                                          Page 19 of 33

   •   Assessed the SEAs’ written policies, procedures, and processes for awarding 21st CCLC
       program subgrants, requests for proposals, scoring rubrics, and application assessment tools.
   •   At each SEA, reviewed the records of the professional qualifications for a selection of peer
       reviewers, and interviewed some of the peer reviewers from Alabama and Puerto Rico. The
       selection criteria of peer reviewers’ records varied by SEA. (See Tables 2 through 5 below for
       more detail on the selection criteria used to select the peer reviewers’ records at each SEA.)
   •   Reviewed a selection of funded and unfunded subgrant applications submitted to the four SEAs
       and records showing peer reviewers’ scores given to those applications. The selection criteria
       of funded and unfunded applications varied by SEA. (See Tables 2 through 5 below for more
       detail on the selection criteria used to select the subgrant applications at each SEA.)
   •   Recalculated the scores peer reviewers gave to the applications reviewed at each SEA and
       verified that the scores were accurately calculated and adequately supported, and that SEAs
       ranked the applications consistent with the scores the applications received.
   •   At Alabama, Florida, and Puerto Rico, reviewed expenditure information for a judgmental
       selection of subgrantees. Our review was limited to verifying whether the subgrantees issued
       any payments to the peer reviewers who evaluated 21st CCLC subgrant applications during the
       grant award competition that the subgrantees participated in. The basis of the judgmental
       selection of subgrantees for reviewing their expenditure information varied by SEA. (See
       Tables 2 through 5 below for more detail on the selection criteria used to select the
       subgrantees’ expenditure information at each SEA.)
   •   Assessed the SEAs’ written policies, procedures, monitoring plans, and monitoring instruments
       for monitoring subgrantees.
   •   Reviewed monitoring and program evaluation reports issued by the SEAs, and the proposed
       corrective actions for subgrantees.
   •   Performed limited tests of the accuracy of selected data elements reported by two Alabama
       subgrantees in their FY 2011 Annual Performance Reports. We judgmentally selected
       Alabama to determine the effect of weaknesses in monitoring subgrantees. In conducting the
       tests on the accuracy of data reporting, we judgmentally selected the only two community-
       based organization subgrantees within the seven funded applications reviewed based on the
       availability of expenditure information at the SEA. For both subgrantees, we reviewed
       documents provided by the subgrantees in support of data elements reported in their Annual
       Performance Reports, including the total number of students who participated in the program
       during the reporting period, the number of students who participated in the program by grade
       level, the number of reported community partners who contributed to the program, and the
       number of paid staff. We also verified whether the two subgrantees met one of the program
       performance objectives each reported as met in their Annual Performance Reports.
   •   Interviewed Department AIPG officials with responsibility over the 21st CCLC program and
       from the four SEAs reviewed.
   •   Conducted a limited assessment of the Department’s and the SEA’s internal controls significant
       to our audit objectives.

We used electronic data for sampling purposes; we did not use electronic data to develop report
findings and conclusions. Specifically, we selected the examples of the items reviewed randomly and
Final Audit Report
ED-OIG/A04L0004                                                                             Page 20 of 33

judgmentally from nonstatistical samples of data provided by the States. We did not test the data
provided by the States for completeness, but conducted tests to assess the accuracy of the data as
outlined in the report. The results presented in this audit report are based on our review of the selected
samples and cannot be projected to the universe of the items reviewed. The universe of the various
items tested and the selection methodology is described in more detail by SEA in Tables 2 through 5.

                      Selection Methodology of Examples of Items Reviewed

                                           Table 2: Alabama
                                            Number of Examples
        Item                Universe              Selected                  Selection Methodology
                                                                      5 Random
Funded Subgrant                51                     7               2 Judgmental selections based on
Applications                                                          potential risks identified in our
                                                                      review of the grant awards.
                                                                      5 Random
Unfunded Subgrant              45                     6               1 Judgmental selection based on
Applications                                                          potential risks identified in our
                                                                      review of the grant award.
                                                                      10 Random
                                                                      4 Judgmental selections based on
Peer Reviewer                  96                     14              community-based organizations
Records                                                               with new subgrant referrals
                                                                      (5 referrals, but 1 was already
                                                                      included in the random sample).
                                                                      Judgmental selections based on
Expenditure                    51                     5               availability of expenditure
Information                                                           information for the community-
                                                                      based organizations.
Monitoring Reports             11                     5               Random
Program Evaluation
Reports                        93                     5               Random
                                                                      Judgmental selections of the
Subgrantees’ Annual
                                7                     2               community-based organizations in
Performance Reports
                                                                      our sample of funded applications.

                                            Table 3: Florida
                                            Number of Examples
         Item               Universe              Selected                 Selection Methodology
Funded Subgrant
Applications                   59                     3               Random
Unfunded Subgrant
Applications                   80                     3               Random
Peer Reviewer
Records                       147                     5               Random
Expenditure                    3                      1               Judgmental selection based on
Information                                                           largest amount of funds received.
                           FY 2010: 7              FY10: 1            Random
Monitoring Reports
                          FY 2011: 20              FY11: 2
Program Evaluation       FY 2010: 33              FY 2010: 2          Random
Reports                   FY 2011: 162            FY 2011: 2
Final Audit Report
ED-OIG/A04L0004                                                                             Page 21 of 33
                                          Table 4: Mississippi
                                           Number of Examples
         Item              Universe               Selected                Selection Methodology
Funded Subgrant              19                      3                            Random
Applications
Unfunded Subgrant             20                     3                            Random
Applications
Peer Reviewer
Records                       10                     2                            Random
Monitoring Reports             3                     3                             N/A

                                          Table 5: Puerto Rico
                                           Number of Examples
         Item              Universe               Selected                Selection Methodology
Funded Subgrant
Applications                  16                     3               Random
Unfunded Subgrant
Applications                  54                     3               Random
Peer Reviewer
                               6                     6               Entire Universe
Records
                                                                     Judgmental selection based on
Expenditure
                              77                     1               potential issues identified with one
Information
                                                                     subgrantee’s use of funds.
                                                                     3 Random
                         26 Site Visits         3 Site Visits        1 Judgmental selection based on
Monitoring Reports
                          18 Invoices            1 Invoice           potential issues identified with one
                                                                     subgrantee’s use of funds.
                        No information
                        available after              1               Judgmental selection based on
Program Evaluation         contract                                  potential issues identified with one
Reports                  expired with                                subgrantee’s use of funds.
                         the external
                          evaluator

During fieldwork, we visited the Department’s AIPG offices located in Washington, D.C., and
performed site visits to the four SEAs selected for review—Alabama on June 20–24, 2011; Florida on
June 7–10, 2011; Mississippi June 20–24, 2011; and Puerto Rico on June 20–24, 2011. We also held
exit conferences with the Department on March 26, 2012, and the four SEAs during the month of
March 2012, and discussed the results of our review. In addition, we obtained written comments from
the four SEAs on the preliminary audit results discussed during the exit conferences and summarized
their comments in the body of this draft audit report. We provided the Department with the
preliminary audit results for the four SEAs and copies of their written comments.

We conducted this performance audit in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate
evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives.
We believe that the evidence obtained provides a reasonable basis for our findings and conclusions
based on our audit objectives.
Final Audit Report
ED-OIG/A04L0004                                                                          Page 22 of 33



                              ADMINISTRATIVE MATTERS


Corrective actions proposed (resolution phase) and implemented (closure phase) by your office will be
monitored and tracked through the Department’s Audit Accountability and Resolution Tracking
System (AARTS). Department policy requires that you develop a final CAP for our review in the
automated system within 30 days of the issuance of this report. The CAP should set forth the specific
action items and targeted completion dates necessary to implement final corrective actions on the
findings and recommendations contained in this final audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector General
is required to report to Congress twice a year on the audits that remain unresolved after 6 months
from the date of issuance. Statements that managerial practices need improvements, as well as
other conclusions and recommendations in this report, represent the opinions of the Office of
Inspector General. Determinations of corrective action to be taken will be made by the appropriate
Department of Education officials.

In accordance with the Freedom of Information Act (5 U.S.C.§ 552), reports issued by the Office of
Inspector General are available to members of the press and general public to the extent information
contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please call
Denise Wempe at (404) 974-9416.

                                            Sincerely,

                                            /s/

                                            Patrick J. Howard
                                            Assistant Inspector General for Audit


Attachments
Final Audit Report
ED-OIG/A04L0004                                                                    Page 23 of 33


                                                                             Attachment 1

              Acronyms, Abbreviations, and Short Forms Used in This Report

21st CCLC            21st Century Community Learning Center

AIPG                 Academic Improvement Programs Group

Alabama              Alabama State Department of Education

CAP                  Corrective Action Plan

C.F.R.               Code of Federal Regulations

Department           U.S. Department of Education

ESEA                 Elementary and Secondary Education Act of 1965, as amended by the
                     No Child Left Behind Act of 2001

Florida              Florida Department of Education

GPRA                 Government Performance Results Act of 1993, as amended by the GPRA
                     Modernization Act of 2010

LEA                  Local Educational Agency

Mississippi          Mississippi Department of Education

OESE                 Office of Elementary and Secondary Education

OIG                  Office of Inspector General

PPICS                Profile and Performance Information Collection System

Puerto Rico          Puerto Rico Department of Education

SEA                  State Educational Agency

Title I              Title I, Part A of the ESEA
Final Audit Report
ED-OIG/A04L0004                                                                                          Page 24 of 33


                                                                                                  Attachment 2


                                            WRITTEN COMMENTS
                         FROM THE OFFICE OF ELEMENTARY AND SECONDARY EDUCATION
                                  IN RESPONSE TO THE DRAFT AUDIT REPORT,
                    U.S. DEPARTMENT OF EDUCATION'S AND SELECTED STATES' OVERSIGHT OF
                THE 21ST CENTURY COMMUNITY LEARNING CENTERS PROGRAM, EO-OIG/A04L0004
                                               May20, 2013

           The Office of Elementary and Secondary Education (OESE) appreciates the opportunity to
           provide written comments on the Draft Audit Report entitled, "U.S. Department of Education's
           and Selected States' Oversight ofthe 21st Century Community learning Centers Program," ED-
           OIG/A04l0004, dated March 21, 2013 (Draft Audit Report).

           Our comments will help clarify the additional steps we are taking to provide effective
           mon itoring and oversight of 21 51 Century Community Learning Centers program (21'1 CClC).
           We believe these steps will help ensure that the funds Congress appropriates for this grant
           progra m are used to significantly improve the quality of out-of-school t ime services that
           support student academic achievement in core subject areas.

          OESE's comments to this Draft Audit Report and our Corrective Action Plan follows. Any
          subsequent questions, comments, or concerns should be addressed to:

          Dr. Sylvia lyles
          Director, Academic Improvement and Teacher Quality
          U.S. Department of Education
          Office of Elementary and Secondary Education
          400 Maryland Avenue, SW
          Washington, DC 20202

          We appreciate your efforts in helping OESE continuously improve the operation of the 21'1
          Century Community learning Centers program.




                                                           1
Final Audit Report
ED-OIG/A04L0004                                                                                            Page 25 of 33


           FINDING NO. 1- The Department Can Improve Oversight of Program
           Performance Data

           Recommendation 1.1 - Ensure that SEAs implement written policies, procedures, and
           monitoring instruments to sufficiently test the 21" CCLC performance data and provide
           reasonable assurance of the accuracy, reliability, and completeness of data reported to the
           Department.

          Comments. OESE agrees with Finding 1, and agrees with Recommendation 1.1. The Draft Audit
          Report notes that the Department's existing monitoring instrument "assess[es] whether the
          SEAs monitored subgrantees to ensure that PPICS data were accurate and submitted on time"
          {page 7). Moreover, the existing monitoring instrument requires the monitoring team to review
          the SEAs' "1) written communication and guidance to sub grantees regarding PPICS collection
          efforts; 2) q\Jality assurance procedures for data collection; and 3) sanctions for subgrantees
          not in compliance." The protocol further prompts the monitoring team to consider how data
          quality is verified, whether it is verified for all subgrantees, and how often.

          Although we believe that our existing monitoring inst rument is adequate, to better ensure that
          SEAs implement written policies, procedures, and monitoring instruments to sufficiently test
          21' r CClC perform a nee data and provide reasonable assurance of its validity and completeness,
          we plan to provide additional training to program staff around assessing SEAs' efforts in this
          area.



          Recommendation 1.2- Add a step in its SEA site visits monitoring instrument to ensure that
          SEAs are including monitoring activities to sufficiently test the reliability of the performance
          data reported to the Department.

          Comments. OESE agrees with Finding 1 but disagrees with Recommendation 1.2. As noted in
          our response to Recommendation 1.1 above, the existing monitoring instruments already
          include a step that requires the monitor to address whether "the SEA monitor[s] subgrantees to
          ensure that the PPICS data are submitted accurately and on time" and to review the SEAs'
          documentation, procedures, guidance, and sanctions for non-compliance, as described in our
          response to Recommendation 1.1 above.

          However, consistent with our response to Recommendation 1.1, we plan to provide additional
          t raining to 21' 1 CCLC program staff that includes examples of what constitutes sufficient testing
          and more detailed, written guidance articulating how SEA:s' subgrantee monitoring plays a
          critical role in ensuring performance data reliability.



          Recommendation 1.3 -Identify promising practices from the Department's monitoring visits
          and communicate those practices to all SEAs.




                                                           2
Final Audit Report
ED-OIG/A04L0004                                                                                         Page 26 of 33



           Comments. OESE agrees with Finding 1, and agrees with Recommendation 1.3. While past
           mon~toring visits have identified SEA promising practices, we acknowledge that it would be
           useful to collect, review, and share SEA promising practices on performance data quality.

          It is important to note that we provide ongoing technical assistance to grantees on program
          requirements and implementation guidance, including information on promising practices.
          During 2012, our annual Summer Institute and quarterly SEA meetings -- led by the Academic
          Improvement and Teacher Quality (AIT Q) Program Director-- provided a forum for
          approximately two thousand 21st CCLC SEA coordinators, center directors, and subgrantee
          staffs to discuss and address issues of concern that the 21s1 CCLC program staff has identified
          t hrough monitoring. For example, one of the most recent meetings (Beyond School Hours
          Conference 2013) featured an overview, guidance, and question-and-answer period with SEA
          coordinators and representatives of local educational agencies (LEAs), community based
          organizations (CBOs), other subgrantees, and prospective applicants on program eligibility for
          LEAs and non-LEAs. A second example of how the Department is facilitating the sharing of
          promising practices is the ongoing peer-to-peer technical assistance that we have facilitated
          since 2005 amongst SEAs around the effective use of SEA evaluations to help SEAs address
          concerns that they are spending significant award funds for program evaluations but are not
          using them to inform program improvements.

          Beginning in mid·2013, in response to the Department's new policy limiting the cost and scale
          of lar ge meetings like the Summer Institute, the 21st CCLC program has been coordinating small
          scale Regional Meetings to provide targeted technical assistance. So far, these techn ical
          assistance sessions, conducted by AITQ pr ogram staff, have occurred in February 2013 in
          Jacksonville, FL, and in March 2013 in Richmond, VA. Additional Regional Meetings are
          scheduled for later this year and in 2014, to allow for continued opportunities to share
          promising practices geared towards ensuring effective and efficient program implementation.



          Recommendation 1.4- Develop and implement written policies and standard operating
          procedures for coordinating and conducting key monitoring activities, including approving or
          rejecting CAPs submitted by SEAs, notifying the SEAs of the acceptance or rejection of the
          CAP, and providing technical assistance to the SEAs.

          Comments. OESE agrees with Finding 1, and we partially agree with Recommendation 1.4. It
          should be noted t hat Recommendation 1.4 is outside the scope of this finding, since it reaches
          beyond Finding l's focus on testing the accuracy of performance data. This recommendation
          appears to focus on developing and implementing policies and procedures for other types of
          monitoring activities, such as those related to the review and approval of CAPs and providing
          t echnical assistance to SEAs.

                  1
          The 21$ CCLC grant program has standard operating procedures in place for coordinating and
          conducting monitoring site visits. The monitoring protocols provide guidance regarding the



                                                          3
Final Audit Report
ED-OIG/A04L0004                                                                                              Page 27 of 33


           types of evidence t hat should be reviewed and the factors that should be considered in
           determining compliance.

           Nevertheless, we agree that more detailed guidance regarding how to assess the evidence and
           address issues related to inadequate evidence would be useful. We further agree that our
           monitoring teams could benefit from clear, written guidance on activities such as reviewing SEA
           Corrective Action Plans (CAPs) and assessing the need for and providing appropriate technical
           assistance to SEAs. We describe our plans for improving on these monitoring activities in our
           proposed corrective action plan.



           FINDING 2-The Department Can Improve Oversight of SEAs' Processes to
           Award and Monitor 21st CCLC Program Subgrants


           Recommendation 2.1- Provide sufficient monitoring and oversight of SEAs' processes to
           award and monitor 21" CCLC formula grants to subgrantees, including selecting peer
           reviewers, evaluating and scoring grant applications, and maintaining sufficient
           documentation to support the award of subgrantees.

           Comments. OESE agrees with Finding 2 in that we can improve oversight of SEAs' processes to
           award and monitor 21" CCLC program subgrants, and agrees in part with the specific strategy
           suggested in Recommendation 2.1. The Draft Audit Report states that the Department's
           "monitoring instrument did not include steps to assess whether SEAs (1) verified the peer
           reviewers' educational qualifications and professional experience." The monitoring protocol
           does include the question, " Has the SEA established and implemented a peer review process
           for awarding grants on a competitive basis?" Additionally, the written guidance to 2151 CCLC
           staff who conducts monitoring activities requires that they review a number of elements
           related to the selection of peer reviewers and their qualifications, including the criteria for
           selection of peer reviewers, the list of peer reviewers and organization affiliations, conflicts of
           interest, and the vetting process for assessing the qualifications of reviewers. The monitoring
           protocol guidance f urther instructs program staff to review peer reviewers' comments to
           ensure that they support the scores. However, to strengthen the monitoring protocol, a probe
           regarding the peer reviewers' educational qualifications and professional experience w ill be
           added. The Department will examine ways to efficiently sample, test, and provide technical
           assistance on common state scoring and evaluation documentation problems.

           Recommendation 2.2- Enhance monitoring ofSE.As rec-eiving 21" CCLC formula grants to
           ensure they develop and implement sufficient policies, procedures, instruments and plans
           that allow the SEAs to promptly resolve monitoring findings with subgrantees.




                                                            4
Final Audit Report
ED-OIG/A04L0004                                                                                         Page 28 of 33


          Comments. OESE agrees with Finding .2 and agrees with Recommendation 2.2. The Draft Audit
          Report appropriately notes that all but one ofthe deficiencies identified during the audit had
          already been identified by the 21'1 CCLC program staff and discussed in monitoring reports.
          Further, the 2151 CCLC monitoring protocol includes the inquiry, 11 Does the SEA notify
          subgrantees of recommendations, findings and corrective actions?" The protocol instructs the
          monitoring team to review written SEA procedures for corrective actions and review written
          correspondence to subgrantees regarding findings and corrective actions. We acknowledge,
          however, that the development of clear, written guidance on reviewing SEAs' monitoring
          findings from monitoring reviews of subgrantees for prompt action and resolution would
          improve the overall effectiveness of our monitoring protocol.



          Recommendation Z.3- Assess whether additional technical assistance would benefit SEAs in
          the areas of peer reviewer selection, processes for awarding 21st CCLC grants, subgrantee
          monitoring, and conducting program evaluations.

          Comments. OESE agrees with Finding 2 and agrees with Recommendation 2.3. A systematic
          needs assessment would assist SEAs, some of which have experienced repeated staff turnover
          since 2010, and the 21'1 CCLC program office by providing reliable data needed to prioritize
          monitoring, program staff training, and technical assistance efforts more effectively.

          We will also continue to utilize and, where appropriate, expand upon our ongoing technical
          assistance efforts, including the annual Summer Institutes, quarterly meetings led by the AITQ
          Program Director, and quarterly monitoring calls that include technical assistance
          opportunities. This ongoing technical assistance has already served as a helpful resource for
          SEAs to gain new knowledge and share strategies for addressing the issues they face.

          Other Matter- The Department Should Evaluate its Performance Standard for Issuing
          Monitoring Reports

          We concur with the former Program Group Leader's statement that limited personnel,
          including the lack of a Team Leader, created challenges in issuing monitoring reports within the
                                      1
          preferred timeframe. A 21' CCLC Program Group Leader and Team Leader are now in place in
                 51
          the 21 CCLC program and they will coordinate monitoring and technica I assistance efforts
          across program staff, including tracking completion and dissemination of monitoring reports.
          This coordination effort, along with the monthly team debriefings th at are now taking place,
          will help the Department determine, by July 31, whether or not the 45-day standard for
          monitoring report issuance is reasonable and identify any substantive barriers to meeting this
          standard.

          Other Matter- The Department Should Encourage SEAs to Issue Timely Monitoring Reports

          We agree that timely reports improve overall monitoring efforts. We ar-e hopeful that our
          renewed efforts to model timeliness by meeting our own 45-day standard, while also noting the



                                                         s
Final Audit Report
ED-OIG/A04L0004                                                                                            Page 29 of 33

           benefits of timely r eports during future technical assistance opportuniti es with SEAs, will
           enhance our existing monitoring and oversight efforts and serve to encourage SEAs to issue
           timely monitoring reports.




                                                          6
Final Audit Report
ED-OIG/A04L0004                                                                                              Page 30 of 33


                                       PROPOSED CORRECTIVE ACTION PLAN
                           FROM THE OFFICE OF ELEMENTARY AND SECONDARY EDUCATION
                                    IN RESPONSE TO THE DRAFT AUDIT REPORT,
                      U.S. DEPARTMENT OF EDUCATION' S AND SELECTED STATES' OVERSIGHT OF
                  THE 21ST CENTURY COMMUNITY LEARNING CENTERS PROGRAM, ED-OIG/A04L0004
                                                 May20, 2013



             I.     FINDING NO. 1 -The Department Can Improve Oversight of Program Performance
                    Data
                       a. Recommendation 1.1. - Ensure that SEAs implement written policies,
                          procedures, and monitoring instruments to sufficiently test the 21't CCLC
                          performance data and provide reasonable assurance of the accuracy,
                          reliability, and completeness of data reported to the Department.

                          Proposed Corrective Action: We agree with Recommendation 1.1 and plan to
                          provide additional training to program staff around assessing SEAs' efforts in this
                          area.



                       b. Recommendation 1.2- Add a step in its SEA site visits monitoring instrument
                          to ensure that SEAs are including monitoring activities to sufficiently test the
                          reliability of the performance data reported to the Department.

                          Proposed Corrective Action: We disagree w ith Recommendation 1.2 for the
                          reasons stated in the Written Comments. !Nevertheless, as stated in the Written
                          Comments, we plan to provide additional training to 2151 CCLC staff.



                       c. Recommendation 1.3 -Identify promising practices from the Department's
                          monitoring visits and communicate those practices to all SEAs.

                          Proposed Corrective Action: We agree with Recommendation 1.3. The 21st CCLC
                          program monitoring visits are intended to irdentify compliance issues as well as
                          identify promising practices. To that end, program staff will develop a draft plan
                          for an efficient and effective approach to collecting and disseminating promising
                          practices on performance data quality to SEAs by September 1, 2013, to be fully
                          implemented in fiscal year 2014.



                       d. Recommendation 1.4 - Develop and implement written policies and standard
                          operating procedures for coordinating and conducting key monitoring
                          activities, including approving or rejecting CAPs submitted by SEAs, notifying




                                                           7
Final Audit Report
ED-OIG/A04L0004                                                                                          Page 31 of 33

                        the SEAs of the acceptance or rejection of the CAP, and providing technical
                        assistance to the SEAs.

                        Proposed Corrective Action: We partially agree with Recommendation 1.4. In the
                        short-term, the Team Leader shared the Draft Audit Report concerns with
                        program staff during the scheduled monthly team meeting (April 26, 2013). The
                        team is actively conducting monitoring reviews through early June and can begin
                        to take action to address concerns raised in the Draft Audit Report during
                        monitoring visits and desk monitoring reviews scheduled during May and June
                        2013.

                        The new 21'1 CCLC monitoring contractor has been working closely with the
                        program staff to identify some key areas where patterns of concern have arisen
                        (e.g., SEAs' oversight of fiscal management by community-based organizations
                        and faith-based organizations). The 21 51 CCLC program's immediate response to
                        those concerns was to augment on-site monitoring teams with contracted
                        monitoring staff who have knowledge of and experience in non-profit financial
                        management in order to effectively monitor how SEAs are providing oversight of
                        this issue.

                        Upon the completion ofthe spring 2013 monitoring site visits and desk reviews,
                        program staff will work with the Departmetnt's contractor t o chart and analyze
                        findings across sites visited in 2012 by July 31, 2013. We anticipate using this
                        analysis to inform a longer-term, comprehensive review and revision process of
                        the monitoring protocols and procedures. We propose submitting a draft plan
                        for this robust monitoring protocol revisiorn process by September 1, 2013. This
                        deadline will allow the plan to incorporate all issues that emerge from final
                        monitoring reports of all monitoring reviews conducted in spring 2013 (reports
                        which are scheduled to be completed by late July, based on the current
                        monitoring schedule).

                        We also agree that there is a need to provide technical assistance to SEAs around
                        the CAP process. We plan to incorporate this technical assistance into our
                        proposed comprehensive technical assistance plan described under II. c.

             11. FINDING 2- The Department Can Improve Oversi'ght of SEAs' Processes to Award and
                 Monitor 2151 CCLC Program Subgrants

                     a. Recommendation 2.1- Provide sufficient monitoring and oversight of SEAs'
                        processes to award and monitor 21'1 CCLC formula grants to subgrantees,
                        including selecting peer reviewers, evaluating and scoring grant applications,
                        and maintaining sufficient documentation to support the award of
                        subgrantees.




                                                        8
Final Audit Report
ED-OIG/A04L0004                                                                                         Page 32 of 33


                        We agree with Recommendation 2.1 in part. We will strengthen the monitoring
                        protocol by adding a probe regarding the peer reviewers' educational
                        qualifications and professional experience. We agree to examine ways to
                        efficiently sample, test, and provide technical assistance on common problems
                        with state scoring and evaluation documentation problems.



                     b . Recommendation 2.2 - Enhance monitoring of SEAs receiving 21st CCLC formula
                         grants to ensure they develop and implement sufficient policies, procedures,
                         instruments and plans that allow the SEAs to promptly resolve monitoring
                         findings with subgrantees.

                        Proposed Corrective Action: We agree with Recommendat ion 2.2. To ensure we
                        continue rigorous oversight and to more th,oroughly monitor SEAs' policies,
                        procedures, tools, and plans to reso lve s.ubgrantee findings, we will review this
                        specific section of the monitoring protocol to determine whether additional
                        written guidance for the monitoring team is needed. We propose to incorporate
                        any necessary additions, revisions, and written guidance around these issues as
                        part of the monitoring protocol revision process described under
                        Recommendation 1.4 by July 31, 2013.




                     c. Recommendation 2.3- Assess whether ad!ditional technical assistance would
                        benefit SEAs in the areas of peer reviewer selection, processes for awarding
                        21st CCLC grants, subgrantee monitoring, and conducting program evaluations.

                        Proposed Corrective Action: We agree with Recommendation 2.3. We propose a
                        two -pronged approach to technical assistance. First, we will develop a plan for
                        conducting a needs assessment of current technical assistance needs specific to
                        program requirements. We propose to complete a needs assessment plan by
                        June 30, 2013. We anticipate the needs assessment will include feedback from
                        SEA coordinators and center directors of multiple sub grants. We will ensure that
                        the needs identified and addressed include support to SEAs around corrective
                        action plans, as suggested in Recommendation 1.4 of the Draft Audit Report.

                        Second, the program w ill continue to refine the technical assistance guidance it
                        will offer during Regional Meetings with SEA coordinators and a subset of
                        subgrantees that SEA coordinators select. lihe following Regional Meetings will
                        be conducted by the 21st CCLC program staff over the next 12 months:

                            •   August 6, 2013 (confirmed): Southwest Regional Meeting, to host Texas,
                                Arkansas, Oklahoma, Louisiana, Mississippi, Alabama, Tennessee,
                                Missouri, Kansas, New Mexico and Colorado.



                                                         9
Final Audit Report
ED-OIG/A04L0004                                                                                       Page 33 of 33


                        •   October 23, 2013 (confirmed): Northwest Regional Meeting, to host
                            Utah, Arizona, Idaho, Montana, Wyoming, Washington, Oregon, Nevada,
                            Alaska, Hawaii and Californi a.
                        •   November 21, 2013 (confirmed): Northeast Regional Meeting, to host
                            New Jersey, Connecticut, Rhode Island, Massachusetts, New York, Maine,
                            New Hampshire and Vermont.
                        •   April 2014 (tentative): Midwest Regional Meeting, to host Iowa, North
                            Dakota, South Dakota, Minnesota, Wisconsin, Illinois and Nebraska.

                     In addition to these meetings, we are in the preliminary planning stages for a
                     2014 all-virtual Spring Institute, tentatively scheduled for May 2014. This virtual
                     conference would entail the identification of promising practices across both
                     program requirements and implementation, with possible video d ips, video
                     conferencing, and similar web-based delivery of technica I assistance that would
                     reach approximately 2,500 grantees and subgrantees. This virtual approach is
                     among the increasingly limited options open to the program since the
                     Department instituted its policy that generally prohibits large-scale conferences
                     such as the 21st CCLC Summer Institute conducted during 2012.




                                                      10