oversight

California Department of Education Management Controls Over IDEA, Part B-Special Education Performance Data.

Published by the Department of Education, Office of Inspector General on 2001-03-30.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

          California Department of Education
              Management Controls Over
   IDEA, Part B- Special Education Performance Data



                               FINAL AUDIT REPORT




                              Control Number ED-OIG/A09-A0016
                                         March 2001




Our mission is to promote the efficient                    U.S. Department of Education
and effective use of taxpayer dollars                      Office of Inspector General
in support of American education.                          Sacramento, California
                                  NOTICE
    Statements that management practices need improvement, as well as other
  conclusions and recommendations in this report, represent the opinions of the
Office of Inspector General. Determination of corrective action to be taken will be
              made by appropriate Department of Education officials.

In accordance with the Freedom of Information Act (5 U.S.C. §552), reports issued
  by the Office of Inspector General are available, if requested, to members of the
 press and general public to the extent information contained therein is not subject
                              to exemptions in the Act.
                                     Table of Contents


Executive Summary.................................................................................1

Audit Results ............................................................................................3
Finding No. 1 – CDE Did Not Fully Meet the Data Quality Standard for
                Ensuring That Data Definitions and Counts Are Correct...... 5

Finding No. 2 – CDE Did Not Fully Meet the Data Quality Standard for
                Ensuring That Data Used Are Correct, Internally
                Consistent and Without Mistakes ............................................. 8

Finding No. 3 – Varying Methods for Determining Placement Percentage
                Could Impact Accuracy of Reported Data............................. 12

Other Matters.........................................................................................13

Background ............................................................................................15

Purpose, Scope and Methodology ........................................................16

Statement on Management Controls ...................................................19

Attachment A – IDEA, Part B Program Objectives, Performance
               Indicators and Performance Data ..................................20

Attachment B – CDE’s Comments to the Report .....................................21
                                 Executive Summary

The California Department of Education (CDE) should take additional steps to improve
management controls over the collection and reporting of performance data provided to the
U.S. Department of Education (ED). The Government Performance and Results Act of 1993
(GPRA) requires that Federal agencies submit annual performance reports to Congress. The
Office of Special Education Programs (OSEP) within ED’s Office of Special Education and
Rehabilitative Services (OSERS) administers programs funded under the Individuals with
Disabilities Education Act (IDEA), Part B. OSEP uses performance data reported by state
educational agencies in preparing ED’s report to Congress on the outcomes of the IDEA, Part B
programs.

For reporting outcomes under ED’s 2001 Annual Plan, OSEP uses state educational agencies’
performance data for the following performance indicators:

    •    Earlier identification and intervention (intervention)
    •    Inclusive settings/Regular education settings (placement)
    •    Graduation (exiting)
    •    Suspensions or expulsions (discipline)
    •    Qualified personnel (personnel)
CDE is required by IDEA, Part B to submit this performance data to ED. Attachment A to this
report shows the relationship between the IDEA, Part B program objectives, performance
indicators and performance data.

Our review of procedures and available documentation at CDE, one special education local plan
area (SELPA) and three school districts identified weaknesses in CDE’s management controls
covering performance data for placement, exiting and discipline for school year 1998-99.
Specifically, we found that CDE did not fully meet two of the six Data Quality Standards
developed by ED for use by its managers when monitoring grantees and evaluating the quality of
the reported data.1

CDE did not fully meet the standards covering accurate descriptions because:

    •    The SELPA and school districts used exiting categories that did not correlate to the
         categories on the OSEP reporting form.
    •    CDE used the child’s age on the exiting date when determining exiting counts rather than
         the child’s age on December 1st.
    •    CDE included children more than once in the reported child counts for exiting.
    •    Not all of California’s school districts provided discipline data.



1
 In March 2000, ED included the Data Quality Standards as an Appendix in Volume 1 of its
publication titled 1999 Performance Reports and 2001 Plans.

ED-OIG                           Control Number ED-OIG/A09-A0016                          Page 1
In addition, CDE did not fully meet the standards covering editing because:

   •     CDE’s computer software recorded a zero placement percentage in the statewide
         database when school districts reported that a child would spend 100 percent of his or her
         time in a regular education program.
   •     CDE, the SELPA and school districts did not perform sufficient reviews of the data
         entered by school district staff to ensure that the data are correct, internally consistent and
         without mistakes.

We also concluded that school districts’ use of varying methods for determining a child’s
placement percentage could impact CDE’s ability to provide reliable data.

California has 119 SELPAs that report performance data to CDE for a total of 1,101 school
districts. Our audit was limited to reviews of procedures and documents at CDE, one SELPA
and three school districts. Since the procedures used by other SELPAs and school districts may
vary from those covered by our review, our audit would not necessarily disclose all material
weaknesses in the management controls related to the reporting of performance data. We
concluded that the identified management control weaknesses disclosed in the report may be
systemic in nature rather than limited to the particular SELPA or school districts.

CDE has already taken steps to address some of the identified weaknesses. We recommend that
the Assistant Secretary for Special Education and Rehabilitative Services require CDE to take
additional action to address the remaining identified weaknesses in the management controls
over reported performance data. The Audit Results section of the report describes the corrective
actions taken by CDE and our specific recommendations for each of the findings.

The Other Matters section of the report discloses that individual education program (IEP) and
triennial assessment dates in the statewide database were not within the required time frames for
children included on the 12/1/98 child count. Federal regulations require that a child’s IEP is
reviewed at least annually and that an assessment is conducted at least once every three years.

In its comments to the report, CDE expressed no objections to our findings and described the
corrective action planned or taken. CDE’s comments are summarized in the report following
each finding. The full text of CDE’s comments is included as Attachment B.




ED-OIG                            Control Number ED-OIG/A09-A0016                               Page 2
                                       Audit Results

Performance Indicator 4.7.c. of ED’s 2001 Strategic Plan states that all ED program managers
will assert that the data used for their program’s performance measurement are reliable, valid and
timely, or will have plans for improvements. Annually, the ED managers must provide the
Office of the Under Secretary with a signed formal attestation covering their data. ED developed
the Data Quality Standards to assist ED managers as they collect, analyze and report data about
Federal programs. For the IDEA, Part B special education programs, the data used for
measuring performance include data reported by the individual states. CDE’s management
controls did not fully meet two of ED’s Data Quality Standards for evaluating the quality of
reported data. There are six standards:

    •    Validity – Data adequately represent performance.
    •    Accurate Description – Definitions and counts are correct.
    •    Editing – Data are clean.
    •    Calculation – The math is right.
    •    Timeliness – Data are recent.
    •    Reporting – Full disclosure is made.

For each of the data quality standards, ED provided examples of conditions that meet or fail to
meet the standard. ED also provided Data Quality Checklists for use by primary data providers
and secondary data managers. For school year 1998-99, CDE management controls over the
collection and reporting of performance data for placement, exiting and discipline did not meet
all elements contained in the Data Quality Standards for accurate description and editing.

CDE used three data collection processes to collect the IDEA, Part B performance data that it
reported to OSEP for school year 1998-99. For intervention, placement and exiting data, CDE
used two statewide databases created by the California Special Education Management
Information System (CASEMIS). For discipline data, CDE used electronic files submitted by
SELPAs to create a statewide database. For personnel data, CDE used information taken from
hardcopy reports prepared by school districts to create a statewide database.

Intervention, placement and exiting. CDE used CASEMIS to create two databases for each
school year. The databases contained student-level information on each child receiving special
education in the state. The first database (child count database) contained information as of the
December 1st child count. The child count database was used to report intervention and
placement data. The second database (end-of-year database) contained information as of the end
of the school year. This database was used for reporting the exiting data.2 The information on
the two CASEMIS databases originates from computerized files or hard copy reports that
individual school districts provided to their respective SELPA.



2
 Beginning with school year 1999-00, CDE will also use the end-of-year database for reporting
discipline data.

ED-OIG                           Control Number ED-OIG/A09-A0016                           Page 3
The school districts used either a system adopted by their SELPA or had their own computerized
or manual systems that maintained information on children that received special education in
their district. Twice each school year, the school districts submitted their files/reports to their
respective SELPA. The SELPA was responsible for consolidating the files/reports into one
computerized file that was in the prescribed CASEMIS format and passed the data edit checks
performed by the CASEMIS software. When the file passed all edit checks, the CASEMIS
software generated a certification, which the SELPA director signed certifying that the data was
accurate and complied with applicable laws and regulations. The signed certification and
consolidated file were sent to CDE. CDE used the consolidated file submitted by the individual
SELPAs to create the CASEMIS statewide databases.

We concluded that CDE needed to take steps to improve management controls covering
placement and exiting data reported to OSEP. For school year 1998-99, the CASEMIS software
recorded a zero in the CASEMIS databases when school districts reported a 100 percent
placement percentage, improperly used the child’s age as of the exiting date for reporting exiting
data and included children more than once in the exiting reporting form. We also found that the
SELPA and school districts we reviewed used exiting categories that did not correlate to the
categories on the OSEP reporting form and that they did not perform sufficient reviews of the
data entered by school district staff. In addition, CDE did not perform sufficient reviews to
ensure that the CASEMIS statewide databases accurately reflected the information contained in
school district records.

Discipline. For school year 1998-99, CDE had SELPAs submit the discipline data for their
school districts in an electronic file using a prescribed format. The format required the SELPAs
to report each disciplinary occurrence. CDE used the electronic files submitted by the SELPAs
to create a separate statewide database.

The procedures used by SELPAs to collect the discipline data from the school districts varied.
The reviewed SELPA had the school districts submit the data on a hardcopy form designed by
the SELPA staff. SELPA staff entered the information from the hard copy forms into the
electronic file.

The discipline data on the statewide database for school year 1998-99 was incomplete. CDE did
not include the discipline data for six SELPAs because the data was not provided in a useable
form. We also found that the discipline data submitted by the reviewed SELPA did not include
information from all its school districts.

Personnel. CDE collected personnel data for each school district using a CDE-designed
hardcopy form. The school districts submitted the forms to CDE through their respective
SELPA. CDE staff enters the information from the hardcopy forms into an electronic file.
Nothing came to our attention during our limited assessment and testing of management controls
that caused us to doubt the acceptability of CDE’s reported performance data for personnel for
the school year 1998-99.

In addition to the management control weaknesses, we noted that the school districts’ use of
varying methods for determining a child’s placement percentage may impact CDE’s ability to
provide reliable placement data.



ED-OIG                          Control Number ED-OIG/A09-A0016                             Page 4
Finding No. 1 –CDE Did Not Fully Meet the Data Quality Standard for
               Ensuring That Data Definitions and Counts Are Correct

Standard Two—Accurate Description of the Data Quality Standards ensures that data definitions
and counts are correct. We found that, for exiting and discipline data, CDE did not fully meet
the standards covering accurate description because:

    •    The SELPA and school districts used exiting categories that did not correlate to the
         categories on the OSEP reporting form.
    •    CDE used the child’s age on the exiting date rather than the child’s age on December 1st
         when determining exiting counts.
    •    CDE included children more than once on the exiting reporting form.
    •    Not all of California’s school districts provided discipline data.


The Exiting Categories Did Not Correlate to
the Categories on the OSEP Reporting Form
California SELPAs and school districts use their own systems and formats (i.e., record layouts
and data definitions) for collecting the individual child information that is submitted to CDE for
inclusion in the CASEMIS database. Standard Two specifies the requirement that “[a]ll data
providers use the same agreed-upon definitions.” We found that definitions used by the SELPA
and school districts we reviewed did not correlate to CASEMIS and the OSEP reporting form for
the “dropped out” and “moved” categories.

For school year 1998-99, the reviewed SELPA and school districts used exiting categories that
included “other” and “reason unknown.”3 According to the SELPA director, the “other” code
was used whenever the reason for the student leaving special education did not fit other available
exiting categories. Staff at one school district told us that the “other” category was used for a
child in our sample because the child was a runaway. When preparing the student information
file for submission to CDE, the SELPA recorded the exiting categories of “other” and “reason
unknown” as “moved, not known to be continuing.”

Table 4 of the OSEP reporting form contains eight exiting categories: (1) no longer receiving
special education; (2) graduated with regular high school diploma; (3) received a certificate;
(4) reached maximum age; (5) died; (6) moved, known to be continuing; (7) moved, not known
to be continuing; and (8) dropped out. OSEP defined “moved, not known to be continuing” as
children who moved out of the catchment area (i.e., school district jurisdiction) and are not
known to be continuing in another education program (i.e., enrolled in another school). OSEP
defined “dropped out” as children who did not exit through any of the other described categories
and provided the following examples: dropouts, runaways, GED recipients, expulsions, status
unknown and other exiters.

3
  To assess management controls, we reviewed the collection and reporting procedures and
supporting documentation at CDE, the East San Gabriel Valley SELPA, Pomona Unified School
District, Azusa Unified School District and Charter Oak Unified School District.


ED-OIG                          Control Number ED-OIG/A09-A0016                            Page 5
Based on OSEP’s definitions, children categorized as “other” and “reason unknown” should
have been recorded as “dropped out” in the file submitted to CDE. Thus, the exiting categories
used by the SELPA and school districts were not aligned with the definitions in the OSEP
Table 4 reporting form.

One of OSEP’s performance indicators is “Graduation: The percentage of children with
disabilities exiting school with a regular high school diploma will increase, and the percentage
who drop out will decrease.” CDE may be underreporting its “dropped out” child count, which
would affect the accuracy of data used for this indicator. As shown below, CDE’s exiting data
for the “dropped out” category was not consistent with the data reported by the other 49 states.

                                             Moved, Not Known
                              Total Exits     to Be Continuing           Dropped Out
      CDE                          60,450         11,036 (18%)            2,694 (4%)
      Reviewed SELPA                 1,094            200 (18%)                 1 (0%)

      Other 49 States             448,066          49,442 (11%)           71,524 (16%)



CDE Improperly Used the Child’s Age on
the Exiting Date for Reporting Exiting Data
CDE did not use the agreed-upon definition of a child’s age for its exiting reports. CDE reported
the child counts in the age categories of the exiting report based on the child’s age on the exiting
date rather than the child’s age on the count date (December 1). OSEP instructed the states to
use the OSEP IDEA, Part B 1998 Data Dictionary when reporting performance data. The Data
Dictionary states that “[a]ge is a child’s actual age in years on the date of the child count….”

The age of the child was used to identify the children included in the measure for the above
OSEP performance indicator for “Graduation.” The performance measures used child counts for
children ages 14 through 21. As a result of using the child’s age at the time of the exit, CDE:

 ! Included children who were 14 at the time of the exit but were only 13 years of age on
   December 1st (i.e., birth dates after December 1st and before the end of the school year).

 ! Excluded children who were 14 on December 1st, but were 13 at the time of the exit
   (i.e., birth dates between the beginning of the school year and December 1st).

 ! May have misclassified children who were 21 years old on December 1st under the
   “22+” age category when the child reached age 22 prior to the exiting.




ED-OIG                          Control Number ED-OIG/A09-A0016                              Page 6
CDE May Have Included Children More
Than Once in Its Reported Exiting Counts
CDE included children more than once in the child counts on the exiting report. Standard Two
of the Data Quality Standards ensures that occurrences are not double counted. The instructions
that accompany the OSEP exiting data collection forms directs the state educational agencies to
provide an unduplicated count of children exiting special education.

Our analysis of the CASEMIS database identified 1,314 children with the same last name, first
name, birth date and gender. These children may have been included in the report totals more
than once. According to a CDE official, the year-end database is not checked for multiple
records per child because CDE uses the database to track moves of children transferring from
district to district. CASEMIS software does not take into account multiple records when
preparing the child counts for the exiting report.


CDE Reported Incomplete Discipline Data
When we initiated the audit, CDE officials informed us that its reported discipline counts were
incomplete because not all SELPAs provided CDE with usable discipline data. Also, at the
reviewed SELPA, we found that one of the three school districts reviewed did not provide
discipline data to the SELPA. Standard Two of ED’s Data Quality Standards specifies that all
instances are counted, and no instances are omitted. CDE did not disclose the data limitation on
the discipline report submitted to OSEP.

School year 1998-99 was the first year that OSEP required state educational agencies to report
discipline data. Since collecting the discipline data for that year, CDE revised the CASEMIS
software to include discipline data for each child. The revised CASEMIS software was used to
collect discipline data for school year 1999-00.


Recommendations
The Assistant Secretary for Special Education and Rehabilitative Services should:

1.1      Require CDE to issue guidance to school districts on the proper category to use when the
         reason for the exit is unknown so that CDE can properly include such exits in the
         “dropped out” category on the OSEP reporting form.

1.2      Require CDE to ensure that it and the SELPAs, if applicable, have an acceptable method
         for grouping the exits in the categories specified on the OSEP reporting form.

1.3      Require CDE to use the child’s age on December 1st for reporting exiting counts.

1.4      Require CDE to implement procedures to eliminate multiple counts of children on the
         exiting report form.



ED-OIG                          Control Number ED-OIG/A09-A0016                             Page 7
CDE’s Comments
In its comments to the report, CDE described the corrective action planned or taken to address
the recommendations. CDE stated that training would be provided to SELPAs and school
districts on exit reporting categories. Starting with school year 2001-02, CDE will include all of
OSEP’s dropout categories in CASEMIS. In addition, CDE plans to verify the data collection
and reporting process as part of its monitoring of school districts.

CDE stated that CASEMIS application software would be changed to calculate children’s ages
as of December 1st. CDE also stated it had concerns regarding use of that date and plans to
continue to work with OSEP to clarify this issue. CDE also plans to add a verification routine in
its software to identify duplicate records in the SELPA files submitted for inclusion in
CASEMIS.


OIG Response
To further ensure that children are included only once in the exiting report, CDE should also
implement procedures to identify duplicate records for children that were included in more than
one SELPA’s file.



Finding No. 2 – CDE Did Not Fully Meet the Data Quality Standard for
                Ensuring that Data Used Are Correct, Internally Consistent
                and Without Mistakes

Standard Three—Editing ensures that data are clean. ED’s definition for this standard is “[d]ata
used are correct, internally consistent and without mistakes.” We found that CDE did not fully
meet the editing standard because:

   •     CASEMIS software recorded a zero placement percentage in the database when school
         districts reported that a child would spend 100 percent of his or her time in a regular
         education program.
   •     CDE and the SELPA and school districts we reviewed did not perform sufficient reviews
         of the data entered by school district staff to ensure that the data are correct and complete.

CASEMIS Software Recorded a Zero Placement Percentage
When School Districts Reported 100 Percent
CASEMIS system recorded a zero placement percentage when school districts reported that the
child would spend 100 percent of the time in a regular education program. Our analysis of the
CASEMIS 12/1/98 child count database for the school year 1998-99 identified records for
3,373 children that showed the child in the regular education program with designated instruction
service (CASEMIS code #410) and a 00 percent of time spent in a regular education program.




ED-OIG                            Control Number ED-OIG/A09-A0016                              Page 8
By definition, children with CASEMIS code #410 would spend a portion of the time in a regular
education program.4

One of OSEP’s performance indicators is “the percentage of children with disabilities ages 6 to
21 who are reported by states as being served in the regular education classroom at least 80
percent of the day will increase.” Incorrect placement percentages impact the accuracy of data
used to measure this indicator.

The percentages were recorded as zero because the CASEMIS software used for school
year 1998-99 was designed to allow only two characters for the placement percentage data. CDE
has modified the CASEMIS software to provide a three-character field for recording the
placement percentage. The revised CASEMIS software will be used in gathering data for school
year 2000-01.


Insufficient Reviews of Data
Entered From Source Documents
CDE and the reviewed SELPA and school districts did not conduct sufficient reviews to ensure
that individual student information is accurately reflected in the CASEMIS databases.

School District Level. The school districts were responsible for the accuracy and completeness
of the data submitted to the SELPA and CDE. We found that school districts did not have
procedures for an independent verification of the data entered by district staff. We also found
that the schools received computer printouts from the district but had no procedures for
confirming that the data was entered completely and correctly.

SELPA Level. SELPA staff did not conduct reviews to confirm that data entered into the
CASEMIS system accurately reflected information in the school’s records. SELPA staff did
address exceptions found by CASEMIS edit checks, and the SELPA director stated that he
“eyeballed” CASEMIS summary reports. The SELPA also provided school district staff with the
CASEMIS summary reports, which the district staff indicated they also “eyeballed.”

State Level. For school year 1998-99, CDE did not conduct reviews to ensure that the
information on the CASEMIS databases reflected accurate and complete information contained
in school files. CASEMIS did include edit checks to ensure that data fields in individual records
contained data in the appropriate format. Also, CDE had procedures for identifying and
eliminating duplicate records on the December 1st child count databases.

Our review identified instances where information in the CASEMIS databases was incorrect.
We compared selected information on the database used for the 12/1/98 child count with records
at the reviewed school districts. We found that the CASEMIS information for 13 of the
87 children reviewed was inconsistent with school records:

4
 For the children selected for file reviews at the three school districts, one child had zero percent
on the CASEMIS database. We confirmed that the school district intended to report a placement
percentage of 100 percent for this child.


ED-OIG                          Control Number ED-OIG/A09-A0016                              Page 9
      •   11 children – The placement percentage used to calculate the time a child spends in the
          regular classroom did not match the percentage documented in the child’s IEP.
      •   1 child – The child’s birth date did not match the date in the school files.
      •   1 child – The child’s IEP contained a statement that the child was no longer receiving
          special education as of June 1998.

We also compared information from the CASEMIS end-of-year exit database for school
year 1998-99 with records at the school districts. We found that the CASEMIS information for
10 of the 29 children reviewed was inconsistent with school records or the school records did not
support the information.

      •   4 children – The school records contained no information to confirm that the exit reason
          and date in the CASEMIS database were correct.
      •   4 children – The exiting reason in the CASEMIS database did not match the reason in the
          school file. Also, for 1 of the 4 children, the child’s exit date was incorrect on the
          CASEMIS database.
      !   2 children – The school records showed that the children were not enrolled at the school
          during school year 1998-99.5

CDE used the CASEMIS databases to prepare reported child counts for intervention, placement
and exiting. Since neither CDE nor the reviewed SELPA and school districts conducted reviews
of the data entered into their databases, we have little assurance that the information contained in
the CASEMIS database was correct and without mistakes.

During school year 1999-00, CDE implemented a school file review process that included
confirming information on the CASEMIS child count database. For the initial cycle, CDE
conducted reviews at selected school districts. The reviews included confirming database
information for children selected from the CASEMIS database for the 12/1/97 child count with
school records. For the next cycle, CDE plans to conduct reviews at additional school districts
and select children from the CASEMIS database used for the 12/1/99 child count. According to
CDE staff, these databases were the most recent completed database available at the time of the
reviews. CDE does not plan to perform reviews of information on the CASEMIS database used
for the 12/1/98 child count.




5
    These 2 children were not included on the CASEMIS file for the 12/1/98 child count.

ED-OIG                              Control Number ED-OIG/A09-A0016                          Page 10
Recommendations
The Assistant Secretary for Special Education and Rehabilitative Services should:

2.1      Recommend CDE implement an edit check in the CASEMIS system to ensure that the
         placement setting and placement percentage are in agreement.

2.2      Recommend CDE ensure that school districts implement procedures for a review by a
         second person to confirm that data fields related to the OSEP reporting form are properly
         recorded in the CASEMIS database.

2.3      Recommend that CDE consider requiring SELPAs to conduct periodic reviews of school
         district procedures and data to support their certifications that the data is accurate.


CDE’s Comments

CDE stated that an edit check was not necessary because the placement setting categories will be
removed from CASEMIS after school year 2000-01. CDE plans to work with school districts
and SELPAs to implement the recommended reviews. Also, CDE will verify that reviews are
being performed during its monitoring of school districts and SELPAs.


OIG Response
After receiving the response, we inquired about the procedures CDE would use to report
placement data after removal of the placement categories. CDE stated that it would use
information contained in two new data fields in CASEMIS. If there is a relationship between the
data fields, CDE should implement edit checks in CASEMIS to ensure that data in the two new
data fields are in agreement.




ED-OIG                          Control Number ED-OIG/A09-A0016                            Page 11
Finding No. 3 – Varying Methods for Determining Placement Percentage
                Could Impact Accuracy of Reported Data

School districts used varying methods to determine a child’s placement percentage. We found
that each of the three school districts we reviewed had its own standard percentage charts and
procedures for estimating a child’s time in regular class based on the type of educational
placement setting. At one school district, the staff used a specific percentage taken from the
standard chart. At the other two school districts, the standard chart showed a range of
percentages for each type of educational placement setting. At one of these districts, staff used
their judgment to estimate a percentage of time in regular class that was within the specified
range shown on the chart. At the other district, the staff used the high-end percentage of the
range. The accuracy of data used to measure the placement indicator is impacted by
inconsistencies in methods used to determine the placement percentage.

Neither OSEP nor CDE have provided specific instructions for calculating the placement
percentage. In the CASEMIS User’s Manual, CDE suggested to school districts that they obtain
the placement percentage by taking into account the amount of instructional time (minutes per
day if it is a daily program or per week if it is a weekly program) spent by the student in the
regular class and dividing this time by the total amount of instructional time. OSEP officials
informed us that the percentage should be based on actual time, but OSEP has not provided
guidance to the states on how to determine the placement percentage.


Recommendation
3.1      The Assistant Secretary for Special Education and Rehabilitative Services should work
         with CDE to develop guidance for SELPAs and school districts on determining
         placement percentages.


CDE’s Comments

CDE stated that it looks forward to working with OSERS to clarify how placement percentages
should be calculated.




ED-OIG                         Control Number ED-OIG/A09-A0016                            Page 12
                                      Other Matters


Individualized Education Program and
Triennial Assessment Dates
IEP and triennial assessment dates in the CASEMIS database were not within the required time
frames for children included on the 12/1/98 child count. Pursuant to 34 CFR §300.343(c)(1) and
34 CFR §300.536 each public agency shall ensure that a child’s IEP is reviewed at least annually
and that a reevaluation (triennial assessment) is conducted at least once every three years. For
children included in the CASEMIS database for the 12/1/98 child counts, school records should
include an IEP that was prepared or reviewed between 12/1/97 and 12/1/98 and a triennial
assessment that was conducted between 12/1/95 and 12/1/98.

Our analysis of the CASEMIS database for the 12/1/98 child count found that 115,265 of the
623,651 children in the database had IEP and/or triennial assessment dates that were not within
the required time frames. We reviewed school records for 29 of the 935 children that did not
have dates within the required time frames for the three school districts we selected for review in
our audit. Based on the reviews, we concluded that:

   •     For 15 children, school records did not contain an IEP and/or triennial assessment that
         was conducted within the required time frames. Information on the CASEMIS database
         for the 12/1/99 child count indicates that the 15 children were receiving special education
         services after 12/1/98. Therefore, it is likely that the children were receiving service on
         12/1/98 and were appropriately included in the reported child counts.

   •     For 13 children, the school records contained an IEP and triennial assessment conducted
         within the required time frames, but the school districts had not updated the IEP and
         assessment dates in the computerized files provided to the SELPA.

   •     For 1 child, the school records included an IEP containing a statement that the child was
         no longer receiving special education as of June 1998. This child should not have been
         included in the reported child counts since the child was not receiving special education
         services on 12/1/98.

The State of California does not limit its reported child counts to children that have IEP and/or
assessments conducted within the required time frames for the December 1st count date.
Therefore, the late IEP/assessments and failure to record current dates in CASEMIS did not have
an impact on CDE’s reported child counts for intervention, placement, exiting and discipline as
long as the children included in the counts were receiving services. As noted above, we found
that 1 of the 29 children was not receiving special education as of the 12/1/98 child count.




ED-OIG                           Control Number ED-OIG/A09-A0016                             Page 13
During school year 1999-00, CDE took action to address the untimely dates for
IEPs/assessements. CDE provided individual school districts with a list of children on the
CASEMIS database for the 12/1/99 child count for which the database had dates that were over
one year for the IEP and over three years for the assessment. CDE instructed the districts to
conduct the required IEP and assessments. CDE advised school districts of its intent to impose
sanctions for issues of long-standing noncompliance and monitor the district’s compliance using
the CASEMIS year-end database for school year 1999-00.




ED-OIG                        Control Number ED-OIG/A09-A0016                           Page 14
                                        Background

The GPRA, enacted on August 3, 1993, specifies the purposes of the Act which include:

   •     To help Federal managers improve service delivery, by requiring that they plan for
         meeting program objectives, and by providing them with information about program
         results and service quality; and

   •     To improve congressional decision-making, by providing more objective information on
         achieving statutory objectives and on the relative effectiveness and efficiency of Federal
         programs and spending.

GPRA requires that Federal agencies prepare a five-year strategic plan for their program
activities. Starting with fiscal year 1999, Federal agencies must prepare annual performance
plans and report on program performance.

ED published its Strategic Plan 1998-2002 in September 1997. ED’s 1999 Performance Report
and 2001 Annual Plan were submitted to Congress in March 2000. The 2001 Annual Plan
contained nine performance indicators for the IDEA Part B — Special Education Program. ED
relies on state-reported data in measuring performance for six of the nine listed indicators.

CDE is responsible for administering the IDEA, Part B–Special Education Program in the State
of California. The state has 119 SELPAs. Each SELPA is responsible for providing oversight
for one or more of the state’s 1,101 school districts. The SELPAs are responsible for collecting
special education data from the school districts and submitting the data to CDE.

CDE received $469 million of IDEA Part B funds for the 1999-2000 award year. CDE reported
that 623,651 children were receiving special education services in the state on 12/1/98.




ED-OIG                           Control Number ED-OIG/A09-A0016                            Page 15
                      Purpose, Scope and Methodology


The purpose of the audit was to: (1) identify the process used by CDE to accumulate and report
IDEA, Part B performance data to OSEP, (2) determine whether CDE management controls
ensured that the performance data was reliable and (3) identify barriers or obstacles that may
impact CDE’s ability to provide quality performance data. The audit was limited to state-
reported data used by OSEP to report on program objectives and outcomes as required by GPRA.
Our review covered the state-reported school year 1998-99 data for the performance indicators:
inclusive settings and regular education settings (placement), earlier identification and
intervention (intervention), graduation (exiting), suspensions and expulsions (discipline) and
qualified personnel (personnel).

To accomplish our objectives, we interviewed state officials and staff responsible for collecting,
processing and reporting the performance data to OSEP. We evaluated CDE’s procedures to
ensure that data reported by school districts was accurately recorded in the CASEMIS databases
and that the data reported to OSEP was supported by the data contained in the CASEMIS
databases. For the school year 1999-00, the State of California had 1,101 school districts that
reported performance data to 119 SELPAs.

Six SELPAs had two or more assigned school districts and reported over 10,000 children for the
IDEA, Part B program. From this group, we randomly selected one SELPA and reviewed its
procedures and the procedures used by 3 of the 12 school districts assigned to the SELPA. To
ensure that we evaluated procedures at school districts with varying populations of special
education children, we grouped the 12 school districts based on their reported child counts and
selected one school district from each group.

           Number of
         School Districts   Reported Child Count            Selected School District
              1             Over 2,000                      Pomona Unified School District
              5             Between 1,000 and 2,000         Azusa Unified School District
              6             Between 300 and 999             Charter Oak Unified School District

In addition, we reviewed CDE’s single audit reports for the fiscal years ended June 30, 1998, and
1999 and the single audit reports for each selected school district for the fiscal year ended
June 30, 1999.

Intervention, Placement and Exiting. To achieve our audit objectives for the intervention,
placement and exiting performance indicators, we extensively relied on computer-processed data
extracted from CDE’s CASEMIS databases. Our assessment of the reliability of the database
was limited to (1) gaining an understanding of the procedures used by CDE and the selected
SELPA and three school districts to collect, process and review the data and (2) confirming that
selected data in the CASEMIS databases was supported by information contained in school
records.




ED-OIG                          Control Number ED-OIG/A09-A0016                            Page 16
To confirm that school records supported information on the database, we randomly selected two
groups of children from the CASEMIS database for the 12/1/98 child count. The first group was
selected from all children receiving special education in the selected school districts. The second
group was selected from those children in the school districts with IEP and/or assessment dates
that were not within the required time frames for the 12/1/98 child count. The following table
shows the total children in each group and the number of children for which we reviewed school
files:

                                                                Children Receiving Special Education
                  Children Receiving Special Education            with IEP/Assessment Dates Not
                                                                   Within Required Time Frames
    School        Total Children          Student Files          Total Children      Student Files
    District        Reported                Reviewed                Reported           Reviewed
 Pomona                3,063                   30                      560                10
 Azusa                 1,092                   30                      187                10
 Charter Oak             646                   30                      188                10

From our assessment and tests, we concluded that CDE should take additional steps to improve
management controls over the collection and reporting of performance data reported to OSEP for
placement and exiting. The Audit Results section of the report provides details on our findings.

Discipline. To achieve our audit objective for the discipline performance indicator, we gained
an understanding of the procedures used by CDE, and the selected SELPA and three school
districts. We confirmed the CDE official’s statement that the reported child counts for school
year 1998-99 were incomplete. We also reviewed CASEMIS documentation to confirm that the
file structure was changed to include discipline data. The Audit Results section of the report
provides details on our findings.

Personnel. To achieve our audit objectives for the personnel performance indicator, we
(1) gained an understanding of the procedures used by CDE and the selected SELPA and three
school districts to collect, process and review the data, (2) confirmed that data reported to OSEP
was supported by information contained on CDE’s electronic file, (3) tested CDE’s entry of data
provided by the three school districts into the electronic file and (4) confirmed that data provided
by the three school districts was supported by information contained in school records. The
following table shows the total personnel reported by each school district and the number of
randomly selected personnel files reviewed:

                                         Total Personnel
                                                                    Personnel Files
                School District             Reported
                                                                      Reviewed
                                    (Full Time Equivalencies)
                Pomona                       299.82                       10
                Azusa                        128.73                       10
                Charter Oak                   55.29                       10

Nothing came to our attention during our limited assessment and testing of management controls
that caused us to doubt the acceptability of CDE’s procedures and the data submitted by the three
school districts.

ED-OIG                            Control Number ED-OIG/A09-A0016                              Page 17
We performed our fieldwork at CDE in Sacramento, California and at the East San Gabriel
Valley SELPA. We also performed fieldwork at the district and school offices of the Pomona
Unified School District, Azusa Unified School District and Charter Oak Unified School District.
Fieldwork was conducted from June to September 2000. We held our exit meeting with CDE
officials on 12/19/00. Our audit was performed in accordance with generally accepted
government auditing standards appropriate to the scope of the review described above.




ED-OIG                        Control Number ED-OIG/A09-A0016                           Page 18
                     Statement on Management Controls


As part of our review, we assessed the system of management controls, policies, procedures and
practices applicable to CDE’s process for collecting and reporting performance data for the
IDEA, Part B program as required by GPRA. Our assessment was performed to determine
whether the processes used by CDE and the reviewed SELPA and school districts provided a
reasonable level of assurance that CDE reported reliable performance data to OSEP.

For the purpose of this report, we assessed and classified CDE’s significant controls related to
collecting and reporting performance data into the following categories:

   •     Guidance and technical assistance,
   •     Collection of data from school districts,
   •     Data compilation and report preparation and
   •     Monitoring school district data collection and reporting processes.

Because of inherent limitations, a study and evaluation made for the limited purpose described
above would not necessarily disclose all material weaknesses in the management controls.
However, our assessment disclosed management control weaknesses that adversely affected
CDE’s ability to report accurate performance data for IDEA, Part B. These included (1) using
exiting categories that did not correlate to the categories on the OSEP reporting form, (2) using
the child’s age on the exiting date when determining exiting counts rather than the child’s age on
December 1st, (3) including children more than once in the reported child counts for exiting,
(4) reporting incomplete discipline data, (5) computer software recording a zero placement
percentage instead of the school district’s reported 100 percent, and (6) insufficient reviews of
the data entered by school district staff.




ED-OIG                           Control Number ED-OIG/A09-A0016                           Page 19
                                                                                                                                                  Attachment A

                   IDEA, Part B Program Objectives, Performance Indicators and Performance Data
                                                2001 Annual Plan

                                                                                                               PERFORMANCE DATA COLLECTED
    PROGRAM OBJECTIVE                                   PERFORMANCE INDICATOR                                        FROM OSEP FORMS
All preschool children with disabilities       1.1 Inclusive settings. The percentage of preschool             State educational agencies report the number of
receive services that prepare them to enter        children with disabilities who are receiving special        students ages 3-5 by age and educational placement.
school ready to learn.                             education and related services in inclusive settings will
                                                   increase.
All children who would typically be            2.1 Earlier identification and intervention.             The    State educational agencies report number of disabled
identified as being eligible for special           percentage of children served under IDEA ages 6 or 7,       children receiving special education by:
education at age 8 or older and who are            compared to ages 6-21, will increase.                           ! disability and age and
experiencing early reading or behavioral                                                                           ! disability and ethnicity
difficulties receive appropriate services
earlier to avoid falling behind their peers.
All children with disabilities have access     3.1 Regular education settings (school age).              The   State educational agencies report the number of
to    the     general     curriculum     and       percentage of children with disabilities ages 6-21 who      students ages 6-21, by age category, disability and
assessments,          with       appropriate       are reported by states as being served in the regular       placement.
accommodations, support, and services,             education classroom at least 80 percent of the day will
consistent with high standards.                    increase.
                                               3.3 Suspensions or expulsions. The percentage of children       State educational agencies report the number of
                                                   with disabilities who are subject to long-term              students suspended or expelled, unilaterally removal
                                                   suspension or expulsion, unilateral change in placement     or removal based on a hearing by:
                                                   or change in placement if their current placement is             ! disability and basis of removal and
                                                   likely to result in injury to someone, will decrease.            ! ethnicity and basis of removal
Secondary       school     students   with     4.1 Graduation.          The percentage of children with        State educational agencies report the number of
disabilities receive the support they need         disabilities exiting school with a regular diploma will     students ages 14-21 that exited special education by:
to complete high school prepared for               increase and the percentage who drop out will decrease.          ! age, disability and basis of exit,
postsecondary education or employment.                                                                              ! age and basis of exit and
                                                                                                                    ! ethnicity and basis of exit
States are addressing their needs for          5.1 Qualified personnel. The number of states and               State educational agencies report the number and
professional development consistent with           outlying areas where at least 90 percent of special         type of teachers and other personnel to provide
their comprehensive system of personnel            education teachers are fully certified will increase.       special education and related services for children
development.                                                                                                   ages 3-21. State educational agencies must report the
                                                                                                               number of staff:
                                                                                                                     ! fully certified and
                                                                                                                     ! not fully certified




ED-OIG                                                       Control Number ED-OIG/A09-A0016                                                               Page 20
                                             Attachment B




         CDE’s Comments to the Report




ED-OIG     Control Number ED-OIG/A09-A0016        Page 21
                   REPORT DISTRIBUTION SCHEDULE
                          ED-OIG/A09-A0016
                                                            No. of copies
Auditee

Delaine Easton                                                     1
State Superintendent of Public Instruction
California Department of Education
721 Capitol Mall
Sacramento, California 95814

Action Official

Andrew J. Pepin                                                    2
Office of Special Education and Rehabilitative Services
U. S. Department of Education
330 C Street, SW Room 3124
Washington, D.C. 20202


Other ED Offices

Director, Office of Special Education Programs                     1
Chief of Staff, Office of the Secretary                            1
Office of the Under Secretary                                      1
Office of General Counsel                                          2
Office of the Chief Financial Officer
  Financial Improvement and Post Audit Operations                  1
Office of Public Affairs                                           1

Office of Inspector General
Inspector General                                                  1
Deputy Inspector General                                           1
Deputy Assistant Inspector General for Audit                       1
Assistant Inspector General for Audit                              1
Assistant Inspector general for Analysis and Inspection            1
Assistant Inspector general for Investigations                     1
Director, State and Local Program Advisory and Assistance          1
Counsel to the Inspector General                                   1
Regional Inspector General for Audit                               1 (each)