oversight

Audit of Kansas State Department of Education Management Controls over IDEA, Part B - Special Education Performance Data.

Published by the Department of Education, Office of Inspector General on 2001-07-20.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                  Kansas State Department of Education
                       Management Controls Over
            IDEA, Part B – Special Education Performance Data



                                 FINAL AUDIT REPORT




                                       ED-OIG/A07-A0020
                                           July 2001




Our mission is to promote the efficiency,                    U.S. Department of Education
Effectiveness, and integrity of the                       Office of Inspector General
Department’s programs and operations.                     Region VII - Kansas City Office
                                     NOTICE
       Statements that management practices need improvement, as well as other
     conclusions and recommendations in this report, represent the opinions of the
   Office of Inspector General. Determination of corrective action to be taken will be
                 made by appropriate Department of Education officials.

In accordance with the Freedom of Information Act (5 U.S.C. Section 552), reports issued
      by the Office of Inspector General are available, if requested, to members of the
     press and general public to the extent information contained therein is not subject
                                  to exemptions in the Act.
 Audit of the Kansas State Department of Education Management
Controls Over IDEA, Part B – Special Education Performance Data

                                 Table of Contents


                                                                                     Page
Executive Summary                                                                     1

Audit Results                                                                         4

      Finding No. 1 – KSDE IDEA, Part B Exit Data Reported to the Department
      Is Not Reliable                                                                 6

      Finding No. 2 – Monitoring System Being Developed Lacks Systematic
      On-SiteVerification of IDEA, Part B Data                                        13

      Finding No. 3 – KSDE Needs To Strengthen Its Management Controls Over its
      IDEA, Part B Management Information System Function to Ensure Data Validity,
      Reliability, and Timeliness                                                     16

Background                                                                            20

Objectives, Scope and Methodology                                                     22

Statement on Management Controls                                                      25

Attachment A – IDEA, Part B Program Objectives, Performance Indicators and
      Performance Data                                                                26

Attachment B – KSDE’Comments to the Report                                            27




______________________________________________________________________________
                              Control Number ED-OIG/A07-A0020
                        Kansas State Department of Education
                            Management Controls Over
                 IDEA, Part B – Special Education Performance Data

                                          Executive Summary


The Kansas State Department of Education (KSDE) in general has a well-designed Individual
with Disabilities Education Act (IDEA), Part B1 centralized database system. KSDE needs to
implement additional internal controls to minimize data errors and irregularities. The integrity of
IDEA, Part B state-reported data is of particular importance because the U.S. Department of
Education (Department) relies on it to provide to Congress an objective and accurate measure of
the success of its special education programs, as required under the Government Performance
and Results Act (GPRA) of 1993. The Office of Special Education Programs (OSEP) within the
Department’s Office of Special Education and Rehabilitative Services (OSERS) administers
programs funded under IDEA, Part B. OSEP uses performance data reported by state
educational agencies (SEAs) in preparing the Department’s report to Congress on the outcomes
of IDEA, Part B programs.

For reporting outcomes under the Department’s 2001 Annual Plan, OSEP uses SEAs’
performance data for the following performance indicators:

    •    Earlier identification and intervention (intervention)
    •    Inclusive settings/regular education settings (placement)
    •    Graduation (exiting)
    •    Suspensions or expulsions (discipline)
    •    Qualified personnel (personnel)

KSDE is required by IDEA, Part B to submit this performance data to the Department.
Attachment A to this report shows the relationship between IDEA, Part B program objectives,
performance indicators and performance data.

Performance Indicator 4.7.c of the Department’s 1999 Performance Reports and 2001 Plans
states that all departmental program managers will assert that the data used for their program’s
performance measurement are valid, reliable, and timely, or will have plans for improvement.
Annually, Assistant Secretaries must provide the Office of the Under Secretary with a signed
formal attestation regarding this data.


1
 Part B of IDEA authorizes the Secretary of Education to provide grants to states to assist them in providing special
education and related services to children with disabilities.

_____________________________________________________________________________________________

        ED-OIG                     Control Number ED-OIG/A07-A0020                                   Page 1
In order to ensure that KSDE provides valid, reliable, and timely data that Department manager’s
can attest to, we found that KSDE needs to:

        comply with OSEP requirements for reporting IDEA, Part B exit data;

        strengthen system controls over its IDEA, Part B exit data to ensure that the data reported
        to the Department is valid, reliable, and timely;

        incorporate on-site verification of its IDEA, Part B data in its monitoring activities; and

        strengthen management controls over its IDEA, Part B management information system
        (MIS) function.

Our review of KSDE exit data disclosed numerous errors and irregularities, such as the inclusion
of students who did not meet OSEP’s age reporting requirements; duplication of reporting data
(duplicate exit reporting for the same student); inclusion of in-state transfers; inclusion of prior
year exits; and use of recycled student identification numbers.

Our review disclosed that KSDE needs a stronger local education agency (LEA) monitoring
function, and stronger management and system controls over its IDEA, Part B data. Specifically,
we found that KSDE lacked on-site monitoring procedures for testing the validity and reliability
of LEA special education data; an updated MIS procedures manual; a contingency plan for its
MIS function; and an independent review function for its IDEA, Part B data. Without a strong
internal control environment, KSDE management cannot assure that its IDEA, Part B data is
valid, reliable, and timely.

We recommend that the Assistant Secretary for Special Education and Rehabilitative Services:

        monitor KSDE’s compliance with OSEP’s exit-age reporting requirement;2

        require KSDE not to report multiple exits for students who transfer and subsequently exit
        from special education programs in other school districts or agencies;

        require KSDE to eliminate the option from its data dictionary that allows an LEA to defer
        reporting an exit to the following school year;

        require KSDE to strengthen its data-system controls, such as data verification and
        validation, for its exit performance data reported to the Department, e.g., for detecting
        duplicate exit data;

2
 OSEP instructs SEAs to report children ages 14 or older who exit special education programs. For age
determination purposes, OSEP requires that children counted on the report be 14 years old or older on December 1
of the school year in question.

_____________________________________________________________________________________________

      ED-OIG                     Control Number ED-OIG/A07-A0020                                  Page 2
       request KSDE to revise its edit checks to detect “recycled pseudo social security
       numbers” (i.e., reuse of student identification numbers);

       request KSDE to incorporate in its new LEA monitoring process a systematic on-site
       review of LEAs that includes appropriate testing of the IDEA, Part B performance data;

       require KSDE to direct its LEAs not to report as drop outs students who cease receiving
       special education and are later discovered to have moved to another school within or
       outside the district and are known to be continuing special education;

       request KSDE to develop and maintain updated written polices and procedures for
       collecting, reviewing, and reporting special education performance data;

       request KSDE to direct its LEAs to develop and maintain updated contingency plans for
       their MIS functions; and

       request KSDE to review the IEP formats in use in the state and ensure that they provide
       for consistent and accurate reporting of data by schools to the LEAs and by LEAs to the
       state.

KSDE generally agreed with our findings but disagreed with several recommendations. We have
summarized KSDE’s responses at the end of the respective finding to which each relate and
provided the full text of the responses as Attachment B.




_____________________________________________________________________________________________

    ED-OIG                  Control Number ED-OIG/A07-A0020                         Page 3
                                              Audit Results


KSDE’s management controls do not provide assurance that the data submitted by KSDE to the
Department will meet all of the standards in the Department’s Data Quality Standards.3 The
Data Quality Standards contain six standards for evaluating the quality of reported data:

    •    Validity – Data adequately represent performance.
    •    Accurate Description – Definitions and counts are correct.
    •    Editing – Data are clean.
    •    Calculation – The math is right.
    •    Timeliness – Data are recent.
    •    Reporting – Full disclosure is made.

For each of the Data Quality Standards, the Department provided examples of conditions that
meet or fail to meet the standard. The Department also provided Data Quality Checklists for use
by primary data providers and secondary data managers. For school year 1998-99, KSDE
management controls over the collection and reporting of performance data for exiting did not
provide Department managers adequate assurance to attest to the validity, reliability, and
timeliness of its IDEA, Part B exit data.

All Kansas LEAs use KSDE’s system or have integrated KSDE’s database specifications in their
own systems for collecting, processing, and reporting their special education data. Twice a year
LEAs send their special education student files electronically to KSDE. Once KSDE receives
these files, it sends them to its centralized database system where it performs two tiers of what
KSDE refers to as validity and accuracy tests. The first tier tests data within each LEA data file,
and the second tests the data from each LEA file relative to statewide data within the entire
KSDE database. Once this process is completed, KSDE formats the data to comply with the
Office of Special Education Program’s (OSEP’s) requirements and submits its files to the
Department.

Our review of KSDE’s database records for the December 1, 1998, child count demonstrated that
its MIS personnel were diligent in applying data validations and checks in an effort to ensure the
accuracy of the data. KSDE needs to implement additional internal controls to minimize data
errors and irregularities. The integrity of IDEA, Part B state-reported data is of particular
importance because the Department relies on it to provide to Congress an objective and accurate
3
  The Department issued these standards, as part of the 1999 Performance Reports and 2001 Plan, to assist ED
managers as they collect, analyze, and report data about federal education programs. Program managers can use the
standards as a tool when monitoring grantees and evaluating the quality of the reported data and preparing
submissions for the GPRA annual report. The standards are the Department’s attempt to provide criteria against
which to evaluate grantees data quality.

_____________________________________________________________________________________________

        ED-OIG                   Control Number ED-OIG/A07-A0020                                  Page 4
measure of the success of its special education programs, as required under the Government
Performance and Results Act of 1993.

In order to ensure that data are valid, reliable, and timely, KSDE needs to (1) comply with OSEP
exit data reporting rules and strengthen its MIS controls over the IDEA, Part B exit data it reports
to the Department; (2) incorporate in its monitoring activities on-site verification and substantive
testing of its IDEA, Part B data; and (3) strengthen its management controls over its IDEA, Part
B MIS function.




_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                          Page 5
Finding No. 1 – KSDE IDEA, Part B Exit4 Data Reported to the Department Is Not
Reliable

We found numerous errors and irregularities in KSDE exit data reported to the Department.
Specifically, we found that of the 7,382 children KSDE reported as having exited special
education programs during school year 1998-99: (1) 506 were not 14 years of age or older as of
December 1, which is contrary to OSEP requirements; and (2) 456 were erroneously counted
more than once. In addition, KSDE did not comply with OSEP rules in reporting exits
discovered after the June 30 reporting date nor did it perform sufficient database analysis to
identify and resolve numerous instances where the data fields used to determine inclusion on its
exit reports to OSEP contained internally inconsistent data.

The cumulative effect of KSDE’s misreporting was that Kansas’ special education programs
were performing better than they appeared to be. The inflated numbers were primarily in the
dropouts and transfers’ exit classifications rather than in the graduation and program completion.
At the same time, KSDE’s misreporting affected the quality of the Department’s IDEA, Part B
exit data.


Exit data reported to the Department included children age 13, which is contrary
to OSEP’s guidance and resulted in KSDE’s exit data being overstated
KSDE did not comply with OSEP rules5 when reporting on children who had exited special
education programs. OSEP instructs SEAs to report children ages 14 or older who exit special
education programs. For age determination purposes OSEP requires that children counted on the
report be 14 years old or older on December 1 of the school year in question. Compliance by all
states is required for comparability between exit data and child count data, as well as
comparability in the data reported by the various states.

For school year 1998-99, KSDE incorrectly used June 30, 1999, rather than December 1, 1998,
to determine the age of students in reporting its exit data to the Department. This departure from
OSEP rules resulted in 506 children, who were age 13, as of the December 1, 1998, child count,
being included in the statewide exit data.

4
  The Department requires states to report to OSEP an unduplicated count of all children with disabilities who exited
special education during the reporting year. OSEP’s instructions define exited students as children who (a) no
longer receive special education; (b) graduated with a regular high school diploma; (c) received a general
equivalency diploma; (d) reached maximum age; (e) died; (f) moved, but are known to be continuing in another
educational program; (g) moved, and are not known to be continuing in another educational program; or (h) dropped
out. The Department’s IDEA, Part B performance indicator 4.1 “Graduation” measures as its objective that “the
percentage of children with disabilities exiting school with a regular high school diploma will increase, and the
percentage who drop out will decrease.”
5
  OSEP IDEA, Part B Data Dictionary, September 1998.

_____________________________________________________________________________________________

      ED-OIG                      Control Number ED-OIG/A07-A0020                                    Page 6
Table 1.1 illustrates the number of exited students KSDE should have reported to the Department
had it adhered to OSEP’s age requirements.


                                          Table 1.1
                        Exit Count of Students Age 14 Years or Older
                         As of the School Year 1998-99 Child Count

 KSDE special education exits reported to the Department                              7,382
 Less: Exits age 13 years old as of the December 1 child count                          506
 Actual counts of KSDE exits age 14 years or older as of the December 1 child         6,876
 count


Exit data reported to the Department included instances of children who were
erroneously counted as having exited special education more than once
OSEP’s reporting instructions specify that states provide unduplicated exit data. Our database
analysis revealed that 456 of the 6,876 children age 14 years or older were reported to the
Department as exited as many as four times during the 1998-99 school year. We found that
6,014 students were reported as exited once, 358 students twice, 46 students were reported as
exited three times and two students were reported as exited four times. Most of the multiple
reporting was due to in-state transfers. KSDE also reported two individuals as dropping out of
school twice, three individuals as graduating twice, and one who both dropped out and
graduated.

Table 1.2 illustrates the net effect this duplicate reporting had on the exit data KSDE reported to
the Department.

                                         Table 1.2
                    Unduplicated Exit Count of Children Ages 14 or Older
                        As of the School Year 1998-99 Child Count

 Actual KSDE exit count of students age 14 years or older as of the                  6,876
 December 1 child count
 Less duplicated exit data:
    • 358 students counted as exited twice (358 x1)                         358
    • 46 students counted as exited three times (46 x 2)                     92
    • 2 students counted as exited four times (2 x 3)                         6
                                                                                       456
 Unduplicated KSDE exit count of children age 14 or older as of the
 December 1 child count date                                                         6,420



_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                          Page 7
KSDE did not comply with OSEP rules in reporting exits discovered after the June
30 reporting date
Using KSDE’s 1998-99 data dictionary, the KSDE MIS database manager instructed LEAs, to
choose among two reporting options for cases in which an exit is discovered after the reporting
period. One option was to report to KSDE corrected exit information for the reporting period
that just ended. The second option was to defer reporting the exit to the following school year, a
practice that the MIS database manger and the LEAs commonly referred to as “archiving”. This
latter option, which was inconsistent with OSEP rules, affected statewide data comparability.

Given the above situation, we were not able to quantify the exact number of deferred records.
However, we found that 253 of the 1998-99 exits were reported to KSDE during the month of
July and 1,091 during August 1998. This compares to an average of 406 exited students per
month from September 1998 through April 1999. The relatively large number of exits during
August might have been due to KSDE’s practice of archiving.


KSDE did not perform sufficient analysis to identify internally inconsistent data
Our database review revealed that KSDE did not perform sufficient analyses to determine
whether exit status students had appropriate status codes, unique student identifiers, and current
individualized educational programs (IEPs).

Comparison of exit and current status students revealed questionable status codes

KSDE reported exit data without performing sufficient analyses to identify possible
inconsistencies between exit status and current status records. We performed a comparison of
the reported exits and current status students and found several inconsistencies that indicated that
the exit status of some students might have been in error. For example:

 •   One hundred ninety-eight of the 4,826 exit status students (none of which were in-state
     transfers) had current status records, indicating that the student had been misclassified as an
     exit, the student had re-entered the program, or the current status record was in error.

 •   Nine hundred thirty-four of the 1,594 in-state transfer students did not have current status
     records in the database, which indicated incorrect in-state transfer classifications, changes
     in student identifiers, or failures by the new LEAs to report the students.

Correction of these inconsistencies would result in additional students being added or removed
from the reported exits. Comparing current and exit status student information would help



_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                          Page 8
identify and correct additional instances where LEAs had misreported student status information
or reused “pseudo social security numbers.”6
Comparison of student identification data revealed duplications

We found 75 instances of the same social security or “pseudo social security number” being
assigned to more than one student. Seven of the 75 instances were due to the assignment of the
same social security numbers to different students; and 66 of the 68 were due to one LEA
reusing pseudo social security numbers that had been previously assigned to students who had
exited the program. While these duplications were due to errors at the LEA level, KSDE could
have discovered them had they compared exit and current status information as we did during
our review and had they conducted on-site LEA monitoring.

The practice of recycling pseudo social security numbers can significantly hinder KSDE’s ability
to provide valid, reliable, and timely IDEA, Part B performance data to the Department.

Comparison of current and exit status records would have revealed potential errors in exit dates

IDEA, Part B requires that IEP meetings be held at least annually. This is reflected in KSDE’s
child count reporting, which excludes students whose IEP meeting dates are more than one year
old. This means that all exit dates should either be immediately after the expiration of the IEP,
for students who completed the objectives or withdrew from the program, or less than one year
old for other exits. Students continuing in the IDEA, Part B program should have IEP meeting
dates that are less than one year old.

The Department’s Data Quality Standards provide that data are collected and reported in a
timely manner and that the activities being measured occurred or existed at the time for which
they are reported.

Our calculations of IEP expiration dates revealed several situations that raise questions about the
validity, reliability, and timeliness of the exit data reported to the Department. For example:

    •   We found 127 exit-status students, 14 years of age or older, whose IEP meeting dates were
        blank or were more than a year old as of June 30, 1998, the previous reporting period. In
        some instances, IEPs dated as far back as January 1993. Late reporting of exits can
        compound the data validity and reliability issue even further if any of these students were
        under the age of 14 as of December 1 of the year following their last IEP meeting dates.
        We found that 36, or about 28 percent, of the 127 students would not have been 14 years
        old had the exit been reported in the year that their IEP expired.

    •   Another 136 exit-status students whose IEPs had expired after June 30, 1998, recorded exit
        dates from 30 to 261 days after the IEP expiration. This indicates that the exit dates may be

6
 KSDE instructs its LEAs to assign “pseudo social security numbers” to students when it is not possible for the
LEA to obtain the student’s social security number.

_____________________________________________________________________________________________

        ED-OIG                    Control Number ED-OIG/A07-A0020                                   Page 9
     the dates that the district discovered the exit instead of the actual exit date, which might
     have been in the previous reporting period.

 •   Of the 701 reported graduates, 49 had IEP expiration dates that were from 30 to 205 days
     old, raising questions about whether the student withdrew from special education prior to
     graduation.

 •   Finally, 249 current status students, age 14 or older, had IEPs that expired prior to the end
     of the reporting period, which indicates that these students might have actually exited the
     program, causing the exit data to be understated.


Recommendations
We recommend that the Assistant Secretary for Special Education and Rehabilitative Services:

1.1. Monitor KSDE’s compliance with the OSEP requirement to use students’ ages as of the
     December 1 child count date in reporting exits.

1.2. Require KSDE not to report multiple exits for students who transfer and subsequently exit
     from special education programs in other school districts or agencies.

1.3. Require KSDE to eliminate the “archiving of exits” reporting option from its data
     dictionary.

1.4. Require KSDE to strengthen its data-system controls, such as data verification and
     validation, for its exit performance data reported to the Department, e.g., for detecting
     duplicate exit data, inaccurate student status information, and expired IEPs.

1.5. Request KSDE to revise its edit checks to detect student records containing recycled pseudo
     social security numbers.


KSDE Response and OIG Comments
Recommendation 1.1:

KSDE agreed with our finding but disagreed with our recommendation. KSDE stated that OSEP
and Westat, OSEP’s contractor, are aware of the fact that the state of Kansas, as well as other
states, use the end of the school year age for exit age determination rather than the student’s age
as of the December 1 child-count. KSDE further stated that both OSEP and Westat have
acknowledged that reporting students who exit age 14 or older as of the December 1, child-count
date is flawed. According to KSDE, OSEP is proposing changing the exit age reporting


_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                          Page 10
requirement for students’ 14 years old and older as of their exit date. KSDE stated that they
would not take a corrective action at this time.

We do not agree with KSDE. Using any other date than the one prescribed by OSEP would
result in inconsistent and unreliable national exit data. OSEP’s Table 4 “Report of Children With
Disabilities Exiting Special Education, 1998-99 School Year” instructs states to “Report the
number of students ages 14-21 that exited special education by age-year, disability condition,
and basis of exit.” For age determination purposes, OSEP’s data dictionary defines age as “…a
child’s actual age in years on the date of the child count: December 1 or the last Friday in
October of the current school year….” Until OSEP formally changes this requirement, KSDE
should use OSEP’s current published criteria when reporting exits.

Recommendation 1.2:

KSDE agreed with our finding but disagreed with our recommendation. KSDE acknowledged
that there are duplicate exit counts across school districts but that OSEP has not provided
instructions to states on how to eliminate duplicate exit counts, i.e., students who exit two or
more districts within a school year. KSDE stated that OSEP needs to issue a directive specifying
which exit takes precedence when a student has multiple exit dates within a school year and that,
until the directive occurs, KSDE will make no change.

During our audit it came to our attention that there is no formal definition for the term
“catchment area” for reporting exits in OSEP’s table four. We also found that OSEP and Westat
defined the term “catchment area” differently.

OSEP’s reporting instructions clearly specify that states must provide unduplicated exit data.
Counting all exit activity for each individual LEA, rather than each individual student does not
provide the Department with an “unduplicated exit data” count. KSDE’s unwillingness to
correct this weakness affects national exit data comparability and makes the Kansas data invalid
and unreliable. KSDE could reduce the likelihood of duplicated exit data by incorporating a
simple edit check into its system.


Recommendation 1.3:

KSDE generally agreed with our finding but not with our recommendation. KSDE stated that
while the practice of archiving is and always has been an option for districts in reporting their
exits, KSDE did not require nor recommend that school districts archive exited students. KSDE
stated that it instructs LEAs to report only those exits that have occurred within the designated
12-month reporting period. KSDE further stated that its data verification process includes an
edit check “e,” which flags exits outside the current reporting period and excludes these exits
from Table 4. KSDE stated that they have revised the section in the report preparation
instructions to clarify how to report unknown exits.


_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                       Page 11
The revised data dictionary section dealing with unknown exits still needs to clarify that
archiving exits for OSEP reporting purposes is not an option and that OSEP allows states to
submit IDEA, Part B data corrections through September of each reporting period.

With respect to KSDE’s data verification process, KSDE’s edit checks cannot detect archived
exits nor can they exclude these exits from being reported in OSEP’s Table 4. An archived exit
does not reflect the actual student exit date but rather an artificially assigned exit date. Because
such artificial exit dates fall within the designated 12-month reporting period, KSDE’s edit
checks would not be able to detect them nor exclude them from the exit count. KSDE’s 1998-99
data dictionary instructed LEAs that “…the student may be archived until the next years EOY
[end-of-year] collection with an exit date of August 10….”

Recommendations 1.4 and 1.5:

KSDE generally agreed with our finding but disagreed with our recommendation. KSDE stated
that they do “…not view a student who exits at one point in the school year and then re-enters
special education at a later point in the school year, who has been indicated as an exit, as an error
in exit status….” KSDE recognized that “…due to possible human error, an entry error could
occur….” KSDE stated that they include “…in its verification program a search for duplicate
social security numbers that allows the detection of a recycled pseudo social security number
within a district….” KSDE stated that “as all of these circumstances require human analysis and
human checks, no corrective action is needed.”

KSDE stated that “…due to the requirement that parents must be provided notice of a meeting
and are to attend if at all possible, many parents choose to not redo an IEP if they know their
child is graduating, dropping out, or transferring….” KSDE further stated that “…just due to
human circumstance, various IEP meeting dates are beyond the annual review date.” KSDE
stated that they report exit data “…as soon as the LEAs are aware of it….” KSDE stated that
LEAs schedule annual review meetings within the one-year time frame but may not actually hold
the meeting due to various reasons. KSDE noted that this is “…indeed the case for
approximately 4% of special education students in Kansas….” KSDE stated that decisions made
by parents are not controllable by KSDE or by the LEAs. KSDE stated that “as all of these
circumstances require human analysis and human checks, no corrective action is needed.”

Although we agree with KSDE that some circumstances may require human analysis and human
checks, a simple addition to their current edit checks identifying exits with current status would
minimize the risk of including duplicated exit data in the data it reports to the Department. With
regard to reporting exited students with expired IEPs, KSDE’s own data dictionary clearly
instructs LEAs to report only the active IEP currently in force for the student.




_____________________________________________________________________________________________

     ED-OIG                   Control Number ED-OIG/A07-A0020                         Page 12
Finding No. 2 –Monitoring System Being Developed Lacks Systematic On-Site
Verification of IDEA, Part B Data


KSDE did not incorporate substantive on-site testing of IDEA, Part B data reported to the
Department in its LEA review activities. As previously discussed under Finding 1, during our
LEA review we found numerous errors and irregularities, such as the inclusion of in-state
transfers as exits, and invalid and unreliable reporting of social security numbers, that could
have been corrected had they been detected by substantive on-site testing.

Specifically, we found invalid and unreliable data in the following areas:

       Exit data. At two of the three LEAs reviewed, Salina and Pittsburg, we found that one
       out of seven and three out of nine student files reviewed in our dropout samples,
       respectively, were misclassified. The LEAs had erroneously classified these students as
       dropouts, when in fact they had moved and were known to be continuing special
       education. At all three LEAs reviewed, we found numerous instances of data errors and
       irregularities such as inadequate documentation supporting exit classifications; exits
       reported in the wrong year; and errors in exit dates, dates of birth, and social security
       numbers.

       Placement data. We found that all three of the LEAs we reviewed exhibited instances
       of inconsistent placement data. At the Salina and Hiawatha LEAs, 6 of the 50 and 20 of
       the 50 student files we reviewed, respectively, showed inconsistencies in the amount of
       time students spent receiving special education services. At the Pittsburg LEA, we
       found that 2 of the 50 student files we reviewed showed unreliable information
       regarding special education settings, and 18 of the 50 student files we reviewed showed
       inconsistencies in the amount of time students spent receiving special education
       services. We also noted that placement data were inconsistently reported within one
       LEA.

       Discipline data. Two of the three LEAs reviewed showed inconsistent discipline data.
       For example, at the Pittsburg LEA, KSDE reported only one long term suspension,
       while the LEA supporting documentation showed three. At the Salina LEA, we also
       found that KSDE misreported one of two suspensions and failed to report one.

We concluded that had KSDE incorporated data accuracy tests of its IDEA, Part B data to its
protocol for LEAs on-site reviews, these errors and irregularities could have been detected and
corrected. We also concluded that the presence of these tests may increase LEAs’ perception of
KSDE’s commitment to data validity, reliability, and timeliness and might identify areas where
LEAs need further clarification or training.



_____________________________________________________________________________________________

    ED-OIG                  Control Number ED-OIG/A07-A0020                        Page 13
KSDE replaced its five-year systematic on-site LEA compliance review process with a control
self-assessment system. This new process, piloted in school year 1999-2000, provided for LEA
on-site compliance reviews on an “as needed basis” rather than on a periodic basis as required
by Kansas State Regulations for Special Education. Section 91-40-51 (b) of the Kansas State
Regulations for Special Education, dated May 19, 2000, states that on-site compliance reviews
“shall be conducted periodically by the special education section of the department.”

KSDE officials stated that the new monitoring system was being developed in response to
OSEP’s new continuous improvement monitoring policies, which required state-control self-
assessments of LEAs. We reviewed OSEP’s Continuous Improvement Monitoring Process
2000-2001 Monitoring Manual and we did not find any statement in the manual directing SEAs
to adopt the control self-assessment process or a similar process as their new monitoring
approach. Furthermore, we noticed that at the inception of our audit, KSDE’s monitoring staff
was comprised of four staff members (one of them a recent hire). During the course of our
audit the most senior compliance staff member retired, leaving the team with fewer experienced
personnel. We also noted that the prior LEA compliance reports were dated past the State’s
five-year cycle. In light of these conditions, KSDE needs to assess whether the current staffing
structure is adequate to meet the demands prescribed by the five-year compliance review cycle.

KSDE monitoring staff stated that, in addition to their compliance reviews, OSEP performs
compliance reviews of selected SEAs. We reviewed OSEP’s August 13, 1996, monitoring
report of its on-site review of the Kansas State Board of Education’s implementation of IDEA,
Part B. We found that OSEP’s review covered, for the most part, tests of KSDE’s compliance
with laws and regulations but no substantive testing of the special education data reported by
the LEAs.

Although KSDE’s new control self-assessment process may be a proactive step towards
working in partnership with the LEAs to improve their internal control structures, a control self-
assessment is not a replacement for the assurances provided by on-site monitoring reviews.
The lack of a systematic on-site monitoring approach (i.e., reviewing LEAs only on an “as
needed basis”) increases the risk that data errors and irregularities could go undetected.


Recommendations
We recommend that the Assistant Secretary for Special Education and Rehabilitative Services:

2.1. Request KSDE to incorporate in its new LEA monitoring process a systematic on-site
     review of LEAs by KSDE personnel that includes substantive testing of the child count,
     placement, personnel, exit, and discipline data reported to the Department.

2.2. Require KSDE to direct its LEAs to accurately report dropout students and not include
     individuals who cease receiving special education and are later discovered to have moved


_____________________________________________________________________________________________

     ED-OIG                 Control Number ED-OIG/A07-A0020                        Page 14
     to another school within or outside the district and are known to be continuing special
     education.


KSDE Response
KSDE generally agreed with our finding but disagreed with our recommendations. KSDE’s
response asserted that systematic on-site verification is a major facet of Kansas’ continuous
improvement monitoring process.


OIG Comment
KSDE did not provide support for the above assertion. Their response did not state that the
continuous improvement monitoring process provides for systematic on-site review of LEAs by
KSDE personnel. In our discussion of this finding as well as our recommendation, we refer to
“systematic on-site verification” to mean systematic on-site review of LEAs by KSDE personnel
that includes substantive testing of the child count, placement, personnel, exit, and discipline
data reported to the Department.




_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                        Page 15
Finding No. 3 – KSDE Needs to Strengthen its Management Controls Over its
IDEA, Part B Management Information System Function to Ensure Data Validity,
Reliability, and Timeliness


Our review disclosed that additional management controls are needed for KSDE management to
minimize the risk of data vulnerabilities due to errors and irregularities. These controls will also
help to ensure that the Department will meet its Data Quality Standards. Specifically, we found
that KSDE management lacked an updated MIS procedures manual; a contingency plan for its
MIS function; and an independent review function with respect to the IDEA, Part B data it
reports to the Department. In addition, we found that KSDE needs to strengthen its controls over
its data processing activities and establish policies and procedures that ensure appropriate
segregation of duties.


Need for Updated MIS Operating Policies and Procedures
According to KSDE officials, the IDEA, Part B MIS policy and procedures manual has not been
updated since at least 1995. In addition, the MIS database has since been upgraded. Thus, the
existing manual is also obsolete.

Standards for Internal Control in the Federal Government7 provides that all transactions and
other significant events be clearly documented and readily available for examination. The
Federal Internal Control Standards require that this documentation should appear in management
directives, administrative policies, or operating manuals. Furthermore, the Federal Internal
Control Standards state that transactions should be promptly recorded to maintain their relevance
and value to management in controlling operations and making decisions.

KSDE management will not have reasonable assurance of data validity, reliability, and
timeliness until they update, formally document, and clearly communicate to the MIS database
manager IDEA, Part B data management objectives, policies, and procedures.


Need for Contingency Planning
KSDE officials acknowledged that they did not have a formal, written contingency plan for the
IDEA, Part B MIS function. KSDE officials told us that they believed that one of their data
coordinators, the person responsible for compiling the state’s personnel records, could assume
the MIS database manager responsibilities if the MIS database manager was not available.

7
 The General Accounting Office issued the Standards for Internal Control in the Federal Government
(GAO/AIMD-00-21.3.1) in November 1999, hereinafter referred to as the “Federal Internal Control Standards.”

_____________________________________________________________________________________________

     ED-OIG                      Control Number ED-OIG/A07-A0020                               Page 16
Furthermore, we noted that the three LEAs we reviewed also lacked formal written contingency
plans.

Federal Internal Control Standards provide that management establish a positive control
environment in which, among other things, it plans and ensures continuity of needed skills and
abilities, and ensures that data centers and client-server operation controls include contingency
and disaster planning.

The absence of formal, written contingency plans, combined with the absence of an updated MIS
procedures manual, increases the risk of erroneous data not being detected and corrected, when
changes that occur affect program operations, including personnel changes.


Need for Independent Review of Data Submitted to the Department
We found that no one other than the MIS database manager reviewed the IDEA, Part B data that
the KSDE submitted to the Department. KSDE’s MIS has computerized edit checks built into
the system to review the format, existence, and reasonableness of the data; but human input
and/or changes to this data, due to edit checks or analytical reviews, can result in data errors.
Furthermore, our review of the KSDE MIS database disclosed that the computerized built-in edit
checks did not detect the significant number of errors and irregularities we discussed in Finding
1 of this report.

Similarly, we found that at the three LEAs the database managers also submitted their IDEA,
Part B data without independent-party review.

Federal Internal Control Standards provide that management perform reviews at each functional
or activity level. Furthermore, the Department’s Data Quality Standards provide that a different
person, who is familiar with the data, systematically review the data.

Without an independent review function, KSDE management and users of the information do not
have reasonable assurance of the validity, reliability, and timeliness of its IDEA, Part B data
submitted to the Department.


Need for Additional Controls Over Data Processing Activities
The KSDE MIS database manager has incorporated into the IDEA, Part B data system a number
of systematic controls that enhance data validity, reliability, and timeliness. Nevertheless, our
review of the database disclosed that its computerized built-in edit checks did not always detect
errors and irregularities, such as recycled pseudo social security numbers, duplicate exit data, and
child counts with expired IEPs. We also noted that KSDE did not require LEAs to use a
standardized IEP data collection form. This practice may have also contributed to some of the
data errors and irregularities we discussed in Finding 1 of this report.

_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                         Page 17
Need for Policies and Practices Regarding Segregation of Duties
During a KSDE MIS demonstration session, we observed that their MIS database manager made
corrections and changes to student data rejected by the system’s built-in edit checks. We also
noted that the MIS database manager corrected some duplicate data by changing them to exits
without prior LEA consultation or review of LEA supporting documentation. This action
constituted a breach of the internal control of segregation of duties. Without proper segregation
of duties and functions there was no way to distinguish between the data the LEA submitted to
KSDE from the changes the database manager made to the LEA data.

Federal Internal Control Standards provide that key duties and responsibilities need to be divided
or segregated among different people to reduce the risk of error or fraud. This should include
separating the responsibilities for authorizing transactions, processing, recording and reviewing
transactions, and handling any related assets. The Standards further state that no one individual
should control all key aspects of a transaction or event. Also, the Department’s Data Quality
Standards provide that data errors should be traced back to the original source and mistakes
corrected by the MIS personnel, in the case of KSDE.

Without proper segregation of data inputting and verification duties, there is an increased risk
that data manipulation may go undetected, and as such, KSDE management and users of the
information do not have reasonable assurance that the data reported to the Department are valid
and reliable.


Recommendations
We recommend that the Assistant Secretary for Special Education and Rehabilitative Services:

3.1 Request KSDE to develop and maintain updated written polices and procedures for
    collecting, reviewing, and reporting special education performance data to include: (1) a
    formal contingency plan for the MIS function, (2) procedures requiring independent
    reviews of all data submitted to the Department, and (3) policies and procedures that ensure
    segregation of data inputting and editing functions.

3.2 Request KSDE to require LEAs to develop and maintain updated contingency plans for
    their MIS functions.

3.3 Request KSDE to review the IEP formats in use in the state and ensure that they provide for
    consistent and accurate reporting of data by schools to the LEAs and by LEAs to the state.




_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                        Page 18
KSDE Response
KSDE generally agreed with our recommendations. KSDE disagreed with our recommendation
that it require LEAs to use standardized IEP forms. The KSDE response noted that its data
dictionary already provides for consistency in the data submitted by LEAs.


OIG Comment
While KSDE’s data dictionary provides for data consistency, we found that all of the IEP forms
in our sample were not formatted in ways that would assure consistency in the data reported to
the state. Specifically, some IEP forms only provided for the reporting of special education
service time in terms of minutes and did not indicate the number of days and weeks over which
this service time occurred, as required by the data dictionary. Another problem is that the
schools in some LEAs each had their own IEP formats, which would increase the risk that the
LEA data entry person would make errors. We modified our recommendation to more
specifically address the problems we identified.




_____________________________________________________________________________________________

     ED-OIG                 Control Number ED-OIG/A07-A0020                      Page 19
                                         Background

The Government Performance and Results Act (GPRA), enacted in 1993, provides for the
establishment of strategic planning and performance measurement in the Federal Government.
Congress intended the Act to:

       Help federal managers improve service delivery by requiring that they plan for meeting
       program objectives and by providing them with information about program results and
       service quality;

       Improve congressional decision making by providing more objective information on
       achieving statutory objectives, and on the relative effectiveness and efficiency of federal
       programs and spending; and

       Improve internal management of the Federal Government.

GPRA requires that the head of each agency submit to the Director of the Office of Management
and Budget and to Congress a five-year strategic plan for program activities and a performance
plan for each fiscal year covered under the plan. The performance plans must establish
performance indicators to be used in measuring or assessing the relevant outputs and outcomes
of each program activity.

The Department of Education published its Strategic Plan 1998-2002 in September 1997. The
Department’s 1999 Performance Reports and 2001 Plans were submitted to Congress in March
2000. The 2001 Annual Plan contained nine performance indicators for the IDEA, Part B –
Special Education Program. The Department relies on state-reported data for measuring the
performance of six of the nine indicators listed in the plan, i.e., inclusive settings, earlier
identification and intervention, regular education settings, suspensions or expulsions, graduation,
and qualified personnel. Attachment A to this report shows the relationship between the IDEA,
Part B program objectives, performance indicators, and performance data.

Performance Indicator 4.7.c of the Department’s 1999 Performance Reports and 2001 Plans
states that all departmental program managers will assert that the data used for their program’s
performance measurement are reliable, valid, and timely, or will have plans for improvement.
Annually, the Assistant Secretaries must provide the Office of the Under Secretary with a signed
formal attestation covering their data. The Department developed Data Quality Standards to
assist departmental managers as they collect, analyze, and report data about federal programs.
For the IDEA, Part B special education programs, the data used for measuring performance
included data reported by the individual states.



_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                         Page 20
KSDE is responsible for reporting to the Department (including the data collection and
processing) IDEA, Part B – Special Education data for the State of Kansas. The State of Kansas
has 81 LEAs and received over $44 million of IDEA, Part B funds for the 1999-00 award year.
OSEP reported that on December 1, 1998, the State had a total of 58,425 children receiving
special education services.




_____________________________________________________________________________________________

     ED-OIG                 Control Number ED-OIG/A07-A0020                      Page 21
                            Objectives, Scope, and Methodology

Our objectives were to: (1) identify the process used by KSDE to accumulate and report
performance data to the Department, (2) determine whether KSDE’s management controls
ensured that performance data are reliable, and (3) identify barriers or obstacles, if any, that may
impact KSDE’s ability to provide quality performance data. The audit was limited to state-
reported data used by OSEP to report on program objectives and outcomes as required by
GPRA.

Our review covered the state-reported 1998-99 school year IDEA, Part B data for six of the nine
Department of Education performance indicators:

         Inclusive settings (Indicator 1.1)
         Earlier identification and intervention (Indicator 2.1);
         Regular education settings (Indicator 3.1);
         Suspensions or expulsions (Indicator 3.3);
         Graduation (Indicator 4.1); and
         Qualified personnel (Indicator 5.1)

As of December 1, 1998, 52 of the 81 LEAs in Kansas were unified school districts and/or
cooperatives serving over 90 percent of the children reported under the IDEA, Part B program.
To ensure we evaluated procedures at LEAs with varying age populations of children with
disabilities, we focused our review on unified school districts and cooperatives, which included
schools covering preschools and grades K through 12. To ensure we evaluated procedures at
LEAs with varying database systems, we grouped the 52 LEAs based on their type of database
system.8 Finally, to ensure we evaluated procedures at LEAs with varying student populations,
we grouped the LEAs by their reported child count.

As shown in Table 4.1, we randomly selected an LEA from each group, as follows:

                                        Table 4.1
                     KSDE LEA Population Subject to Our Sample Selection
                            By Database Type and Child Count

    Clusters     Database Type (See footnote 8 on preceding page.)      Child Count
    1            Custom LEAs (5)9                                       2,029-6,017
    2            File maker Pro LEAs (18)                               935-2,206
    3            File maker Pro LEAs (28)                               301-732

8
  Kansas LEAs collect, process, and report IDEA, Part B data using one of two database systems: the State provided
system, “File Maker Pro”, or “custom” database systems developed by some of the larger LEAs for their own use,
while incorporating State specifications.
9
  We excluded one of six KSDE “custom” database LEAs from our sampling universe because it was not within the
_____________________________________________________________________________________________
child count parameters of cluster number one.
       ED-OIG                      Control Number ED-OIG/A07-A0020                                Page 22
To identify the process KSDE uses in accumulating and reporting performance data to the
Department, we interviewed KSDE officials and staff responsible for collecting, processing, and
reporting the performance data to OSEP. For the three LEAs selected for our review, we
interviewed LEA officials to gain an understanding of the procedures that the LEAs used in
collecting, processing, and reporting the IDEA, Part B supporting data to KSDE.

To determine whether KSDE management controls ensure performance data are valid, reliable,
and timely, we: (1) reviewed and tested school year 1998-99 IDEA, Part B MIS database
controls; (2) selected a random sample of student and personnel records from KSDE’s IDEA,
Part B database; and (3) verified the validity and reliability of the data reported to the
Department by comparing this data to supporting school and student records.

Table 4.2 shows the number of students reported to the Department and the number of student
records randomly selected for each of the LEAs we visited.

                                        Table 4.2
       Total Child Count and Number of Student Records Reviewed for Selected LEAs

                    LEA                        Total Students         Student Records Reviewed
                                                 Reported

                  Salina                            2,029                           106
                 Pittsburg                          1,430                           98
                 Hiawatha                            306                             63

Table 4.3 shows a breakdown, by population and sample size, of the student and school data sets
reviewed at each of the selected LEAs. These data sets support the six IDEA performance
indicators that rely on state-reported data: inclusive settings (indicator 1.1), earlier identification
and intervention (indicator 2.1), regular education settings (indicator 3.1), suspensions or
expulsions (indicator 3.3), graduation (indicator 4.1), and qualified personnel (indicator 5.1).

                                      Table 4.3
 Total Numbers Student and Personnel, and Numbers of Students and Personnel Records
                  Reviewed, Broken Down By Performance Indicator

                Child Count &              Graduation and             Discipline                Personnel
 LEA             Placement                     Dropout
              (Data sets supporting      (Data sets supporting   (Data sets supporting    (Data sets supporting
             Indicators 1.1, 2.1, 3.1)       Indicator 4.1)          Indicator 3.3)           Indicator 5.1)
              Total        Student        Total       Student     Total       Student      Total      Personnel
             Students       Files        Students       Files    Students       Files     Certified      Files
             Reported     Reviewed       Reported Reviewed       Reported Reviewed        Teachers Reviewed
                                                                                          Reported
 Salina         2,029               50         91           20         36           36          184            40
 Pittsburg      1,430               50         96           20         28           28          135            36
Hiawatha          306               50         13           13          0            0           32            32

_____________________________________________________________________________________________

     ED-OIG                         Control Number ED-OIG/A07-A0020                               Page 23
In addition, we reviewed the state of Kansas’ fiscal years 1997, 1998, and 1999 single audit
reports and KSDE’s most recent on-site compliance review reports for the three selected LEAs.

We performed fieldwork at the KSDE in Topeka, special education offices in Salina, Pittsburg,
and Hiawatha, and at our offices in Kansas City, Missouri. Our fieldwork was conducted from
June 27 to December 15, 2000. Our audit was performed in accordance with generally accepted
government auditing standards appropriate to the scope of review.




_____________________________________________________________________________________________

     ED-OIG                 Control Number ED-OIG/A07-A0020                      Page 24
                        Statement on Management Controls

For our review, we assessed KSDE’s management controls, policies, procedures and practices
applicable to KSDE’s process for collecting and reporting performance data for the IDEA, Part B
program as required by GPRA. Our assessment was performed to determine whether the
processes used by KSDE and the reviewed LEAs provided a reasonable level of assurance that
KSDE reported reliable performance data to OSEP.

For the purpose of this report, we classified KSDE’s controls into the following categories:

       Guidance and technical assistance,
       Collection of data from LEAs,
       Data compilation and report preparation, and
       Monitoring LEA data collection and reporting processes.

Because of inherent limitations, a study and evaluation made for the limited purpose described
above would not necessarily disclose all material weaknesses in the management controls.
However, our assessment disclosed management control weaknesses, which could adversely
affect KSDE’s ability to report accurate performance data for GPRA. These weaknesses are
discussed in the Audit Results section of this report.




_____________________________________________________________________________________________

     ED-OIG                  Control Number ED-OIG/A07-A0020                        Page 25
                                                                                                                                                               Attachment A
                           IDEA, Part B Program Objectives, Performance Indicators and Performance Data
                                                      FY 2001 Annual Plan1

                                                                                                                                   PERFORMANCE DATA
       PROGRAM OBJECTIVE                                        PERFORMANCE INDICATOR                                          COLLECTED FROM OSEP FORMS
    All preschool children with disabilities     1.1 Inclusive settings. The percentage of preschool children with            SEAs report the number of students ages 3-5 by
    receive services that prepare them to             disabilities who are receiving special education and related services   age and educational placement
    enter school ready to learn.                      in inclusive settings will increase.

    All children who would typically be          2.1 Earlier identification and intervention. The percentage of               SEAs report number of disabled children receiving
    identified as being eligible for special         children served under IDEA ages 6 or 7, compared to ages 6-21,           special education by:
    education at age 8 or older and who are          will increase.                                                                    disability and age and
    experiencing early reading or behavioral                                                                                           disability and ethnicity
    difficulties receive appropriate services
    earlier to avoid falling behind their
    peers.
    All children with disabilities have access   3.1 Regular education settings . The percentage of children with             SEAs report the number of students ages 6-21, by
    to the general curriculum and                    disabilities ages 6-21 who are reported by states as being served in     age category, disability and placement
    assessments, with appropriate                    the regular education classroom at least 80 percent of the day will
    accommodations, support and services,            increase.
    consistent with high standards.              3.3 Suspensions or expulsions. The percentage of children with               SEAs report the number of students suspended or
                                                     disabilities who are subject to long-term suspension or expulsion,       expelled, unilateral removed or removed based on
                                                     unilateral change in placement or change in placement if their           hearing by:
                                                     current placement is likely to result in injury to someone, will                   disability and basis of removal and
                                                     decrease.                                                                          ethnicity and basis of removal
    Secondary school students with               4.1 Graduation. The percentage of children with disabilities exiting         SEAs report the number of students ages 14-21
    disabilities receive the support they need       school with a regular diploma will increase and the percentage who       that exited special education by:
    to complete high school and prepare for          drop out will decrease.                                                            age, disability and basis of exit,
    postsecondary education or employment.                                                                                              age and basis of exit and
                                                                                                                                        ethnicity and basis of exit
    States are addressing their needs for        5.1 Qualified personnel. The number of states and outlying areas             SEAs report the number and type of teachers and
    professional development consistent              where at least 90 percent of special education teachers are fully        other personnel to provide special education and
    with their comprehensive system of               certified will increase.                                                 related services for children ages 3-21. SEAs must
    personnel development.                                                                                                    report the number of staff:
                                                                                                                                        fully certified and
                                                                                                                                        not fully certified


1
    Source: ED-OIG/A09-A0001

ED-OIG                                                                      ACN ED-OIG/A07-A0020                                                                       Page 26
                                                                             Attachment B




                        KSDE’s Comments to the Report




_____________________________________________________________________________________________

    ED-OIG                   Control Number ED-OIG/A07-A0020                      Page 27