oversight

Tax Systems Modernization: IRS Needs to Resolve Certain Issues With Its Integrated Case Processing System

Published by the Government Accountability Office on 1997-01-17.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                     United States General Accounting Office

GAO                  Report to the Chairman, Subcommittee
                     on Oversight, Committee on Ways and
                     Means, House of Representatives


January 1997
                     TAX SYSTEMS
                     MODERNIZATION
                     IRS Needs to Resolve
                     Certain Issues With Its
                     Integrated Case
                     Processing System




GAO/GGD/AIMD-97-31
                   United States
GAO                General Accounting Office
                   Washington, D.C. 20548

                   General Government Division

                   B-265969

                   January 17, 1997

                   The Honorable Nancy L. Johnson
                   Chairman, Subcommittee on Oversight
                   Committee on Ways and Means
                   House of Representatives

                   Dear Chairman Johnson:

                   Improving service to taxpayers is one of the goals the Internal Revenue
                   Service (IRS) hopes to achieve by restructuring its organization through tax
                   systems modernization (TSM). To guide its efforts to improve customer
                   service, IRS developed its “Customer Service Vision,” which is a key part of
                   its overall business vision for its future operations. The Customer Service
                   Vision describes how IRS proposes to meet taxpayers’ needs in the future.
                   IRS’ plans for achieving this vision include a long process of consolidating
                   work units, retraining employees, and developing new information
                   systems.

                   Integral to reaching this vision is IRS having the capabilities to quickly
                   obtain the data needed to answer taxpayer questions and resolve a variety
                   of taxpayer problems. IRS’ Integrated Case Processing (ICP) System is one
                   of the key information systems being developed and deployed to help
                   provide these capabilities. With ICP and other new systems, IRS envisions
                   that by 2001 employees will have the capability to resolve taxpayer issues
                   during a single telephone conversation, 95 percent of the time. This report
                   responds to your request that we review IRS’ ICP systems development
                   effort. Specifically, we (1) evaluated IRS’ assessment of ICP costs and
                   benefits and obtained users’ perceptions on the system’s benefits,
                   (2) analyzed IRS’ testing of ICP, (3) assessed IRS’ ongoing efforts to redesign
                   its customer service work processes to fully use ICP capabilities, and
                   (4) assessed the software development processes being used for ICP.


                   Improving service to taxpayers is an important goal that IRS’ Customer
Results in Brief   Service Vision shows promise in addressing. ICP, if successful, could
                   improve upon IRS’ existing systems by providing IRS employees access to
                   more information and automated tools. However, the promise anticipated
                   by the vision is unlikely to be fulfilled unless changes are made in the
                   development and deployment of ICP. Our review disclosed that IRS has
                   invested millions of dollars in ICP; but unresolved issues with the costs and
                   benefits of ICP, the testing of ICP, the redesign of work processes, and




                   Page 1                                       GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




software development weaknesses raise serious concerns about IRS’
capability to successfully develop and deploy ICP.

IRS estimated that about $150 million was spent on ICP from 1993 to 1995
and that an additional $77 million will be spent through 1996. Overall, IRS
plans to spend about $641 million on ICP through fiscal year 2000. Despite
this sizable investment, costs and benefits remain uncertain because
(1) the scheduled rollout of ICP workstations continues to change, (2) the
ICP capabilities have not been finalized, (3) certain benefits are still to be
determined, and (4) the software is still being developed. Although IRS
managers and customer service staffs said that ICP is an improvement over
existing systems, IRS has not taken the steps needed to ensure that systems
solutions it has designed are likely to achieve the goal of improved
customer service or that ICP is a cost-effective modernization effort that
should be supported with continued funding.

IRSplanned that certain ICP capabilities being developed would be pilot
tested beginning on September 30, 1996. Yet, in a memorandum dated
July 31, 1996, the Associate Commissioner for Modernization postponed
the pilot test indefinitely. In the meantime, IRS has hired a contractor to
perform a risk assessment of the entire ICP effort. In an interim report,
dated August 21, 1996, the contractor recommended that the pilot test be
delayed at least 3 months, citing problems with software requirements
definition, testing, and security.

IRSdeveloped and initiated a limited deployment of the initial ICP version.
Testing of this initial version was limited to a 4-week nonpeak period at
one site. The test results provided little insight on the potential benefits of
the system, because IRS did not adequately measure ICP’s impact on
business operations. In some areas, IRS had no baseline measures for
comparison of the status quo. In other areas, IRS could not isolate the
impact of ICP from that of other changes in work processes. IRS officials
recognized the limitations of the testing and told us that testing of the next
software release would be more comprehensive.

Also, it is unclear how this and future versions will support new work
processes that are being designed. According to IRS’ Customer Service
Vision, ICP was expected to be the vehicle to provide customer service
representatives (CSR) with access to information that would enable IRS to
combine a phase of the tax collection process with customer service.
However, IRS is now reconsidering the extent to which the collection
process can be combined with customer service and is reconsidering the



Page 2                                       GAO/GGD/AIMD-97-31 IRS’ ICP System
                           B-265969




                           range of tasks a customer service representative can be expected to
                           perform. Modifications to ICP may be required as the roles and
                           responsibilities of CSRs continue to evolve.

                           The software development processes in place at IRS organizations
                           responsible for developing ICP software are extremely weak, making the
                           likelihood of their producing quality ICP software on time and within
                           budget very low. According to nationally recognized standards for
                           software development, organizations must have defined processes in five
                           key areas. The three IRS organizations developing ICP software failed to
                           fully meet the standards in any of these areas.


                           As we and others have reported, taxpayers often have problems obtaining
Background                 the needed information from IRS to file their tax returns and resolve
                           problems with their accounts. Not only do taxpayers have difficulty in
                           reaching IRS by telephone, but once a taxpayer reaches a CSR, that CSR does
                           not always have easy access to the information needed to resolve the
                           taxpayer’s problems.1

                           One of TSM’s major goals is quick and easy access to the data needed by
                           CSRs and other employees to provide better customer service and improve
                           voluntary compliance. Several systems are being developed or are planned
                           to address IRS’ critical data needs. IRS considers ICP to be one of the most
                           important of these undertakings.


CSRs Need Better Access    Information on taxpayers and their accounts are contained in a variety of
to Taxpayer Account Data   IRS databases. Until 1995, information on IRS’ primary database for
                           taxpayer account, which is used for assisting taxpayers—known as the
                           Integrated Data Retrieval System (IDRS)—was stored at the service center
                           where the taxpayers filed their returns and could be accessed by
                           employees at the service center, connected district offices, and customer
                           service sites. If the taxpayers called a service center other than that at
                           which their returns were filed, the CSR would be unable to answer
                           questions about their accounts. Either the taxpayers were told to call a
                           different service center or the questions would be written down and
                           referred to the appropriate service center for resolution.




                           1
                           Tax Administration: IRS Faces Challenges in Reorganizing for Customer Service (GAO/GGD-96-3,
                           October 10, 1996).



                           Page 3                                                 GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




Early in 1995, IRS implemented a networking capability among the service
centers, district offices, and customer service sites, so that employees
could have access to IDRS data nationwide. This networking capability is
referred to as Universal IDRS; however, this is only a partial solution to IRS’
data accessibility problems. Although Universal IDRS gives IRS employees
access to taxpayer account information nationwide, it does not always
provide complete information on a taxpayer’s account. Other information
needed to help the taxpayer may be contained in different systems that are
not linked to IDRS. 2

Generally, the CSR must access each of the different systems
independently. For example, an IRS employee using IDRS will know that a
taxpayer was sent a notice of underreported income but would not have
access to the actual notice. That notice is contained in IRS’ Automated
Underreporter System (AUR). AUR would provide additional information,
such as the amount of unreported income and information from the tax
return that may indicate, for example, the amount of dividend or interest
reported by financial institutions but not by the taxpayer.

To obtain these data, the IRS employee must be able to access the AUR
database using a different computer terminal. However, the employee may
not have access capability. As a result, the employee would have to either
(1) refer the taxpayer to another office, (2) research the problem and
return the taxpayer’s call, or (3) tell the taxpayer to call back later.

With ICP, IRS envisions that customer service staff would have all relevant
information from a number of important databases available to them to
assist the taxpayer. IRS plans to use ICP to integrate and obtain access to
information from each of the existing IRS functional databases that contain
taxpayer information. The primary databases include IDRS, AUR, Corporate
Files On Line (CFOL), and the Automated Collection System (ACS).3

ICPwas intended to resolve the data accessibility problems by integrating
the information from various databases used by CSRs and providing a
single computer terminal to do the task. Using a taxpayer’s Social Security
number to obtain case information, the ICP software is expected to
automatically assemble the relevant information on a computer terminal,


2
 Telephone Assistance: Adopting Practices Used by Others Would Help IRS Serve More Taxpayers
(GAO/GGD-95-86, April 12, 1995).
3
 IDRS stores taxpayer account data collected from filed returns; AUR contains information on
taxpayers who underreport their income; CFOL provides up-to-date views of taxpayer account data
posted on the master file; and ACS has data on taxpayers whose payments to IRS are overdue.



Page 4                                                  GAO/GGD/AIMD-97-31 IRS’ ICP System
                              B-265969




                              provide questions and prompts for CSRs, and perform calculations for
                              updating the account.


ICP to Be Developed and       ICP,as originally envisioned, was expected to support both IRS’ customer
Implemented in Stages         service vision and district office compliance operations. It was to be
                              developed and implemented in stages using a multirelease approach. Each
                              release was to build upon the previous release, providing a related set of
                              software, hardware, and telecommunication tools that were to provide
                              incremental improvements in customer service.

                              The first series of ICP releases, commonly referred to as releases 1.0/1.5,
                              2.0, and 2.5, were intended to meet the needs of IRS’ customer service
                              employees. Later releases are expected to support district office
                              compliance operations, but they have been delayed until sometime after
                              2000 due to IRS’ recent rescoping of the TSM program.

                              The first release, 1.0/1.5, primarily provided computer hardware and
                              software that eliminated the need for CSRs to use multiple workstations to
                              access data on various databases. It was designed to allow CSRs to use one
                              computer terminal to access the various databases that contain
                              information on taxpayers’ accounts. For example, using an ICP
                              workstation, CSRs could access information stored on IRS’ three major
                              databases—IDRS, ACS, and AUR as well as some smaller databases. It also
                              provided some features that made the existing systems easier to use, such
                              as a summary screen of taxpayer information, menus to look up command
                              codes, and automated forms ordering.

                              The next ICP software release, 2.0, is designed to provide CSRs with a single
                              view of taxpayer data. It is expected to eliminate the need for a CSR to
                              access the separate databases. Instead, information is to be assembled
                              from various databases onto a standard screen. Release 2.0 is expected to
                              also provide CSRs with new tools to enhance their ability to offer taxpayers
                              one-stop service. It is also expected to provide a call-routing feature that
                              would route taxpayers’ calls to the next available representative who
                              would be most skilled at addressing the taxpayer’s question or issue.

                              Some of the tools expected from ICP 2.0 include

                          •   on-line display of and adjustments to Form 1040 returns and associated
                              schedules including automated tax, interest, and penalty computation;
                          •   automated installment agreement preparation;



                              Page 5                                      GAO/GGD/AIMD-97-31 IRS’ ICP System
                               B-265969




                           •   automated payment tracer capability;
                           •   automated refund inquiries;
                           •   data directed routing; and
                           •   enhanced history generation.

                               Additionally, ICP 2.0 is expected to eliminate the need for CSRs to
                               remember numerous command codes, which are needed to access and
                               update taxpayer account information. For example, both ACS and IDRS have
                               their own language of command codes, requiring significant training and
                               adequate time to learn. IDRS alone has many codes, requiring two large
                               handbooks of explanation. Not surprisingly, few IRS employees have
                               mastered both systems. ICP 2.0 would eliminate the need for CSRs to know
                               any ACS command codes and most of the IDRS command codes.4

                               IRSexpects ICP 2.0 to provide improved service to taxpayers. Currently, to
                               answer taxpayers’ questions about whether payments have been properly
                               credited to their accounts, CSRs must access up to five separate databases,
                               searching for payment transaction codes or payment offset codes. This
                               procedure is known as a “payment tracer.” CSRs must then locate the
                               missing payment and manually prepare a credit transfer to move the
                               payment to the proper account. Often the CSR is unable to complete the
                               search while the taxpayer is on the telephone.

                               With ICP, CSRs are expected to complete the payment tracer and resolve the
                               taxpayer’s question while the taxpayer is still on the telephone. Instead of
                               entering five separate search commands, CSRs would simply input the
                               amount of the payment. ICP would automatically search the databases and
                               provide CSRs with the information needed to determine whether the
                               taxpayer’s account had been properly credited for the payment. ICP would
                               also provide CSRs with an easier way to transfer payments between
                               accounts.

                               Release 2.5 is expected to provide CSRs this same level of access to
                               information for business taxpayers.


Prior Studies Identified       Over the past decade, we have issued several reports and testified before
TSM Management and             congressional committees on IRS’ costs and difficulties in modernizing its
Technical Weaknesses           information systems. From 1986 through fiscal year 1995, IRS estimated
                               that it had invested about $2.5 billion in TSM. IRS projects to spend over

                               4
                                IRS officials told us that some IDRS command codes are used so infrequently that the cost to
                               eliminate them would exceed expected benefits.



                               Page 6                                                    GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




$8 billion on TSM. By any measure, this is an enormous information
systems development effort, much larger than most other organizations
have ever undertaken.

In September 1993, IRS assessed its software development capability using
Carnegie Mellon University’s Software Engineering Institute’s (SEI)
Capability Maturity Model (CMM).5 This model is the generally accepted
standard in both industry and government for assessing an organization’s
ability to develop software in accordance with modern software
engineering methods. This tool focuses on the maturity of certain software
development processes called “key process areas (KPA).” The five KPAs are:
requirements management, software project planning, software project
tracking and oversight, software quality assurance, and software
configuration management. The model ranks organizations on a scale of 1
to 5. IRS’ self-assessment placed its software development capability at the
lowest level, CMM level 1, because its assessment showed significant
weaknesses in all KPAs prescribed for an organization to reach a level 2
capability. Each of the CMM levels are described in appendix I.

In February 1995, TSM was added to our list of high-risk areas6 as a critical
information systems project that is vulnerable to schedule delays, cost
over-runs, and potential failure to meet mission goals. In July 1995, we
issued a comprehensive report on the effectiveness of IRS’ efforts to
modernize tax processing.7 The report discussed pervasive management
and technical weaknesses that must be corrected if TSM is to succeed and
made over a dozen specific recommendations. In this regard, we reported
that unless IRS improved its software development ability, it is unlikely to
build TSM in a timely or economically manner, and systems are unlikely to
perform as intended.

Reflecting continued congressional concern with TSM, the Treasury, Postal
Service, and General Government Appropriations Act of 1996, required the
Secretary of the Treasury to provide a report to the House and Senate
Appropriations Committees regarding the management and

5
 SEI is a nationally recognized, federally funded research and development center established at
Carnegie-Mellon University in Pittsburgh, Pennsylvania, to address software development issues. In
the late 1980’s, SEI, with assistance from the Mitre Corporation, developed a process maturity
framework that would help organizations improve their software process. In general, software process
maturity serves as an indicator of the likely range of cost, schedule, and quality results to be achieved
by projects within a software organization.
6
 High-Risk Series: An Overview (GAO/HR-95-1, Feb. 1995).
7
Tax Systems Modernization: Management and Technical Weaknesses Must Be Corrected If
Modernization Is to Succeed (GAO/AIMD-95-156, July 26, 1995).



Page 7                                                      GAO/GGD/AIMD-97-31 IRS’ ICP System
                     B-265969




                     implementation of TSM.8 This report was provided to the Committees in
                     May 1996.9

                     As directed by the same legislation that required the report, in June 1996,
                     we reported on our assessment of IRS actions taken to correct its
                     management and technical weaknesses.10 We found that while IRS had
                     taken some actions, none responded to any of our recommendations in
                     total. As a result, IRS was not in any appreciably better position to ensure
                     Congress that the money spent on TSM would deliver the promised
                     capability, on time, and within budget.

                     Because IRS had not made adequate progress to correct its weaknesses, we
                     suggested that Congress should consider limiting TSM spending to only
                     cost-effective modernization efforts that (1) support ongoing operations
                     and maintenance; (2) correct IRS’ pervasive management and technical
                     weaknesses; (3) are small, represent low technical risk, and can be
                     delivered in a relatively short time frame; and (4) involve deploying
                     already developed systems—only if these systems have been fully tested,
                     are not premature given the lack of a completed architecture, and produce
                     a proven, verifiable business value.


                     Our objectives were to (1) evaluate IRS’ assessment of ICP costs and
Objectives, Scope,   benefits and obtain users’ perceptions on the system’s benefits, (2) analyze
and Methodology      IRS’ testing of ICP, (3) assess IRS’ ongoing efforts to redesign its
                     customer-service work processes to fully utilize ICP capabilities, and
                     (4) assess IRS’ software development processes being used for ICP.

                     To evaluate IRS’ assessment of ICP costs and benefits, we reviewed two IRS
                     studies that were developed to assess the expected costs and benefits of
                     ICP. The first document, known as the Unified Business Case, was
                     developed by IRS in January 1995. During our review, IRS conducted a
                     second analysis of ICP costs and benefits. This ICP Business Case was
                     issued in July 1996. We reviewed both documents for completeness and




                     8
                      P.L. 104-52, Nov. 19, 1995.
                     9
                      Report to House and Senate Appropriations Committees: Progress Report on IRS’s Management and
                     Implementation of Tax Systems Modernization, Department of the Treasury, May 6, 1996.
                     10
                      Tax Systems Modernization: Actions Underway But IRS Has Not Yet Corrected Management and
                     Technical Weaknesses (GAO/AIMD-96-106, June 7, 1996).


                     Page 8                                                 GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




compared them against IRS’ criteria for business cases,11 as detailed in its
Business Case Handbook. To obtain user views on ICP benefits, we
randomly selected and conducted structured interviews with 193 CSRs, 37
customer service managers and 11 system administrators at the Nashville,
Cincinnati, and Atlanta customer service sites. We chose these three sites
because (1) Nashville was the prototype site for testing ICP and new work
processes and (2) Atlanta and Cincinnati were two of the initial sites to
receive ICP.

To analyze IRS’ testing of ICP, we reviewed the results of the initial pilot test
of ICP version 1.5. We met with IRS officials at the National Office, the ICP
program office, and the Customer Service Site Executive’s Office to
discuss the limitations of the test that IRS identified. We also discussed
with IRS officials their plans for a more thorough test of the next ICP
version, 2.0, including visiting the Integrated Test and Control Center
facility where ICP 2.0 was being tested. We were unable to review specific
plans for the pilot test because they had not been completed during our
audit work.

To assess IRS’ ongoing efforts to redesign its customer service work
processes to fully utilize ICP capabilities, we met with IRS officials in charge
of efforts to develop new work processes for CSRs. We reviewed
documents, such as the Customer Service Work System Design document,
that discussed the results of the initial efforts to broaden the scope of
telephone assistors’ work. We also reviewed draft reports on the results of
recent studies that make further recommendations for redesigning work
processes.

To assess IRS’s software development processes used to develop ICP 2.0,
our fourth objective, an SEI-trained team of GAO specialists used SEI’s
Software Capability evaluation (SCE) method. The details of our scope and
methodology for this objective are discussed in appendix I.

We conducted our work from August 1995 through August 1996 in
accordance with generally accepted government auditing standards. We
requested comments on a draft of this report from the Commissioner of IRS
or her designee. On November 21, 1996, IRS officials, including the
Customer Service Site Executive and the National Director, Customer


11
  According to IRS internal guidance for developing business cases, a business case is a management
tool that documents key aspects of an information technology initiative to include (1) justifying the
initiative and helping ensure that it provides programmatic benefits, (2) providing a mechanism to aid
in tracking and managing initiatives during implementation, and (3) establishing a baseline against
which progress of the initiative may be judged.



Page 9                                                     GAO/GGD/AIMD-97-31 IRS’ ICP System
                        B-265969




                        Service Planning and Systems Division, provided us with oral comments.
                        These comments were supplemented by a memorandum from the National
                        Director, Customer Service Planning and Systems Division, and the
                        Deputy Chief Information Officer (Systems Development) on
                        November 26, 1996. Their comments are summarized on pages 22-24 and
                        incorporated elsewhere in the report where appropriate.


                        Through fiscal year 1995, IRS had invested over $150 million in ICP and,
IRS Invested Millions   according to data provided to us after a May 6, 1996, Treasury report to the
of Dollars in ICP but   House and Senate Appropriations Committees, IRS had plans to invest
Costs and Benefits      about $77 million and $112 million in fiscal years 1996 and 1997.12 That
                        would bring the total investment to about $340 million, or about 53 percent
Remain Uncertain        of the $641.1 million budgeted for ICP through 2000. However budget cuts
                        have caused IRS to reduce planned expenditures for ICP and to reassess
                        how to move forward to meet the needs of front-line assistors.

                        Despite this sizable investment, ICP costs and benefits remain uncertain
                        because the scheduled rollout of ICP and its capabilities continue to
                        change. Since ICP began in 1993, the milestone dates for tasks have slipped,
                        and most recently the testing of software release 2.0 has been delayed at
                        least 3 months. Also, the capabilities of software release 2.0 may be less
                        than originally planned. Finally, the original business case on ICP was
                        never accepted. While a more recent business case indicates that IRS will
                        update projections for cost and benefits as necessary, IRS has made no
                        revisions to the business case, even though changes are expected to the
                        rollout date and to the software capabilities for release 2.0. IRS is
                        reassessing its plans for release 2.0 and plans to revise its business case
                        after a proposal is made to and approved by the Investment Review Board.

                        ICPbegan in late 1993, and the capability of ICP was to be rolled out
                        incrementally in four phases and was to be completed by 1997. In
                        March 1995, changes in the scheduled rollout date took place. The revised
                        date for ICP being operational was extended to November 1998. As of
                        June 1996, the first increment of ICP was partially deployed at 14 of the 23
                        customer service centers. There were about 2,500 ICP workstations
                        operating at these sites. IRS was expecting to purchase additional
                        workstations in 1996 and 1997.



                        12
                          IRS has concerns about our costs obtained from their records for ICP through 1995 and provided us
                        information that shows costs of about $73 million. The cost issue is addressed in the agency comment
                        section of this report.



                        Page 10                                                   GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




The latest IRS schedule calls for ICP to be fully deployed by fiscal year 2000,
but this may be delayed. For example, pilot testing of release 2.0 was
scheduled to begin on September 30, 1996, with initial deployment in
April 1997. However, the development team has been unable to deliver the
software as scheduled. As a result, the pilot test was delayed, and a risk
assessment of the entire ICP project was initiated. The contractor’s interim
report on the risk assessment states that the pilot test on ICP release 2.0
should be delayed at least 3 months. The testing and deployment of release
2.0 may be delayed longer than 3 months, because the contractor stated
that the number of problems identified during software testing continue to
increase and are “not likely to be fixed in near term.” IRS does not know
the impact on costs of these delays, but it seems these delays, especially
any long delay with release 2.0, will likely increase costs.

IRS has spent about $150 million to date for ICP 1.0 /1.5 and to develop ICP
2.0, but IRS officials told us that they never projected any revenue or
productivity gain for the early releases of ICP. IRS officials said that ICP
activities to date have provided the foundation for development of ICP 2.0
and have put in place the hardware, telecommunications, and other
infrastructure components required to implement the customer service
vision; and they noted that the real benefit gains of ICP will come from ICP
release 2.0.

In 1995, IRS’ Information Systems Division developed a “Unified Business
Case” for the systems supporting IRS’ customer service and district office
operations.13 The costs and benefits were projected to be $3.2 billion and
$5.2 billion, respectively. IRS customer service officials said that this cost
and benefit analysis was never accepted by their office because, by the
time the analysis was completed, the projects being evaluated were not
consistent with their new business vision and no longer represented the
scope of ICP.

In July 1996, IRS completed another business case for Customer
Service/ICP. ICP costs and benefits were estimated to be $774 million and
$2.9 billion, respectively. This business case was intended to justify the
costs of ICP, including the necessary physical infrastructure, such as real
estate, telecommunications, computer equipment, and furniture.




13
 The systems were Corporate Accounts Processing System, Case Processing System, Workload
Management System, Telephone Routing Interactive System, and the Servicewide Electronic Research
Project.



Page 11                                                GAO/GGD/AIMD-97-31 IRS’ ICP System
                          B-265969




Most Users Found          Most of the users we interviewed said that ICP 1.5 had provided some
Advantages to Using ICP   advantages. At the time of our review, however, IRS had not taken steps to
1.5                       measure the extent to which ICP has improved service to taxpayers.

                          More than 91 percent of the employees that responded to this question
                          said ICP improved their ability to serve taxpayers at least to some extent,
                          when compared with what they used before development of ICP. About
                          89 percent of those who responded told us that ICP increased their
                          productivity while 85 percent said it increased their ability to resolve the
                          taxpayers’ questions on the initial contact at least to some extent. While
                          the results of our survey of CSRs were generally positive, IRS had not
                          attempted to measure the extent to which ICP had affected the services
                          provided to taxpayers. Appendix II shows CSRs’ opinions on the extent to
                          which ICP release 1.5 has allowed them to improve customer service and
                          improved their ability to do their jobs.


                          The testing of ICP 1.5 was too limited and did not measure ICP’s impact on
ICP Has Not Been          business operations.14 Also, IRS discounted system downtime when
Thoroughly Tested         analyzing the results of the test. IRS officials recognized the limitations of
                          the ICP 1.5 testing and told us that testing of ICP 2.0 would be more
                          comprehensive.


Testing Was Limited       IRSconducted its test of ICP 1.5 at the Nashville customer service site
                          during a 4-week period in July and August 1995. Nashville, IRS’ prototype
                          customer service site, had been using ICP for approximately 9 months
                          before the test. The test was done during a nonpeak period, when IRS is not
                          typically as busy as during the tax season months of January through
                          April. Testing during a nonpeak period may not stress the system’s
                          capacity. IRS officials said that testing was limited because ICP 1.5 was only
                          intended to provide the data access foundation for developing ICP 2.0 and
                          to put in place the hardware, telecommunications, and other infrastructure
                          components required to implement the customer service vision.

                          The National Research Council also reported that the ICP test was too
                          limited to “yield the analytical results needed to appraise ICP in a full
                          site-production mode.” The Council’s report also states that ICP “was




                          14
                           IRS’ certification testing of ICP 1.5 was designed to assess ICP performance in a production
                          environment, measuring system availability, reliability, and performance.



                          Page 12                                                    GAO/GGD/AIMD-97-31 IRS’ ICP System
                           B-265969




                           tested during only one tax season, on a limited basis, before being
                           deployed to other sites.”15


Testing Did Not Measure    The purpose of a pilot test is to evaluate the performance of a system in
ICP’s Impact on Business   one location before deciding whether to implement the system at other
Operations                 locations. IRS uses the pilot test to certify that the system is meeting its
                           program or business objectives. IRS refers to this process as the “Business
                           Certification.” During the pilot test, IRS was to collect data on the
                           performance of the system and compare the data against established
                           performance goals to certify that the system is performing as expected.

                           To measure ICP’s impact on business operations, IRS examined six quality
                           indicators—productivity, accuracy, timeliness, revenues, initial contact
                           resolution,16 and customer satisfaction. IRS had difficulty measuring four of
                           these six indicators, and its measure of the remaining two indicators was
                           very limited in scope. Additionally, IRS based its measure of another
                           indicator—quality of the workplace—on focus group discussions. Despite
                           difficulties in measuring the impact on business, IRS officials decided to
                           roll out the system to other sites because comments on the quality of the
                           system from the workplace focus groups had been generally favorable.

                           IRS discounted the results of its testing of accuracy and revenues collected
                           because it could not isolate the impact of ICP from that of other changes in
                           work processes. For example, during the certification test period,
                           accuracy varied from 85 percent for questions on tax law and procedures
                           to 36 percent for account questions. The national standard is 87 percent.
                           The evaluation team concluded that the results of the accuracy and
                           revenue tests were not comparable to national results because Nashville
                           was “blending” certain collection and taxpayer service work and
                           cross-training its employees to work both areas.

                           IRSused very narrow measures to gauge the system’s effect on timeliness
                           and productivity at the Nashville site. Timeliness and productivity
                           measurements were limited to measuring gains made from a more timely
                           process of ordering forms. According to the test results, ICP reduced the
                           amount of time it took to order forms by 1 day and saved $118.68 per day,
                           compared with fiscal year 1994 costs for direct labor and mail. However,


                           15
                            Continued Review of the Tax Systems Modernization of the Internal Revenue Service, Final Report,
                           National Research Council, 1995.
                           16
                             IRS defines initial contact resolution as an instance in which IRS is able to satisfactorily resolve a
                           taxpayer’s issue on the basis of the taxpayer’s initial contact with IRS.



                           Page 13                                                       GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




ordering forms is only a small part of customer service. IRS did not
measure the timeliness of handling taxpayers’ calls for other services, such
as refund inquiries or the productivity of CSRs—concerning the number of
calls they were able to answer.

IRS had no baseline measures for customer satisfaction and initial contact
resolution. This prevented IRS from measuring improvements over the
status quo. The certification report gives no results for customer
satisfaction and notes that surveys on customer satisfaction were not
done. IRS reported the results of measures on initial contact
resolution—the percentage of calls that IRS resolved in one contact.
However, the rate—43 percent—is much lower than the goal of 95 percent.
Nonetheless, IRS gave the system a “pass” mark on that indicator stating
that the “increased functionality expected in future releases of ICP should
increase the overall ICR [initial contact resolution] rate.”

Furthermore, IRS’ measurement of how ICP 1.5 affected the quality of work
life was limited to holding focus group discussions. Thirty-one of the 311
employees out of the Nashville office participated in the focus groups. The
3 focus groups were made up of 14 experienced CSRs, 12 inexperienced
CSRs, and 5 managers. According to the certification report, the
experienced CSRs were “excited about the ICP system,” but they expressed
several concerns ranging from technological problems to lack of training.
The report cautioned that “unless their concerns are addressed, the
impression of the ICP system will turn into that of a curse rather than the
now perceived blessing.” Similarly, the managers said the system offered
many promises, but they too were concerned about the technical problems
associated with the system. The inexperienced CSRs were not as
enthusiastic about the system as the experienced CSRs and managers.
While they had concerns similar to the experienced CSRs, they were very
concerned about the amount of system downtime.

In our July 1995 report17 on TSM, we said that although IRS recognized the
importance of testing, it had not yet developed a complete and
comprehensive testing plan for TSM. We said that individual TSM systems
were developing their own test plans, which IRS described as rudimentary
and inadequate. If systems like ICP are not adequately tested, design and
development errors may go undetected, leading to performance shortfalls.
Similar to ICP, IRS failed to thoroughly test its Service Center Recognition
Image Processing System (SCRIPS). The pilot test of SCRIPS was incomplete
because it (1) did not certify all software applications that were to be used

17
  GAO/AIMD-95-156, July 26, 1995.



Page 14                                     GAO/GGD/AIMD-97-31 IRS’ ICP System
                        B-265969




                        during 1995 and (2) did not test SCRIPS ability to handle peak processing
                        volumes. Many of the problems IRS experienced with SCRIPS, such as slow
                        processing rates and system failures, might have been anticipated had IRS
                        thoroughly tested the system before placing it into operation.18


IRS Excluded Downtime   IRS’technical certification report stated that the system was available to
                        users more than 98 percent of the time during the 19-day test period.
                        However, it reached that percentage by excluding 3 days in which the
                        system was down. IRS excluded the downtime from the test results
                        because officials said they believed they had corrected the technical
                        problem and that it would not recur. Had the 3 days been included, the
                        system would have been available to the users about 95 percent of the
                        time.

                        Downtime may be caused by problems with system elements related to
                        ICP, but not directly measured in the ICP test. Most of the CSRs we
                        interviewed in December 1995 and March 1996 said downtime was a
                        problem at their site. Of the 185 CSRs who responded to this question,
                        82 percent said downtime had disrupted customer service, at least to some
                        extent. The representatives considered downtime to be those times when
                        they were unable to access information through their workstation,
                        regardless of the cause. They told us that when the system went down they
                        were unable to provide customer service. They said they either called the
                        taxpayer back when the system was back up, or they told the taxpayer to
                        call back later, anticipating that the system would be back in operation
                        when the taxpayer called.

                        IRS officials in Nashville said the downtime stemmed from power outages,
                        telecommunications problems, and connectivity problems with the
                        systems from which ICP pulls data. Both the Cincinnati and Atlanta
                        customer service centers experienced similar problems with connectivity
                        to these old systems.




                        18
                         Tax Systems Modernization: Imaging System’s Performance Improving but Still Falls Short of
                        Expectations (GAO/GGD-97-29, January 16, 1997).



                        Page 15                                                 GAO/GGD/AIMD-97-31 IRS’ ICP System
                          B-265969




More Testing/Benefits     IRSofficials told us that they plan to conduct a more thorough pilot test of
Measurement Planned for   the next release of ICP. Software acceptance testing19 began in May 1996
ICP 2.0                   and was scheduled to be completed in September 1996. ICP 2.0 was then to
                          be subjected to 3 months of operational testing,20 using 40 CSRs at the
                          Fresno customer service center beginning on September 30, 1996. It was to
                          be expanded using about 160 CSRs in January 1997 and then rolled out to
                          other customer service centers beginning in May 1997.

                          However, on July 31, 1996, in a memorandum to the Commissioner and
                          Deputy Commissioner, the Associate Commissioner for Modernization
                          cancelled the September 30, 1996, pilot start date on the recommendation
                          of the Business Site Executive. The Associate Commissioner noted that
                          continued slippage in milestone dates for software programming and
                          testing had jeopardized the pilot start date. To address concerns about the
                          project, IRS hired a contractor to perform a risk assessment of the entire
                          project. We believe that this decision is a positive indication of IRS’ desire
                          to ensure that the system is sound before it is tested in production. The
                          final results of this risk assessment were submitted in October 1996. IRS
                          has prepared a draft evaluation plan that it believes will enable them to
                          make a sound business decision about further investment in ICP.

                          In an interim report dated August 21, 1996, the contractor recommended
                          that the pilot test be delayed at least 3 months. The contractor cited
                          various reasons why the test should be delayed. For example, during
                          software testing, ICP failed to recognize certain data or taxpayer issues
                          when such data or issues existed, and it failed to shut down when data
                          were entered into certain accounts that were supposed to be protected
                          from additional data entry. The contractor also cited problems with (1) the
                          accuracy of data and with the updating of taxpayers’ IDRS accounts; (2) the
                          definition of the user requirements for ICP 2.0; and (3) hardware
                          differences among development, test, and production sites. The contractor
                          stated that the number of problems identified during software testing
                          continue to increase and were not likely to be fixed in the near term.




                          19
                           Software acceptance testing is conducted at a test facility and determines if the system software
                          performs in the manner in which it was designed. For example, it determines whether the system
                          obtains the information expected when predetermined data are input.
                          20
                           Operational or pilot testing determines if the system is performing effectively under normal business
                          operations.



                          Page 16                                                    GAO/GGD/AIMD-97-31 IRS’ ICP System
                        B-265969




                        According to IRS’ Customer Service Vision, ICP was expected to be the
IRS Is Developing and   vehicle to provide CSRs access to the information they would need to
Deploying ICP Before    answer all types of calls coming from taxpayers. Also, IRS planned to
Work Processes Have     combine a phase of the collection process with customer service.21
                        However, according to IRS officials, after experimentation at the Nashville
Been Determined and     customer service center prototype, IRS is now reconsidering the extent to
Before Desired          which CSRs will be able to answer the broad range of taxpayer questions,
                        which are anticipated if IRS reduces the current level of employee
Capabilities Are        specialization and combines the customer service and some of the
Known                   collection functions. Modifications to ICP and/or subsequent investments in
                        information technology may be required as the roles and responsibilities
                        of a CSR continue to evolve.


IRS Is Examining Work   IRS hired a contractor and formed a team in February 1996 to examine the
Process Issues          customer-service work processes and duties of CSRs. IRS acknowledges
                        that many questions still need to be resolved on future job scope and
                        structure of the customer service position. The contractor has been
                        focusing on the redesign of current operations, systems, and organizations
                        and the design of the CSR’s position. The contractor’s task is to develop a
                        quality oriented, workable customer service system that furthers IRS’
                        objectives, enhances employee and customer satisfaction, and maximizes
                        efficient use of resources. The draft design report was issued to IRS for
                        review and comment on August 30, 1996.

                        IRShas traditionally operated its telephone activities along functional lines,
                        with employees specializing in specific areas. As such, an IRS telephone
                        representative does not handle a broad range of inquiries. For example, if
                        a business taxpayer called IRS regarding a balance due, the taxpayer would
                        be routed to an IRS employee who specialized in handling business
                        accounts and who handled only business account calls.

                        As originally envisioned, ICP was to allow CSRs to perform a wide range of
                        tasks, rather than have specific areas of expertise. ICP was to consolidate
                        data from multiple databases and eliminate the complex command codes
                        that IRS employees are now required to know in order to access and update
                        taxpayer account information.

                        Also, IRS planned to combine its initial efforts to collect taxes owed with
                        its traditional customer service work. Essentially, with this blending of

                        21
                          This phase of the collection process involves IRS staff initiating telephone contact with taxpayers
                        who have not responded to notices. As part of its customer service vision, IRS plans to consolidate
                        these types of collection calls with other taxpayer service related calls.



                        Page 17                                                     GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




work, a CSR would be expected to answer all types of taxpayer calls. For
example, a representative could receive a call from an individual taxpayer
inquiring about a refund, and the next call could be an income tax
preparer asking questions about IRS procedures or tax law. The CSR
described in the vision would require a far broader knowledge base and
much more extensive training than under the traditional telephone
operations.

IRS is reconsidering the extent to which CSRs will be able to answer the
broad range of questions. IRS officials said they tested the blending concept
at the Nashville prototype site and concluded that blending all the duties
into one position was not feasible. The work systems design team is
expected to decide how the work will be performed and define the duties
of CSRs. Senior executives say they are committed to merging the taxpayer
service and compliance functions. They acknowledge that certain issues
must be resolved, such as how much tax knowledge a CSR needs to have,
the proper skill level, and what authority the position should have to make
certain decisions about a taxpayer account.

CSRs that we talked with had mixed views on the extent to which blending
has improved IRS’ ability to serve taxpayers. Twenty-two percent of the
CSRs said blending improved their ability to assist taxpayers to little or no
extent while another 22 percent said it improved their ability to assist
taxpayers to a great extent.

Some CSRs said blending allows them to provide one-stop service to the
taxpayer without transferring them to other CSRs, while others said that
one CSR cannot be responsible for performing multiple jobs. Some CSRs
also said blending causes inaccuracy and requires more time per call
because CSRs are less proficient when performing multiple jobs.

IRS officials said that ICP 2.0 has been designed to provide CSRs with the
most basic capabilities that both traditional telephone assistors and
collection staff would find useful. Indeed, many of the capabilities
expected from ICP 2.0 would provide clear advantages over IRS’ existing
systems. However, until the role of the CSR is defined, it is unlikely that IRS
will be able to provide information technology solutions that maximize
productivity and customer service. As we have previously reported,
organizations that successfully develop systems and achieve significant
operational improvements do so only after analyzing and redesigning
critical business processes. At the time of our review, the process of




Page 18                                       GAO/GGD/AIMD-97-31 IRS’ ICP System
                              B-265969




                              designing, testing, and implementing the role and processes surrounding
                              the CSR was still not complete.


System Requirements May       Until IRS completes its work systems design effort, the information
Change With Results of        technology requirements to support CSRs will not be fully understood. The
Work Systems Design           information CSRs need and the presentation of data might change from IRS’
                              initial vision and current ICP requirements because of the results of the
Effort                        work systems design effort. Therefore, CSRs may not require the same
                              capabilities from ICP, as previously envisioned, in order to provide
                              customer service.

                              At the time of our visits, some sites were not using all of ICP’s capabilities
                              because current duties did not require those capabilities. For example,
                              CSRs at the Atlanta and Cincinnati customer service centers were using ICP
                              to access IDRS when responding to inquiries from taxpayers who had
                              received collection notices from IRS. They did not use ICP’s capabilities to
                              access additional databases such as ACS and AUR. Customer service center
                              officials reported that they were not servicing the kinds of calls that
                              require access to either ACS or AUR.


                              None of the ICP software development projects reviewed fully satisfy any
IRS Lacks the                 of the KPAs that the SEI’s CMM requires to classify as a CMM level 2 rating or
Software                      “a repeatable software development process.” In this regard, we found that
Development                   three IRS organizations developing ICP software are extremely weak in the
                              following KPAs: requirements management, software project planning,
Capabilities Needed           software project tracking and oversight, software quality assurance, and
to Attempt a Project          software configuration management. As a result, successful delivery of ICP
                              2.0 software is unlikely.
Like ICP
                              Each of the five KPAs, along with examples of how the software
                              development organizations compare to the KPA goals, is summarized
                              below.22 Appendix III details how well each of the three organizations
                              performed the KPA goals.

                          •   Requirements Management - The purpose of requirements management is
                              to establish a common understanding and agreement between the
                              customer and the software project management on the customer’s
                              requirements that are to be addressed through the software. One of the

                              22
                                CMM level 2 is achieved by satisfying all of the KPAs required for that level. In order to satisfy a KPA,
                              all of its goals must be satisfied. Each of the KPAs and the goals within those areas are described in
                              appendix III.



                              Page 19                                                      GAO/GGD/AIMD-97-31 IRS’ ICP System
    B-265969




    two goals of this KPA states that, “software plans, products, and activities
    are kept consistent with the system requirements allocated to software.”
    While IRS produces a number of documents—for example, (1) the
    configuration item list; (2) administrative request for information services;
    (3) the system architectural description; and (4) the concept of operations,
    which contains varying levels of detail on customer requirements—IRS
    does not update these documents, as requirements change, to ensure that
    these document are complete, consistent, or current. As a result, IRS has no
    assurance that the code being written and tested is traceable to customer
    requirements.
•   Software Project Planning - The purpose of software project planning is to
    establish reasonable plans for performing the software engineering and for
    managing the software project. One of the three goals within this KPA
    states that, “software project activities and commitments are planned and
    documented.” IRS does not have a defined process governing software
    project planning. Moreover, the ICP software projects do not have
    documented software plans. Without these plans, IRS cannot effectively
    measure and monitor software development progress and take
    appropriate action when needed.
•   Software Project Tracking and Oversight - The software project tracking
    and oversight process provides insight into actual project progress so that
    management can take effective actions when the software project’s
    performance deviates significantly from the software plans. One of the
    three goals within this KPA is that “actual results and performances are
    tracked against the software plans.” As noted above, IRS does not have ICP
    software development plans, and while it tracks the software project
    against schedules, these schedules are not derived using generally
    accepted government or industry software engineering methods. As a
    result, management cannot tell when actual progress warrants corrective
    action.
•   Software Quality Assurance - The purpose of software quality assurance is
    to enable management to assess the quality of the process being used by
    the software project and of the products being built. Two of the four goals
    within this KPA emphasize that (1) “software quality assurance activities
    are planned” and (2) “adherence of software products and activities to
    applicable standards, procedures, and requirements is verified
    objectively.” The ICP software projects do not have software quality
    assurance plans. In addition, a software quality assurance group does not
    participate in certain required software quality assurance functions, such
    as the preparation, review, and audit of projects’ software development
    plans, standards, and procedures. As a result, IRS has no assurance that the




    Page 20                                     GAO/GGD/AIMD-97-31 IRS’ ICP System
                  B-265969




                  ICP software is being developed in a quality fashion and will perform as
                  intended.
              •   Software Configuration Management - The purpose of software
                  configuration management is to establish and maintain the integrity of
                  products of the software project throughout the project’s software life
                  cycle. Two of the four goals of the configuration management KPA require
                  that (1) “software configuration management activities be planned” and
                  (2) “software work products be identified, controlled, and available.” The
                  ICP software projects reviewed do not have software configuration
                  management plans. In addition, although IRS controls changes to source
                  code using a tool called Source Code Control System, the requirements
                  within this KPA require change control to all software products created
                  within the entire software life cycle. Specifically, IRS has not identified
                  software work products—other than source code—such as requirements
                  documentation, design specifications, test plans and results that need to
                  be placed under configuration management. As a result, IRS does not know
                  whether all its software products are complete, consistent, and current.


                  Modernizing IRS’ systems is critical to IRS reaching its Customer Service
Conclusions       Vision. As envisioned, ICP is planned to offer some clear advantages over
                  IRS’ existing information systems and could improve taxpayer services.
                  However, the success of ICP may be at risk because IRS has made
                  substantial investments in the system without having (1) validated the
                  costs and benefits by thoroughly testing the ICP system, (2) finalized the
                  redesign of work processes that ICP will support, and (3) achieved the
                  software development maturity needed to successfully build the
                  envisioned capabilities within planned cost and milestones. Some of these
                  problems are evident in recent slippage in milestone dates for software
                  programming and testing that forced the cancellation of the September 30,
                  1996, pilot start date for ICP release 2.0. The contractor’s interim report
                  assessing the risks associated with ICP development also supports delaying
                  the pilot test.

                  IRS has invested millions of dollars in ICP without having the cost and
                  benefit data needed to fully assess the program, including analyzing
                  program risks and making the most appropriate investment decisions.
                  Furthermore, IRS’ testing of ICP 1.5 was limited and lacked baseline
                  measures to gauge the success of this and future releases. Until IRS settles
                  outstanding issues with its work processes, such as the scope of the duties
                  of CSRs, it will not be in a position to adequately project whether ICP will




                  Page 21                                    GAO/GGD/AIMD-97-31 IRS’ ICP System
                         B-265969




                         provide the necessary capabilities or be the best system for customer
                         service.

                         The views of CSRs were generally supportive of an early version of ICP.
                         However, continuing with plans to develop and deploy ICP to support
                         unmeasurable benefits is risky. In this regard, until IRS implements a way
                         to measure benefits, the extent to which ICP is likely to improve customer
                         service and provide a positive return on investment cannot be determined.

                         IRSis unnecessarily risking hundreds of millions of dollars by attempting to
                         develop ICP software without having the requisite processes for doing so.


                         Concurrent with the risk assessment being performed by the contractor,
Recommendations          we recommend that the IRS Commissioner immediately limit deployment
                         of ICP workstations to those already purchased until (1) projected costs
                         and benefits are better known and can be validated by testing the system
                         in a realistic operational environment, using baseline performance
                         measures and (2) decisions are made on work processes, including the
                         blending of collection and service work and specific duties of CSRs.

                         We also recommend that expedient steps be taken to better position IRS to
                         develop its software successfully and to protect its software investments.
                         Specifically, we recommend that the IRS Commissioner take the following
                         actions:

                     •   Develop and implement an action plan to ensure that ICP software is
                         developed by an organization(s) with at least a level 2 CMM rating.
                     •   Delay any major investment in ICP software until the action plan is
                         implemented.


                         We requested comments on a draft of this report from the Commissioner
Agency Comments          of Internal Revenue or her designated representative. Responsible IRS
and Our Evaluation       officials, including the Customer Service Site Executive and the National
                         Director, Customer Service Planning and Systems Division, provided IRS’
                         comments in a November 21, 1996, meeting. These comments were
                         supplemented by a November 26, 1996, memorandum from the National
                         Director, Customer Service Planning and Systems Division, and the
                         Deputy Chief Information Officer (Systems Development) that addressed
                         our recommendations and clarified remarks made during our discussion.
                         We considered IRS’ comments and modified this report where appropriate.



                         Page 22                                    GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




IRS officials agreed with our recommendation to limit further deployment
of ICP workstations to those already purchased. They are currently
considering several alternatives for reevaluating ICP. They said these
alternatives, along with a recommendation, will be presented to IRS’
Investment Review Board in the near future.

Additionally, IRS officials generally agreed with our assessment of ICP
software development processes and agreed with our recommendation
that they need at least a CMM level 2 capability to develop ICP software. The
officials added that future ICP development is to be done using CMM level 2
processes and that, as we recommended, major investments in ICP will be
delayed until this level of capability is achieved. They also added that their
plan for achieving this level of capability involved two options—software
development process improvements and heavy reliance on software
development contractors. With respect to the former, the officials cited
examples of improvement initiatives under way and planned, such as use
of a requirements traceability matrix and software quality assurance
program.

We believe IRS’ two proposed actions to improve software development
capabilities are not totally responsive to our recommendation. First, while
the software process improvements cited are a step in the right direction,
these actions should be part of a complete and comprehensive action plan
for process improvement, as we recommended, which is rooted in SEI’s
CMM level 2 KPA requirements. Second, to effectively acquire software using
development contractors, IRS must have at least SEI defined CMM level 2
software acquisition processes. Moreover, it must ensure that its
development contractors have at least level 2 development capabilities.
Accordingly, IRS’ action plan for ICP should specify how this goal will be
accomplished before it relies on contractors to develop ICP.

IRS officials also stated that some ICP software had been developed using
nationally recognized standards. For example, they cited software for
computer screens, developed by IRS for use on multiple systems, including
ICP. However, as stated in the objectives, scope, and methodology section
of this report, our software capability assessment addressed those IRS
organizations responsible for developing ICP applications software.

IRS officials raised concerns about the amount of money cited in our report
as spent on ICP through fiscal year 1995. Rather than $150 million, they
now believe the investment in ICP through 1995 is about $73 million.
Throughout our review, we had difficulty determining the amount spent on



Page 23                                      GAO/GGD/AIMD-97-31 IRS’ ICP System
B-265969




ICP. At one point, IRS officials told us that $171.5 million had been spent on
ICP through fiscal year 1995, as reported in the May 6, 1996, Treasury
report to the House and Senate Appropriations Committees. They later
told us they found errors in that estimate, and the actual investment in ICP
through 1995 was about $150 million. Now, they believe the $150 million
cost projection was overstated because it included costs for the Aspect
Automated Call Distributor System, which are not directly attributable to
ICP. While we agree that some equipment costs are included in the
$150 million figure, we are uncertain how much is attributable to the
Aspect system because we did not validate the accuracy of IRS’ estimates.
Accordingly, the $150 million was retained in this report.

We are sending copies of this report to the Ranking Minority Member of
your Subcommittee, the Chairman and Ranking Minority Member of the
Senate Committee on Finance and other appropriate congressional
committees, the Secretary of the Treasury, the Commissioner of Internal
Revenue, and other interested parties.


Major contributors to this report are listed in appendix IV. If you or your
staff have any questions concerning this report, please call me on
(202) 512-8633.

Sincerely yours,




Lynda D. Willis
Director, Tax Policy and
  Administration Issues




Page 24                                      GAO/GGD/AIMD-97-31 IRS’ ICP System
Page 25   GAO/GGD/AIMD-97-31 IRS’ ICP System
Contents



Letter                                                                                            1


Appendix I                                                                                       28

Description of
Methodology
Appendix II                                                                                      31

Customer Service
Representatives’
Views on Integrated
Case Processing
System
Appendix III                                                                                     32

Detailed Software
Capability Evaluation
Results for the
Fresno, Dallas, and
Austin Development
Centers
Appendix IV                                                                                      44

Major Contributors to
This Report
Tables                  Table I.1: Capability Maturity Model Levels and Descriptions             28
                        Table I.2: Capability Maturity Model Level 2 “Repeatable” KPA            30
                          Descriptions
                        Table II.1: CSRs’ Views on the Extent That ICP Has Allowed               31
                          Them to Improve Customer Service in Selected Areas.
                        Table II.2: CSRs’ Views on the Extent That Various ICP                   31
                          Capabilities Have Improved Their Ability to do Their Jobs




                        Page 26                                   GAO/GGD/AIMD-97-31 IRS’ ICP System
Contents




Abbreviations

ACS        Automated Collection System
AUR        Automated Underreporter System
CFLO       Corporate Files On Line
CMM        Capability Maturity Model
CSR        customer service representative
ICP        Integrated Case Processing
IDRS       Integrated Data Retrieval System
IRS        Internal Revenue Service
KPA        key process area
SCE        Software Capability Evaluation
SCRIPS     Service Center Recognition Image Processing System
SEI        Software Engineering Institute
TSM        tax systems modernization


Page 27                                 GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix I

Description of Methodology


                                       This section describes the methodology we used to evaluate the software
                                       development capabilities of the organizations that are developing ICP
                                       software. The Software Capability Evaluation (SCE) is a method for
                                       evaluating agencies’ and contractors’ software development processes
                                       against the Software Engineering Institutes’s (SEI) five-level software
                                       Capability Maturity Model (CMM), as shown in table I.1. These levels, the
                                       key process areas (KPA) described within each level, and the goals within
                                       each KPA, define an organization’s ability to develop software and can be
                                       used to guide software development process improvement activities. The
                                       findings generated from an SCE identify (1) process strengths that mitigate
                                       risks, (2) process weaknesses that increase risks, and (3) improvement
                                       activities that indicate potential mitigation of risks.

Table I.1: Capability Maturity Model
Levels and Descriptions                Level             Name                    Description
                                       5                 Optimizing              Continuous process improvement is enabled by
                                                                                 quantitative feedback from the process and from
                                                                                 piloting innovative ideas and technologies.
                                       4                 Managed                 Detailed measures of the software process and
                                                                                 product quality are collected. Both the software
                                                                                 process and products are quantitatively understood
                                                                                 and controlled.
                                       3                 Defined                 The software process for both management and
                                                                                 engineering activities is documented, standardized,
                                                                                 and integrated into a standard software process for
                                                                                 the organization. All projects use an approved,
                                                                                 tailored version of the organization’s standard
                                                                                 software process for developing and maintaining
                                                                                 software.
                                       2                 Repeatable              Basic project management processes are
                                                                                 established to track cost, schedule, and
                                                                                 functionality. The necessary process discipline is in
                                                                                 place to repeat earlier successes on projects with
                                                                                 similar applications.
                                       1                 Initial                 The software process is characterized as ad hoc,
                                                                                 and occasionally even chaotic. Few processes are
                                                                                 defined, and success depends on individual effort.
                                       Note: According to an SEI study ( i.e., Moving on Up: Data and Experience Doing CMM-Based
                                       Process Improvement, Technical Report CMU/SEI-95-TR-008, Aug. 1995) of 48 organizations that
                                       implemented software process improvement programs, the amount of time required to increase
                                       process maturity from level 1 to level 2 took an average of 30 months, with a range of 11 months
                                       to 58 months.

                                       Source: Capability Maturity Model for Software, Version 1.1, (Technical Report
                                       CMU/SEI-93-TR-24, Feb. 1993).




                                       Page 28                                                  GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix I
Description of Methodology




In our July 1995 report, we reported that IRS was a CMM level 1 software
development organization and that unless IRS improved its software
development capability, it was unlikely to build Tax Systems
Modernization (TSM) systems timely or economically. In June 1996, we
reported that IRS had begun to act on our recommendations in this area,
however, none of the actions were complete or institutionalized. At that
time, IRS’ Chief Information Officer agreed that IRS was not yet
institutionally a CMM level 2, but stated that some CMM level 2 processes
were being used to develop Integrated Case Processing (ICP). Therefore,
we evaluated ICP software development organizations that were said to be
using CMM level 2 requirements.

Specifically, we evaluated two ICP version 2 subsystems that are being
developed in three locations—Dallas, Texas; Austin, Texas; and Fresno,
California. We evaluated the software development processes used on
these projects, focusing on KPAs necessary to achieve a “repeatable”
capability or CMM level 2. According to SEI, organizations that have a
repeatable software development process have been able to significantly
improve their productivity and return on investment. In contrast,
organizations that have not developed the process discipline necessary to
better manage and control their projects at the repeatable level incur
greater risk of schedule delay, cost overruns, and poor quality software.23
These organizations rely solely upon the variable capabilities of
individuals, rather than on institutionalized processes considered basic to
software development.

According to SEI,24 KPAs for a repeatable capability are considered the most
basic in establishing discipline and control in software development and
are crucial steps for any project to mitigate risks associated with cost,
schedule, and quality. These KPAs are identified and described in table I.2.




23
  Capability Maturity Model for Software, Version 1.1 (Technical Report CMU/SEI-93-TR-24, Feb. 1993).
24
  Software Capability Evaluation, Version 2.0, Method Description (CMU/SEI-94-TR-06, June 1994).



Page 29                                                   GAO/GGD/AIMD-97-31 IRS’ ICP System
                                       Appendix I
                                       Description of Methodology




Table I.2: Capability Maturity Model
Level 2 “Repeatable” KPA               CMM Level 2 KPAs                 Description
Descriptions                           Requirements management          Defining, validating, and prioritizing requirements, such
                                                                        as functions, performance, and delivery dates.
                                       Software project planning        Developing estimates for the work to be performed,
                                                                        establishing the necessary commitments, and defining
                                                                        the plan to perform the work.
                                       Software project tracking and Tracking and reviewing software accomplishments and
                                         oversight                   results against documented estimates, commitments, and
                                                                     plans and adjusting these based on the actual
                                                                     accomplishments and results.
                                       Software subcontract             Selecting qualified contractors and managing them
                                         management                     effectively.
                                       Software quality assurance       Reviewing and auditing the software products and
                                                                        activities to ensure that they comply with the applicable
                                                                        processes, standards, and procedures and providing the
                                                                        staff and managers with the results of their reviews and
                                                                        audits.
                                       Software configuration           Selecting project baseline items, such as specifications;
                                         management                     systematically controlling these items and changes to
                                                                        them; and recording and reporting status and change
                                                                        activity for these items.
                                       Source: Software Capability Evaluation: VA’s Software Development Process Is Immature
                                       (GAO/AIMD-96-90, June 19, 1996).




                                       Page 30                                                GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix II

Customer Service Representatives’ Views on
Integrated Case Processing System


Table II.1: CSRs’ Views on the Extent That ICP Has Allowed Them to Improve Customer Service in Selected Areas (as a
Percentage of All Comments).
                                                             The extent to which ICP has improved customer service
                                                             To a very                         To a
                                                                 great     To a great      moderate        To some      To little or
Measure                                                         extent         extent        extent          extent      no extent       Uncertain
Reducing response time to taxpayers                                  25             33             18             10                10          4
Increasing customer satisfaction                                     14             43             21               7                9          7
Increasing initial contact resolution                                16             38             23               8               10          4
                                        a
Decreasing Case Inventory Delivery System cycle time                 26             30             15               4                4         20
Increasing productivity                                              19             45             17               8                7          5
Increasing accuracy                                                  18             41             19               9                9          4
Increasing revenue collection                                        12             28             22               8               10         20
Increasing taxpayer compliance                                       12             28             22             12                11         14
Reducing taxpayer burden                                             10             29             26               9               14         12
                                            a
                                                Centralized Inventory and Distribution System (CIDs) is IRS’ forms-ording system.

                                            Source: Interviews with CSRs.




Table II.2: CSRs’ Views on the Extent That Various ICP Capabilities Have Improved Their Ability to Do Their Jobs (as a
Percentage of All Comments)
                                                          Extent to which ICP has improved CSRs ability to do their jobs
                                                             To a very                         To a
                                                                 great     To a great      moderate        To some      To little or
Function                                                        extent         extent        extent          extent      no extent       Uncertain
Requesting forms                                                     35             29             12               7               12          5
Accessing external systems                                           49             28              9               3                7          4
Using history                                                        11             15              9               3               39         24
Using calculator                                                      4              6             11               6               49         23
Using clipboard                                                       8             10             11               7               36         28
Using calendar                                                       16             23             13             10                23         15
Using report trouble                                                  7             10              5               3               34         42
Using IDRS tutorial                                                   5              6              5               4               46         34
Using reasonable cause assistant                                      2              7              5               3               44         39
Using SERPa                                                           7             13              9               5               31         35
Using CSR newsletter                                                 22             17             11               8               13         29
Using levy source listing                                            20             17             11               1               19         32
                                            a
                                             Servicewide Electronic Research Project (SERP) is IRS’ automated system for researching IRS
                                            publications.

                                            Source: Interviews with CSRs.




                                            Page 31                                                     GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III

Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers
               Table III.1 summarizes our detailed findings from our software capability
               evaluation at three of IRS’ ICP development centers. As mentioned in
               appendix I, we evaluated the software development processes used on ICP
               software development projects at three centers, focusing on the key
               process areas (KPA) necessary to achieve a Capability Maturity Model
               (CMM) level 2 rating. CMM level 2 is achieved by satisfying all of the five KPAs
               under this level. To satisfy a given KPA, all of that area’s goals must be
               satisfied. Satisfying a goal, in turn, requires effectively meeting all of the
               activities associated with that goal. Table III.1 identifies whether each of
               the IRS development centers satisfied the KPAs, the associated goals, and
               activities.

               In accordance with the Software Engineering Institute’s (SEI) CMM
               assessment methodology, the activities within the respective goals are
               characterized as (1) a “strength” if IRS’ implementation of the activity was
               effective, (2) a “weakness” if IRS’ implementation of the CMM activity was
               ineffective, or IRS failed to implement an acceptable alternative, and
               (3) “not applicable” if the activity does not apply to the center’s software
               development environment. Therefore, in table III.1, a goal is classified as
               “not satisfied” when any associated activity is classified as a “weakness”
               and a KPA is classified as “not satisfied” when any associated goal is
               classified as “not satisfied.”




               Page 32                                       GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III
Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers




Page 33                                      GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III
Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers




Level 2 KPA/Purpose                           Goals                                        A
Requirements management:
to establish a common understanding
between the customer and the software
project of the customer’s requirements that
will be addressed by the software project.
                                              Goal 1
                                              System requirements allocated to software
                                              are controlled to establish a baseline for
                                              software engineering and management
                                              use.
                                                                                           T
                                                                                           r
                                                                                           p
                                              Goal 2
                                              Software plans, products, and activities
                                              are kept consistent with the system
                                              requirements allocated to software.
                                                                                           T
                                                                                           r
                                                                                           a
                                                                                           C
                                                                                           i
Software configuration management:
to establish and maintain the integrity of
products of the software project throughout
the project’s software life cycle.
                                              Goal 1
                                              Software configuration management
                                              activities are planned.
                                                                                           A
                                                                                           s
                                                                                           A
                                                                                           m
                                                                                           c
                                              Goal 2
                                              Selected software work products are
                                              identified, controlled, and available.
                                                                                           A
                                                                                           m
                                                                                           c
                                                                                           A
                                                                                           r
                                                                                           T
                                                                                           m
                                                                                           P
                                                                                           t
                                                                                           p




Page 34                                            GAO/GGD/AIMD-97-31 IRS’ ICP System
                                             Appendix III
                                             Detailed Software Capability Evaluation
                                             Results for the Fresno, Dallas, and Austin
                                             Development Centers




Activity                                                                Fresno            Dallas            Austin
                                                                        Not satisfied     Not satisfied     Not satisfied




                                                                        Not satisfied     Not satisfied     Not satisfied




The software engineering group reviews the allocated                    Weakness          Weakness          Weakness
requirements before they are incorporated into the software
project.
                                                                        Not satisfied     Not satisfied     Not satisfied



The software engineering group uses the allocated                       Weakness          Weakness          Weakness
requirements as a basis for software plans, work products, and
activities.
Changes to the allocated requirements are reviewed and                  Not applicable    Weakness          Weakness
incorporated into the software project.
                                                                        Not satisfied     Not satisfied     Not satisfied



                                                                        Not satisfied     Not satisfied     Not satisfied


A software configuration management plan is prepared for each           Weakness          Weakness          Weakness
software project according to a documented procedure.
A documented and approved software configuration                        Weakness          Weakness          Weakness
management plan is used as a basis for performing software
configuration management activities.
                                                                        Not satisfied     Not satisfied     Not satisfied


A documented and approved software configuration                        Weakness          Weakness          Weakness
management plan is used as a basis for performing software
configuration management activities.
A configuration management library system is established as a           Weakness          Weakness          Weakness
repository for the software baselines.
The software work products to be placed under configuration             Weakness          Weakness          Weakness
management are identified.
Products form the software baseline library are created and             Weakness          Weakness          Weakness
their release is controlled according to a documented
procedure.
                                                                                                                     (continued)


                                             Page 35                                        GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III
Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers




Level 2 KPA/Purpose                             Goals                                       A
                                                Goal 3
                                                Changes to identified software work
                                                products are controlled.
                                                                                            C
                                                                                            i
                                                                                            a
                                                                                            C
                                                                                            d
                                                Goal 4
                                                Affected groups and individuals are
                                                informed of the status and content of
                                                software baselines.
                                                                                            T
                                                                                            a
                                                                                            S
                                                                                            M
                                                                                            b
                                                                                            a
                                                                                            S
                                                                                            d
Software quality assurance:
to provide management with appropriate
visibility into the process being used by the
software project and of the products being
built.
                                                Goal 1
                                                Software quality assurance activities are
                                                planned.
                                                                                            A
                                                                                            p
                                                                                            S
                                                                                            a
                                                Goal 2
                                                Adherence of software products and
                                                activities to the applicable standards,
                                                procedures, and requirements is verified
                                                objectively.
                                                                                            S
                                                                                            a
                                                                                            S
                                                                                            p
                                                                                            p
                                                                                            S
                                                                                            e
                                                                                            S
                                                                                            w




Page 36                                              GAO/GGD/AIMD-97-31 IRS’ ICP System
                                              Appendix III
                                              Detailed Software Capability Evaluation
                                              Results for the Fresno, Dallas, and Austin
                                              Development Centers




Activity                                                                 Fresno            Dallas            Austin
                                                                         Not satisfied     Not satisfied     Not satisfied


Change requests and problem reports for all configuration                Weakness          Weakness          Weakness
items/units are initiated, recorded, approved, and tracked
according to a documented procedure.
Changes to baselines are controlled according to a                       Weakness          Weakness          Weakness
documented procedure.
                                                                         Not satisfied     Not satisfied     Not satisfied



The status of configuration items/units is recorded according to         Weakness          Weakness          Weakness
a documented procedure.
Standard reports documenting the Software Configuration                  Weakness          Weakness          Weakness
Management activities and the contents of the software
baseline are developed and made available to affected groups
and individuals.
Software baseline audits are conducted according to                      Weakness          Weakness          Weakness
documented procedures.
                                                                         Not satisfied     Not satisfied     Not satisfied




                                                                         Not satisfied     Not satisfied     Not satisfied


A software quality assurance plan is prepared for the software           Weakness          Weakness          Weakness
project according to a documented procedure.
Software quality assurance group’s activities are performed in           Weakness          Weakness          Weakness
accordance with the software quality assurance plan.
                                                                         Not satisfied     Not satisfied     Not satisfied




Software quality assurance group’s activities are performed in           Weakness          Weakness          Weakness
accordance with the software quality assurance plan.
Software quality assurance group participates in the                     Weakness          Weakness          Weakness
preparation and review of the project’s software development
plan, standards, and procedures.
Software quality assurance group reviews the software                    Weakness          Weakness          Weakness
engineering activities to verify compliance.
Software quality assurance group audits designated software              Weakness          Weakness          Weakness
work products to verify compliance.
                                                                                                                      (continued)




                                              Page 37                                        GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III
Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers




Level 2 KPA/Purpose                            Goals                                       A
                                               Goal 3
                                               Affected groups and individuals are
                                               informed of software quality assurance
                                               activities and results.
                                                                                           S
                                                                                           r
                                                                                           D
                                                                                           p
                                                                                           d
                                                                                           S
                                                                                           i
                                                                                           a
                                               Goal 4
                                               Noncompliance issues that cannot be
                                               resolved within the software project are
                                               addressed by senior management.
                                                                                           D
                                                                                           p
                                                                                           d
Software project planning:
to establish reasonable plans for performing
the software engineering and for managing
the software project.
                                               Goal 1
                                               Software estimates are documented for
                                               use in planning and tracking the software
                                               project.

                                                                                           E
                                                                                           t
                                                                                           a
                                                                                           E
                                                                                           a
                                                                                           E
                                                                                           d
                                                                                           T
                                                                                           d
                                                                                           S
                                               Goal 2
                                               Software project activities and
                                               commitments are planned and
                                               documented.
                                                                                           S
                                                                                           i
                                                                                           A
                                                                                           i




Page 38                                             GAO/GGD/AIMD-97-31 IRS’ ICP System
                                                Appendix III
                                                Detailed Software Capability Evaluation
                                                Results for the Fresno, Dallas, and Austin
                                                Development Centers




Activity                                                                   Fresno            Dallas            Austin
                                                                           Not satisfied     Not satisfied     Not satisfied



Software quality assurance group periodically reports the                  Strength          Weakness          Weakness
results of its activities to the software engineering group.
Deviations identified in the software activities and software work         Weakness          Weakness          Weakness
products are documented and handled according to a
documented procedure.
Software quality assurance group conducts periodic reviews of              Not applicable    Weakness          Weakness
its activities and findings with the customer’s software quality
assurance personnel, as appropriate.
                                                                           Not satisfied     Not satisfied     Not satisfied



Deviations identified in the software activities and software work         Weakness          Weakness          Weakness
products are documented and handled according to a
documented procedure.
                                                                           Not satisfied     Not satisfied     Not satisfied



                                                                           Not satisfied     Not satisfied     Not satisfied




Estimates for the size of software work products(or changes to             Weakness          Weakness          Weakness
the size of the software work products) are derived according to
a documented procedure.
Estimates for the software project’s effort and cost are derived           Weakness          Weakness          Weakness
according to a documented procedure.
Estimates for the project’s critical computer resources are                Weakness          Weakness          Weakness
derived according to a documented procedure.
The project’s software schedule is derived according to a                  Strength          Weakness          Weakness
documented procedure.
Software planning data are recorded.                                       Weakness          Weakness          Weakness
                                                                           Not satisfied     Not satisfied     Not satisfied



Software project planning is initiated in the early stages of, and         Strength          Weakness          Weakness
in parallel with, the overall project planning.
A software life cycle with predefined stages of manageable size            Strength          Strength          Strength
is identified or defined.
                                                                                                                        (continued)




                                                Page 39                                        GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III
Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers




Level 2 KPA/Purpose                             Goals                                       A
                                                                                            T
                                                                                            a
                                                                                            T
                                                                                            S
                                                                                            m
                                                                                            T
                                                                                            a
                                                                                            a
                                                                                            P
                                                                                            s
                                                Goal 3
                                                Affected groups and individuals agree to
                                                their commitments related to the software
                                                project.
                                                                                            T
                                                                                            p
                                                                                            T
                                                                                            g
                                                                                            c
                                                                                            S
                                                                                            e
                                                                                            m
Software project tracking and oversight:
to provide adequate visibility into actual
progress so that management can take
effective actions when the software project’s
performance deviates significantly from the
software plans.
                                                Goal 1
                                                Actual results and performances are
                                                tracked against the software plans.
                                                                                            A
                                                                                            t
                                                                                            T
                                                                                            t
                                                                                            a
                                                                                            T
                                                                                            c
                                                                                            T
                                                                                            c
                                                                                            T
                                                                                            a
                                                                                            T
                                                                                            c
                                                                                            T
                                                                                            a




Page 40                                              GAO/GGD/AIMD-97-31 IRS’ ICP System
                                                Appendix III
                                                Detailed Software Capability Evaluation
                                                Results for the Fresno, Dallas, and Austin
                                                Development Centers




Activity                                                                   Fresno            Dallas            Austin
The project’s software development plan is developed                       Weakness          Weakness          Weakness
according to a documented procedure.
The plan for the software project is documented.                           Not applicable    Weakness          Weakness
Software work products that are needed to establish and                    Not applicable    Not applicable    Weakness
maintain control of the software project are identified.
The software risks associated with the cost, resource, schedule,           Strength          Weakness          Weakness
and technical aspects of the project are identified, assessed,
and documented.
Plans for the project’s software engineering facilities and                Not applicable    Weakness          Weakness
support tools are prepared.
                                                                           Not satisfied     Not satisfied     Not satisfied



The software engineering group participates on the project                 Strength          Strength          Strength
proposal team.
The software engineering group participates with other affected            Strength          Weakness          Weakness
groups in the overall project planning throughout the project life
cycle.
Software project commitments made to individuals and groups                Weakness          Not applicable    Weakness
external to the organization are reviewed with senior
management according to a documented procedure.
                                                                           Not satisfied     Not satisfied     Not satisfied




                                                                           Not satisfied     Not satisfied     Not satisfied


A documented software development plan is used for tracking                Weakness          Weakness          Weakness
the software activities and communicating status.
The size of the software work products(or size of the changes to           Weakness          Weakness          Weakness
the software work products) are tracked, and corrective actions
are taken as necessary.
The project’s software effort and costs are tracked, and                   Strength          Strength          Strength
corrective actions are taken as necessary.
The project’s critical computer resources are tracked, and                 Weakness          Weakness          Weakness
corrective actions are taken as necessary.
The project’s software schedule is tracked, and corrective                 Weakness          Weakness          Weakness
actions are taken as necessary.
The software engineering technical activities are tracked, and             Weakness          Weakness          Weakness
corrective actions are taken as necessary.
The software risks associated with cost, resource, schedule,               Weakness          Weakness          Weakness
and technical aspects of the project are tracked.
                                                                                                                        (continued)



                                                Page 41                                        GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix III
Detailed Software Capability Evaluation
Results for the Fresno, Dallas, and Austin
Development Centers




Level 2 KPA/Purpose                          Goals                                        A
                                                                                          A
                                                                                          p
                                                                                          T
                                                                                          r
                                                                                          i
                                                                                          F
                                                                                          t
                                                                                          m
                                             Goal 2
                                             Corrective actions are taken and managed
                                             to closure when actual results and
                                             performance deviate significantly from the
                                             software plans.
                                                                                          T
                                                                                          t
                                                                                          T
                                                                                          c
                                                                                          T
                                                                                          c
                                                                                          T
                                                                                          c
                                                                                          T
                                                                                          a
                                                                                          T
                                                                                          c
                                                                                          A
                                                                                          p
                                             Goal 3
                                             Changes to software commitments are
                                             agreed to by the affected groups and
                                             individuals.
                                                                                          S
                                                                                          m
                                                                                          r
                                                                                          p
                                                                                          A
                                                                                          p
                                                                                          e




Page 42                                           GAO/GGD/AIMD-97-31 IRS’ ICP System
                                              Appendix III
                                              Detailed Software Capability Evaluation
                                              Results for the Fresno, Dallas, and Austin
                                              Development Centers




Activity                                                                 Fresno            Dallas            Austin
Actual measurement and replanning data for the software                  Weakness          Weakness          Weakness
project are recorded.
The software engineering group conducts periodic internal                Weakness          Weakness          Strength
reviews to track technical progress, plans, performance, and
issues.
Formal reviews to address the accomplishments and results of             Weakness          Weakness          Weakness
the software project are conducted at selected project
milestones according to a documented procedure.
                                                                         Not satisfied     Not satisfied     Not satisfied




The project’s software development plan is revised according             Weakness          Weakness          Weakness
to a documented procedure.
The size of the software work products are tracked, and                  Weakness          Weakness          Weakness
corrective actions are taken as necessary.
The project’s software effort and costs are tracked, and                 Strength          Strength          Strength
corrective actions are taken as necessary.
The project’s critical computer resources are tracked, and               Weakness          Weakness          Weakness
corrective actions are taken as necessary.
The project’s software schedule is tracked, and corrective               Weakness          Strength          Strength
actions are taken as necessary.
The software engineering technical activities are tracked, and           Weakness          Weakness          Weakness
corrective actions are taken as necessary.
Actual measurement and replanning data for the software                  Weakness          Weakness          Weakness
project are recorded.
                                                                         Not satisfied     Not satisfied     Not satisfied



Software project commitments and changes to commitments                  Weakness          Not applicable    Not applicable
made to individuals and groups external to the organization are
reviewed with senior management according to a documented
procedure.
Approved changes to commitments that affect the software                 Strength          Strength          Weakness
project are communicated to the members of the software
engineering group and other software related groups.




                                              Page 43                                        GAO/GGD/AIMD-97-31 IRS’ ICP System
Appendix IV

Major Contributors to This Report


                        Robert L. Giusti, Assignment Manager
General Government      Christopher Hess, Evaluator
Division, Washington,
D.C.
                        A. Carl Harris, Issue Area Manager
Atlanta Field Office    David Schechter, Evaluator
                        Karen B. Thompson, Evaluator
                        Sara Bingham, Reports Analyst


                        Leonard Baptiste, Jr., Senior Assistant Director
Accounting and          Kelly A. Wolslayer, Senior Information Systems Analyst
Information             Madhav S. Panwar, SCE Team Leader
Management Division,    David Chao, SCE Team Member
                        Nancy M. Donnellan, Information Systems Analyst
Washington, D.C.        Leonard J. Latham, SCE Team Member
                        K. Alan Merrill, SCE Team Member
                        Paul Silverman, SCE Team Member




(268700)                Page 44                                  GAO/GGD/AIMD-97-31 IRS’ ICP System
Ordering Information

The first copy of each GAO report and testimony is free.
Additional copies are $2 each. Orders should be sent to the
following address, accompanied by a check or money order
made out to the Superintendent of Documents, when
necessary. VISA and MasterCard credit cards are accepted, also.
Orders for 100 or more copies to be mailed to a single address
are discounted 25 percent.

Orders by mail:

U.S. General Accounting Office
P.O. Box 6015
Gaithersburg, MD 20884-6015

or visit:

Room 1100
700 4th St. NW (corner of 4th and G Sts. NW)
U.S. General Accounting Office
Washington, DC

Orders may also be placed by calling (202) 512-6000
or by using fax number (301) 258-4066, or TDD (301) 413-0006.

Each day, GAO issues a list of newly available reports and
testimony. To receive facsimile copies of the daily list or any
list from the past 30 days, please call (202) 512-6000 using a
touchtone phone. A recorded menu will provide information on
how to obtain these lists.

For information on how to access GAO reports on the INTERNET,
send an e-mail message with "info" in the body to:

info@www.gao.gov

or visit GAO’s World Wide Web Home Page at:

http://www.gao.gov




PRINTED ON    RECYCLED PAPER
United States                       Bulk Rate
General Accounting Office      Postage & Fees Paid
Washington, D.C. 20548-0001           GAO
                                 Permit No. G100
Official Business
Penalty for Private Use $300

Address Correction Requested