oversight

Battlefield Automation: Software Problems Hinder Development of the Army's Maneuver Control System

Published by the Government Accountability Office on 1997-10-16.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                  United States General Accounting Office

GAO               Report to the Secretary of Defense




October 1997
                  BATTLEFIELD
                  AUTOMATION
                  Software Problems
                  Hinder Development of
                  the Army’s Maneuver
                  Control System




GAO/NSIAD-98-15
             United States
GAO          General Accounting Office
             Washington, D.C. 20548

             National Security and
             International Affairs Division

             B-276830

             October 16, 1997

             The Honorable William S. Cohen
             The Secretary of Defense

             Dear Mr. Secretary:

             The Army has spent over $765 million of the $1 billion estimated total cost
             for the Maneuver Control System (MCS) which is to provide battlefield
             information to maneuver commanders. Since 1980, the MCS program has
             experienced numerous problems, such as fielding inadequate computer
             software and canceling the development of one software version due to
             design flaws, cost growth, and schedule slips. Given the program’s past
             difficulties and the important role of MCS in the Army’s battlefield
             automation efforts, we reviewed the Army’s development and acquisition
             plans for MCS. Specifically, our objectives were to determine whether
             (1) the current MCS software development strategy is appropriate to
             overcome prior development problems and (2) 207 new computers for MCS
             related training should be procured as planned.


             The goal of the Army’s MCS program is to develop and field a computer
Background   system that provides automated critical battlefield assistance to maneuver
             commanders and their battle staff at the corps-to-battalion level. MCS is
             intended to enable the command staff to collect, store, process, display,
             and disseminate critical data to produce and communicate battle plans,
             orders, and enemy and friendly situational reports. It is a key component
             of the Army Tactical Command and Control System, which is also
             intended to enhance the coordination and control of combat forces
             through automated management of five key battlefield areas, including
             maneuver control.1 Given its role to communicate battle plans, orders, and
             enemy and friendly situation reports, MCS is also a key component of the
             Army’s ongoing efforts to digitize (automate) its battlefield operations.

             In 1980, the Army fielded the first MCS system—with limited command,
             control, and communications capabilities—to VII Corps in Europe. In
             1982, the Army awarded a 5-year contract to continue MCS development,
             and by 1986 MCS software had evolved to version 9, also fielded in Europe.
             In 1987, the Army performed post-deployment tests on version 9 in
             Germany. The results of those tests led the Army Materiel Systems


             1
              The other battlefield functional areas are air defense, fire support, intelligence and electronic warfare,
             and combat service support.



             Page 1                                                     GAO/NSIAD-98-15 Battlefield Automation
                   B-276830




                   Analysis Activity to conclude that MCS did not exhibit adequate readiness
                   for field use and recommend that further fielding not occur until the
                   system’s problems were resolved.2 However, the Army awarded a second
                   5-year contract that resulted in version 10, which was fielded by April 1989
                   and remains in the field today. In November 1989, the Army Materiel
                   Systems Analysis Activity reported that MCS had met only 30 percent of its
                   required operational capabilities and again recommended that the system
                   not be released for field use. In May 1990, operational testers again
                   questioned the system’s functional ability and effectiveness because it
                   could not produce timely, accurate, and useful information in a battle
                   environment.

                   While earlier versions of MCS were being fielded and withdrawn, the
                   development of software continued. In 1988, the Army awarded a contract
                   for the development of version 11. By February 1993, the Army stopped
                   development of version 11 software due to multiple program slips, serious
                   design flaws, and cost growth concerns. The program was then
                   reorganized with a plan approved by the Office of the Secretary of Defense
                   in April 1993. Under the reorganized program, a group of contractors and
                   government software experts have been working to develop the next
                   version of MCS software—version 12.01—utilizing software segments that
                   could be salvaged from the failed version 11 effort.

                   In addition to software, the MCS system consists of computers procured
                   under the Army’s Common Hardware and Software (CHS) effort, which was
                   undertaken to reverse the proliferation of program-unique computers and
                   software. The Army planned to acquire 288 of the CHS computers in fiscal
                   years 1997 and 1998 to support the MCS training base, and has already
                   acquired 81. Those computers were used in a training base assessment to
                   support a decision to acquire the remaining 207 computers.


                   Since its 1993 reorganization, the Maneuver Control System has continued
Results in Brief   to experience development problems. The initial operational test and
                   evaluation of version 12.01 software has slipped 28 months, from
                   November 1995 to March 1998, and interim tests have shown that
                   significant software problems continue. Despite these problems, the Army
                   awarded a contract in September 1996 for the concurrent development of
                   the next software versions—12.1, 12.2, and 12.3—which are being
                   developed by a new contractor and may involve substantially different


                   2
                    In April 1984, the Army Materiel Command designated the Army Materiel Systems Analysis Activity as
                   its independent evaluator for materiel releases of major and other high-visibility systems.



                   Page 2                                                GAO/NSIAD-98-15 Battlefield Automation
                      B-276830




                      software. If the Army’s current development strategy for the Maneuver
                      Control System is not strengthened, development problems may continue
                      to occur. Currently, the Army’s strategy allows (1) less than full
                      operational testing of version 12.1 and (2) development of follow-on
                      versions 12.2 and 12.3 to start about 18 months before the operational
                      testing of each version’s predecessor.

                      Despite the fact that the Maneuver Control System has yet to undergo an
                      initial operational test and evaluation or be approved for production, the
                      Army plans to acquire 207 computers in fiscal years 1997 and 1998 to
                      increase the number computers available for system training. Program
                      officials stated that they need to acquire the computers before operational
                      testing to provide not only MCS specific training but also training for the
                      larger Army Battle Command System, of which the Army Tactical
                      Command and Control System and the Maneuver Control System are
                      major components. The 207 computers, however, are not needed to satisfy
                      any of the three legislated reasons for low-rate initial production before an
                      initial operational test and evaluation.3


                      Since its reorganization in 1993, MCS program experience indicates
Development           continuing problems in the system’s development. Specifically, (1) the MCS
Problems Continue     initial operational test and evaluation of version 12.01 has slipped twice,
                      (2) interim developmental level tests and a customer test done to support
                      a decision to award a contract to develop follow-on software show that
                      significant problems continue, and (3) development of follow-on version
                      12.1 was begun despite the results of the customer test and prior program
                      history.


Operational Testing   After the 1993 program reorganization, version 12.01 was scheduled to
Schedules Slip        undergo initial operational testing and evaluation in November 1995. The
                      test slipped to November 1996 and is now scheduled for March 1998.
                      Program officials stated that the test date slipped initially because the CHS
                      computers to be used were not yet available.

                      During August and September 1996, version 12.01 underwent a system
                      confidence demonstration to determine whether it was ready for the

                      3
                       Title 10 U.S.C. 2400 provides that low-rate initial production of systems, except for ships and
                      satellites, is to produce the minimum quantity necessary to (1) provide production-configured or
                      representative articles for operational test and evaluation, (2) establish an initial production base for
                      the system, and (3) permit an orderly increase in the production rate for the system sufficient to lead
                      to full-rate production upon the successful completion of operational test and evaluation.



                      Page 3                                                     GAO/NSIAD-98-15 Battlefield Automation
                            B-276830




                            November 1996 initial operational test and evaluation. Because the
                            software was not ready, further work and two additional system
                            confidence demonstrations followed in August and September 1996. Both
                            demonstrations indicated that the system was not ready for operational
                            testing. Additionally, the software still had an open priority one software
                            deficiency and priority three and four deficiencies that would have
                            negatively impacted the conduct of the operational test.4

                            Both the Army’s Operational Test and Evaluation Command and the
                            Department of Defense’s (DOD) Director of Operational Test and
                            Evaluation (DOT&E) had stated that there could be no open priority one or
                            two software deficiencies before the operational test. They had also stated
                            that there could not be any open priority three and four deficiencies that,
                            in combination, were likely to have a detrimental effect on the system’s
                            performance. DOT&E staff told us that there were a number of open priority
                            three and four software deficiencies that they believe would have had a
                            detrimental effect. When MCS program officials realized that these
                            deficiencies would not be resolved in time for the initial operational test,
                            they downgraded the test 3 weeks before it was to occur to a limited user
                            test,5 utilizing $8.5 million appropriated for the MCS operational test in
                            fiscal years 1996 and 1997.6 That test was conducted in November 1996.
                            While the test report has not been finalized, a draft version states that
                            MCS—in the tested configuration—is not operationally effective or suitable.



Interim Development Level   Throughout the development of version 12.01, interim software builds
Tests Indicate Continuing   have undergone numerous performance tests to determine the current
Problems                    state of software development, and build 4 was subjected to a customer




                            4
                             Software deficiencies are rated in a priority system, from priority one—the most critical—to priority
                            five—the least critical. An open software deficiency is a deficiency identified through testing that is
                            not considered to be resolved.
                            5
                             The limited user test involved the same testing planned for the initial operational test and evaluation.
                            It was limited in that it had no pass/fail criteria and was not to be used to support a full-rate production
                            decision. It served as a learning experience, providing information on the current maturity of the MCS
                            software and a baseline of performance by which to judge future development efforts.
                            6
                             MCS program officials stated that at the time it became apparent that MCS was not ready for its initial
                            operational test and evaluation, it made sense to go forward with the test as a limited user test because
                            the user had been trained; the equipment was instrumented for the test; and the users, equipment, and
                            testers were all in place. In providing technical comments on a draft of this report, DOD stated that,
                            given the sunk costs, the Army’s decision to go forward with the test made sense because an
                            operational test could provide invaluable feedback to the MCS developers that could not be obtained
                            through technical testing.



                            Page 4                                                     GAO/NSIAD-98-15 Battlefield Automation
                          B-276830




                          test.7 The results of those tests identified continuing problems as the
                          number of builds proceeded. For example, a December 1995 performance
                          test report on build 3.0 stated that, if the problems found during the test
                          were not quickly corrected in build 3.1, then the risk to the program might
                          be unmanageable. The follow-on April 1996 performance test report of
                          build 3.1 stated that significant problems in system stability prevented
                          proper testing of several requirements. The report further stated that
                          messaging between battlefield functional areas was extremely difficult and
                          problematic and that the system had other stability problems.

                          A September 1996 performance test report stated that of 568 previously
                          open deficiency reports from builds 5.1 through 5.2c, 165, almost
                          29 percent, still remained open. This report, the last published on an MCS
                          performance test, reflected the state of the MCS software shortly before the
                          downgraded limited user test, in which MCS failed to demonstrate either
                          operational effectiveness or suitability. More recent performance tests of
                          later builds have been done; however, separate reports on those test
                          events have not been issued. Rather, the program office plans to prepare
                          an integrated test report in October or November 1997.


Concurrent Contract Was   In April 1994, the MCS program office released a plan to begin follow-on
Awarded for Follow-on     software development while version 12.01 was still in development. In a
Software Development      May 1995 memorandum, the Deputy DOT&E expressed concern regarding
                          this plan. He stated that, because version 12.01 was being

                          “developed by a confederation of contractors who have built this current version of MCS on
                          the salvaged ’good’ portions of the abruptly terminated development of MCS Version 11, it
                          needs to stand the rigor of an Independent Operational Test and Evaluation . . . before a
                          MCS Block IV [post version 12.01 development] contract is awarded.”



                          To help determine the level of risk in proceeding under the Army’s
                          development strategy, DOT&E stated in a June 1995 memorandum that an
                          operational test of version 12.01 be conducted to measure the software’s
                          maturity before the award of a contract for the development of follow-on
                          versions. As a result, an operational assessment—called the MCS customer
                          test—was conducted on version 12.01 in April 1996 to support the award
                          of a $63.1 million contract for the development of MCS Block IV
                          software—MCS versions 12.1, 12.2, and 12.3.


                          7
                           A software build involves additions or changes to the software to add new functions or correct
                          deficiencies in the prior build software. Development of a new software version under the evolutionary
                          software development philosophy is accomplished by multiple intraversion software builds.



                          Page 5                                                  GAO/NSIAD-98-15 Battlefield Automation
B-276830




No pass/fail criteria were set for the customer test. However, DOT&E
directed that four operational issues be tested. Those issues related to
(1) the capacity of the system to store and process required types and
amounts of data, including the ability of the staff users to frequently
update the information database; (2) the capabilities of the MCS network to
process and distribute current and accurate data using the existing
communications systems; (3) the impact of computer server outages on
continuity of operations; and (4) the system administration and control
capabilities to initialize the system, become fully operational, and sustain
operations.

In its report on the customer test, the Army’s Test and Experimentation
Command stated that, at the time of the test, MCS was evolving from a
prototype system to one ready for initial operational test and evaluation
and, as such, possessed known limitations that were described to the
system users during training. The Command reported that the test’s major
limitations included (1) software that did not contain the full functional
capability planned for the initial operational test and evaluation; (2) a need
to reboot the system after crashes caused by the use of the computer’s
alternate function key; (3) two changes in software versions during
training; and (4) the fact that 65 percent of the system manager functions
had not been implemented or trained. Table 1 provides more detail on the
customer test results.




Page 6                                     GAO/NSIAD-98-15 Battlefield Automation
                                          B-276830




Table 1: Customer Test Operational
Issues and Associated Army Test and       Operational issue                Army Test and Experimentation Command comments
Experimentation Command Comments          Capacity of the system to        “The system consistently locked up and had to be
                                          store and process the            rebooted while the staff user was attempting to process
                                          required types and amounts       . . . data. This resulted in the loss of all data in working
                                          of data, including the ability   files and any data in the queues awaiting distribution or
                                          of the staff users to update     processing to a database.”
                                          the required database
                                          frequently.                      “Staff users rated the systems capability to process and
                                                                           provide . . . data, and to assist the staff in the
                                                                           performance of their duties as marginal.”

                                                                           “Storing and processing . . . data was adequately
                                                                           demonstrated by [MCS] for only two functions: editing
                                                                           specified reports and processing specified messages.
                                                                           . . . application software for the other functions . . .
                                                                           performed inconsistently and rendered the system
                                                                           unreliable.”
                                          The capabilities of the MCS      “The [system to distribute data] and the distributed
                                          network to process and           computing environment did not work as required. The
                                          distribute current and           systems locked up and the message handler backed up
                                          accurate data using the          (sometimes with thousands of messages). The test officer
                                          existing communications          noted . . . that . . . the dedicated server, had a message
                                          systems.                         queue backlog of 19,000 messages. This situation,
                                                                           combined with the necessity to reboot the system . . .
                                                                           throughout the test, caused backlogged messages to be
                                                                           lost. The staff users were often unable to initiate or
                                                                           complete tasks and transmit data between the nodes.”
                                          Impact of computer server        The Army Test and Experimentation Command’s report
                                          outages on continuity of         indicates that the third operational issue was met, stating
                                          operations.                      that the success rate for continuity of operations was 100
                                                                           percent.
                                          System administration and        “Sixty five percent of the system manager functions are
                                          control capabilities to          not yet implemented, and were not trained.”
                                          initialize the system, become
                                          fully operational, and sustain   “The results indicate that system administration and
                                          operations.                      control capabilities functions are incomplete for this build
                                                                           of software. Additionally, poor system performance, and
                                                                           an immature training program hampered the user’s ability
                                                                           to sustain operations.”
                                          Source: Maneuver Control System/Phoenix: Customer Test Report, Command, Control, and
                                          Communications Test Directorate; Army Test and Experimentation Command, 1996-CT-1302,
                                          June 1996.



                                          In addition to these findings, the MCS test officer stated the following:

                                      •   “System performance degraded over time causing message backlogs, loss
                                          of data, and numerous system reboots. Over a period of 12 operational
                                          hours the [data distributing system] slowed down and created message




                                          Page 7                                              GAO/NSIAD-98-15 Battlefield Automation
                                     B-276830




                                     backlogs of up to 4 hours. To remain functional, the entire network of
                                     [MCS] systems must be shut down and reinitialized in proper sequence.”
                                 •   “The staff users had great difficulty using ... [multiple] applications.”
                                 •   “The software pertaining to system management functions was immature,
                                     incomplete and lacked documentation. This capability is critical to the
                                     effective use and operation of the [MCS] system.”

                                     Even though the customer test did not involve pass/fail criteria, based on
                                     our review of the test report and the test officer’s comments, we believe
                                     that only the third operational issue—impact of computer server outages
                                     on continuity of operations—was met. Despite the results of the customer
                                     test and the program’s prior history, the Under Secretary of Defense for
                                     Acquisition and Technology approved the Army’s plan to award a
                                     concurrent contract for MCS Block IV software development—MCS versions
                                     12.1, 12.2 and 12.3.

                                     In September 1996, the Army awarded a contract for the development of
                                     MCS software versions 12.1, 12.2, and 12.3 to a different contractor than the
                                     developers of MCS version 12.01. At that time, version 12.01 was still
                                     scheduled to undergo its initial operational testing in November 1996. The
                                     start of the follow-on development could have been timed to occur after
                                     version 12.01 had completed that operational testing. At most, this action
                                     would have delayed the contract award 2 months, assuming that the initial
                                     operational test had occurred in November 1996 as scheduled. However,
                                     the contract was awarded before the initial operational test, and the
                                     planned 5 month concurrency in the development of versions 12.01 and
                                     12.1 became 18 months when the operational test slipped to March 1998.

                                     The current program schedule indicates that (1) version 12.1 is expected
                                     to undergo its operational assessment/test about 1 year after the fielding of
                                     version 12.01 is started and (2) version 12.1 fielding is to be done 5 months
                                     after initial operational capability of version 12.01 is achieved. If the
                                     scheduled version 12.01 operational test and evaluation slips again and the
                                     version 12.1 contractor is able to maintain its development schedule,
                                     version 12.1 could become available before version 12.01.

Army Requested Flexibility for       By May 1997, the Army requested DOD approval of a revised acquisition
Operational Testing of               program baseline that changes the planned follow-on operational test and
Follow-on Software                   evaluation of versions 12.1, 12.2, and 12.3 to operational
                                     assessments/operational tests. Program officials said that, although the
                                     name of the tests had changed, the planned scope of the tests had not.
                                     However, the officials said that the name change complies with guidance



                                     Page 8                                    GAO/NSIAD-98-15 Battlefield Automation
                                B-276830




                                from DOT&E, which lists multiple levels of operational test and evaluation
                                (from an abbreviated assessment to full operational test) and outlines a
                                risk assessment methodology to be used to determine the level of testing
                                to be performed. The officials further stated that the use of the generic
                                term operational test/operational assessment permits possible changes to
                                the level of testing for version 12.1 and follow-on software increments
                                based on the risk assessment process.

                                The contractors competing for the MCS Block IV (MCS versions 12.1, 12.2,
                                and 12.3) development were given access to the government’s 12.01 code
                                and allowed to reuse as much of it as they chose. The Block IV developer
                                is not required to reuse any of version 12.01. Rather, the Block IV contract
                                requires the development of software to provide specific functions. Given
                                that (1) version 12.01 software has not passed or even undergone an initial
                                operational test and evaluation and (2) the MCS Block IV contractor
                                building version 12.1 is not the contractor that is building version 12.01
                                and is only required to develop the version 12.1 to provide specified
                                functions, we believe that the version 12.1 development effort should not
                                be viewed as building upon a proven baseline. Instead, it should be viewed
                                as a new effort.

Continuation of Current         The Army’s current development plan for version 12.1 and beyond, as
Development Strategy Could Be   shown in figure 1, continues an approach of building a follow-on version of
Costly                          software on an incomplete and unstable baseline—the uncompleted
                                preceding version of software.




                                Page 9                                    GAO/NSIAD-98-15 Battlefield Automation
                                                       B-276830




Figure 1: Future MCS Software Development Schedule

      FY 97           FY 98                                      FY 99                       FY 00           FY 01
 OND-JFM-AMJ-JAS OND-JFM-AMJ-JAS                            OND-JFM-AMJ-JAS             OND-JFM-AMJ-JAS OND-JFM-AMJ-JAS


 Version 12.01

    Version 12.1


                                        Version 12.2


                                                                       Version 12.3




        Initial operational test and evaluation

        Full-rate production decision

        Development
        Operational assessment/operational test

        Fielding


                                                       Source: Army.




                                                       Additionally, according to an official in the DOD’s Office of the Director of
                                                       Test, Systems Engineering, and Evaluation, the Army’s development
                                                       process allows requirements that are planned for one software version,
                                                       which cannot be accomplished in that version’s development as planned,
                                                       to be deferred to a later version’s development. As a result, this process
                                                       makes judging program risk and total cost very difficult.

                                                       The MCS program has previously demonstrated the problem of deferring
                                                       requirements. For example, during MCS version 11 development, we
                                                       reported that the Army had deferred seven MCS functions that were to have
                                                       been developed by June 1992 and included in the software version to
                                                       undergo operational testing.8 Even though the version 11 operational test
                                                       had slipped twice, from May 1992 to September 1992 and then to

                                                       8
                                                        Battlefield Automation: Planned Production Decision for Army Control System is Premature
                                                       (GAO/NSIAD-92-151, August 10, 1992).



                                                       Page 10                                               GAO/NSIAD-98-15 Battlefield Automation
                         B-276830




                         May 1993, the Army continued to defer those functions, and the
                         operational test was planned for less than the complete software package
                         originally scheduled to be tested.

                         In commenting on a draft of this report, DOD said that they had made
                         progress not reflected in that draft. Specifically, they noted that there were
                         no priority one or two, and only 22 priority three software deficiencies
                         open as of September 11, 1997, as compared with 10 priority one, 47
                         priority two, and 67 priority three deficiencies open on August 16, 1996.
                         While we agree these results indicate that some known problems have
                         been fixed, they provide no indication of the number or severity of still
                         unknown problems. For example, MCS version 12.01 development showed
                         enough progress entering the November 1996 scheduled initial operational
                         test and evaluation to reach a commitment of resources and personnel.
                         However, that test was later downgraded to a limited user test because of
                         software immaturity. Successful completion of an initial operational test
                         and evaluation should provide a more definitive indication of the MCS
                         program’s progress.


                         Before the slip of the MCS initial operational test and evaluation from
Buying Training Base     November 1996 to March 1998, the Army planned to acquire 288
Computers Before         computers—150 in fiscal year 1997 and 138 in fiscal year 1998—for the MCS
Operational Testing Is   training base. These computers were to be acquired after a full-rate
                         production decision at a total cost of about $34.8 million—$19.1 million in
Questionable             fiscal year 1997 and $15.7 million in fiscal year 1998.

                         After the initial operational test and evaluation slipped, DOD approved the
                         Army’s acquisition of a low-rate initial production of 81 computers in fiscal
                         year 1997 for a training base operational assessment. The purpose of the
                         assessment, which was performed from February to May 1997, was to
                         judge the merits of allowing the Army to procure the remaining computers
                         prior to successful completion of the slipped operational test. On the basis
                         of the results of that assessment, the Acting Under Secretary of Defense
                         for Acquisition and Technology authorized the Army in July 1997 to
                         proceed with its acquisition plans. The Acting Under Secretary noted that
                         the DOT&E had reviewed the assessment and agreed that version 12.01 was
                         adequate for use in the training base.

                         The Acting Under Secretary also authorized the Army to move the training
                         base computer funds from the MCS budget to the Army’s automated data
                         processing equipment program budget line. This action was necessary



                         Page 11                                    GAO/NSIAD-98-15 Battlefield Automation
B-276830




because, according to both Army and DOD officials, it was determined that
the computers to be acquired do not meet the legislated reasons in 10
U.S.C. 2400 for low-rate initial production. That legislation allows the early
acquisition of systems to (1) establish an initial production base,
(2) permit an orderly increase in the production rate for the system that is
sufficient to lead to full-rate production upon successful completion of
operational test and evaluation, and (3) provide production-representative
items for operational test and evaluation. Even though the Army now
plans to acquire the computers under a different budget line, the intended
use of the computers remains unchanged.

MCS  program officials said that the computers are needed in the MCS
training base before operational testing to adequately support future
fielding of MCS and the larger Army Battle Command System, of which the
Army Tactical Command and Control System and MCS are key
components. This rationale is the same one the Acting Under Secretary
cited in his July 1997 memorandum. In that memorandum, he stated that
the “requirement to train Army-wide on commercial equipment is a
recognized requirement not only for MCS but for a host of other digital . . .
systems.” The Acting Under Secretary further noted that the funds to be
moved were for equipment needed to support integrated training of
multiple systems throughout the Army and concluded that “training on a
digital system, even if it is not the system that is ultimately fielded, is
important to the Army in order to assist in making the cultural change
from current maneuver control practice to a digitized approach.”

MCS program officials stated that the MCS course curriculum needs to be
developed and that equipping the training base before the completion of
operational testing avoids a 2-year lag between the completion of
operational testing and the graduation of trained students. The officials
also commented that the computers could be used elsewhere, since they
would be compatible with other Army programs.

The legislated requirement9 that major systems, such as MCS, undergo
initial operational test and evaluation before full-rate production serves to
limit or avoid premature acquisitions. The Army has had previous
experience acquiring ineffective MCS equipment, which is indicative of the
need for adequate testing before systems are fielded. In July 1990, the
Army began withdrawing over $100 million of militarized MCS hardware
from the field due to both hardware and software deficiencies.
Additionally, the Army subsequently decided not to deploy other MCS

9
 Title 10 U.S.C. 2399.



Page 12                                    GAO/NSIAD-98-15 Battlefield Automation
                         B-276830




                         equipment it had procured for light divisions at a cost of about $29 million
                         because the equipment was too bulky and heavy.


                         The MCS program’s troubled development and acquisition history has
Conclusions and          continued since the program’s 1993 reorganization. However, the Army
Recommendations          awarded a new contract to develop future software versions and plans to
                         procure computers without fully resolving the problems of earlier
                         versions. This strategy does not minimize the possibility of future
                         development problems and ensure that the Army will ultimately field a
                         capable system. Also, since MCS software version 12.1 is being developed
                         concurrently by a different contractor to functional specifications, it
                         would be prudent to subject the version 12.1 software to the level of
                         operational testing required to support a full-rate production decision, as
                         planned for version 12.01. Accordingly, we believe a more appropriate
                         strategy would require that future software versions be developed using
                         only fully tested baselines, and that each version be judged against specific
                         pre-established criteria.

                         We recommend that you direct the Secretary of the Army to

                     •   set specific required capabilities for each software version beyond version
                         12.01, test those versions against specific pass/fail criteria for those
                         capabilities, and only award further development contracts once problems
                         highlighted in that testing are resolved;
                     •   perform a full operational test and evaluation of MCS software version 12.1
                         to ensure that it provides the full capabilities of version 12.01; and
                     •   procure additional MCS computers only after an initial operational test and
                         evaluation and a full-rate production decision have been completed.


                         In commenting on a draft of this report, DOD agreed with our
Agency Comments          recommendation that specific required capabilities for each MCS software
and Our Evaluation       version beyond version 12.01 are needed, that those versions should be
                         tested against specific pass/fail criteria for those capabilities, and that the
                         Army should not award further development contracts until problems
                         highlighted in prior tests are resolved. DOD noted that the Army has already
                         set specific required capabilities for those software versions and will test
                         those versions against specific pass/fail criteria to ensure system maturity
                         and determine that the system remains operationally effective and
                         suitable. DOD further stated that it will not support the award of further




                         Page 13                                    GAO/NSIAD-98-15 Battlefield Automation
B-276830




development contracts until the Army has successfully resolved any
problems identified during the testing of related, preceding versions.

DOD partially agreed with our recommendation that the Army be directed
to perform a full-operational test and evaluation of MCS software version
12.1 to ensure that it provides the full capabilities of version 12.01. DOD
stated that the Army will comply with DOD regulation 5000.2R and will
follow guidance from Director of Operational Test and Evaluation, which
lists multiple levels of operational test and evaluation (from an
abbreviated assessment to full operational test) and outlines a risk
assessment methodology to be used to determine the level of testing to be
performed. DOD did not, however, indicate whether it would require the
Army to conduct a full operational test. We continue to believe that the
version 12.1 development effort should not be viewed as building upon a
proven baseline. Instead, version 12.1 development should be viewed as a
new effort. As a result, we still believe that the prudent action is to require
that version 12.1 be subjected to the same level of operational test and
evaluation as version 12.01, the level required to support a full-rate
production decision.

DOD agreed with our recommendation that it direct the Army to not
procure more MCS computers until the completion of an initial operational
test and evaluation and a full-rate production decision. It stated, however,
that no further direction to the Army is needed as it had already provided
direction to the Army on this issue. Specifically, the Department stated
that it has directed the Army to extract the training base computers from
the MCS program and to not procure or field more MCS hardware to
operational units until successfully completing an initial operational test
and evaluation. Our recommendation, however, is not limited to the
hardware for operational units, but also encompasses the computers the
Army plans to buy for the training base. Given the program’s prior history
and the fact that the training base computers are not needed to satisfy any
of the legislated reasons for low-rate initial production, we continue to
believe that the Army should not be allowed to buy those computers until
MCS has successfully completed its initial operational test and
evaluation—the original plan prior to the MCS initial operational test and
evaluation’s multiple schedule slips.

DOD’s comments are reprinted in their entirety in appendix I, along with
our evaluation. In addition to those comments, we have revised our report
where appropriate to reflect the technical changes that DOD provided in a
separate letter.



Page 14                                    GAO/NSIAD-98-15 Battlefield Automation
              B-276830




              To determine whether the current MCS software development strategy is
Scope and     appropriate to overcome prior problems and to determine whether the
Methodology   Army should procure 207 new computers for the expansion of the MCS
              training base, we interviewed responsible officials and analyzed pertinent
              documents in the following DOD offices, all in Washington, D.C.: Director
              of Operational Test and Evaluation; Director of Test, Systems Engineering,
              and Evaluation; Assistant Secretary of Defense for Command, Control,
              Communications, and Intelligence; Under Secretary of Defense
              (Comptroller); and Defense Procurement. In addition, we interviewed
              responsible officials and analyzed test reports from the office of the
              Army’s Project Manager, Operations Tactical Data Systems, Fort
              Monmouth, New Jersey; and the Army’s Operational Test and Evaluation
              Command, Alexandria, Virginia. To meet our second objective, we also
              interviewed responsible officials and analyzed pertinent documents from
              the Army’s Combined Arms Center, Fort Leavenworth, Kansas.

              We conducted our review from March to September 1997 in accordance
              with generally accepted government auditing standards.


              We are sending copies of this report to the Chairman and Ranking
              Minority Members, Senate and House Committees on Appropriations,
              Senate Committee on Armed Services, and House Committee on National
              Security; the Director, Office of Management and Budget; and the
              Secretary of the Army. We will also make copies available to others on
              request.

              As you know, the head of a federal agency is required by 31 U.S.C. 720 to
              submit a written statement on actions taken our recommendations to the
              Senate Committee on Governmental Affairs and the House Committee on
              Government Reform and Oversight not later than 60 days after the date of
              this report. A written statement must also be submitted to the Senate and
              House Committees on Appropriations with the agency’s first request for
              appropriations made more than 60 days after the date of the report.




              Page 15                                 GAO/NSIAD-98-15 Battlefield Automation
B-276830




Please contact me at (202) 512-4841 if you or your staff have any questions
concerning this report. Major contributors to this report were
Charles F. Rey, Bruce H. Thomas, and Gregory K. Harmon.

Sincerely Yours,




Allen Li
Associate Director
Defense Acquisitions Issues




Page 16                                  GAO/NSIAD-98-15 Battlefield Automation
Page 17   GAO/NSIAD-98-15 Battlefield Automation
Appendix I

Comments From the Department of Defense


Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




See comments 1 and 2.




                             Page 18   GAO/NSIAD-98-15 Battlefield Automation
                 Appendix I
                 Comments From the Department of Defense




Now on p. 13.




Now on p. 13.




See comment 1.




                 Page 19                                   GAO/NSIAD-98-15 Battlefield Automation
                 Appendix I
                 Comments From the Department of Defense




Now on p. 13.




See comment 2.




                 Page 20                                   GAO/NSIAD-98-15 Battlefield Automation
               Appendix I
               Comments From the Department of Defense




               The following are GAO’s comments on the Department of Defense’s (DOD)
               letter dated October 2, 1997.


               1. In partially agreeing with this recommendation, DOD states that the Army
GAO Comments   will comply with DOD regulation 5000.2R and will follow guidance from
               Director of Operational Test and Evaluation—guidance which lists
               multiple levels of operational test and evaluation (from an abbreviated
               assessment to full operational test) and outlines a risk assessment
               methodology to be used to determine the level of testing to be performed.
               DOD does not, however, indicate how they agree or disagree with our
               recommendation or state whether they will implement the
               recommendation. As we stated in the body of this report, given that a
               different contractor is building version 12.1 under a requirement to
               provide specific functionality, we believe that this development effort
               should not be viewed as building upon a proven baseline. Instead, version
               12.1 development should be considered a new effort. As a result, we
               continue to believe that it is prudent to require that version 12.1 be
               subjected to the level of operational test and evaluation required to
               support a full-rate production decision.

               2. DOD’s direction to the Army only partially implements our
               recommendation. Our recommendation is not limited to the hardware for
               operational units, but also encompasses the computers the Army plans to
               buy for the training base. We continue to believe that the Army should not
               be allowed to buy the planned training base computers until MCS has
               successfully completed its initial operational test and evaluation—the
               original plan prior to the MCS initial operational test and evaluation’s
               schedule slips. The training base computers are not required to satisfy any
               of the three purposes the law indicates for low-rate initial production—to
               (1) establish an initial production base, (2) permit an orderly increase in
               the production rate for the system sufficient to lead to full-rate production
               upon successful completion of operational test and evaluation, and
               (3) provide production-representative items for operational test and
               evaluation. Since the training base computers are not needed to satisfy
               one of the above legislated conditions, we continue to believe that the
               Army should refrain from buying any additional MCS computers prior to a
               full-rate production decision.




(707242)       Page 21                                   GAO/NSIAD-98-15 Battlefield Automation
Ordering Information

The first copy of each GAO report and testimony is free.
Additional copies are $2 each. Orders should be sent to the
following address, accompanied by a check or money order
made out to the Superintendent of Documents, when
necessary. VISA and MasterCard credit cards are accepted, also.
Orders for 100 or more copies to be mailed to a single address
are discounted 25 percent.

Orders by mail:

U.S. General Accounting Office
P.O. Box 37050
Washington, DC 20013

or visit:

Room 1100
700 4th St. NW (corner of 4th and G Sts. NW)
U.S. General Accounting Office
Washington, DC

Orders may also be placed by calling (202) 512-6000
or by using fax number (202) 512-6061, or TDD (202) 512-2537.

Each day, GAO issues a list of newly available reports and
testimony. To receive facsimile copies of the daily list or any
list from the past 30 days, please call (202) 512-6000 using a
touchtone phone. A recorded menu will provide information on
how to obtain these lists.

For information on how to access GAO reports on the INTERNET,
send an e-mail message with "info" in the body to:

info@www.gao.gov

or visit GAO’s World Wide Web Home Page at:

http://www.gao.gov




PRINTED ON    RECYCLED PAPER
United States                       Bulk Rate
General Accounting Office      Postage & Fees Paid
Washington, D.C. 20548-0001           GAO
                                 Permit No. G100
Official Business
Penalty for Private Use $300

Address Correction Requested