oversight

Military Readiness: Improvements Still Needed in Assessing Military Readiness

Published by the Government Accountability Office on 1997-03-11.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                          United States General Accounting Office

GAO                       Testimony
                          Before the Subcommittee on Military Readiness,
                          Committee on National Security, House of Representatives




For Release on Delivery
Expected at
10:00 a.m., EDT
                          MILITARY READINESS
Tuesday,
March 11, 1997

                          Improvements Still Needed
                          in Assessing Military
                          Readiness
                          Statement of Mark E. Gebicke, Director, Military
                          Operations and Capabilities Issues, National Security and
                          International Affairs Division




GAO/T-NSIAD-97-107
                 Mr. Chairman and Members of the Subcommittee:

                 Today’s American military forces have earned the reputation of being
                 among the best, if not the best, trained forces in the world. That reputation
                 stands in stark contrast to the so-called “hollow forces” of the 1970s. Yet,
                 as we have proceeded through nearly a decade of military downsizing,
                 periodic concerns or questions have surfaced about the potential for a new
                 “hollowing” of our forces. Concerns voiced by military personnel to
                 congressional staff during field visits are quite different from official unit
                 readiness assessment reports forwarded through service headquarters to
                 the Joint Chiefs of Staff (JCS), and to the Office of the Secretary of Defense
                 (OSD). This difference has resulted in questions in recent years about the
                 true measure of readiness of our military forces.

                 Today, I would like to provide a broad overview of the readiness
                 assessment process and frame my comments around three questions.

             •   What disconnects are associated with readiness reporting, and why do
                 they exist?
             •   What corrective actions have been proposed and taken to measure
                 readiness?
             •   What further actions are needed?


                 Historically, readiness of U.S. military forces at the unit level has been
Background       measured using the Status of Resources and Training System (SORTS),
                 under the sponsorship of the JCS. Under SORTS, units report their overall
                 readiness status as well as the status of four resource areas (personnel,
                 equipment and supplies on hand, equipment condition, and training). The
                 readiness status of a unit is reported by assigning capability, or “C,” ratings
                 as follows:

                 C-1—Unit can undertake the full wartime missions for which it is
                 organized or designed.

                 C-2—Unit can undertake the bulk of its wartime missions.

                 C-3—Unit can undertake major portions of its wartime missions.

                 C-4—Unit requires additional resources and/or training to undertake its
                 wartime missions, but if the situation dictates, it may be required to
                 undertake portions of the missions with resources on hand.



                 Page 1                                                      GAO/T-NSIAD-97-107
                           C-5—Unit is undergoing a service-directed resource change and is not
                           prepared to undertake its wartime missions.

                           While SORTS still provides the basic underpinning to readiness assessments,
                           both OSD and JCS have established senior oversight groups in recent years
                           to focus on readiness issues at a higher level and provide a more
                           comprehensive assessment of readiness.


                           Formal readiness reports provided by SORTS have sometimes indicated a
Summary                    higher state of readiness than appears warranted based on other
                           information coming from military personnel in the field. The implications
                           are that the formal reporting system is overly optimistic in its readiness
                           assessments, and questions can be legitimately raised about its credibility.
                           As we and others have reported, there are many shortcomings in SORTS
                           that need to be addressed, including the

                       •   lack of emphasis on readiness on a long-term basis, contrasted with the
                           snapshot in time currently provided;
                       •   use of insufficient indicators to ensure a comprehensive assessment of
                           readiness; and
                       •   inability to measure integrated readiness of joint operating forces.

                           Our recommendations have been targeted toward helping DOD identify
                           indicators most relevant to developing a more comprehensive readiness
                           assessment and ensuring that comparable data are maintained by all
                           services to allow the development of trends on the selected indicators.


                           Several types of disconnects have historically existed between SORTS
What Disconnects Are       formal readiness reports and other information obtained from military
Associated With            personnel in the field, and those disconnects exist for various reasons. In
Readiness Reporting,       recent years, either in reports or testimony before the Congress, we
                           discussed the Department of Defense’s (DOD) system for measuring
and Why Do They            readiness and reported on the need for improvements.1 We previously
Exist?                     reported instances where, during fieldwork on our assignments, SORTS data
                           appeared to paint a rosier picture of readiness than did various military
                           officials, who expressed concerns about readiness in their discussions
                           with us, or even in correspondence with higher headquarters. These
                           concerns were centered on high operating tempo (OPTEMPO), frequent
                           deployments of personnel away from home (known as PERSTEMPO),

                           1
                            A list of relevant GAO reports and testimonies is included at the end of this statement.



                           Page 2                                                                           GAO/T-NSIAD-97-107
personnel shortfalls and turnovers, and the shifting of funds from key
readiness accounts to meet other needs, each of which could degrade
readiness. Many of these concerns addressed current conditions and, more
importantly, the future if existing conditions persisted.

We have continued to report on these issues in conjunction with more
recent work. Our April 1996 report on PERSTEMPO issues noted that DOD
could not precisely measure the increase in deployments because until
1994 only the Navy had systems to track PERSTEMPO. Still missing were
clear and consistent definitions and data collection on a consistent basis
across the services. Further, during our visits to high-deploying units,
military personnel at major commands expressed grave concerns about
the adverse effects on readiness resulting from high operating tempo and
frequent deployments away from home. However, SORTS C-ratings
examined in conjunction with these assignments have continued to show a
fairly stable level of overall unit readiness. Less than one-third of the
high-deploying units we reviewed dropped below planned readiness levels
due to deployments. During our most recent examination of this issue in
conjunction with Special Operations Forces, we found that a negative
impact on readiness due to increased OPTEMPO was not readily apparent in
the SORTS reports.

In 1995 we reported that participation in peace operations could enhance
or reduce a unit’s combat capability, depending on the type of unit, skills
used or not used, length of participation, and in-theater training
opportunities. We noted that the ground combat forces, mechanized
infantry, armored units, and units that are heavily dependent on equipment
(such as artillery) face the greatest combat skill erosion when they deploy
for peace operations without their equipment. Also, while they are
deployed, they may do tasks that are significantly different from the
combat tasks for which they normally train.

Senior defense officials have stated that it is difficult to estimate the
amount of time required to restore a unit’s combat effectiveness for all its
missions after a unit participates in a peace operation; however, Army
commanders generally estimate a range of 3 to 6 months. Yet when
examining SORTS reports, we have seen little to indicate significant
reductions in C-ratings for units participating in peacekeeping operations.
While I cannot say conclusively that downgraded readiness should have
been reported, I will note that a special study entitled The Effects of Peace
Operations on Unit Readiness, published in February 1996 by the Army’s
Center for Army Lessons Learned, recommended that the Department of



Page 3                                                     GAO/T-NSIAD-97-107
the Army consider having units report “C-5” on their unit status reports for
a period of 4 months after return from peace operations.

Our 1996 report on chemical and biological defense pointed out that many
of the types of problems encountered during the Gulf War remain
uncorrected, and U.S. forces continue to experience serious
training-related weaknesses in their chemical and biological proficiency.
At the same time, we found that the effectiveness of SORTS for evaluating
units’ chemical and biological readiness was limited. This was the case
despite a DOD requirement imposed in 1993 for all the services to assess
their equipment and training status for operations in a contaminated
environment and to report this data as a distinct part of SORTS. DOD’s
requirement also allows commanders to subjectively upgrade their overall
SORTS status, regardless of their chemical and biological status. For
example, one early deploying active Army division was rated as C-1—the
highest SORTS category—despite rating itself C-4 for chemical and
biological equipment readiness.

I also want to touch on the effect that manning levels have on readiness,
which we reported on in 1995. This continues to be an important issue.
Existing SORTS data often reflects a high readiness level for manning
because in the aggregate, through substitution, units may numerically have
most of their assigned personnel. However, aggregating data can mask
underlying personnel problems that can be detrimental to readiness, such
as shortages by skill level and rank or grade. Compounding these
problems can be high levels of personnel turnover. When considered
collectively, these factors create situations where commanders may have
difficulty developing and maintaining unit cohesion and accomplishing
training objectives. Judging by our recent review of selected commanders’
comments submitted with their SORTS reports, and other available data, the
problems I have just noted are real, although not well reflected in the
overall C-ratings.

Several factors play a part in the apparent disconnects I have noted here
today. We have noted that formal readiness assessments in SORTS contain
both objective and subjective elements. Gunnery scores, for example, can
be more objectively measured than can the broad impact of personnel
shortfalls and turnovers. The C-rating for training is based on a
commander’s subjective assessment of how well a unit is trained based on
his personal observation and various internal and external evaluations. A
commander may subjectively change his unit’s overall C-rating, based on
experience, to reflect a broader perspective of the unit’s ability to perform



Page 4                                                     GAO/T-NSIAD-97-107
                         its wartime missions. Thus, concerns about degradation in readiness in
                         one area may diminish in relation to the commander’s confidence about
                         the overall state of readiness.

                         It may be that a commander’s informal statements of concern over
                         readiness, apart from SORTS, are a signal of an impending change that may
                         eventually show up in SORTS reports. However, we have been told by a
                         variety of military leaders that some commanders may view the SORTS
                         reports they prepare as scorecards on their capabilities and performance
                         with the potential to affect their promotion potential. Thus, they are
                         reluctant to report degraded readiness. We have also been told that the
                         reluctance to cite degraded readiness is indicative of a “can do” spirit of
                         optimism. Whatever the cause, the fact is that significant differences can
                         and do exist between official SORTS reports, other data, and professional
                         military judgments.


                         In 1994 we reported that C-ratings represent a snapshot in time; they are
What Corrective          not predictive and do not address long-term readiness or signal impending
Actions Have Been        changes in the status of resources. Neither do they assess joint readiness,
Proposed and Taken       that is, the preparedness of unified commands and joint task forces to
                         effectively integrate individual service combat and support units into a
to Measure               joint operating force.
Readiness?
                         We identified and reported on a number of indicators that (1) service
                         officials told us were either critical or important to a more comprehensive
                         assessment of readiness and (2) had some predictive value. These
                         indicators included projected personnel trends, crew manning, recruiting
                         shortfalls, personnel stability, PERSTEMPO, borrowed manpower, morale,
                         operating tempo, funding, accidents, and unit readiness and proficiency.
                         At that time, we recommended that the Under Secretary of Defense for
                         Personnel and Readiness be directed to

                     •   review the indicators we had identified as being critical to predicting
                         readiness and select the specific indicators most relevant to a more
                         comprehensive readiness assessment,
                     •   develop criteria to evaluate the selected indicators and prescribe how
                         often the indicators should be reported to supplement SORTS data, and
                     •   ensure that comparable data is maintained by all services to allow the
                         development of trends on the selected indicators.




                         Page 5                                                    GAO/T-NSIAD-97-107
In the 1994 time frame and later, OSD and JCS began a number of initiatives
that have heightened the emphasis on readiness within their respective
offices, including some initial emphasis on joint readiness. Additionally,
some of the services have initiated actions to strengthen their assessments
of readiness.

In the fall of 1993, OSD created a Senior Readiness Oversight Council
comprised of high-level military and civilian officials and co-chaired by the
Deputy Secretary of Defense and the Vice Chairman of JCS. The Council
meets monthly to review the status of readiness based on briefings given
by each service chief of staff and an overall assessment by the Vice
Chairman. Also, results of the JCS joint reviews are briefed to the Council.
The Council focuses on topical readiness issues too, such as various
aspects of combat support, both on a short- and long-term basis. The
Council is also responsible for providing quarterly readiness reports to the
Congress.2 The most recent unclassified quarterly readiness report
submitted to the Congress for the period October to December 1996 stated
that “first to fight” forces were at a high level of readiness, while overall
unit readiness was stable at historic levels. At the same time, it noted that
careful management was required for some segments of the force that
were critical to current operations and to major regional contingencies.

OSD is in the beginning stages of attempting to develop a readiness
baseline, which when completed will contain additional indicators with
information on personnel, equipment, training, and joint readiness that is
not available from existing DOD databases. This baseline is now more
oriented to examining functional issues such as accessions, retention,
manning levels, and training on an aggregate basis than it is to developing
a more comprehensive readiness assessment system from a unit
perspective. Over time, as system development continues, the baseline is
expected to facilitate assessments of joint readiness, provide a basis for
resource allocation, and support DOD’s budgeting process. OSD’s efforts
over the past 3 years have focused on identifying indicators that would be
useful to this system. OSD officials told us they hope to have baseline data
available to assist in joint readiness assessments within 3 or 4 years but
that a comprehensive system with predictive capabilities will evolve over
several years.




2
 Section 361 of the 1996 defense authorization act added a new section to chapter 22 of 10 U.S.C.,
section 452, requiring the Secretary of Defense to submit quarterly reports on military readiness to the
Congress.



Page 6                                                                          GAO/T-NSIAD-97-107
To direct more attention to readiness, JCS established the Chairman’s
Readiness System, which became operational in December 1994. A major
component of this system is the Joint Monthly Readiness Review process,
which provides an assessment of readiness to execute the National
Military Strategy through current assessments of unit and joint readiness
at the tactical, operational, and strategic levels. The process requires each
commander in chief (CINC), service, and combat support agency to assess
and report on the current and projected readiness status of major combat
and critical strategic forces, given specific scenarios. A foundation for
these assessments is provided by SORTS, supplemented by other data
available to the CINCs. JCS staff told us that the joint reviews generally focus
on relatively near-term readiness issues (current time to 2 years), often
dealing with combat support in such functional areas as lift, intelligence,
logistics, and sustainment. These reviews help to identify readiness
deficiencies that can be prioritized for possible remedy or workarounds.

Also, JCS is attempting to develop the capability to combine multiple DOD
databases to assess readiness at tactical, operational, and strategic levels.
In doing so, JCS recognized that the SORTS system is oriented more to
assessing readiness at the tactical or unit level. This capability could be
used to automate and expedite analyses now completed as part of the JCS
joint reviews. A JCS official told us that funding has just been approved to
implement this project. It is important to note that as JCS develops the
planned software programs, the system would still incorporate SORTS, with
its problems, as well as multiple other data systems to provide a broader
assessment of readiness issues at multiple levels.

At the service level, only the Army has taken significant actions on its own
to identify and collect data to provide a more comprehensive assessment
of readiness. The Army Readiness Management System (ARMS), which
began during the past year, is a 4-year effort to combine SORTS data with
Army installation status reports and the Training and Doctrine Command’s
training status report to develop a comprehensive assessment of unit,
operational, and training readiness.3 The Army’s focus in developing ARMS
has been on improving or enhancing information provided by SORTS
reports. As part of this effort, unit commanders are now required to report
data in a number of additional categories. While this supplemental data is
not used by reporting units to set C-ratings, the data is used by the Army to

3
 An installation status report provides an assessment of mission support, strategic mobility, housing,
community, utility, and environmental infrastructure assets on an Army installation. The training
status report provides an assessment of the Training and Doctrine Command’s current and future
capability to provide skilled soldiers, training and equipment criteria, and sound doctrine for Army
units.



Page 7                                                                          GAO/T-NSIAD-97-107
                       do supplemental analyses and make some projections of future impacts on
                       readiness. For example, data now collected pertains to crew proficiency,
                       the percentage of specialty training completed, and PERSTEMPO. Moreover,
                       the software program used in ARMS allows Army officials at all levels to
                       quickly develop and portray current, historical, trend, and near-term
                       predictive readiness information.

                       The Navy, Marine Corps, and Air Force continue to rely on SORTS to
                       provide readiness information and have not required commanders to
                       provide information on additional indicators since our 1994 report. The
                       Navy has started analyzing existing SORTS data in more finite ways to
                       enhance its usefulness. This effort includes analyses by fleet, type of ship,
                       type of aircraft, and deployed status. Within the next year, Navy officials
                       hope to include information outside of SORTS, for example, maintenance
                       data and equipment cannibalization data, as part of its analyses in an effort
                       to develop some short-term readiness forecast capability. The Marine
                       Corps has continued to collect and report only those indicators required
                       by JCS regulation as a part of SORTS, and officials told us they have no plans
                       to systematically obtain other readiness information.

                       Air Force officials told us they see no need to use readiness indicators
                       other than those provided by SORTS. Instead, they believe that SORTS
                       reporting needs to be improved, and they are exploring ways to make the
                       SORTS data more sensitive to readiness changes by narrowing the
                       percentage range for reporting at a certain C-rating. However, Air Force
                       officials told us that changes to SORTS reporting would not be proposed for
                       at least a year.


                       We continue to have concerns that not enough attention is being devoted
What Further Actions   to ensuring the accuracy and completeness of SORTS—the beginning point
Are Needed?            for higher-level assessments at the operational and strategic levels.
                       Continuing shortcomings in SORTS, both in terms of its inherent limitations
                       and seeming disconnects, need to be addressed if DOD is to have a credible
                       foundation upon which a more comprehensive readiness reporting system
                       can be built. In addressing these deficiencies, DOD should develop
                       additional readiness indicators, and ensure that they are integrated into
                       assessments of readiness on a unit-level basis within each of the services.

                       We commend JCS and OSD efforts to develop broader capabilities for
                       measuring readiness. However, some of those efforts involve making use
                       of existing databases, apart from SORTS, to develop a more comprehensive



                       Page 8                                                      GAO/T-NSIAD-97-107
assessment of readiness. We believe that efforts will be required to ensure
the accuracy and completeness of those databases. For example, while the
potential for PERSTEMPO to adversely affect retention raises concerns, OSD’s
primary database dealing with reasons for separating from military service
has historically captured limited information on why separations occur.
An OSD official expressed hope that as data systems used in OSD’s baseline
project come into increased use, senior leaders will exert pressure to
enhance the quality of these data systems. We believe that actions to
identify database requirements, limitations, and needed improvements
should occur concurrent with the baseline development.


This concludes my prepared statement. I would be happy to respond to
any questions that you or Members of the Subcommittee may have.




Page 9                                                    GAO/T-NSIAD-97-107
Appendix I

Recent GAO Reports and Testimonies
Dealing With Readiness

              Army Ranger Training: Safety Improvements Need to Be Institutionalized
              (GAO/NSIAD-97-29, Jan. 2, 1997).

              Military Readiness: Data and Trends for April 1995 to March 1996
              (GAO/NSIAD-96-194, Aug. 2, 1996).

              Operation and Maintenance Funding: Trends in Army and Air Force Use of
              Funds for Combat Forces and Infrastructure (GAO/NSIAD-96-141, June 4,
              1996).

              Chemical and Biological Defense: Emphasis Remains Insufficient to
              Resolve Continuing Problems (GAO/T-NSIAD-96-154, May 1, 1996).

              Civilian Downsizing: Unit Readiness Not Adversely Affected, but Future
              Reductions a Concern (GAO/NSIAD-96-143BR, Apr. 22, 1996).

              Military Readiness: A Clear Policy Is Needed to Guide Management of
              Frequently Deployed Units (GAO/NSIAD-96-105, Apr. 8, 1996).

              Chemical and Biological Defense: Emphasis Remains Insufficient to
              Resolve Continuing Problems (GAO/NSIAD-96-103, Mar. 29, 1996).

              DOD Reserve Components: Issues Pertaining to Readiness
              (GAO/T-NSIAD-96-130, Mar. 21, 1996).

              Military Readiness: Data and Trends for January 1990 to March 1995
              (GAO/NSIAD-96-111BR, Mar. 4, 1996).

              Peace Operations: Effect of Training, Equipment, and Other Factors on
              Unit Capability (GAO/NSIAD-96-14, Oct. 18, 1995).

              Military Personnel: High Aggregate Personnel Levels Maintained
              Throughout Drawdown (GAO/NSIAD-95-97, June 2, 1995).

              Military Readiness: Improved Assessment Measures Are Evolving
              (GAO/T-NSIAD-95-117, Mar. 16, 1995).

              Military Readiness: DOD Needs to Develop a More Comprehensive
              Measurement System (GAO/NSIAD-95-29, Oct. 27, 1994).

              Military Readiness: Current Indicators Need to Be Expanded for a More
              Comprehensive Assessment (GAO/T-NSIAD-94-160, Apr. 21, 1994).



(703193)      Page 10                                                  GAO/T-NSIAD-97-107
Ordering Information

The first copy of each GAO report and testimony is free.
Additional copies are $2 each. Orders should be sent to the
following address, accompanied by a check or money order
made out to the Superintendent of Documents, when
necessary. VISA and MasterCard credit cards are accepted, also.
Orders for 100 or more copies to be mailed to a single address
are discounted 25 percent.

Orders by mail:

U.S. General Accounting Office
P.O. Box 6015
Gaithersburg, MD 20884-6015

or visit:

Room 1100
700 4th St. NW (corner of 4th and G Sts. NW)
U.S. General Accounting Office
Washington, DC

Orders may also be placed by calling (202) 512-6000
or by using fax number (301) 258-4066, or TDD (301) 413-0006.

Each day, GAO issues a list of newly available reports and
testimony. To receive facsimile copies of the daily list or any
list from the past 30 days, please call (202) 512-6000 using a
touchtone phone. A recorded menu will provide information on
how to obtain these lists.

For information on how to access GAO reports on the INTERNET,
send an e-mail message with "info" in the body to:

info@www.gao.gov

or visit GAO’s World Wide Web Home Page at:

http://www.gao.gov




PRINTED ON    RECYCLED PAPER
United States                       Bulk Rate
General Accounting Office      Postage & Fees Paid
Washington, D.C. 20548-0001           GAO
                                 Permit No. G100
Official Business
Penalty for Private Use $300

Address Correction Requested