oversight

Federal Research: Evaluation of Small Business Innovation Research Can Be Strengthened

Published by the Government Accountability Office on 1999-06-04.

Below is a raw (and likely hideous) rendition of the original report. (PDF)

                  United States General Accounting Office

GAO               Report to the Committee on Science,
                  House of Representatives



June 1999
                  FEDERAL RESEARCH
                  Evaluation of Small
                  Business Innovation
                  Research Can Be
                  Strengthened




GAO/RCED-99-114
      United States
GAO   General Accounting Office
      Washington, D.C. 20548

      Resources, Community, and
      Economic Development Division

      B-281081

      June 4, 1999

      The Honorable James F. Sensenbrenner
      Chairman
      The Honorable George E. Brown, Jr.
      Ranking Minority Member
      Committee on Science
      House of Representatives

      This report responds to your request for information on the Small Business Innovation
      Research program. It discusses the distribution of awards, with special emphasis on the 25
      companies that have won the most awards. It also discusses commercial potential as a factor
      taken into consideration by federal agencies when evaluating companies’ proposals. The report
      includes a matter for congressional consideration that may help to clarify the relative emphasis
      that agencies, in evaluating proposals, should give to a company’s commercialization record as
      part of the goal of commercialization and to the program’s other goals. It also contains a
      recommendation to the Administrator of the Small Business Administration that may help to
      strengthen the evaluation of the program’s commercial outcomes in response to the
      Government Performance and Results Act.

      We are sending copies of this report to the Honorable Aida Alvarez, Administrator, Small
      Business Administration, and to the heads of the other federal agencies participating in the
      Small Business Innovation Research program. If you have any questions, I can be reached at
      (202) 512-3841. Major contributors to this report are listed in appendix XIII.




      Susan D. Kladiva
      Associate Director, Energy, Resources,
        and Science Issues
Executive Summary


             As a nation competing in a global economy, the United States depends
Purpose      heavily on innovation through research and development. The Small
             Business Innovation Development Act of 1982, which authorized the Small
             Business Innovation Research (SBIR) program, emphasizes the benefits of
             technological innovation and the ability of small businesses to transform
             the results of research into new products. In its 16 years, the program has
             provided over 45,000 awards worth $8.4 billion in 1998 dollars to
             thousands of small high-technology companies. As the program has
             matured in the 1990s, congressional concern has focused on the
             companies’ ability to commercialize the results of their research and on
             the concentration of awards in certain states and companies—commonly
             known as “frequent winners.” SBIR awards, like total federal research and
             development expenditures, are heavily concentrated in certain states.
             Concern about frequent winners has arisen, in part, because studies
             conducted by GAO and the Department of Defense indicate that frequent
             winners achieve lower levels of commercialization than companies
             winning fewer awards.

             To facilitate the discussion of these issues, the Subcommittee on
             Technology, House Committee on Science, asked GAO to review (1) the
             distribution of awards by company and geographic area, with special
             emphasis on the share of awards received by the 25 most frequent
             winners; (2) the extent to which federal agencies are considering
             commercial potential and the program’s other goals in making their
             awards; and (3) previous evaluations of the SBIR program to identify
             opportunities to improve measurements of the program’s outcomes.


             The act establishing the SBIR program identified four goals for the program:
Background   technological innovation, commercialization, the use of small businesses
             to meet agencies’ research and development needs, and participation by
             minorities and disadvantaged persons. Funding for the program in fiscal
             year 1997 (the last year for which funding data are available) amounted to
             $1.1 billion.

             Federal agencies that have external research and development budgets of
             more than $100 million are currently required to use at least 2.5 percent of
             this budget for the program. Ten federal agencies participate in the
             program. The Department of Defense accounts for about 45 percent of the
             awards, while the National Aeronautics and Space Administration, the
             Department of Health and Human Services, the Department of Energy, and
             the National Science Foundation together account for close to 48 percent.



             Page 2             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                   Executive Summary




                   Each agency makes awards and manages its own program while the Small
                   Business Administration (SBA) plays a key administrative role that includes
                   the issuance of policy directives and the maintenance of a central database
                   on awards.

                   In reauthorizing the program in 1992, the Congress stated its intention to
                   expand and improve the program, emphasize the program’s goal of
                   increasing the private sector’s commercialization of technology developed
                   through federal research and development, increase small businesses’
                   participation in federal research and development, and improve the
                   federal government’s dissemination of information on the program. One
                   new provision requires agencies, when evaluating proposals at an
                   intermediate stage (phase II), to consider each proposal’s commercial
                   potential, which includes a company’s commercialization record,
                   commitments accompanying the proposal for developmental funding from
                   sources other than the SBIR program, and other factors. The
                   commercialization record, which indicates how successful the company
                   has been in developing commercial applications of SBIR or other research,
                   generally includes the company’s sales, additional developmental funding,
                   and other results of its SBIR awards. Another provision requires agencies to
                   collect information on frequent winners. This greater emphasis on the
                   results of the program mirrors the intention of the Government
                   Performance and Results Act of 1993, which requires federal agencies to
                   report on the outcomes of federal programs. The program is scheduled to
                   terminate on October 1, 2000.


                   From fiscal year 1983 through fiscal year 1997, the 25 most frequent
Results in Brief   winners received over $900 million, or about 11 percent of the program’s
                   awards. These companies represent fewer than 1 percent of all the
                   companies that have received awards. The program has a high number of
                   first-time participants. One-third of the companies receiving awards from
                   fiscal year 1993 through fiscal year 1997 were first-time winners, indicating
                   that the program is attracting hundreds of new companies annually. SBIR
                   awards are concentrated in certain states. From fiscal year 1993 through
                   fiscal year 1996, companies in one-third of the states received 85 percent
                   of the program’s awards, largely because companies in these states
                   submitted the most proposals. Companies from California and
                   Massachusetts won the highest number of awards. To broaden the
                   geographic distribution of awards, agencies have made efforts to
                   encourage the submission of proposals from companies in states with
                   fewer awards. For example, the National Science Foundation has used a



                   Page 3              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Executive Summary




program to support research in states that have received relatively little
federal research funding to increase participation in the SBIR program.

In response to the 1992 reauthorization, agencies are considering
commercial potential as an explicit criterion when evaluating proposals.
At the same time, the emphasis on commercial potential has raised
questions for the agencies. The reauthorization does not clarify how a
company’s commercialization record, as part of the goal of
commercialization, and the program’s other goals should be used in
evaluations of proposals. This lack of clarity has led to differences across
agencies in how they evaluate proposals. For example, using an approach
shared by none of the other agencies, the Department of Defense planned
to give significantly lower scores to companies perceived as poor
commercializers. Early tests of Defense’s plan indicated that some of the
most frequent winners that have been relatively unsuccessful in
commercializing their research results would not have been penalized if
only a few of their awards had resulted in sales. At the same time,
companies with far fewer awards and no previous sales might have been
subject to penalties. Although the Department of Defense has revised its
plan to avoid this problem, the lack of clarity in the legislation remains a
concern. This report raises as a matter for congressional consideration
how much emphasis the commercialization record as part of the goal of
commercialization should receive relative to the program’s other goals in
evaluations of proposals.

Federal agencies and others have used various methods to evaluate the
program’s commercial outcomes. These methods have used “snapshots” of
sales, data on additional developmental funding for the projects, “success
stories,” and other indications of commercial success. However, they
become quickly outdated and do not provide an ongoing, consistent, and
programwide record. The use of a single method with uniform criteria for
success focusing on commercial outcomes and other indicators of success
would help to satisfy the requirements of the Results Act.1 The Small
Business Administration is currently developing a new database called
Tech-Net, which is scheduled for implementation in 1999. Tech-Net affords
an opportunity to maintain current, consistent information about
commercial and other outcomes and respond to the Results Act. This
report recommends that it be used for this purpose.




1
 In December 1997, the Congress specified that information relating to the SBIR program must be
included by each federal agency in updates or revisions to its strategic plan. 15 U.S.C. 638(t).



Page 4                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Executive Summary




Principal Findings

Awards Go to Both            Both frequent and first-time winners receive significant funds under the
Frequent Winners and New     program. From fiscal year 1983 through fiscal year 1997, the 25 most
Applicants, and Agencies     frequent winners accounted for a total of 4,629 awards and received over
                             $900 million of the $8.4 billion awarded, with $108 million going to one
Are Trying to Broaden the    company alone. These awards contributed substantially to the companies’
Geographic Distribution of   annual revenue for fiscal year 1998, averaging about 43 percent of these
Awards                       companies’ total annual revenue, with variations from a low of 6 percent
                             to a high of 80 percent. First-time winners also received a significant
                             portion of the awards. From fiscal year 1993 through fiscal year 1997,
                             one-third of the companies receiving an award—over 750 companies each
                             year, on average—were first-time winners.

                             The concentration of awards in certain states tends to reflect the
                             concentration of federal research resources in general. A 1998 SBA study
                             reported that the number of small high-technology firms in a state, its
                             research resources, and the availability of venture capital are important
                             factors in explaining the distribution of SBIR awards.2 Agencies’ efforts to
                             broaden the geographic distribution of awards have included outreach
                             conferences in states with fewer awards and a program at the National
                             Science Foundation to support research in 18 states that have received
                             relatively little federal research funding. Since 1994, this program has
                             awarded 82 SBIR grants valued at over $7 million to numerous small
                             businesses in these states.


The Emphasis on              One of the purposes of the 1992 reauthorization was to emphasize the goal
Commercialization May        of commercialization. As required by the act, agencies are weighing the
Have Unintended              commercial potential of all proposals and are collecting data on
                             commercialization by frequent winners. The emphasis on
Consequences and Raises      commercialization has created problems for some agencies in evaluating
Questions About the          proposals. Measuring the commercial success of companies is difficult for
Program’s Other Goals        several reasons. First, the role of the commercialization record in judging
                             a company’s current commercial potential remains unclear. In general,
                             program managers reported that the commercialization record has played
                             a limited role so far because it is only one of several factors considered as
                             a part of commercial potential; however, the Department of Defense
                             planned to make the record increasingly important in its evaluations of

                             2
                               An Analysis of the Distribution of SBIR Awards by States, 1983-1996, SBA, Office of Advocacy
                             (Jan. 1998).



                             Page 5                    GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                          Executive Summary




                          proposals. Second, despite the greater emphasis on the goal of
                          commercialization, according to some of the program managers, the other
                          goals remain important. In their view, limited commercialization by itself
                          may not signal failure if a company has achieved other goals.

                          Because the 1992 act and a 1993 SBA policy directive on implementing the
                          act do not indicate how the commercialization record should be used,
                          differences among the agencies have emerged. The Department of
                          Defense, for example, developed an approach shared by none of the other
                          agencies. It collected information on commercialization by companies,
                          which it planned to use in evaluating proposals from companies that had
                          won multiple SBIR awards. However, early tests of the plan showed that
                          companies with relatively few awards and no sales might receive
                          comparatively low scores, whereas the most frequent winners with only
                          modest sales might not be penalized at all. The Department has revised its
                          plan to avoid this problem by taking into account the concept of statistical
                          significance as it relates to companies with widely varying numbers of
                          awards.


A Method Exists to        In the 1990s, studies by GAO, individual agencies, and others have focused
Improve the Measurement   on the commercial outcomes of awards, but the program itself has lacked
of Program Results        the ability to measure its accomplishments as the Results Act directs. Two
                          of the major methods used so far to survey the outcomes of SBIR research
                          include GAO’s approach3 and case studies of companies that have been
                          successful in commercializing the results of their research. In response to
                          a congressional mandate, GAO surveyed companies that had won awards
                          from fiscal year 1984 through fiscal year 1987 and received information on
                          the outcomes of 1,457 projects. One of the key questions GAO asked was,
                          “Has the technology associated with this project led to additional
                          developmental funding and/or sales, and is further work on this
                          technology under way?” This question has been used in subsequent
                          surveys by other agencies, including the Department of Defense in 1996
                          and SBA in 1998. In addition, several agencies have presented success
                          stories stemming from their SBIR awards. Although the methods have
                          differed, many of the key criteria for success focus on common concerns
                          about the level of sales, developmental funding, and job creation.

                          An opportunity exists to improve the measurement of outcomes and
                          respond to the Results Act. This opportunity involves the use of uniform,

                          3
                           Federal Research: Small Business Innovation Research Shows Success but Can Be Strengthened,
                          (GAO/RCED-92-37, Mar. 30, 1992).



                          Page 6                  GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                       Executive Summary




                       outcome-related criteria and the expansion of SBA’s new Tech-Net
                       database to provide information about these outcomes. SBA’s previous
                       central database was developed long before the passage of the Results Act
                       and focused on inputs, such as company names and funding, while
                       including virtually no information on results. Tech-Net, however, is an
                       Internet database that will enable agencies to update information on their
                       SBIR awards and companies to update information on their activities. Thus,
                       a key feature of the system will be its ability to show changes in the
                       program over time. Standard criteria for measuring commercial success
                       and other outcomes, such as savings to agencies resulting from SBIR
                       projects, could be identified and included in the new database. This
                       approach, if implemented, will make available—for the first time—a
                       central database that can be used to produce effective, consistent, ongoing
                       evaluations of the program’s commercial and other outcomes.


                       When the Congress considers the reauthorization of this program, it may
Matter for             wish to clarify the relative emphasis that agencies, in evaluating
Congressional          companies’ proposals, should give to a company’s commercialization
Consideration          record as part of the goal of commercialization and to the program’s other
                       goals. This clarification would help ensure uniformity in the program and a
                       clear set of standards for determining whether, and to what extent,
                       commercialization and the program’s other goals should be considered in
                       evaluations of proposals.


                       To respond to the Government Performance and Results Act, GAO
Recommendation to      recommends that the Administrator develop standard criteria for
the Administrator,     measuring the commercial and other outcomes of the SBIR program and
Small Business         incorporate these criteria into the new Tech-Net database. The criteria
                       should include uniform measures of sales, developmental funding, and
Administration         other indicators of success.


                       GAO provided a draft of this report to and obtained comments from the
Agency Comments        Small Business Administration and the 10 program agencies. GAO has
and GAO’s Evaluation   discussed the specific issues raised by the agencies at relevant places in
                       the report and incorporated the additional information and technical
                       corrections from the agencies where appropriate.

                       The Small Business Administration generally agreed with the report. The
                       Department of Defense, the Department of Education, the Department of



                       Page 7              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Executive Summary




Agriculture, the Department of Transportation, the Department of
Commerce, the National Science Foundation, and the Environmental
Protection Agency generally agreed with the report while offering
additional observations or suggesting specific technical corrections. The
Department of Defense agreed with GAO’s concern about the unintended
consequences of its plan to use a company’s commercialization record in
evaluating proposals. It has revised its plan to avoid these unintended
consequences. GAO has updated the report to reflect this revision of the
Department’s plan.

Several agencies expressed concerns regarding specific issues and
suggested technical corrections. The Department of Energy disagreed with
GAO’s description of the Department’s use of information on companies’
commercialization records in evaluating SBIR proposals. GAO has deleted
the specific points identified by the Department. Only one agency, the
National Institutes of Health, commented on the matter for congressional
consideration. The Institutes expressed concern about the matter’s focus
on uniformity, noting that it misses the fact that different relative
emphases may be appropriate to agencies’ different missions. The
Institutes also questioned what they considered to be the draft report’s
close association between success and commercialization. In general, GAO
does not believe that an effort to clarify the relative emphasis that should
be given to commercialization and to the program’s other goals will lead to
insensitivity to agencies’ different missions. Moreover, the report did not
equate success and commercialization.

The Small Business Administration concurred with the recommendation
but said that, for it to be effective, federal agencies must agree to provide
the information, and the Congress must require the agencies to provide the
information through the Tech-Net database. GAO notes that the agencies
are already required to report information on awards to the Small Business
Administration and believes that they could include the additional
information on outcomes in responding to this reporting requirement. The
Environmental Protection Agency, the National Aeronautics and Space
Administration, and the National Institutes of Health expressed concern
about issues related to the entry, maintenance, safeguards, reliability, and
commercial emphasis of this information. In general, while GAO recognized
that the implementation of this recommendation would raise these types
of issues, it continues to believe that the recommendation’s
implementation will provide a useful opportunity to capture the results of
the program and that these issues can be addressed effectively by
coordination among the agencies. In response to concerns expressed by



Page 8              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Executive Summary




the National Institutes of Health that commercial potential cannot be
based solely on potential dollars in sales, GAO added a reference to other
indicators of success in its recommendation, reflecting its recognition of
the need for flexibility. GAO further discusses the agencies’ comments on
the matter for congressional consideration and the recommendation at the
end of chapters 3 and 4, respectively. The agencies’ comments appear in
appendixes II through XII, together with GAO’s responses.




Page 9              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Contents



Executive Summary                                                                                       2


Chapter 1                                                                                              14
                        The Administration of the Program                                              14
Introduction            “Frequent Winners” and the Geographic Distribution of Awards                   16
                          Have Become Important Issues
                        The Government Performance and Results Act Has Increased the                   17
                          Emphasis on Evaluating the Results of Federal R&D
                        Objectives, Scope, and Methodology                                             18

Chapter 2                                                                                              21
                        An Overview of Awards Made by the SBIR Program                                 21
Frequent and            Frequent Winners’ Share of the Program’s Resources                             23
Infrequent Winners      Infrequent Winners’ Share of the Program’s Resources                           25
                        SBIR Awards and the Program’s Resources Are Concentrated in                    26
Are Major SBIR             Several States
Players, but Frequent   Federal Agencies’ Efforts to Expand the Geographic Distribution                29
Winners and Certain        of Awards Include Special Funding and Outreach
States Have Won a
Large Share of the
Program’s Resources
Chapter 3                                                                                              34
                        Agencies Are Considering Commercial Potential to Varying                       35
Agencies Are              Degrees in Making Awards
Considering             Penalties for Poor Commercialization Records May Have                          40
                          Unintended Consequences
Commercial Potential    The Emphasis on Commercialization Raises Questions About the                   44
in Making Awards, but     Role of Other Goals in Evaluating Companies’ Performance
the Emphasis on         Conclusions                                                                    46
                        Matter for Congressional Consideration                                         47
Commercialization       Agency Comments and Our Evaluation                                             47
Raises Questions




                        Page 10           GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                     Contents




Chapter 4                                                                                            49
                     Various Methods With Similar Criteria for Success Have Been                     49
SBA Has an             Used in Attempting to Measure Outcomes
Opportunity to       A Standard Approach Involves the Use of Uniform Criteria for                    55
                       Success and Improvements in SBA’s New Database
Standardize          Conclusions                                                                     57
Evaluations of the   Recommendation to the Administrator, SBA                                        57
Program’s Outcomes   Agency Comments and Our Evaluation                                              57

Appendixes           Appendix I: SBIR Phase I Award/Proposal Ratios in Fiscal Year                   60
                       1998, by Agency
                     Appendix II: Comments From the Small Business Administration                    61
                     Appendix III: Comments From the Department of Defense                           63
                     Appendix IV: Comments From the Department of Commerce                           66
                     Appendix V: Comments From the Department of Education                           67
                     Appendix VI: Comments From the Department of Transportation                     68
                     Appendix VII: Comments From the Department of Agriculture                       69
                     Appendix VIII: Comments From the National Aeronautics and                       72
                       Space Administration
                     Appendix IX: Comments From the Environmental Protection                         74
                       Agency
                     Appendix X: Comments From the Department of Energy                              77
                     Appendix XI: Comments From the National Institutes of Health                    81
                     Appendix XII: Comments From the National Science Foundation                     89
                     Appendix XIII: Major Contributors to This Report                                90

Table                Table 2.1: An Overview of the Top 25 Frequent Winners, Fiscal                   24
                       Years 1983-97

Figures              Figure 2.1: Percentage of Phase II Awards Won by Various                        22
                       Program Participants, Fiscal Years 1984-97
                     Figure 2.2: Percentage of Companies Winning First-Time Phase I                  26
                       Awards, Fiscal Years 1993-97
                     Figure 2.3: Distribution of SBIR Phase II Awards by State, Fiscal               28
                       Year 1997




                     Page 11            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Contents




Abbreviations

DOD        Department of Defense
EPSCoR     Experimental Program to Stimulate Competitive Research
R&D        research and development
SBA        Small Business Administration
SBIR       Small Business Innovation Research Program


Page 12           GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Page 13   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 1

Introduction


                            As a nation competing in a global economy, the United States depends
                            heavily on innovation through research and development (R&D). The Small
                            Business Innovation Development Act of 1982, which authorized the Small
                            Business Innovation Research (SBIR) program, emphasizes the benefits of
                            technological innovation and the ability of small businesses to transform
                            the results of R&D into new products. The act designated four major goals
                            for the program:

                        •   stimulating technological innovation,
                        •   using small businesses to meet federal R&D needs,
                        •   fostering and encouraging participation by minorities and disadvantaged
                            persons in technological innovation, and
                        •   increasing the private sector’s commercialization of innovations derived
                            from federal R&D.

                            The Small Business Research and Development Enhancement Act of 1992
                            stated the congressional intention to

                        •   expand and improve the program,
                        •   emphasize the program’s goal of increasing the private sector’s
                            commercialization of technology developed through federal R&D,
                        •   increase small businesses’ participation in federal R&D, and
                        •   improve the federal government’s dissemination of information about the
                            program.


                            In addition to establishing goals, the original legislation determined federal
The Administration of       agencies’ participation in and funding for the program. By 1986, agencies
the Program                 spending more than $100 million annually for external R&D were required
                            to set aside not less than 1.25 percent of their total external R&D funds for
                            the program. The 1992 reauthorization directed agencies to increase the
                            set-aside to not less than 1.5 percent in fiscal years 1993 and 1994, not less
                            than 2 percent in fiscal years 1995 and 1996, and not less than 2.5 percent
                            in fiscal year 1997 and thereafter. This requirement has increased the
                            annual funding to about $1 billion. At present, 10 agencies participate in
                            the program. The five agencies with larger SBIR programs, accounting for
                            over 90 percent of all awards, include the Department of Defense (DOD);
                            the Department of Energy; the Department of Health and Human Services
                            and its National Institutes of Health, in particular; the National
                            Aeronautics and Space Administration; and the National Science
                            Foundation. The five agencies with smaller SBIR programs include the
                            Department of Commerce, the Department of Education, the Department



                            Page 14            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
    Chapter 1
    Introduction




    of Transportation, the Department of Agriculture, and the Environmental
    Protection Agency.

    Agencies are required to issue a solicitation for proposals that sets the
    process in motion. The solicitation, a formal document issued by each
    agency, lists and describes the topics to be addressed by each company’s
    proposals and invites companies to submit proposals for consideration.
    Each agency with a program is responsible for targeting research areas
    and administering its own funding agreements.

    The Small Business Administration (SBA) is responsible for issuing policy
    directives for the general conduct of the SBIR programs within the federal
    government. The directives were to provide for simplified, standardized,
    and timely solicitations and a simplified, standardized funding process. In
    addition, they were to minimize the regulatory burden for small businesses
    participating in the program. Issued in January 1993, the current policy
    directive incorporated changes made by the 1992 legislation. Federal
    agencies were also required to report key data to SBA, which in turn has
    published annual reports on the program.

    SBA’s policy directive states that, to be eligible for an award, a small
    business must be

•   independently owned and operated,
•   other than the dominant firm in the field in which it is proposing to carry
    out an SBIR project,
•   organized and operated for profit,
•   an employer of 500 or fewer employees (including employees of
    subsidiaries and affiliates),
•   the primary source of employment for the project’s principal investigator
    at the time of the award and during the period when the research is
    conducted, and
•   at least 51-percent owned by U.S. citizens or lawfully admitted permanent
    resident aliens.

    The original law established a three-phase structure for the program. The
    first phase, not to exceed 6 months, was designed to determine the
    scientific and technical merit and the feasibility of a proposed idea. The
    second phase, not to exceed 2 years, was designed to further develop the
    idea. The SBA policy directive established $50,000 and $500,000 as the
    general limits for phase I and II awards, respectively. The 1992
    reauthorization directed SBA to raise these figures to $100,000 and



    Page 15             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                     Chapter 1
                     Introduction




                     $750,000, respectively, with an adjustment every 5 years to reflect
                     economic and programmatic considerations. When selecting phase I
                     proposals for awards, an agency is now required under the reauthorization
                     and SBA’s directive to consider the scientific and technical merit and
                     feasibility of ideas that appear to have commercial potential. The funding
                     for phase II shall be based on the results of phase I and the scientific and
                     technical merit and feasibility of the proposal, including, among other
                     things, a consideration of its commercial potential. The third phase is
                     somewhat more flexible and difficult to define. In general, it is expected to
                     result in commercialization or further research and development. Unlike
                     phases I and II, phase III has no general limits in time or dollar amounts. In
                     addition, a phase III project must obtain funds from non-SBIR sources in
                     the federal government or in the private sector.


                     In the SBIR program, the same companies have often received multiple
“Frequent Winners”   awards, creating concerns about the concentration of awards. According
and the Geographic   to one expert, the program was established, in part, to enable small
Distribution of      businesses to compete with large companies for a portion of the federal
                     R&D funding, but the program has generated its own internal “corporate
Awards Have Become   giants” against which even smaller businesses must now compete. While
Important Issues     these “frequent winners” have received a significant share of the program’s
                     resources, they have generally demonstrated less commercial activity in
                     phase III than companies with fewer awards.

                     We discussed this concern about frequent winners in our 1992 report on
                     phase III commercialization.1 At that time, as we pointed out, the five most
                     frequent winners had received a total of almost $100 million from fiscal
                     year 1983 through fiscal year 1990. Collectively, these five small businesses
                     had received over 700 phase I and II awards from the program. For the
                     purpose of further analysis in our report, we defined a frequent winner as
                     a company that had won five or more phase II awards. We compared these
                     companies with infrequent winners and found that frequent winners were
                     achieving lower levels of phase III sales and less additional developmental
                     funding from non-SBIR sources. The frequent winners were achieving about
                     $117,000 less in sales and about $86,000 less in additional developmental
                     funding per phase II award. In 1998, we also reported that frequent




                     1
                      Federal Research: Small Business Innovation Research Shows Success but Can Be Strengthened
                     (GAO/RCED-92-37, Mar. 30, 1992).



                     Page 16                 GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                         Chapter 1
                         Introduction




                         winners achieved lower levels of sales and less additional developmental
                         funding.2

                         As part of its legislation to reauthorize the program in 1992, agencies were
                         required to begin collecting data on the commercialization activity of
                         companies that were submitting phase I proposals and had won 15 or
                         more phase II awards in a 5-year period. Analyses of these data have
                         shown that companies with numerous awards continue to commercialize
                         at somewhat lower levels than other companies. For example, in a 1996
                         survey following up on our 1992 survey, a DOD contractor found that DOD
                         recipients with nine or more phase II awards achieved less than half of the
                         sales per project when compared with the recipients of phase II awards in
                         general.

                         The geographic distribution of awards has become a more prominent issue
                         since both the funding for the program and the number of awards per year
                         have increased under the 1992 reauthorization. For example, a recent SBA
                         study reported that one-third of the states received 85 percent of all SBIR
                         awards and funds from fiscal year 1983 through fiscal year 1996 but also
                         found that the distribution of SBIR awards tends to mirror the distribution
                         of R&D funds in general.


                         In setting forth its findings and reasons for enacting the Results Act, the
The Government           Congress stated that congressional policy-making, spending decisions, and
Performance and          oversight are seriously handicapped by insufficient attention to programs’
Results Act Has          performance and results. One purpose of the act was to improve federal
                         programs’ effectiveness and public accountability by promoting a new
Increased the            focus on results. Echoing this concern about focusing on programs’
Emphasis on              results, Representative George Brown stated in September 1997 that
                         information was not available to answer the most basic question about the
Evaluating the Results   effectiveness of the SBIR program. In a specific reference to the Results
of Federal R&D           Act, he also recommended that agencies develop performance measures
                         for their SBIR programs, collect information on the performance of
                         grantees, and analyze the data in light of the program’s goals. In
                         December 1997, the Congress specified that information on the SBIR
                         program must be included in the updates or revisions of agencies’ strategic
                         plans that are required under the Results Act.




                         2
                          Federal Research: Observations on the Small Business Innovation Research Program
                         (GAO/RCED-98-132, Apr. 17, 1998).



                         Page 17                 GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                     Chapter 1
                     Introduction




                     This emphasis on results raises a question about the availability and
                     reliability of key data to answer questions about the extent to which
                     awardees have achieved commercialization and the program’s other goals.
                     In measuring results, GAO, individual agencies, and others have developed
                     a variety of evaluation approaches and criteria for the program’s success.
                     In each instance, the efforts have required the construction of new
                     databases that permit a “snapshot” of the program and have become
                     outdated in a relatively short time. Because of the growing attention being
                     given to results, we tried to identify a more convenient and effective way
                     of obtaining data and evaluating the program.


                     As agreed with the Committee, we focused our review on three objectives.
Objectives, Scope,   First, we provided a statistical overview of the distribution of awards by
and Methodology      company and geographic area; we also identified outreach efforts by
                     federal agencies and other organizations to broaden this distribution of
                     awards. Second, we determined whether federal agencies are considering
                     proposals’ commercial potential in making their awards and what, if any,
                     actions they have taken in response to concerns about the level of
                     commercialization by frequent winners. Third, we reviewed previous
                     evaluations of the SBIR program to identify opportunities to improve
                     measurements of the program’s outcomes.

                     To respond to the first objective, we obtained data on awards from the
                     start of the program in 1983 through 1997, the most recent year for which
                     data were available. Our main source of data was SBA, which maintains the
                     most complete database on the program. Because of our concerns about
                     the reliability of the information, we worked closely with SBA officials to
                     review and correct the data. The main source of errors was the lack of a
                     unique identifier for individual companies; slight changes in a company’s
                     name, caused by entering it in a slightly different way, resulted in data
                     showing separate companies. We reviewed the records for all companies
                     to eliminate these variations and arrive at a more accurate list of
                     participants.

                     Once the data were corrected, we prepared a reliable database that
                     showed the number of awards to each company since the start of the
                     program. We used these data to develop a statistical profile for three
                     groups of companies—the 25 companies with the most phase II awards,
                     the infrequent winners with one to four phase II awards, and the
                     companies with an intermediate number of awards. We chose the top 25
                     companies at the request of the Committee and used the number of phase



                     Page 18            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 1
Introduction




II awards as the criterion for identifying them because the dollar value of
these awards substantially outweighs the dollar value of phase I awards.
However, we included information on the number of phase I awards to
provide a complete picture of the number of awards and the total funding
received by the most frequent winners.

In analyzing the geographic distribution of awards, we obtained data from
SBA and a consultant who has studied this issue for several years. We used
these data to determine the distribution of awards among individual states.
We interviewed program officials and the consultant to gain insight into
agencies’ outreach efforts.

To respond to the second objective, we briefly reviewed the program’s
major goals in relation to the growing focus on commercialization. We
reviewed the 1992 reauthorization act and SBA’s directives. Among other
things, the legislation required agencies to consider the commercial
potential of each proposal when making phase II awards and, when
reviewing proposals for phase I awards submitted by companies that had
received 15 or more phase II awards in the last 5 years, to collect data
demonstrating how much previous phase III funding the companies had
received. We analyzed agencies’ efforts as they related to proposals from
all companies; we then analyzed their efforts as they related to proposals
from companies with larger numbers of awards. We interviewed officials
at all of the SBIR agencies to learn about their implementation of the
legislation. We gave particular attention to DOD’s efforts because DOD
makes about half of all awards under the program and planned to
implement a significant new policy in May 1999.

In our discussions of the first two objectives, the definition of “frequent
winners” may vary with the context. We use the term in three different
ways in our report. First, it may refer to the top 25 frequent winners, that
is, the group of companies that have won the most phase II awards since
the start of the program. This is the group that the Committee asked us to
analyze. Second, it may refer to the group of companies as specified in the
legislation that reauthorized the program in 1992. This group consists of
companies that have won 15 or more phase II awards during the last 5
years of the program. Third, it may refer to companies that have won 5 or
more phase II awards since the start of the program. We defined these
companies as frequent winners in our 1992 report, and DOD is using the
same criterion in its plans for evaluating the commercial potential of
frequent winners. To avoid confusion, we have noted the context for our
use of this term wherever necessary.



Page 19            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 1
Introduction




To respond to the third objective, we reviewed the main evaluations of the
program’s results (primarily commercialization) performed by federal
agencies and others. We examined their methods and criteria for success
to identify common themes and criteria that could be applied to future
evaluations. We interviewed SBA officials about their existing SBIR database
and development of a new database, called Tech-Net, scheduled to replace
the existing database in 1999. We explored the opportunity to enhance the
new database by including “data fields” on the commercial and other
outcomes of the program. We attended an SBA-sponsored meeting of
database managers in December 1998 to discuss this and other ideas
relating to Tech-Net. We presented our proposal at this meeting and
followed up with a second presentation on the same issue at a meeting of
program managers in January 1999.

Our work was performed in accordance with generally accepted
government auditing standards from July 1998 through March 1999. Our
work was focused on federal agencies in the Washington, D. C., area. We
requested and received comments on our draft of this report from SBA and
the 10 program agencies.




Page 20            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 2

Frequent and Infrequent Winners Are Major
SBIR Players, but Frequent Winners and
Certain States Have Won a Large Share of
the Program’s Resources
                     The 25 companies with the most phase II awards, which represent fewer
                     than 1 percent of the companies participating in the program, have won
                     about 11 percent of these awards over the life of the program. They have
                     accumulated over $900 million in total phase I and II awards; the leading
                     frequent winner has received $108 million. However, thousands of
                     companies have received between one and four phase II awards since the
                     start of the program; in addition, first-time winners accounted for about
                     one-third of the participants from fiscal year 1993 through fiscal year 1997.
                     This percentage of first-time participants amounts to about 750 companies
                     annually. Concern about the concentration of awards has also focused on
                     their geographic distribution. Companies in a small number of states,
                     especially California and Massachusetts, have submitted the most
                     proposals and won the majority of awards, although the distribution of
                     awards generally follows the pattern of distribution of non-SBIR
                     expenditures for R&D, venture capital investments, and academic research
                     funds. In response to congressional concerns about this concentration,
                     agencies have undertaken efforts to broaden the geographic distribution of
                     awards. The National Science Foundation’s use of a special program to
                     support research in states that have historically received lesser amounts of
                     federal R&D funding has increased the number of SBIR awards to these
                     states. Other agencies also have such programs but have not used them to
                     assist their SBIR participants. Several agencies are considering such an
                     initiative to increase their outreach efforts in the SBIR program.


                     We divided participants into three distinct groups of phase II award
An Overview of       winners in order to examine the distribution of awards over the life of the
Awards Made by the   program. These groups are (1) the 25 companies with the most awards,
SBIR Program         (2) companies with between 1 and 4 awards, and (3) a middle group with
                     between 5 and 27 awards. Figure 2.1 highlights the distribution of phase II
                     awards to these three categories of companies from fiscal year 1984, when
                     the first phase II awards were made, through fiscal year 1997, the latest
                     year for which complete data are available.




                     Page 21            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                                          Chapter 2
                                          Frequent and Infrequent Winners Are Major
                                          SBIR Players, but Frequent Winners and
                                          Certain States Have Won a Large Share of
                                          the Program’s Resources




Figure 2.1: Percentage of Phase II Awards Won by Various Program Participants, Fiscal Years 1984-97




                 7




                                          Source: GAO’s analysis of data from SBA’s SBIR database.




                                          The companies in the top group have been the focus of concern because of
                                          their large number of awards. As the figure shows, the 25 most frequent
                                          winners, representing fewer than 1 percent of the participants, account for
                                          about 11 percent of the phase II awards. The concentration of awards is
                                          also shown by combining this group with the intermediate group and
                                          looking at companies with 5 or more phase II awards in general. The two
                                          top groups represent 11 percent of the program’s participants and have
                                          received almost half of all phase II awards. The third group, the infrequent
                                          winners, constitutes almost 90 percent of the program’s participants and
                                          has received slightly more than 50 percent of the phase II awards. Thus,
                                          while a relatively small percentage of companies has received a large
                                          share of the phase II awards, thousands of companies participating in the
                                          program have each won a few awards.




                                          Page 22                 GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                      Chapter 2
                      Frequent and Infrequent Winners Are Major
                      SBIR Players, but Frequent Winners and
                      Certain States Have Won a Large Share of
                      the Program’s Resources




                      Table 2.1 provides a detailed view of the top 25 winners, including the
Frequent Winners’     number and total dollar value of their awards over the life of the program
Share of the          (fiscal years 1983-97), and, when available, the percentage of their revenue
Program’s Resources   derived from the program in fiscal year 1998.




                      Page 23              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                                       Chapter 2
                                       Frequent and Infrequent Winners Are Major
                                       SBIR Players, but Frequent Winners and
                                       Certain States Have Won a Large Share of
                                       the Program’s Resources




Table 2.1: An Overview of the Top 25
Frequent Winners, Fiscal Years                                                                                 Percentage of
1983-97                                                                                                        revenue from
                                       Company             Phase II awards    Total awards      Dollar value     SBIR (1998)
                                       Foster Miller                  147               573            108.2                 20
                                       Physical Optics                  96              377             71.2                 68
                                       Creare                           87              281             61.4                 64
                                       Physical Sciences                76              290             57.2                 42
                                       Spire                            75              351             59.4                 26
                                       Radiation
                                       Monitoring
                                       Devices                          59              187             43.3                 38
                                       Bend Research                    58              166             34.3                 23
                                       EIC Laboratories                 53              188             38.1                 33
                                       Mission Research                 50              196             39.8                  8
                                       Science
                                       Research
                                       Laboratory                       49              147             33.4                 76
                                       Advanced
                                       Technology
                                       Materials                        48              208             38.4                 10
                                       Advanced Fuel
                                       Research                         42              154             27.8                 52
                                       Ultramet                         38              140             28.4                 37
                                       Aerodyne
                                       Research                         35              134             27.5                 36
                                       CFD Research                     35              107             24.7                 52
                                                                                                                               a
                                       Sparta                           35              162             28.3
                                       TDA Research                     35              127             19.5                 70
                                                                                                                               a
                                       Thermacore                       35              102             25.8
                                       American
                                       Research Corp.
                                       of Virginia                      34              102             19.3                 80
                                       Waterjet
                                                                                                                               a
                                       Technology                       34              102             21.5
                                       Scientific
                                       Research
                                                                                                                               a
                                       Associates                       33              113             24.0
                                       Giner                            30              110             22.1                 70
                                       Schwartz
                                       Electro-optics                   30              104             20.1                  6
                                       Bio-Metric
                                                                                                                               a
                                       Systems                          29               89             18.5
                                       Satcon
                                       Technology                       28              119             22.2                 44

                                                                                                      (Table notes on next page)


                                       Page 24                GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                      Chapter 2
                      Frequent and Infrequent Winners Are Major
                      SBIR Players, but Frequent Winners and
                      Certain States Have Won a Large Share of
                      the Program’s Resources




                      a
                          Information was not available.

                      Source: GAO’s analysis of data from SBA’s SBIR database.



                      The 25 companies that have received the most phase II awards account for
                      a total of 4,629 phase I and II awards worth over $900 million. The most
                      frequent winner, Foster Miller, has received $108 million. All of these
                      companies have participated in the program for at least 10 years.

                      As table 2.1 shows, we also obtained information on the percentage of
                      total annual revenue that these companies attributed to their SBIR awards.
                      These data, provided by a DOD contractor, indicate that the awards
                      contributed about 43 percent of their total annual revenue, on average, for
                      fiscal year 1998.1 However, this figure varied enormously by company,
                      from a low of 6 percent to a high of 80 percent.


                      We examined two groups of infrequent winners, including (1) companies
Infrequent Winners’   with between one and four phase II awards and (2) first-time winners of
Share of the          phase I awards. We found that companies with between one and four
Program’s Resources   phase II awards have also played a major role in the program. These 4,048
                      companies constitute almost 90 percent of the phase II award winners.
                      They have received over one-half of the program’s total resources (about
                      $4.5 billion out of a total $8.4 billion). They are relative newcomers when
                      compared with the 25 most frequent winners. Slightly over half of them
                      received their first phase II award in fiscal year 1992 or later.

                      First-time winners have also been successful in obtaining awards, winning
                      about one-third of the phase I awards in recent years. Figure 2.2 shows the
                      percentage of first-time winners in the program from fiscal year 1993
                      through fiscal year 1997—the last 5 years for which complete data were
                      available. On average, 750 companies won an award for the first time in
                      each of these years.




                      1
                       This figure is based on information provided by the 20 companies for which information on annual
                      revenue from the SBIR program was available.



                      Page 25                      GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                                      Chapter 2
                                      Frequent and Infrequent Winners Are Major
                                      SBIR Players, but Frequent Winners and
                                      Certain States Have Won a Large Share of
                                      the Program’s Resources




Figure 2.2: Percentage of Companies
Winning First-Time Phase I Awards,
Fiscal Years 1993-97




                                      Source: GAO’s analysis of data from SBA’s SBIR database.




                                      As the figure shows, the percentage of new participants has remained
                                      steady. In our view, this level of participation by first-time winners shows
                                      the program’s substantial capacity to attract new participants each year.2


                                      SBIR awards, like total U.S. R&D expenditures, are heavily concentrated in
SBIR Awards and the                   several states. A recent SBA study reported that companies in one-third of
Program’s Resources                   the states received 85 percent of all SBIR awards and funds from fiscal year
Are Concentrated in                   1983 through fiscal year 1996.3 Companies in two states—California and
                                      Massachusetts—received by far the highest number of awards. According
Several States                        to the study, the 17 states with companies that won the most awards also
                                      have the bulk of the federal R&D expenditures, venture capital investments,

                                      2
                                       In commenting on our draft report, the National Aeronautics and Space Administration noted that
                                      about 46 percent of the companies that received phase II awards from it in award years 1993-97 were
                                      first-time winners.
                                      3
                                        An Analysis of the Distribution of SBIR Awards by States, 1983-1996, SBA, Office of Advocacy
                                      (Jan. 1998).



                                      Page 26                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 2
Frequent and Infrequent Winners Are Major
SBIR Players, but Frequent Winners and
Certain States Have Won a Large Share of
the Program’s Resources




and academic research funds.4 Hence, the study observes that the number
of small high-technology firms in a state, its R&D resources, and its access
to venture capital are important factors in the distribution of SBIR awards
and that the distribution of these awards tends to mirror the distribution
of R&D funds in general.

In fiscal year 1997, the geographic distribution of awards was similar to
their distribution over the life of the program. California and
Massachusetts had the highest concentrations of phase II awards, with
California companies receiving 326 and Massachusetts companies
receiving 202. In five states (Virginia, New York, Maryland, Pennsylvania,
and Colorado), companies won between 55 and 80 awards. At the bottom
of the list were 19 states where companies received three or fewer awards.5


For fiscal year 1998, data on the proposal-to-award ratios show that
proposals from companies in states with historically lesser amounts of
federal research funding won awards at almost the same rate as proposals
from companies in other states. However, these data showed some
variation among the individual program agencies. Appendix I provides a
snapshot of the proposal-to-award ratios among the agencies in fiscal year
1998. The geographic distribution of phase II awards by state in fiscal year
1997 is presented in figure 2.3.




4
The 17 states, listed in descending order by number of awards, are California, Massachusetts, Virginia,
Maryland, New York, Pennsylvania, Colorado, Connecticut, Texas, Ohio, New Jersey, Washington,
New Mexico, Florida, Michigan, Alabama, and Illinois.
5
 The 19 states are Alaska, Arkansas, Hawaii, Kansas, Kentucky, Nebraska, Nevada, Oklahoma, Iowa,
Louisiana, Maine, Mississippi, Montana, West Virginia, Idaho, North Dakota, South Carolina, South
Dakota, and Vermont.



Page 27                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                                                         Chapter 2
                                                         Frequent and Infrequent Winners Are Major
                                                         SBIR Players, but Frequent Winners and
                                                         Certain States Have Won a Large Share of
                                                         the Program’s Resources




Figure 2.3: Distribution of SBIR Phase II Awards by State, Fiscal Year 1997



            Washington                                                                                                                                             Vermont
               29                                                                                                                                     Maine             1
                                                                                                                                                       2
                                          Montana            North Dakota
                                             0                     1
         Oregon                                                                 Minnesota                                                                  New Hampshire
           24                                                                      20                                                                                   24
                           Idaho                             South Dakota                                                                    New            Massachusetts
                             1                                    1                         Wisconsin                                        York                      202
                                            Wyoming                                            12                                             67             Rhode Island
                                                                                                             Michigan
                                               4                                                                                                                          5
                                                                                                                31
                                                                                     Iowa                                                                      Connecticut
                                                                                                                                  Pennsylvania
                 Nevada                                        Nebraska                2                                                                                39
                                                                                                                                      56
                                                                  3                                                  Ohio                                      New Jersey
                   3               Utah                                                           Illinois Indiana                                                      40
                                    15                                                                       11       41
                                                                                                      14
                                                Colorado                                                                                                        Delaware
    California                                     55                                                                                  Virginia                          9
                                                                      Kansas           Missouri
      326                                                                                 5                                               80                     Maryland
                                                                        3                                      Kentucky
                                                                                                                                                                        56
                                                                                                                  3
                                                                                                                                  North Carolina          Washington, D.C.
                            Arizona                                                                       Tennessee                     20                               6
                                                                       Oklahoma
                               29                                          3                                 12                                              West Virginia
                                            New Mexico                                 Arkansas                                    South                                 2
                                                24                                         0                                      Carolina
                                                                                                                                     1
                                                                                                          Alabama       Georgia
                                                                                                             36           12
                                                                   Texas
                                                                    41



                                                                                                                                                                    Florida
                                                                                                                                                                         35

                          Alaska
                                                                                                                                                                   Alabama
                             0
                                                                                                                                                                        36
                                                                                                                                                                 Mississippi
                                                                                                                                                                          2
                                                                           Hawaii
                                                                             3                                                                                    Louisiana
                                                                                                                                                                          2




                                                                   States designated by National Science Foundation for its experimental program to stimulate competitive
                                                                   research.



                                                         Source: GAO’s analysis of data from SBA’s SBIR database and from the National Science
                                                         Foundation.




                                                         Page 28                      GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                            Chapter 2
                            Frequent and Infrequent Winners Are Major
                            SBIR Players, but Frequent Winners and
                            Certain States Have Won a Large Share of
                            the Program’s Resources




                            To encourage greater participation by companies in the states with fewer
Federal Agencies’           awards, the National Science Foundation has used a program it
Efforts to Expand the       established about 20 years ago to support research in states with
Geographic                  historically lesser amounts of federal research funding. Eighteen states
                            and the Commonwealth of Puerto Rico participate in the program.6 The
Distribution of             Foundation and other agencies have also conducted outreach conferences
Awards Include              in such states and used the Internet to increase access to the program.
                            Constraints on the amount of funding available to administer the program,
Special Funding and         according to program managers, have limited the agencies’ efforts to reach
Outreach                    out to the states with fewer awards.


National Science            One effort that has been effective in increasing the number of awards to
Foundation’s Efforts Have   small businesses in states with fewer awards is the Foundation’s
Been Effective              Experimental Program to Stimulate Competitive Research (EPSCoR), which
                            began in 1981 and was funded at about $49 million in fiscal year 1999. For
                            nearly two decades, the Foundation has used this program to support
                            federally funded research in states that have received relatively little
                            federal research funding and have demonstrated a commitment to develop
                            their research bases and improve science and engineering research and
                            education programs at their universities and colleges. Since 1994, the
                            Foundation’s SBIR program has used EPSCoR to increase its assistance to
                            potential SBIR participants in EPSCoR states. The Foundation assists these
                            small businesses in two ways. First, through EPSCoR, the Foundation’s SBIR
                            program offers a “phase zero” award to help small businesses put together
                            a competitive phase I proposal.7 Second, phase I proposals from EPSCoR
                            states that were ranked in the “highly recommended” or “recommended”
                            category in the proposal review process but were not selected because of
                            funding constraints receive a second review and an opportunity to be
                            funded through EPSCoR.

                            Since 1994, EPSCoR has awarded 82 phase I SBIR grants valued at over
                            $7 million. In the fiscal year 1999 solicitation, EPSCoR awarded 17 phase I
                            grants.8 According to April 1998 testimony by the director of the

                            6
                             The states are Alabama, Arkansas, Idaho, Kansas, Kentucky, Louisiana, Maine, Mississippi, Montana,
                            Nebraska, Nevada, North Dakota, Oklahoma, South Carolina, South Dakota, Vermont, West Virginia,
                            and Wyoming.
                            7
                             Under the phase zero initiative, small businesses may receive about $5,000 to prepare themselves for
                            the phase I competition. Companies use these funds for such things as preliminary data acquisition,
                            analyses, or visits to SBIR agency personnel.
                            8
                             Two of the proposals were cofunded using both SBIR and EPSCoR funds. In addition, the Foundation
                            funded eight phase I proposals in EPSCoR states through the normal proposal review process in the
                            fiscal year 1999 SBIR phase I competition.



                            Page 29                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 2
Frequent and Infrequent Winners Are Major
SBIR Players, but Frequent Winners and
Certain States Have Won a Large Share of
the Program’s Resources




Foundation’s Industrial Innovation Program before the House Committee
on Small Business, EPSCoR has enabled a steady increase in participation in
the SBIR program in many of the rural states. In June 1998 testimony before
the Senate Committee on Small Business, a consultant who conducts SBIR
outreach in states in the northern Rocky Mountains and Great Plains
stated that the Foundation’s approach is highly effective and should be
considered for implementation at other agencies.

DOD, the Department of Energy, the National Institutes of Health, and the
National Aeronautics and Space Administration also have programs to
support federally funded research in states with lesser amounts of federal
research funding. However, there was no linkage between these agencies’
SBIR programs and their programs to support research in these states.
Their programs were established in the early to mid-1990s, have smaller
budgets than the Foundation’s program, and generally direct their funding
toward researchers in academic institutions, not small businesses.
Nonetheless, the executive director of the National Aeronautics and Space
Administration’s SBIR program is currently evaluating how EPSCoR might
enable the agency to expand outreach to potential SBIR participants in
states with fewer awards. The program manager for the Ballistic Missile
Defense Organization, a DOD component with an SBIR program, told us that
it would be appropriate for DOD’s SBIR program and its program to work
together to assist such states, but no decision has been made to take this
step.

Three of the agencies with smaller SBIR programs—the Department of
Agriculture, the Department of Commerce, and the Environmental
Protection Agency—also have programs to support research in states with
lesser amounts of federal research funding. As with the agencies with
larger SBIR programs, there was no linkage between the agencies’ SBIR
programs and their programs to support research in these states. The
manager of Agriculture’s SBIR program stated that he maintains a list of
states with the fewest awards from the Department; he is prepared on a
case-by-case basis to skip the strict numerical ranking of proposals and
make an award to a company to fill a geographic gap, provided the
company was ranked in the “should fund” category (approximately the
top 30 percent) during the regular review process. The Department of
Commerce, through its Experimental Program to Stimulate Competitive
Technology, recently awarded $300,000 to the University of Mississippi to
increase the state’s competitiveness for the SBIR program. At the
Environmental Protection Agency, the SBIR program manager is




Page 30              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                           Chapter 2
                           Frequent and Infrequent Winners Are Major
                           SBIR Players, but Frequent Winners and
                           Certain States Have Won a Large Share of
                           the Program’s Resources




                           considering whether and how to link his program with the agency’s
                           program that assists states with lesser amounts of research funding.


Despite Administrative     The program’s statute prohibits agency officials from using SBIR funds to
Funding Constraints,       pay for the administrative costs of the program, such as the costs of
Agencies Have Undertaken   salaries, support services, and outreach efforts. Despite this constraint,
                           agencies have tried to encourage participation by small businesses in
Other Outreach Efforts     states with fewer awards. For example, they have held outreach
                           conferences, offered help for small businesses in the proposal preparation
                           and review processes, and used the Internet to increase access to the
                           program. According to a consultant who has specialized in helping small
                           businesses in states in the northern Rocky Mountains and Great Plains win
                           SBIR awards, the agencies have been working effectively to broaden the
                           distribution of awards but could use additional administrative funds to
                           increase their outreach if the restrictions were lifted. In his view,
                           additional outreach to these states could increase the submission of
                           high-quality proposals from small businesses in these states, a key to
                           improving the geographic distribution of awards.

                           Several participating agencies described outreach trips they have made to
                           states or regions of the country that have won a relatively small share of
                           awards. For example, DOD’s program director told us that in 1998 DOD
                           program managers went to Alaska, Maine, and Oregon to discuss the SBIR
                           program. DOD is also cosponsoring regional conferences. It has scheduled
                           conferences in Iowa, Kansas, and Missouri in 1999. If the regional
                           conferences are successful, according to the program director, DOD will
                           conduct more of them. In addition, DOD plans to use about $20,000 to help
                           companies in such states prepare effective proposals. The National
                           Institutes of Health’s former program director told us that in 1998 he
                           traveled to Alaska, Arizona, Hawaii, Idaho, Kentucky, Oklahoma, Oregon,
                           Missouri, North Carolina, and Wyoming to discuss the Institutes’ program.

                           Officials from the departments of Education, Transportation, and
                           Agriculture told us that maintaining an equitable geographic distribution of
                           awards is generally not a problem for their agencies’ programs. However,
                           they also described their special efforts to reach out to small businesses in
                           states with fewer awards. For example, Transportation’s program manager
                           has explored ways in which states can work with small businesses to
                           develop a manufacturing capability for the results of SBIR research.
                           Agriculture’s program manager gives small businesses from states that




                           Page 31              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                         Chapter 2
                         Frequent and Infrequent Winners Are Major
                         SBIR Players, but Frequent Winners and
                         Certain States Have Won a Large Share of
                         the Program’s Resources




                         have won the fewest awards from the Department special consideration
                         for a phase I award.

                         Each participating agency has established a Web page on the Internet to
                         provide up-to-date information on its SBIR program, including agency
                         contacts, information on preparing a proposal, and upcoming events. In
                         addition, some of the agencies have developed state outreach notebooks
                         used by small businesses and agency officials. For example, the Ballistic
                         Missile Defense Organization publishes on the Internet a comprehensive
                         state outreach notebook that provides key agency and state contacts. The
                         notebook is used by several agencies, including those within DOD as well as
                         other program agencies. For example, the manager of Education’s
                         program told us that he placed calls to each of the state officials listed in
                         the Ballistic Missile Defense Organization’s book to inform them about the
                         program. The Environmental Protection Agency also publishes a state
                         outreach notebook with key state and agency contacts.

                         Several program directors told us that the prohibition on using the
                         program’s funds for administrative expenses has limited their ability to
                         conduct outreach to states with fewer awards. Some of the agencies with
                         smaller SBIR programs, in particular, have travel funds that provide for only
                         a few long-distance trips each year. Several agency officials told us that if
                         additional administrative funds were available, they would use the money,
                         in part, to reach out to the states with fewer awards. For example, the
                         Department of Commerce’s program manager told us that with a moderate
                         increase in administrative funds, Commerce could initiate an outreach
                         program that would focus on broadening the distribution of awards.


SBA Is Developing an     In 1998, the Congress made available $1 million for SBA to provide
Outreach Program for     technical assistance to the states that receive the fewest SBIR awards. The
States With the Fewest   Congress directed SBA to use the funding for awards to states that received
                         less than $5 million in awards in fiscal year 1995. The eligible states may
Awards                   receive up to $100,000 with a $50,000 state match for efforts such as
                         outreach to small businesses and assistance in applying for awards.
                         Twenty-three states, the District of Columbia, and Puerto Rico qualify for




                         Page 32              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 2
Frequent and Infrequent Winners Are Major
SBIR Players, but Frequent Winners and
Certain States Have Won a Large Share of
the Program’s Resources




the assistance and may submit proposals.9 SBA published the program
announcement in March 1999 and plans to make the first awards in the
spring of 1999.




9
 The states are Alaska, Arkansas, Delaware, Hawaii, Idaho, Indiana, Iowa, Kentucky, Louisiana, Maine,
Mississippi, Missouri, Montana, Nebraska, Nevada, North Dakota, Oklahoma, Rhode Island, South
Carolina, South Dakota, Vermont, West Virginia, and Wyoming. Sixteen of the states on SBA’s list and
Puerto Rico are also on the Foundation’s list of EPSCoR states. However, two states—Alabama and
Kansas—are not on SBA’s list but do receive special assistance in the SBIR competition from the
Foundation because they are EPSCoR states. In fiscal year 1997, Alabama was ranked 12th among the
states in the number of phase II awards it received, and Kansas was ranked 34th. In addition, SBA’s list
includes several non-EPSCoR states, such as Alaska, Delaware, Hawaii, Indiana, Iowa, Missouri, Rhode
Island, and the District of Columbia.



Page 33                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 3

Agencies Are Considering Commercial
Potential in Making Awards, but the
Emphasis on Commercialization Raises
Questions
              In reauthorizing the program in 1992, the Congress emphasized
              commercialization. The act required agencies to consider commercial
              potential in making awards and to collect data on companies that have
              received more than 15 phase II awards during the preceding 5 years. These
              requirements reflected a concern on the part of some Members of
              Congress that certain companies, especially frequent winners, were poor
              commercializers. In response, agencies are weighing the commercial
              potential of all proposals and have collected data on frequent winners. At
              the same time, the emphasis on the goal of commercialization raises
              questions about the role of companies’ commercialization records and the
              program’s other goals in evaluating proposals. First, the role of the
              commercialization record in evaluating the commercial potential of new
              proposals remains unclear. In addition, agencies have made little use of
              their data on commercialization by frequent winners, in part because of
              uncertainty about how to use the information appropriately. Second,
              despite the greater emphasis on commercialization, the program’s other
              goals remain important to the agencies. By itself, according to some of the
              program managers, limited commercialization may not signal “failure”
              because a company may have achieved other goals, such as innovation or
              responsiveness to an agency’s research needs. Because the 1992
              reauthorization and SBA’s 1993 policy directive do not define the role of the
              commercialization record in determining commercial potential and the
              relative importance of the program’s goals, different approaches have
              emerged in agencies’ evaluations of proposals. For example, DOD was
              preparing plans that would have greatly increased the importance of the
              commercialization record and resulted in significantly lower scores on
              companies’ proposals, making it harder for them to win awards, if they
              were perceived as poor commercializers. None of the other agencies was
              taking such an approach. Early tests of DOD’s approach indicated that it
              would have had the unintended effect of lowering the scores of companies
              with relatively few awards and no sales while having no adverse impact on
              winners with many awards and only modest sales. DOD has revised its
              approach to avoid these unintended consequences.




              Page 34            GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                        Chapter 3
                        Agencies Are Considering Commercial
                        Potential in Making Awards, but the
                        Emphasis on Commercialization Raises
                        Questions




                        As required by the 1992 reauthorization act, agencies are taking into
Agencies Are            account four indicators of the commercial potential of all proposals
Considering             identified in the legislation.1 Taken together, these four indicators, or
Commercial Potential    pieces of evidence, account for a substantial portion of a proposal’s rating,
                        amounting to as much as one-third of the total score. However, as just one
to Varying Degrees in   of the four indicators, a company’s commercialization record plays a
Making Awards           limited role in the evaluation of commercial potential and an even more
                        limited role when viewed along with the other, noncommercial factors,
                        such as technical merit, that are also considered in an evaluation. At DOD,
                        for example, the commercialization record currently accounts for about
                        one-fourth of the commercial score and about one-twelfth of the total
                        score for a proposal; at the Department of Energy it accounts for about
                        one-eighteenth of the total score. Hence, even a poor commercialization
                        record has thus far exercised only a limited influence on the evaluation
                        process. The following section discusses SBA and the participating
                        agencies individually.


SBA’s Role in           Beyond the 1992 reauthorization’s emphasis on commercialization, SBA’s
Implementing the        policy directive provides little or no guidance for participating agencies
Legislation             when considering a proposal’s commercial potential. The directive states
                        that SBA may monitor whether follow-on nonfederal funding commitments
                        obtained by phase II awardees for phase III were considered in the
                        evaluation of phase II proposals as required by the law. As of March 1998,
                        according to the Assistant Administrator for Technology, SBA had taken no
                        steps to monitor this aspect of the program.


DOD’s Evaluation of     DOD evaluates proposals according to (1) their scientific and technical
Commercial Potential    merit and degree of innovation, (2) the qualifications of key investigators,
                        and (3) the proposals’ commercial potential. According to the program
                        director, the commercial potential typically accounts for about one-third
                        of the total score, although its weight varies somewhat across DOD
                        agencies. One part of the commercial potential is the commercialization
                        record, whose weight also varies somewhat from one DOD agency to
                        another. The main tool for ascertaining the record is a form, Appendix E,
                        contained in DOD’s solicitation. Appendix E requires information from all



                        1
                         Under the 1992 reauthorization legislation, commercial potential is evidenced by “(i) the small
                        business concern’s record of successfully commercializing SBIR or other research; (ii) the existence of
                        second phase funding commitments from private sector or non-SBIR funding sources; (iii) the
                        existence of third phase, follow-on commitments for the subject of the research; and (iv) the presence
                        of other indicators of the commercial potential of the idea.” 15 U.S.C. 638(e)(4)(B).



                        Page 35                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Chapter 3
                             Agencies Are Considering Commercial
                             Potential in Making Awards, but the
                             Emphasis on Commercialization Raises
                             Questions




                             companies on their commercialization record for each of their phase II
                             awards.

                             In evaluating the commercial potential of a current proposal, DOD reviews
                             the proposal’s funding commitments, if any, and the company’s
                             commercialization strategy (a 1- or 2-page document that must accompany
                             the proposal). DOD also affords a special opportunity to companies that
                             obtain a cash investment linked with their phase II proposal; in such cases,
                             companies qualify for a “fast-track” review that greatly boosts their
                             chances of winning an award. The percentage of phase II proposals
                             receiving an award rises from 40 percent without a fast-track review to
                             90 percent with such a review. The threshold for the additional investment
                             that qualifies a proposal for a fast-track review is one-fourth of every DOD
                             dollar if the company has never won a phase II award and one-for-one
                             matching dollars if the company has previously won a phase II award.
                             Discussing the relative weight that DOD gives to a company’s
                             commercialization record and the commercial potential of the current
                             proposal, the program director said that the commercial potential
                             accounts, on average, for about one-third of the total score, as noted
                             above, and past performance accounts for about one-fourth of the total
                             commercial potential. Thus, the actual weight of the commercialization
                             record accounts for only about one-twelfth of the total score.

                             The program director told us that deliberations at higher policy-making
                             levels indicate a trend toward greater emphasis on commercialization. He
                             said that the Under Secretary of Defense for Acquisition and Technology
                             intends to focus the program more directly on phase III sales to DOD and
                             the private sector. In response to this effort to enhance commercialization,
                             the program director briefed the Under Secretary in September 1998 and
                             presented three approaches: (1) making more effective use of data on past
                             commercial performance, (2) establishing measures of success for phase
                             III (which will rely on our 1992 report), and (3) increasing the involvement
                             of DOD’s acquisition programs. The Under Secretary has approved these
                             plans. Their implementation is scheduled for May 1999. As part of their
                             implementation, the program director will report semiannually to the
                             Under Secretary on commercialization results in phase III.


The National Science         The Foundation uses only two criteria, the quality and impact of research,
Foundation’s Evaluation of   in evaluating proposals. The latter criterion includes commercial potential.
Commercial Potential         No specific percentage is assigned to either criterion. As part of its
                             evaluation of commercial potential, according to the program director, the



                             Page 36              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Chapter 3
                             Agencies Are Considering Commercial
                             Potential in Making Awards, but the
                             Emphasis on Commercialization Raises
                             Questions




                             Foundation reviews the history of phase II awards to companies that
                             submitted proposals. Attachment N of its solicitation, which was modeled
                             on DOD’s Appendix E and introduced in 1997, is one of the means of
                             obtaining this information. Technical reviewers do not see it, but program
                             managers use it when factors besides the strict technical review of quality
                             are taken into account. The 8 to 10 program managers are able to provide
                             special expertise in the selection process because of their detailed
                             knowledge of the proposals and the companies.

                             The program director said that, for the selection of phase I awards, the
                             information in Attachment N functions mainly as a tiebreaker if the
                             technical merit of two proposals has been judged as equal. He noted that
                             the agency’s reliance on a broad nonnumerical rating system gives the
                             program managers more flexibility to use Attachment N as a tiebreaker.
                             He could not say how often the commercialization record plays a
                             tie-breaking role, but he told us that it could do so for about half of the
                             Foundation’s awards. He added that, if two companies submit proposals of
                             equal technical merit and one company has a poor commercialization
                             record while the other company is a newcomer with no record, he will
                             choose the newcomer.


The National Aeronautics     The National Aeronautics and Space Administration’s program manager
and Space Administration’s   stated that commercial potential accounts for about 25 percent of the total
Evaluation of Commercial     possible score, although the solicitation does not specifically indicate this
                             percentage. In evaluating this area, consideration is given to (1) the
Potential                    commercial potential of the technology, (2) the demonstrated commercial
                             intent of the company, and (3) the capability of the company to bring
                             successfully developed technology to commercial application. In
                             evaluating the company’s commercialization record, the program manager
                             told us that he applies a subjective sense of a company’s record in general.
                             He added that the agency has conducted an extensive survey of phase II
                             commercial outcomes that may enable it to take a more structured
                             approach in evaluating the record. The results of the survey, which
                             covered companies that won phase II awards from 1984 through 1994, are
                             currently being tabulated and analyzed. To date, the survey data have been
                             used in an aggregate way to answer questions about commercialization.
                             The program manager told us that although little emphasis has been
                             placed so far on the records of individual companies, the data could
                             provide this information. The agency is now weighing what influence the
                             survey data should have on future awards.




                             Page 37              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Chapter 3
                             Agencies Are Considering Commercial
                             Potential in Making Awards, but the
                             Emphasis on Commercialization Raises
                             Questions




The Department of            Energy’s scoring system for selecting proposals focuses on three criteria:
Energy’s Evaluation of       the strength of the technical approach, the company’s ability to carry out
Commercial Potential         the project, and the project’s impact (which includes commercial
                             potential). Each criterion counts as one-third of the overall rating. The
                             program manager commented that a company with poor commercial
                             potential needs to be judged as almost perfect under the other two criteria
                             to receive an award. He added that Energy does not require a
                             commercialization plan in connection with phase II proposals, in part
                             because he believes that (1) it is too early for such a plan to be meaningful,
                             (2) agency personnel are not qualified to review it, and (3) important
                             information concerning possible commercialization is included with the
                             proposal in a section called “Anticipated Benefits;” this section discusses
                             the expected product or process, the likelihood that it could lead to a
                             marketable product, and the significance of the market.

                             In evaluating commercial potential, the program manager told us that he
                             interprets the four points in the 1992 legislation literally. He noted that
                             only one of the four refers to a company’s commercialization record and
                             this record, in turn, is weighted proportionally in the evaluation of
                             commercial potential. (The impact criterion is divided into two parts. The
                             commercialization record accounts for one-third of one of these parts, or
                             one-sixth of the impact criterion. This impact score then accounts for
                             one-third of the total score. Thus, the commercialization record accounts
                             for about one-eighteenth overall.) He added that although the
                             commercialization record accounts for only a small percentage of the total
                             score, it could make an important difference in a tight competition.

                             In obtaining data on commercialization, Energy requires companies, as a
                             condition of their phase II grant, to provide the program manager with an
                             annual report on phase III funding at the end of phase II and for 3 years
                             after their project’s completion. This report is to detail the sources and
                             amounts of the nonfederal funding used to continue support for, or
                             commercialize the research funded by, the award. The program manager
                             said that 1994 was the first year that Energy began to formally use
                             commercial potential as a criterion in evaluating proposals. Prior to 1994,
                             information on commercialization was used only as a tiebreaker in
                             specific instances.


The National Institutes of   The Institutes’ current solicitation includes commercial potential as one of
Health’s Evaluation of       its criteria, but neither commercial potential nor any of the Institutes’
Commercial Potential         other six criteria is assigned a definite weight in the grants program, which



                             Page 38              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                            Chapter 3
                            Agencies Are Considering Commercial
                            Potential in Making Awards, but the
                            Emphasis on Commercialization Raises
                            Questions




                            provides about 95 percent of the Institutes’ SBIR awards. For the contract
                            awards that constitute the remaining 5 percent, the program manager told
                            us, commercial potential accounts for 10 percent of a proposal’s score. All
                            proposals receive a peer review in which an average of three reviewers
                            represent the small business community and 10 to 15 doctors and
                            scientists represent the biomedical community. The small business
                            representatives are included for scientific balance and special
                            business-related knowledge, according to the program manager, but the
                            other peer reviewers (research scientists and physicians) can also
                            comment on the proposals’ commercial potential. Each peer review leads
                            to a summary statement that incorporates the major comments, including
                            those relating to commercial considerations. The program manager said
                            the scoring of commercial potential was subjective. He was unaware of
                            any instances in which a company’s commercialization record had
                            influenced the choice of proposals.2


The Five Smaller Programs   The five smaller programs are emphasizing commercial potential, but only
and Their Evaluation of     the Environmental Protection Agency indicated that the
Commercial Potential        commercialization record plays a potentially significant role in making
                            awards.

                            At the Environmental Protection Agency, the director of the
                            Environmental Engineering Research Division told us that the biggest
                            single change in the agency’s program since the 1992 reauthorization has
                            been the increased emphasis on commercialization. He said that
                            commercialization used to be one of six criteria used in judging proposals;
                            now, it is one of five. For phase I proposals, the agency requires a 2- to
                            3-page commercialization plan. For phase II proposals, it requires a fully
                            developed plan. In a peer review of phase II proposals, a
                            commercialization reviewer is responsible for rating the quality of the
                            complete plan. One effect of this increased emphasis is that a company’s
                            commercialization plan and record play a greater role in the peer review’s
                            final rating of a proposal. Specifically, according to the director, if a
                            company has a poor plan and record, its proposal will have much more
                            difficulty obtaining a “very good” or “excellent” rating (required for the
                            proposal to be eligible for funding), particularly if, technically, the
                            proposal is in the borderline area between “good” and “very good.”



                            2
                             In commenting on our draft report, the National Institutes of Health stated that some of their staff
                            take past commercialization success into account. However, those data have not been tracked by a
                            central office in the Institutes, which may be why the program manager was unaware of any instances.



                            Page 39                  GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                     Chapter 3
                     Agencies Are Considering Commercial
                     Potential in Making Awards, but the
                     Emphasis on Commercialization Raises
                     Questions




                     Agriculture sent a questionnaire to all phase II winners from the start of
                     the program through 1995 and found that more than 50 percent reported
                     some commercial sales. The Department’s SBIR program manager said that
                     it would be rare for a company with a poor commercialization record to be
                     penalized on a phase I proposal; instead, he said, a commercially
                     successful company might receive a boost from its previous success. At
                     phase II, more attention is given to commercial potential, but the two most
                     important review criteria are (1) the degree to which phase I objectives
                     were met and technical feasibility was demonstrated and (2) the technical
                     merit of the phase II proposal.

                     Transportation’s program manager stated that Transportation reviewers
                     consider technical merit and commercial potential when reviewing
                     proposals. He added that a company’s commercialization record has little,
                     if any, bearing on the selection of proposals and that the record has never
                     been used to make or break a proposal. Commerce has developed
                     guidelines for a commercialization plan to be included in phase II
                     proposals. The program manager said that this plan, which documents
                     how the company will convert its research into a commercial product, is
                     critical to winning a phase II award. Commerce has not found the
                     commercialization record to be a significant factor in its selections.
                     Education’s criteria for phase I awards include the potential commercial
                     applications of the research. Past commercialization success is among the
                     criteria for phase II awards.


                     Agencies have collected data on commercialization by companies,
Penalties for Poor   including frequent winners. According to SBA’s Assistant Administrator for
Commercialization    Technology, the 1992 reauthorization directs agencies to collect
Records May Have     information on commercialization by companies with 15 or more phase II
                     awards but does not clarify how they are supposed to use it.3 Without such
Unintended           clarification, agencies may establish different sets of rules that will be
Consequences         confusing to companies, many of which have received SBIR awards from
                     more than one agency. For example, as discussed later in this chapter, DOD
                     planned to implement an approach that would have greatly increased the


                     3
                      The act requires agencies in their annual reports to include an accounting of the phase I awards made
                     during the reporting period to entities that have received more than 15 phase II awards during the
                     preceding 5 fiscal years. 15 U.S.C. 638(l)(2). The act also required SBA to modify the SBIR program
                     directive to provide for procedures to ensure that these companies, when they submit phase I
                     proposals, are able to demonstrate the extent to which they have been able to secure phase III funding
                     for their previous phase II awards. 15 U.S.C. 638(j)(2)(H). The policy directive requires that companies
                     document the extent to which they have secured phase III funding to develop concepts resulting from
                     their phase II awards and for agencies to collect and retain such information. SBIR Policy Directive
                     para. 15c.



                     Page 40                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                           Chapter 3
                           Agencies Are Considering Commercial
                           Potential in Making Awards, but the
                           Emphasis on Commercialization Raises
                           Questions




                           importance of the commercialization record. In addition, DOD’s approach
                           would have led to unintended consequences, as early tests of its plan
                           indicated. None of the other agencies developed such an approach.


SBA’s Role in Addressing   In response to a requirement in the 1992 legislation, SBA included a section
Frequent Winners           in its 1993 policy directive requiring agencies to collect and retain
                           documentation on companies receiving 15 or more phase II awards in the
                           previous 5 years. The section restated the law while furnishing no
                           additional details. The Assistant Administrator for Technology told us that,
                           in his view, the legislation requires agencies to collect information on
                           commercialization but provides no guidance on what should be done with
                           it.


DOD’s Response to          DOD’s program director told us that all companies’ proposals are given
Frequent Winners           equal scrutiny when being ranked for commercial potential. The only way
                           “frequent winners” will be given somewhat greater scrutiny is connected
                           with the development of a past performance index. This index applies to
                           companies that won five or more phase II awards from fiscal year 1984
                           through fiscal year 1995. DOD’s focus on companies with five or more
                           awards during this period is broader than the focus on multiple winners
                           specified in the 1992 act. DOD’s approach potentially includes hundreds of
                           companies, whereas, according to an SBA official, the law leads to a list of
                           only 24 frequent winners for fiscal years 1993-97, the latest period for
                           which data were available.

                           DOD has had difficulty making effective use of the commercialization
                           records obtained from frequent winners. The problem has arisen because
                           of the large number of phase II awards and the volume of information. For
                           example, the program director told us that the company with the most
                           phase II awards over the life of the SBIR program has submitted
                           information on 94 completed phase II awards in a 19-page document.4
                           According to the program director, many of the technical reviewers have
                           little familiarity with the program and therefore lack the background to
                           grapple with so much information and reach a “bottom line” about the
                           company’s commercialization record.

                           To alleviate this problem, the program director plans to create a past
                           performance index for the program’s frequent winners. Each of these

                           4
                            He noted that the company should also have provided information on its 54 ongoing phase II awards
                           but did not do so.



                           Page 41                  GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 3
Agencies Are Considering Commercial
Potential in Making Awards, but the
Emphasis on Commercialization Raises
Questions




companies will be required to submit an electronic file of the
commercialization results of its phase II awards that will be used to
calculate how the company’s sales and additional developmental funding
compare with the DOD-wide average per award. The output will be a
number showing a company’s commercialization record as a percentage of
the DOD-wide average. The company will be asked to include this figure in
each new proposal so that the technical evaluators will see for the first
time a snapshot of the company’s past commercial performance level. In
discussing the weight that will be given to the index in evaluating
proposals, the program director said that, in general, DOD would not
prescribe any particular use for the data.

According to the program director, the only requirement that will govern
all of DOD’s SBIR agencies applies to companies that have received five or
more phase II awards since the start of the program and have achieved
only 5 percent or less of the DOD-wide average for sales and additional
developmental funding per award.5 For these companies, at the agency’s
discretion, the rating on commercial potential may be “capped” at half of
the total possible score. This cap will increase the weight for the
commercial record from the current one-twelfth of the total score to
one-sixth, a change that could reduce the number of awards to this group
of companies. (On a 100-point scale, DOD’s “cap” would decrease a
proposal’s score by about 16 points, a substantial penalty). This policy,
however, permits an exception if the program manager recommends that
the company be exempted from this requirement and the contracting
officer approves the exception.

In November 1998, the DOD support contractor implementing the index
pointed out some difficulties in making the index work effectively and set
the stage for the revised approach. The contractor noted that two-thirds of
all phase II awards in DOD show no sales. Against this backdrop, even
frequent winners with relatively low sales could turn out to be “above
average.” The leading frequent winner, for example, came out above
average simply because it had achieved limited sales with its numerous
phase II awards. The support contractor pointed out that the index did not
allow for important factors, such as recent awards to companies that have
not had time to commercialize them. In addition, it did not distinguish
between technologies such as software, which may be commercialized
quickly, and hardware, which may require a manufacturing step that takes
longer to commercialize. The support contractor concluded that much

5
 Initially set at 25 percent, this figure was lowered to 5 percent under DOD’s revised approach. As a
result, significantly fewer companies are potentially affected by DOD’s plan, and their level of
commercialization is significantly lower than under the earlier plan.



Page 42                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                            Chapter 3
                            Agencies Are Considering Commercial
                            Potential in Making Awards, but the
                            Emphasis on Commercialization Raises
                            Questions




                            more testing needed to be done before the index could become an
                            effective tool. The support contractor expressed concern about the
                            number of companies that might be affected and said that if it was too
                            high, the threshold would need to be adjusted. Further work by the
                            support contractor in April 1999 increased DOD’s concern about the
                            unintended consequences of the Department’s plan and led to important
                            revisions, including the lowering of the threshold.6


The Department of           Energy has collected data on commercialization by its awardees, including
Energy’s Response to        frequent winners. The program manager told us that these data do not
Frequent Winners            indicate a significantly lower level of commercialization by frequent
                            winners in general. For example, awardees with five or fewer phase II
                            awards from fiscal year 1984 through fiscal year 1996 averaged $1 million
                            in sales per project. Companies with nine or more phase II awards during
                            the same period averaged $854,000 per project. If the frequent winner with
                            the poorest record among the 10 companies with nine or more phase II
                            awards is removed from the calculation, the average rises to $939,000. Of
                            the two companies that received the most phase II awards from Energy
                            from fiscal year 1984 through fiscal year 1996, the company with 16
                            awards averaged $1.3 million and the company with 17 awards averaged
                            $1.7 million in sales per project.

                            Because Energy has developed detailed commercialization data on its
                            frequent winners, we asked the program manager whether this
                            information might have led to penalties during evaluations of proposals
                            from frequent winners with poor commercialization results. He told us
                            that he has not used this information to penalize any company beyond
                            considering commercial potential in phase II, as discussed previously. He
                            said that the law instructs the agencies to collect the data but, in his view,
                            does not tell them how to use it effectively in dealing with frequent
                            winners, even those that are clearly poor performers. He concluded that if
                            the Congress wants the agencies to monitor frequent winners and have the
                            data make a difference in the award process, then the law itself may have
                            to be clarified.


Other Agencies’ Responses   Other agencies have given only limited attention to the concern about
to Frequent Winners         frequent winners. For example, program managers at the National Science

                            6
                             In commenting on the draft report, DOD’s SBIR program director stressed that the past performance
                            index is one among many informational tools that DOD will use in evaluating proposals. He further
                            noted that, for a company with a strong commercialization record, the index offers an opportunity for
                            a favorable rating that may lead to a higher score on the company’s proposals.



                            Page 43                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                    Chapter 3
                    Agencies Are Considering Commercial
                    Potential in Making Awards, but the
                    Emphasis on Commercialization Raises
                    Questions




                    Foundation and the National Aeronautics and Space Administration were
                    uncertain whether the legislation defined a frequent winner as a company
                    with 15 awards in a 5-year period from a single agency or from all agencies
                    combined. They interpreted the law to mean awards from a single agency
                    and found virtually no companies that belonged in this category at their
                    agencies, so they did not focus further on the issue. The program manager
                    at the National Institutes of Health told us that his agency collected the
                    information, as required, but that officials were uncertain how to use the
                    information effectively and said that it played a minimal, if any, role in the
                    evaluation of proposals. In general, the five agencies with smaller
                    programs have taken no special steps to focus on frequent winners.


                    Despite the greater emphasis on commercialization, the program’s other
The Emphasis on     goals remain important to the agencies when evaluating companies’
Commercialization   accomplishments and subsequent proposals. According to some of the
Raises Questions    program managers, a relatively low level of commercialization may not
                    signal failure because a company may have achieved other goals. The
About the Role of   difficulty, for agencies, of using any particular goal as a key criterion for
Other Goals in      selecting future proposals for funding stems from their not having (1) a
                    clear definition of the program’s goals, (2) information on the relative
Evaluating          weight that should be given to these potential goals, and (3) criteria for
Companies’          judging whether these goals have been achieved.
Performance
                    Finding practical ways to define and measure the SBIR program’s goals in
                    order to evaluate proposals has been difficult. For example, efforts to
                    define and measure technological innovation, which was one of the
                    program’s original goals, have posed a challenge. Although definitions
                    vary, there is widespread agreement that technological innovation is a
                    complex process, particularly in the development of sophisticated modern
                    technologies. Technological innovation can involve many steps, including
                    research, engineering, prototype testing, and product development.
                    Because technological innovation occurs in many different ways, no one
                    indicator is an accurate measure of it. Differences among firms’ operating
                    styles can also create measurement problems. Some innovative firms will
                    file many patent applications (which are sometimes used as measures of
                    innovation), while others will prefer to retain trade secrets. Similarly,
                    according to SBA’s Assistant Administrator for Technology, the 1992
                    reauthorization lacks a clear definition of “commercialization,” and he has
                    sometimes differed with agencies on its meaning. This absence of a
                    definition makes it more difficult, in his view, to determine when a
                    frequent winner is “failing” to achieve a sufficient level of



                    Page 44              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 3
Agencies Are Considering Commercial
Potential in Making Awards, but the
Emphasis on Commercialization Raises
Questions




commercialization and how to include this information in an agency’s
review of the company’s proposal.

The relative weight that should be given to the goals when evaluating
proposals remains unclear. Innovation and responsiveness to an agency’s
needs, for example, may compete with the achievement of
commercialization. In the view of many program managers, innovation
involves a willingness to undertake R&D with a higher element of risk and a
greater chance that it may not lead to a commercial product;
responsiveness to an agency’s needs involves R&D that may be aimed at
special niches with likewise limited commercial potential. Striking the
right balance between encouraging new, unproven technologies and
achieving commercial sales is, according to the program managers, one of
the key ingredients in the overall success of the program. A former
director of the Ballistic Missile Defense Organization’s program told us
that commercialization could be significantly boosted. He added, however,
that he would oppose the use of commercial success as an exclusive
measure for the program because innovation and support for higher-risk
projects would then be virtually eliminated as goals.

Agencies have also not agreed on criteria for what constitutes “success” in
relation to these goals. The former program manager of the Ballistic
Missile Defense Organization put the problem clearly: How much
commercialization is “enough?” If an exclusive focus on commercial
success might signal that the program was “picking winners” and
sacrificing innovation, then what is the appropriate mix of higher-risk
projects that lead less frequently to commercial outcomes and lower-risk
projects that lead more frequently to successful products?

The difficulty caused by this lack of criteria is compounded by other
factors, such as the high concentration of commercial success in only a
handful of projects in the program. For example, as shown by our 1992
report, 1.5 percent of the projects accounted for almost half of all the sales
at that time. A 1996 survey by the DOD support contractor of DOD projects
from 1984 to 1992 also found that 1.5 percent of these projects accounted
for 50 percent of the sales and 4 percent accounted for 75 percent of the
sales. For a program in which the great majority of projects achieve no
sales or only very limited sales, the evaluation of subsequent proposals
from individual companies becomes more difficult if commercialization is
considered the primary goal. As the SBA contractor stated in a presentation
at the National Academy of Sciences in November 1998, this high
concentration of success necessitates large-scale surveys of the program



Page 45              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
              Chapter 3
              Agencies Are Considering Commercial
              Potential in Making Awards, but the
              Emphasis on Commercialization Raises
              Questions




              because the outcomes achieved by smaller subsets of winners or
              individual companies may be significantly influenced by the presence or
              absence of just a few major successes. In a separate discussion, the SBA
              contractor noted that Creare, one of the most frequent winners mentioned
              in chapter 2, generated $110 million in actual and $90 million in anticipated
              sales through a single phase II award entitled “Numerical Modeling for
              Chemical Vapor Deposition.”

              As the emphasis on commercialization has grown, so have concerns that
              noncommercial successes may not be captured at all. For example, the
              president of the Innovation Development Institute in Massachusetts
              expressed concern that the growing emphasis on commercialization was
              occurring at the expense of innovation and agencies’ R&D needs. She
              believed that some of the higher-risk projects that received awards in the
              mid-1980s and led to “technology leaps” would not now be seriously
              considered for awards because they would be judged too time-consuming
              and too risky. She was also disturbed by the suggestion that firms doing
              high-quality work and meeting the needs of a federal agency in an
              innovative manner are somehow deficient. There was no suggestion,
              however, of a valid methodology for assessing success in meeting the
              program’s other goals.

              Program managers also expressed concern that noncommercial
              accomplishments may not be adequately recognized. For example, the
              Navy program manager described a software project for a special military
              need with limited sales potential; he said it was very helpful in reducing
              the agency’s expenditures but believed that the savings would not be
              captured in typical measurements of commercialization. Likewise, the
              program manager at the National Institutes of Health cited instances of
              special medical equipment, such as pediatric heart devices, with limited
              markets. He pointed out that emphasizing commercialization as the
              primary goal would discount achievements in these areas. In general, we
              found that program managers valued both noncommercial and
              commercial successes and feared that the former might be ignored in
              emphasizing the latter.


              The existing legislation has generally increased participating agencies’
Conclusions   consideration of the commercial potential associated with new phase II
              proposals. It directs the agencies to consider the commercial potential
              (including the company’s commercialization record as one of four types of
              evidence) of each phase II proposal but does not clarify the extent to



              Page 46              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                     Chapter 3
                     Agencies Are Considering Commercial
                     Potential in Making Awards, but the
                     Emphasis on Commercialization Raises
                     Questions




                     which this potential should be a factor in making awards. This lack of
                     clarity about the role of commercialization is further evident in the
                     provision dealing with frequent winners. It directs the agencies to collect
                     information from companies submitting phase I proposals that have
                     received more than 15 phase II awards during the preceding 5 years to
                     demonstrate the extent to which they have been able to secure phase III
                     funding for these awards. However, the law provides no guidance on how
                     this information should be used. In turn, the emphasis on
                     commercialization has raised questions about the role of the program’s
                     other goals in the evaluation of companies’ proposals. Program managers
                     and others have expressed concern that the other goals and
                     accomplishments may not be sufficiently recognized.

                     Lacking guidance on these issues, agencies must determine their own
                     responses, and differences among agencies have emerged. In particular,
                     DOD developed a unique approach that would have led to lower scores on
                     proposals from companies with 5 or more phase II awards if they were
                     perceived as poor commercializers. DOD has revised its approach to
                     account for differences in the number of awards to specific companies and
                     to avoid the unintended consequences of its plan. Despite this
                     improvement, the lack of clarity in the legislation remains a concern.


                     When the Congress considers the reauthorization of this program, it may
Matter for           wish to clarify the relative emphasis that agencies, in evaluating
Congressional        companies’ proposals, should give to a company’s commercialization
Consideration        record as part of the goal of commercialization and to the program’s other
                     goals. This clarification would help ensure uniformity in the program and a
                     clear set of standards by which to determine whether, and to what extent,
                     commercialization and the program’s other goals should be considered in
                     evaluations of proposals.


                     Only the National Institutes of Health expressed concern about the matter
Agency Comments      for congressional consideration. The Institutes believed that the matter’s
and Our Evaluation   focus on uniformity would miss the fact that different relative emphases
                     on the commercialization record may be appropriate to agencies’ different
                     missions. The Institutes also questioned what they considered to be the
                     report’s close association between success and commercialization. In
                     general, we do not believe that an effort to clarify the relative emphasis on
                     commercialization and the program’s other goals will lead to a focus on
                     uniformity or insensitivity to the agencies’ divergent missions. Moreover,



                     Page 47              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 3
Agencies Are Considering Commercial
Potential in Making Awards, but the
Emphasis on Commercialization Raises
Questions




the report does not equate success and commercialization. This chapter
discussed the way in which the emphasis on commercialization raises
questions about the role of the program’s other goals and stated that
despite the greater emphasis on commercialization, these other goals
remain important to the agencies when evaluating a company’s
accomplishments and subsequent proposals. We made no changes to the
matter as a result of the comments provided by the National Institutes of
Health.




Page 48              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 4

SBA Has an Opportunity to Standardize
Evaluations of the Program’s Outcomes

                          Commercialization is only one of the program’s objectives but has become
                          the main outcome for measuring its effectiveness. Studies of
                          commercialization have proliferated as agencies have tried to obtain data
                          on commercial activity. In the 1990s, studies by GAO, individual agencies,
                          and academic specialists have focused on sales, developmental funding,
                          “success stories,” and a variety of other measures. A review of these
                          studies shows that although they rely on different approaches, they
                          contain some common criteria for success, and it suggests a further
                          opportunity for standardizing the measurement of commercialization. Two
                          of the main steps toward establishing a standard approach would involve
                          the development of uniform criteria for success and an improved SBIR
                          database at SBA that captures information on commercial outcomes.
                          Established before the passage of the Results Act, SBA’s database contains
                          information emphasizing input data (such as company names and awards)
                          while giving virtually no attention to results. As SBA develops a new
                          database, called Tech-Net, which is scheduled for full implementation in
                          1999, it has an opportunity to include outcome-related measures that can
                          be used to track commercialization and other indicators of success.


                          Various methods have been used to quantify commercialization and
Various Methods With      related outcomes of the program. Some of the major methods include the
Similar Criteria for      approach in our 1992 report on commercialization, the Department of
Success Have Been         Energy’s emphasis on a company’s products and services (derived from
                          SBIR technology) rather than on individual SBIR projects (an approach that
Used in Attempting to     sometimes has the effect of “clustering” awards), reliance on “success
Measure Outcomes          stories,” and an academic approach. A frequent winner has also developed
                          a method of its own. Although the methods have differed, many of the key
                          criteria for success focus on common concerns, such as levels of sales and
                          developmental funding. The following section gives an overview of these
                          methods but is not intended to include every study of commercialization in
                          the program.


Survey Criteria for the   Our 1992 report responded to a congressional mandate that we report on
1992 Report Identified    the commercial outcomes of the program. We surveyed companies that
Outcomes                  had won phase II awards from 1984 through 1987 and received information
                          on the outcomes of 1,457 projects. The survey instrument contained about
                          40 questions. One of the key questions was the following: “Has the
                          technology associated with this project led to additional developmental
                          funding and/or sales, and is further work on this technology under way?”
                          This question was intended to divide projects into four major categories



                          Page 49           GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Chapter 4
                             SBA Has an Opportunity to Standardize
                             Evaluations of the Program’s Outcomes




                             according to their phase III outcomes. It identified projects that (1) had
                             achieved funding and/or sales and had further work under way, (2) had not
                             yet achieved funding and/or sales and had further work under way, (3) had
                             achieved funding and/or sales and had no further work under way, or
                             (4) had achieved no funding and/or sales and were discontinued. The
                             remainder of the questionnaire focused mainly on obtaining further
                             information about projects falling into each of these categories. For
                             example, for projects that remained active in phase III, we asked detailed
                             questions about the amounts of additional developmental funds and sales,
                             the sources of their funds and the markets for their sales, and the levels of
                             financial activity expected in the future.

                             This approach has also been used in later surveys of the program. In 1996,
                             a DOD support contractor used it to survey all of DOD’s phase II awards. The
                             contract manager kept the basic structure intact but streamlined it by
                             eliminating certain questions that had not led to findings in our own 1992
                             review. In 1997, SBA asked the same DOD support contractor to conduct a
                             governmentwide survey of SBIR commercialization using the same
                             questionnaire. Additional use of this approach is being made by individual
                             agencies. In 1998, USDA sent a questionnaire to its phase II awardees that
                             uses similar outcome-related criteria. The National Aeronautics and Space
                             Administration has also focused on the outcomes associated with
                             individual phase II awards. Its survey asks for information on sales to
                             government agencies and the private sector, additional developmental
                             funding, the number of spin-off firms and patents, and other measures of
                             outcomes. As mentioned in chapter 3, the agency has sent the
                             questionnaire to its phase II awardees.1


Energy’s Approach Differs,   The Department of Energy’s approach differs from GAO’s 1992 approach in
Permitting the Clustering    that a company is asked to report on products and services derived from
                             SBIR technology instead of on individual awards. (As noted in ch. 3, Energy
of Awards to Measure
                             has required its phase II awardees to provide annual reports on phase III
Outcomes                     funding—i.e., on sales and further developmental funding—at the end of
                             phase II and for 3 years.) The data summary that Energy sends to
                             companies includes a list of all of their previous phase II awards. For each
                             product or service identified, the companies are instructed to identify
                             which phase II projects (as many as appropriate) contributed to that
                             product or service. As a result, Energy found, multiple SBIR projects

                             1
                              In commenting on our draft report, the National Aeronautics and Space Administration stated that its
                             survey of phase II projects provides relatively current information on commercial activities. It stated
                             that its survey is being implemented as an ongoing effort rather than as a single effort that would
                             quickly become outdated.



                             Page 50                   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Chapter 4
                             SBA Has an Opportunity to Standardize
                             Evaluations of the Program’s Outcomes




                             sometimes contributed to the same product or service, and, conversely,
                             multiple products and services were sometimes derived from the same
                             SBIR project. The program manager believes that companies find it easier
                             and more reliable to trace their commercial results to their own products
                             and services rather than to a single award. He is concerned that the
                             attempt to capture the results of each award individually may lead to
                             “double counting,” since more than one award sometimes leads to the
                             same product and, thus, to the same commercial result.

                             Despite the difference in methodology, Energy’s approach relies on
                             outcome-related criteria for success. For example, it asks about products
                             or services, sales and developmental funding, partners, and abandoned
                             projects. In addition, the program manager noted that, in most cases,
                             companies responding to the form have ascribed their products and other
                             commercial outcomes to an individual award, which further reduces the
                             apparent difference between the two approaches. At our request, he
                             reviewed the responses from a sample of 143 companies (about half of all
                             companies with phase II awards in the first 10 years of the program) and
                             found that about four-fifths of the companies responded in terms of
                             individual awards.


The “Success Stories”        The National Science Foundation, DOD, the National Aeronautics and
Approach Has Similar         Space Administration, and other agencies have presented success stories
Basic Criteria for Success   stemming from their awards. The purpose of these stories varies from
                             agency to agency. For example, the National Science Foundation has used
but Also Has Important       this approach to document the most significant results of its awards. By
Limitations                  contrast, the National Aeronautics and Space Administration places little
                             value on the approach as a measure of the program’s results and uses it
                             primarily to help companies market their technologies.

                             The National Science Foundation has used this approach three times. Its
                             first review of success stories was completed in September 1996 and was
                             entitled “50 Examples of SBIR Commercialization.” In carrying out the
                             study, the Foundation’s contractor obtained the information through
                             telephone and personal interviews, usually with the company president at
                             the time of the original award and through the early growth period. The
                             key questions included the following: (1) Did any of the Foundation’s SBIR
                             research awards make a significant difference to the performance and
                             growth of your company? (2) Did the project result in commercial sales?
                             As a follow-up question for discussion, the person being interviewed was
                             asked to include results that probably would not have occurred without



                             Page 51              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 4
SBA Has an Opportunity to Standardize
Evaluations of the Program’s Outcomes




the SBIR program or the Foundation’s SBIR award, or sales, investment, and
other actions that would not have taken place in the same period. The
survey used these criteria for success to explore the results in an
“open-ended” way rather than relying on a more detailed and structured
set of questions.

This approach led to summaries of 50 of what the Foundation considered
its major successes showing a wide variety of commercial outcomes.
Overall, as the program director testified in April 1998, the success stories
approach led the Foundation to find that, with respect to the private
sector’s commercialization of technology, the top 50 successful small
business grantees (representing about 10 percent of the Foundation’s
phase II grantees) have grown until they account for direct sales of $2.7
billion and 10,000 jobs created. Given that the Foundation’s total
investment in the SBIR program throughout its history is $350 million, the
program manager concluded that the Foundation had received a 7-to-1
return on its investment.

The program director told us that a second contractor is resurveying the
same 50 companies to verify the original information and gain more insight
into these companies. The Foundation has also let a third contract to study
20 additional companies. The program director said that these studies of
70 companies would capture a significant percentage of the success
achieved with the Foundation’s awards.

DOD’s  program director told us that all of DOD’s major SBIR agencies have
used the success stories approach. He cited problems with this approach,
including the lack of a consistent method among DOD’s separate agencies
and the vagueness of the resulting information. He said that each
component, including the SBIR headquarters office in the Office of the
Secretary of Defense, goes its own way in asking questions of companies
and that no systematic approach or evaluation has been attempted.
Moreover, he said that the resulting information, when companies are
asked to describe their outcomes, frequently leads to vague phrases such
as “advancing the state of the art.” He commented that, if the success
stories approach is to prove valuable, companies should be asked a better
set of questions.

The National Aeronautics and Space Administration has relied on its
questionnaire survey to obtain information on commercial outcomes; by
contrast, according to the SBIR program manager, the agency’s use of
success stories has served mainly to market companies’ technologies



Page 52              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                              Chapter 4
                              SBA Has an Opportunity to Standardize
                              Evaluations of the Program’s Outcomes




                              rather than to measure results. The agency lets the companies prepare and
                              publish their success stories in such publications as its Tech Briefs
                              magazine, which reaches an audience of about 220,000 readers. The
                              agency’s only role in this effort has been to provide a common format for
                              the stories. The format requires the companies to present their stories in a
                              four-step series: (1) a description of the innovation, (2) the
                              accomplishments, (3) commercialization, and (4) government/science
                              applications.

                              In general, our review of the “success stories” approach indicated that it is
                              being used extensively but that its purpose varies. The National Science
                              Foundation’s approach is intended to provide a comprehensive survey of
                              commercialization results, whereas the National Aeronautics and Space
                              Administration uses “success stories” mainly to help its winning
                              companies market their technologies. In addition, the success stories
                              approach has not led to the development of carefully structured questions.
                              The approach is “open-ended,” meaning that it can be used to develop a
                              detailed story for individual companies but does not lend itself to greater
                              systemization. A further shortcoming is its omission of less successful
                              projects, which tends to bias the results of this approach.


An Academic Approach          Academic studies have also focused on the program’s commercial
Illustrates a Different Set   outcomes. One of the leading specialists in this area stated in a paper
of Methods but Contains       presented in October 1998 at a National Research Council workshop on
                              SBIR that, as the number of public venture capital programs such as SBIR
Similar Criteria About        has grown, policymakers and economists are increasingly grappling with
Sales and Job Creation        the question of how to assess these programs. The paper pointed out that
                              one of the main academic approaches is to examine the long-run impact of
                              participation in public venture capital programs on the growth of the firms
                              themselves, relative to a matched set of firms.

                              This approach is directly related to the discussion of commercialization in
                              this chapter. The specialist at the workshop provided an example of it in
                              another paper.2 The paper analyzed a sample of firms that had received
                              SBIR awards and compared them with a closely matching set of firms that
                              had not received awards during the same time period. The comparison
                              focused on the impact of participation in the program on sales and
                              employment. The analysis found that the mean increase in both
                              employment and sales from the end of 1985 to the end of 1995 was higher

                              2
                              Josh Lerner, The Government as Venture Capitalist: The Long-Run Impact of the SBIR Program,
                              Working Paper 5753, National Bureau of Economic Research (Sept. 1996).



                              Page 53                 GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                         Chapter 4
                         SBA Has an Opportunity to Standardize
                         Evaluations of the Program’s Outcomes




                         for SBIR firms (a boost of 26 versus 5 employees and $5 million versus
                         $2 million in sales). The specialist pointed out several limitations of this
                         approach, including the fact that it does not measure the increase in a
                         firm’s value. Because over 98 percent of the firms were privately held,
                         assessing the valuation and profitability of the awards was very difficult.


A Frequent Winner Has    The chief executive officer of a frequent winner developed a new
Developed a Method for   approach in a March 1998 paper.3 The paper states that although numerous
Evaluating               methods could be used to gauge the SBIR program’s success, his approach
                         focuses on the follow-on funding, or sales, achieved. It also states that SBIR
Commercialization        awards span a wide range of commercial potential, from those awards
                         aimed at highly commercializable technologies to those that address
                         narrow, mission-specific requirements with little or no follow-on potential.
                         It adds that, for awards in the latter category, it is important not to
                         penalize the contractor who successfully responds to such solicitations. It
                         then separates projects into agency-specific, commercially viable, and
                         dual-use categories and contends that the ability to accurately apply such
                         classifications was confirmed by relatively little deviation among various
                         observers, including an advisory board representing six venture capital
                         firms. Subsequently, in calculating the company’s return on the SBIR
                         investment, it eliminates the “agency-specific projects” that are judged at
                         the outset to have virtually no commercial potential. It concludes that
                         when these projects are deducted, the company shows a successful rate of
                         return on the SBIR investment. A more detailed analysis, according to the
                         paper, reveals that only three or four of the phase II awards in the
                         “commercially viable” category accounted for more than two-thirds of all
                         follow-on funding over a 10-year period and that, in each of these cases,
                         the follow-on business could not have been reliably predicted. At the end,
                         the paper strongly recommends that an analysis of other multiple-award
                         winners be carried out in this manner.




                         3
                          Robert F. Weiss, “Analysis of Follow-on Funding Generated by Major SBIR Award Winners—The Case
                         of Physical Sciences Inc.”, Physical Sciences, Inc. (Mar. 1998).



                         Page 54                 GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                             Chapter 4
                             SBA Has an Opportunity to Standardize
                             Evaluations of the Program’s Outcomes




                             The methods we identified do not provide consistent information across
A Standard Approach          agencies on the program’s results. The use of a single method with
Involves the Use of          uniform criteria for success focusing on outcomes would produce such
Uniform Criteria for         information, enabling SBA and the agencies to satisfy the requirements of
                             the Results Act. The expansion of SBA’s SBIR database affords an
Success and                  opportunity to standardize the reporting of results. The previous SBA
Improvements in              database contained two general data fields for the results of SBIR awards,
                             but they were vague, optional, and seldom used. To overcome this
SBA’s New Database           limitation, standard criteria can be identified and turned into specific data
                             fields, capturing a variety of commercial and other measures. This
                             approach, if implemented, will make available—for the first time—a
                             central database for the program that allows for the effective evaluation of
                             its commercial outcomes and other measures of success.


An Opportunity Exists to     The Government Performance and Results Act of 1993 was intended,
Respond to the Results Act   among other purposes, to improve the effectiveness of federal programs
by Using Standardized        and enhance public accountability by promoting a new focus on results. In
                             1997, the Congress specified that information on the SBIR program must be
Criteria for Success and     included by each federal agency in the updates or revisions to its strategic
Capturing Outcomes in        plan required by the Results Act. (15 U.S.C. 638(t)). As the central
SBA’s New Tech-Net           administrative agency for the program, SBA has maintained a
Database                     governmentwide database that brings together the data submitted by the
                             individual agencies participating in the program. According to the
                             Assistant Administrator for Technology who oversees the program, SBA
                             has used the database primarily to develop its annual reports on the
                             program and to accomplish other purposes as required by the SBIR
                             legislation. Currently, however, SBA is developing a new database called
                             Tech-Net. This effort provides a unique opportunity to address the
                             shortcomings of the previous database. It may also help agencies respond
                             to the Results Act by using standardized criteria for success and capturing
                             the commercial and other outcomes of SBIR activities.

                             For the purpose of measuring these outcomes, the original SBA database
                             has had two major shortcomings. First, because it was developed long
                             before the Results Act emphasized the measurement of outcomes, the
                             database reflects the earlier attention given to inputs. It consists of 62
                             “fields,” or specific pieces of information, such as the name of each
                             company and the amount of funding that it received. Although the
                             database includes two fields for information on the results of awards,
                             according to the database manager, these fields capture only the
                             companies’ general expectations of benefits (such as cost savings or more



                             Page 55              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 4
SBA Has an Opportunity to Standardize
Evaluations of the Program’s Outcomes




efficient service) at the time of receiving a phase I or phase II award. In
addition, the use of these data fields was optional, so some of the
companies did not fill them out. In general, companies have not provided
information on the actual (as opposed to the anticipated) results of their
research. Second, the database contains unreliable information. One key
reason for unreliable data is the lack of a unique identifying code for each
company in SBA’s current database. Identification has depended simply on
the company’s name. Slight variations in spelling, however, have created
difficulty because the database is not able to recognize these differences
and thus counts each separate spelling as a separate company.

In June 1998, SBA announced the introduction of a new database called
Tech-Net at a meeting of program managers. This new system is an
Internet-based database containing SBIR awards, as well as awards and
information associated with other technology programs. SBA describes
Tech-Net as an electronic gateway of technology information and
resources for and about small high-technology businesses. It provides a
search engine for researchers, scientists, and government officials; a
marketing tool for small firms; and a potential link to investment
opportunities for investors and other sources of capital. It will enable
agencies to update their information on SBIR awards and companies to
update key information on their activities. The previous information will
be preserved in a special archive. The entire abstract of each award will be
a source of keywords, allowing searches not only of the current
information but also of the data saved in the archives. Thus, a key feature
of the system will be its ability to show changes in the program over time.

SBA is taking steps to implement Tech-Net and to ensure that it keeps a
more accurate record of company names than the previous database. SBA’s
Assistant Administrator for Technology emphasized that implementing
Tech-Net by the spring of 1999 was a priority. As part of this effort, he
plans to send a letter to every company that has received an award since
the start of the program. The letter will contain a unique
user-identification number for each company to prevent confusion over
the identity of participants. In December 1998, SBA sponsored a technical
meeting of SBIR database managers representing numerous agencies to
determine how much difficulty, if any, they would have in submitting the
data required by Tech-Net in a common electronic format. The managers,
whose agencies are required by law to submit data on the program to SBA,
were optimistic about their ability to provide whatever data SBA requested.
In talking with us about the inclusion of outcome-related data fields, the




Page 56              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                     Chapter 4
                     SBA Has an Opportunity to Standardize
                     Evaluations of the Program’s Outcomes




                     database managers at SBA were also optimistic about their ability to
                     expand Tech-Net to capture this information.


                     The commercial outcomes of the SBIR program have been the subject of
Conclusions          numerous evaluations that have not followed the same approach but have
                     focused on many of the same criteria for measuring the program’s success.
                     An opportunity exists to identify the most useful and uniform criteria for
                     success and to build the answers to them into the new Tech-Net database
                     at SBA.


                     To respond to the Government Performance and Results Act, we
Recommendation to    recommend that the Administrator develop standard criteria for measuring
the Administrator,   the commercial and other outcomes of the SBIR program and incorporate
SBA                  these criteria into the new Tech-Net database. The criteria should include
                     uniform measures of sales, developmental funding, and other indicators of
                     success.


                     SBA said it concurred with the recommendation, adding that for the
Agency Comments      recommendation to work, the participating federal agencies must agree to
and Our Evaluation   provide SBA with information on the outcomes of their projects. It also
                     stated that any action by the Congress must include a provision that will
                     require the participating federal agencies to provide this critical
                     information to SBA through the new Tech-Net database system. However,
                     agencies are already required to report information on their SBIR awards to
                     SBA. Additional information on the outcomes of projects could be included
                     with this submission. Our recommendation would simply provide for
                     consolidating the information in a uniform format in the Tech-Net
                     database.

                     The National Aeronautics and Space Administration, the Environmental
                     Protection Agency, and the National Institutes of Health commented on
                     this recommendation. In general, their concerns focused on the entry,
                     maintenance, safeguards, reliability, and commercial emphasis of the data
                     to be captured in the Tech-Net database. The National Aeronautics and
                     Space Administration expressed concerns about data safeguards, data
                     reliability, and incentives to firms to provide the data. It also asked us to
                     furnish specific measurements and details for implementation. The
                     Environmental Protection Agency questioned its ability to require
                     information from the companies. The National Institutes of Health



                     Page 57              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Chapter 4
SBA Has an Opportunity to Standardize
Evaluations of the Program’s Outcomes




expressed concern that the Tech-Net database is assumed to be the
correct and single approach even though agencies have widely varying
missions and preferences for evaluating their own programs. The
Institutes raised questions about the commercial emphasis of the data to
be entered, who is responsible for entering and validating the data, what
level of compliance is to be expected, what incentives exist for grantees to
submit data, and how reliable the data are likely to be.

In making this recommendation, we recognized that issues about its
implementation such as the agencies have identified would arise. We did
not include additional detail because we believe that SBA and the program
agencies are in the best position to identify and resolve these issues. The
effective implementation of this recommendation will require close
cooperation among the participating companies, the program agencies,
and SBA.

Our recommendation may be helpful in addressing concerns about the
reliability of the data to be submitted. Previous approaches, such as
questionnaire surveys, were labor-intensive and the results were difficult
to verify. The information in a current, centralized database could be
sampled more easily in a systematic way to verify its accuracy. In response
to the concern expressed by the National Institutes of Health about the
widely differing missions of the agencies, we added a reference to other
indicators of success in our recommendation that reflects our recognition
of the need for flexibility in identifying successful outcomes.




Page 58              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Page 59   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix I

SBIR Phase I Award/Proposal Ratios in
Fiscal Year 1998, by Agency


                                                                        Award/
                               Proposals      Awards made        proposal ratio,                                             Award/
                            received from               to       in percent, for    Proposals            Awards made proposal ratio,
                             non-EPSCoR       non-EPSCoR          non-EPSCoR received from                 to EPSCoR in percent, for
Agency                             states           states               states EPSCoR states                   states EPSCoR states
Department of Defense
                                    8,543               1,200                14.0                557                  59        10.6
National
Institutes of
Health (grants
only)                               2,311                 667                28.9                129                  25        19.4
National
Aeronautics
and Space
Administration                      2,183                 318                14.6                152                  27        17.8
National
Science
Foundation                          1,439                 212                14.7                  95                 22        23.2
Department of Energy
                                    1,120                 191                17.1                  71                 13        18.3
Department of Commerce
                                     351                   39                11.1                  23                   6       26.1
Department of Agriculture
                                     324                   57                17.6                  96                 20        20.8
Environmental Protection
Agency
                                     294                   35                11.9                  27                   2        7.4
Department of
Transportation                       232                   19                 8.2                  14                   2       14.3
Department of Education
                                     218                   39                17.9                  13                   2       15.4
Total                             17,016                2,777                16.3              1,176                 178        15.1
                                           Source: GAO’s analysis of data from agencies participating in the SBIR program.




                                           Page 60                  GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix II

Comments From the Small Business
Administration

Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




See comment 1.




                             Page 61   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
              Appendix II
              Comments From the Small Business
              Administration




              The following is GAO’s comment on the Small Business Administration’s
GAO Comment   letter dated May 11, 1999.

              1. This concern is addressed in the discussion of agency comments at the
              end of the executive summary and of chapter 4.




              Page 62              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix III

Comments From the Department of Defense


Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




See comment 1.




                             Page 63   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix III
Comments From the Department of Defense




Page 64             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
              Appendix III
              Comments From the Department of Defense




              The following is GAO’s comment on the Department of Defense’s letter
GAO Comment   dated April 20, 1999.

              1. We agree with the Department’s revision of its plan and believe that the
              new approach will help avoid the unintended consequences that we
              discussed in our report. We have updated our report to reflect the
              Department’s revision.




              Page 65             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix IV

Comments From the Department of
Commerce




              Page 66   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix V

Comments From the Department of
Education




             Page 67   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix VI

Comments From the Department of
Transportation




              Page 68   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix VII

Comments From the Department of
Agriculture

Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




                             Page 69   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                 Appendix VII
                 Comments From the Department of
                 Agriculture




See comment 1.




                 Page 70             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
              Appendix VII
              Comments From the Department of
              Agriculture




              The following is GAO’s comment on the Department of Agriculture’s letter
GAO Comment   dated April 26, 1999.

              1. While we recognize the commercial breadth exhibited by the
              Department’s reported results, governmentwide surveys performed in 1996
              and 1998 by a support contractor for the Department of Defense and SBA
              showed that only 39 percent of the projects responding to the surveys
              reported sales. The concern about using commercialization as the primary
              goal for evaluating SBIR proposals remains valid in view of the great
              concentration of commercial success in a very small percentage of
              projects.




              Page 71             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix VIII

Comments From the National Aeronautics
and Space Administration

Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




See comment 1.




                             Page 72   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
              Appendix VIII
              Comments From the National Aeronautics
              and Space Administration




              The following is GAO’s comment on the National Aeronautics and Space
GAO Comment   Administration’s letter dated April 22, 1999.

              1. We agree with this point about the need for close cooperation and have
              made additional comments at the end of the executive summary and of
              chapter 4.




              Page 73              GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix IX

Comments From the Environmental
Protection Agency

Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




                             Page 74   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                 Appendix IX
                 Comments From the Environmental
                 Protection Agency




See comment 1.




                 Page 75             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
              Appendix IX
              Comments From the Environmental
              Protection Agency




              The following is GAO’s comment on the Environmental Protection Agency’s
GAO Comment   letter dated April 21, 1999.

              1. We have noted the Environmental Protection Agency’s concern at the
              end of chapter 4. SBA and the program agencies will have to coordinate
              their efforts to resolve this and other issues.




              Page 76             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix X

Comments From the Department of Energy


Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




See comment 1.




                             Page 77   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                 Appendix X
                 Comments From the Department of Energy




See comment 2.




See comment 3.




                 Page 78             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix X
Comments From the Department of Energy




Page 79             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
               Appendix X
               Comments From the Department of Energy




               The following are GAO’s comments on the Department of Energy’s letter
GAO Comments   dated April 22, 1999.

               1. We revised the report to delete these references.

               2. We revised the report to reflect the Department’s specific suggestions.

               3. The Department commented that our use of the term
               “commercialization record” to describe information required for
               evaluating commercial potential and information required from companies
               with 15 or more phase II awards may be confusing. To avoid any
               confusion, we continue to use the term in connection with the evaluation
               of commercial potential and revised the report to avoid the use of the term
               in connection with frequent winners. The Department notes that the policy
               directive does not define what is meant by the potential for
               commercialization with regard to phase I proposals, nor does it suggest
               how this potential should be evaluated. However, the reauthorization act
               specifies that phase I ideas “appear to have commercial potential” as
               described in the law under phase II.




               Page 80             GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix XI

Comments From the National Institutes of
Health

Note: GAO comments
supplementing those in the
report text appear at the
end of this appendix.




                             Page 81   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                 Appendix XI
                 Comments From the National Institutes of
                 Health




See comment 1.




See comment 2.




                 Page 82               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix XI
Comments From the National Institutes of
Health




Page 83               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                 Appendix XI
                 Comments From the National Institutes of
                 Health




See comment 3.




                 Page 84               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix XI
Comments From the National Institutes of
Health




Page 85               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
                 Appendix XI
                 Comments From the National Institutes of
                 Health




See comment 4.




                 Page 86               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix XI
Comments From the National Institutes of
Health




Page 87               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
               Appendix XI
               Comments From the National Institutes of
               Health




               The following are GAO’s comments on the National Institutes of Health’s
GAO Comments   letter dated April 27, 1999.

               1. While we recognize the inherent differences between grants and
               contracts, these differences do not eliminate the need to clarify the
               relative emphasis on commercialization and the program’s other goals. We
               made no changes in response to the Institutes’ comment. We also
               recognize that agencies differ in their evaluations of proposals but believe
               that, without such clarification, these differences may lead to unintended
               consequences, such as those that would have resulted from DOD’s
               emphasis on the commercialization record.

               2. We addressed this issue in our evaluation of agency comments at the
               end of the executive summary and of chapter 3.

               3. The Institutes express concern about developing standard criteria to
               measure commercial outcomes while at the same time acknowledging that
               the database appears to have some merit as a useful tool. Certain criteria,
               such as sales and additional funding, that agencies might agree upon
               would increase the ability of Congress to evaluate the program across
               agencies. Nevertheless, we do not envision this database being used to
               circumvent the judgments of individual agencies in making awards.

               4. The draft report reviewed by the National Institutes of Health also noted
               that the concentration of SBIR awards in certain states tends to reflect the
               concentration of federal research resources in general. The report
               acknowledges the Institutes’ and other agencies’ efforts to reach out to
               businesses in states with comparatively few SBIR awards.




               Page 88               GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix XII

Comments From the National Science
Foundation




               Page 89   GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Appendix XIII

Major Contributors to This Report


                        Dennis Carroll
Resources,              Curtis Groves
Community, and          Kathy Hale
Economic                Brad Hathaway
                        Victor Rezendes
Development
Division, Washington,
D.C.
                        Mindi Weisenbloom
Office of General
Counsel




(141228)                Page 90           GAO/RCED-99-114 Evaluation of Small Business Innovation Research
Ordering Information

The first copy of each GAO report and testimony is free.
Additional copies are $2 each. Orders should be sent to the
following address, accompanied by a check or money order
made out to the Superintendent of Documents, when
necessary. VISA and MasterCard credit cards are accepted, also.
Orders for 100 or more copies to be mailed to a single address
are discounted 25 percent.

Orders by mail:

U.S. General Accounting Office
P.O. Box 37050
Washington, DC 20013

or visit:

Room 1100
700 4th St. NW (corner of 4th and G Sts. NW)
U.S. General Accounting Office
Washington, DC

Orders may also be placed by calling (202) 512-6000
or by using fax number (202) 512-6061, or TDD (202) 512-2537.

Each day, GAO issues a list of newly available reports and
testimony. To receive facsimile copies of the daily list or any
list from the past 30 days, please call (202) 512-6000 using a
touchtone phone. A recorded menu will provide information on
how to obtain these lists.

For information on how to access GAO reports on the INTERNET,
send an e-mail message with "info" in the body to:

info@www.gao.gov

or visit GAO’s World Wide Web Home Page at:

http://www.gao.gov




PRINTED ON    RECYCLED PAPER
United States                       Bulk Rate
General Accounting Office      Postage & Fees Paid
Washington, D.C. 20548-0001           GAO
                                 Permit No. G100
Official Business
Penalty for Private Use $300

Address Correction Requested