|
A Toolkit
for Evaluating Public R&D Investment Models,
Methods, and Findings from ATP's First Decade
CHAPTER 3: ATP's Evaluation
Program
Background:
Evaluation Drivers
In large measure,
the development of a strong evaluation program by ATP was internally
driven. ATP’s standing as an experimental undertaking, established
by the Omnibus Trade and Competiveness Act of 1988, together with
the perspective of its first director, 57 contributed
to an environment of curiosity and learning. Evaluation was seen
as a tool of learning. A small amount of ATP’s initial 1990
budget was set aside by program officials to fund rudimentary evaluation
activities. This interest in evaluation continued to grow over time.
A time line of events and developments critical to ATP and its evaluation
program during its first decade is provided below:
- 1988 Omnibus Trade
and Competitiveness Act (P.L. 100–418) authorizes ATP.
- 1990 ATP receives
its first budget of $10 million.
- 1992 American Technology
Preeminence Act (P.L. 102–245) amends P.L. 100–418.
- 1993 Government Performance
and Results Act (GPRA) requires evaluation.
- 1993 Scale up of
ATP under consideration.
- 1996 Progress report
on ATP’s impacts due to Congress.
- 1997 Secretary of
Commerce Daley orders a 60-day review of ATP.
- 1998 Senate Report
105–234 requests independent assessment of ATP.
The American Technology
Preeminence Act of 1991, enacted in 1992, amended the legislation establishing
ATP, and directed that:
The Secretary [of
the Department of Commerce] shall, not later than 4 years after the
date of enactment of this Act, submit to each House of the Congress
and the President a comprehensive report on the results of the Advanced
Technology Program ... including any activities in the areas of high-resolution
information systems, advanced manufacturing technology, and advanced
materials.
Knowing it would have
to report on results no later than 1996 provided another reason for ATP
to strive to build its evaluation capabilities.
Because of these and other
forces at work, ATP was able to take passage of the GPRA in stride. The
GPRA’s requirement for the reporting of particular types of performance
metrics influenced ATP to produce those measures. Measures that ATP considered
important, such as spillover measures and collaboration results, seemed
to fit GPRA requirements less well, while numbers of projects funded
and completed, and other accomplishments that lent themselves to counting
and trend lines, seemed to fit better. Therefore, ATP established the
tracking of a set of performance metrics acceptable for GPRA reporting,
while continuing its indepth, more complex economic and sociological
evaluation studies that were highly informative of the workings and long-term
performance of the program, though impossible to grasp in a single number
or trend line.
With a scale up of ATP
under consideration in 1993 and a change in the makeup of Congress in
1994, the program became the object of considerable debate and intense
scrutiny. The General Accounting Office and the Office of the Inspector
General carried out a number of studies of the program, and members of
Congress asked the program to address many questions. In this environment,
the importance of evaluation increased for ATP.
Preparation of the mandated
report to Congress in 1996 also had a stimulating effect on further evaluation,
providing a good opportunity to assess progress and identify shortcomings.
Former Secretary of Commerce Daley’s direction to the program the
next year to review certain aspects of its operation gave further impetus
to evaluation.
More recently, in 1998,
the U.S. Senate directed ATP to arrange for a well regarded organization
with significant business and economic experience to conduct a comprehensive
assessment of the program, analyzing how well it has performed against
the goals established in the authorizing statute. This directive led
to an extensive review of ATP from 1999 through 2001, by the National
Academy of Sciences/National Research Council’s Board on Science,
Technology, and Economic Policy, drawing on papers presented at academy-organized
workshops and roundtables and the body of work on ATP.
ATP’s Evaluation
Logic Model
In-depth knowledge
of ATP’s structure, mission, operational mechanisms, program
features, and intended impacts was essential to developing its evaluation
program. Fleshing out the skeletal, generic logic model of Figure
2–1 to provide a specific logic model for ATP provides a useful
framework for understanding how its evaluation program was formed.
Figure 3–1, supplemented by Table 3–1, depicts an evaluation
logic model for ATP.
Reading Figure 3–1
from top down, ATP began with a congressional goal and approach, which,
stated in broadest terms, was to increase national prosperity and quality
of life by providing funding for the development of new technologies. Faced
with a suite of alternative public policy strategies for supporting science
and technology, Congress adopted a public-private partnership program
as one element of an overall strategy to meet its goal. Congress authorized
establishment of ATP, defined its mission, and provided direction to
the formulation of the new program’s operational mechanisms, features,
and intended impacts, which were further elaborated by the U.S. Department
of Commerce through the federal rulemaking process. Table 3–1 lists
major goals specified in ATP’s mission and highlights some of the
program’s more important operational mechanisms and features.
ATP was directed to increase
the nation’s scientific and technical knowledge base, expand and
accelerate development and commercialization of generic (referred to
synonymously as “enabling”) technologies, promote collaborative
R&D, refine manufacturing processes, and increase the competitiveness
of U.S. firms. It was directed to generate broad-based benefits for the
nation; that is, benefits extending beyond the relatively narrow population
of award-recipient organizations. A constraint was included that the
program should ensure the “appropriate” participation of
small businesses.
Rather than using an outright,
no-strings-attached grant as the award mechanism, ATP uses cooperative
agreements to enter into cost-sharing arrangements with award recipients.
Awarded funds can be applied only to approved costs of research. Projects
are selected from proposals submitted to ATP that are peer reviewed against
published selection criteria. Each award is for a specific project with
both an R&D and a business/economics plan, and with well-defined
goals and a limited duration.
There are two routes by
which the program is designed to deliver impacts and achieve broad-based
benefits: a direct route by which ATP award recipients and their collaborators
accelerate development and commercialization of technologies that lead
directly to private returns and market spillovers, and an indirect route
by which publications, presentations, patents and other means of knowledge
generation and dissemination lead to knowledge spillovers. Market and
knowledge spillovers from the program are looked to as a primary means
for broadening the impact of funded projects substantially beyond the
direct award recipients.
Table
3–1. ATP’s Mission, Mechanisms, and Features
MISSION
SPECIFICATION
- Add
to the nation’s scientific and technical knowledge
base
- Foster
expanded/accelerated technology development and commercialization
by U.S. firms
- Promote
collaborative R&D
- Refine
manufacturing processes
- Ensure
appropriate small-business participation
- Increase
competitiveness of U.S. firms
- Generate
broadly based benefits
|
OPERATIONAL
MECHANISMS AND FEATURES
- Cooperative
agreements with industry for industry-led, cost-shared
research
- Focus
on high-risk research to develop enabling technologies
- Competitive
selection of projects using peer review and published
criteria
- Sunset
provisions for all funded projects
- Requirement
that all project have well defined goals and identified
pathways to technical and economic impacts
|
Figure
3–1. ATP’s Evaluation Logic Model

The middle tier of Figure
3–1 shows the program’s inputs, outputs, outcomes, and impacts.
Program inputs derive from congressional appropriations that provide
budgets for making awards, convening staff to administer the process,
and providing for equipment, facilities, and other administrative costs.
Principal outputs include the funded projects, collaborative relationships
formed as a result of the program, publications, patents, models and
algorithms, and prototype products and processes. Principal outcomes
include sales of new and improved products, processes, and related services;
productivity effects on firms; changes in firm size and industry size;
a change in the propensity of firms and other organizations to collaborate;
the spread of resulting knowledge through citations of publications and
patents and by other means; and knowledge and market spillovers as others
adopt the funded innovations. Longer-term impacts relate back to the
broad societal goal that drove the program’s creation, including
increased GDP, employment gains, improvements in the quality of life
through improvements in the nation’s health, safety, and environment,
and improved international competitiveness of U.S. industry. Impacts
may also include an effect on the nation’s capacity to innovate.
Figure 3–1 also indicates “process dynamics,” which
refers to the transformations through which program inputs, outputs,
outcomes, and impacts are linked. These transformations are complex,
and there is much to learn about them.
The lower tier of Figure
3–1 ties the evaluation strategies and objectives to the program.
Evaluation focuses on the inputs, outputs, outcomes, impacts, and process
dynamics of a program. Evaluation objectives include tracking progress
of funded projects, using, for example, indicator metrics; understanding
process dynamics; estimating benefits and costs of projects and of the
program overall; identifying the more difficult to measure effects; relating
findings back to the program’s mission; and applying tests of success
(discussed in the following section). Additional objectives include disseminating
evaluation results and feeding results back to program administrators
to improve the program and to policy makers to inform them and to meet
reporting requirements. Evaluation methods and tools used to achieve
these objectives include those presented in Chapter
2.
Conceptual Tests of ATP’s
Success
One of ATP’s
central missions is to produce broad-based economic benefits. This
suggests a test of success stated primarily in economic terms, and
is a major factor in ATP’s decision to press the use of economic
methods of evaluation. However, ATP’s mission is complex and
multidimensional, and a single test of success is inadequate.
The following four tests
can help define ATP’s accomplishment of its central mission, provided
they are applied after sufficient time has passed to offer a fair test: 58
- Test
1: Has the portfolio of ATP-funded projects
produced large net social benefits for the nation?
- Test
2: Has the portfolio of ATP- funded projects
contributed to enhanced United States economic and technological
competitiveness?
- Test
3: If test 1 is met, is a large share of
the benefits attributable to ATP?
- Test
4: Regarding the distribution of net benefits,
do they extend well beyond the direct ATP award recipients?
Additional criteria are
needed to test for achievement of other supporting objectives while holding
to program constraints. These other supporting objectives and constraints
include building the scientific and technical knowledge base, fostering
collaborative research, refining manufacturing, and ensuring appropriate
participation of small businesses.
ATP’s Approach to
Evaluation
Who Evaluates?
ATP’s approach
has been to try to capture the advantages and avoid the disadvantages
of relying solely on either in-house evaluations or outside contractors.
The ATP formed a core in-house group responsible for planning, guiding,
and monitoring its evaluation efforts, for establishing and maintaining
certain databases needed both for evaluation and program management,
and for carrying out studies that could best be performed in-house. 59
ATP has also used a variety
of outside contractors to carry out evaluation studies. It formed an
early association with the National Bureau of Economic Research (NBER)
to ensure that it obtained the services of leading evaluators, as well
as to provide another level of independent review of the studies. In
addition, it formed a panel to review proposed evaluation studies. ATP
periodically held a roundtable of notable evaluators who were invited
in to hear and comment on presentations of studies planned and completed.
ATP obtained additional reviews of evaluation studies from outside reviewers.
ATP cooperated with other program assessors, including assessors from
the General Accounting Office (GAO), the Office of Inspector General
(OIG), and the National Academy of Sciences (NAS), to provide evaluation
materials and results from surveys.
This strategy of combining
in-house capability with substantial support from outside evaluators
was designed to keep ATP’s evaluations focused, relevant, and efficient
while ensuring credibility and gaining multiple perspectives, talents,
and experience.
A Variety of Methods
Used in a Portfolio Approach
One strength of ATP’s
evaluation program has been its strategy to use a variety of methods
to evaluate program effects, choosing the best method for the task rather
than focusing on a single method. Taking this multi-faceted approach,
ATP has given more attention to some evaluation methods than to others
during its first decade. Differential emphasis on the methods resulted
from programmatic considerations including mission relevancy, the time
delay before certain outputs and outcomes could result, and the need
to respond to specific executive or legislative requests for information.
Figure 3–2 suggests the timing and relative intensity with which
the various evaluation methods have been applied to ATP.
Figure
3–2. Intensity of ATP’s Use of Evaluation Methods

*These 81 methods are employed
in the 45 ATP studies commissioned between 1990 and 2000 that are examined
in this report.
Building a Portfolio
of Evaluation Studies
The sequence of evaluation
studies commissioned by ATP over the past 10 years reveals how evaluation
of the program evolved. Tables 3–2, 3–3, and 3–4 show
the sequence of studies and indicate the principal method(s) used by
each study. Two features are particularly notable: the increasing volume
of studies as ATP ended its first decade and the increasing sophistication
and greater focus of the studies funded. Starting with conceptual studies
and surveys, ATP added case studies and econometric/statistical studies,
and more recently undertook patent citation-tracing studies and network
analysis.
Table
3–2. Select ATP Studies Commissioned and Completed, 1991–1995
| STUDY
COUNT |
STUDY
NAME |
PUBLICATION
YEAR AND DOCUMENT NUMBER |
AUTHOR
AND AFFILIATION |
SUBJECT |
METHOD
USED (princial method listed first) |
| 1 |
Measuring the
Economic Impact of the Advanced Technology Program: A Planning
Study |
1992 (Unpublished) |
Albert Link, Univ.
of NCGreensboro |
Program performance
metrics |
Informing underlying
program theory |
| 2 |
Advanced Technology
Program: An Assessment of Short-Term Impacts—First Competition
Participants |
1993 |
Samantha Solomon,
Solomon Associates |
Indicators of
progress toward goals |
Survey |
| 3 |
Estimating Social
and Private Returns from Innovations Based on the Advanced Technology
Program: Problems and Opportunities |
1999 NIST GCR
99–780 |
Edwin Mansfield
(deceased), Univ. of Penn. |
Method of measuring
private returns and market spillovers |
Modeling underlying
program theory |
| 4 |
Economic Analysis
of Research Spillovers: Implications for the Advanced Technology
Program |
1997 NIST GCR
97–708 |
Adam Jaffe, Brandeis
Univ. |
Market, knowledge,
and network spillovers: what they are, how they arise, and how
they may be deliberately pursued |
Modeling underlying
program theory |
| 5 |
Survey of Advanced
Technology Program; 1990–1992 Awardees: Company Opinion
About the ATP and Its Early Effects |
1996 |
Bohne Silber,
Silber & Associates |
Indicators of
progress toward goals and customer feedback |
Survey |
| 6 |
The ATP’s
Business Reporting System: A Tool for Economic Evaluation |
1996 |
Jeanne Powell,
ATP |
Use of electronic
survey to compile progress data from ATP participants |
Survey plan |
| 7 |
Advanced Technology
Program Case Study: The Development of Advanced Technologies
and Systems for Controlling Dimensional Variation in Automobile
Body Manufacturing |
1997 NIST GCR
97–709 |
CONSAD Research
Corporation |
Economic impacts
of improved dimensional control in assembling vehicles resulting
from a joint venture project led by the Auto Body Consortium |
Economic case
study; expert judgment |
| 8 |
Advanced Technology
Program; Early Stage Impacts of the Printed Wiring Board Research
Joint Venture, Assessed at Project End |
1997 NIST GCR
97–722 |
Albert Link, Univ.
of N.C.- Greensboro |
Economic impacts
(mainly in terms of cost saving) of improved process technology
for the Printed Wiring Board industry resulting from a joint
venture project led by National Center for Manufacturing Sciences |
Economic case
study; survey |
| Table
3–3. Select ATP Studies Commissioned
and Completed, 1996–1999 |
| 9 |
Acceleration of
Technology Development by the Advanced Technology Program |
1997 NISTIR 6047 |
Frances Laidlaw,
ATP and G.W. Univ. |
Impact of ATP
on R&D cycle time |
Survey |
| 10 |
Development, Commercialization,
and Diffusion of Enabling Technologies |
1997 NISTIR 6098 |
Jeanne Powell,
ATP |
Assessment using
1995 Business Reporting System (BRS) data of progress of 480
companies and 210 projects funded 1993–1995 |
Survey; indicator
metrics; bibliometrics |
| 11 |
Small-Firm Experience
in the Advanced Technology Program |
1996 |
Jeanne Powell,
ATP |
Comparison of
performance of small-firm awardees with all- firm awardees |
Survey |
| 12 |
A New Lexicon
and Framework for Analyzing the Internal Structures of the U.S.
Advanced Technology Program and Its Analogues Around the World |
1998 Journal of
Technology Transfer 23 (2):5–10 |
Connie Chang,
ATP |
Comparison of
ATP and similar programs abroad in terms of their key features |
Modeling underlying
program theory |
| 13 |
Advanced Technology
Program’s Approach to Technology Diffusion |
1999 NISTIR 6385 |
Rosalie Ruegg,
ATP |
How ATP promotes
early adoption/diffusion of technologies it funds by influencing
project structure and firm behavior |
Modeling underlying
program theory |
| 14 |
Business Planning
and Progress of Small Firms Engaged in Technology Development
through the Advanced Technology Program |
1996 NISTIR 6375 |
Jeanne Powell,
ATP |
Comparison of
small-firm performance with that of medium and large firms |
Survey |
| 15 |
Publicly Supported
Non-Defense R&D: The U.S.A.’s Advanced Technology Program |
1997 Science and
Public Policy 24(1): Feb. issue* |
J-C Spender, New
York Inst. of Tech. |
Theoretical justification
of ATP as promoting trajectories through the U.S. innovation
space |
Modeling underlying
program theory |
| 16 |
Framework for
Estimating National Economic Benefits of ATP Funding of Medical
Technologies |
1998 NIST GCR
97–737 |
Sheila Martin
et al., RTI |
Model for estimating
social, private, and public returns and application to seven
tissue engineering projects |
Economic case
study; expert judgment |
| 17 |
Papers and Proceedings
of the Advanced Technology Program’s International Conference
on the Economic Evaluation of Technological Change: |
1998 Conference
date; 2001, NIST SP 952 |
Richard Spivack,
ATP |
Conference themes
included public policy issues, policy goals and program design,
evaluation of programs, and evaluation metrics |
Modeling underlying
program theory |
| 18 |
Performance of
Completed Projects, Status Report 1 |
1999 NIST SP 950–1 |
William Long,
Business Performance Research Associates |
Collection of
mini-case studies of first 38 completed projects with output
and outcome data compiled according to a common template |
Case study; indicator
data; bibliometrics; informing underlying program theory |
| 19 |
Economic Impacts
of Flow-Control Machining Technology: Early Applications in the
Automobile Industry |
1999 NISTIR 6373 |
Mark Ehlen, NIST |
Economic impacts
of adopting flow-control machining technology in vehicle production |
Economic case
study |
| 20 |
Capital Formation
and Investment in Venture Markets: Implications for the Advanced
Technology Program |
1999 NIST GCR
99–784 |
Paul Gompers and
Josh Lerner, Harvard Univ. |
Availability of
private-sector funding for startup company R&D |
Informing underlying
program theory; descriptive case study |
| 21 |
The Advanced Technology
Program: Challenges and Opportunities |
1999 NAS Press |
Charles Wessner,
NRC |
First of 2 reports
on ATP; this one summarizing deliberations of a symposium on
ATP |
Expert judgment
informed by other methods |
| Table
3–4. Select ATP Studies Commissioned or Completed in
2000 |
| 22 |
Advanced Technology
Program; Information Infrastructure for Healthcare Focused Program:
A Brief History |
2000 NISTIR 6477 |
Bettijoyce Lide
and Richard Spivack, ATP |
Genesis of ATP’s
Information Infrastructure for Healthcare Focused Program |
Descriptive case
study |
| 23 |
Reinforcing Interactions
between the Advanced Technology Program and State Technology
Programs; vol. 1: A Guide to State Business Assistance Programs
for New Technology Creation and Commercialization |
2000 NIST GCR
00–788 |
Marsha Schachtel
and Maryann Feldman, Johns Hopkins Univ. |
How state programs
work in combination with ATP to assist new technology creation
and commercialization |
Modeling underlying
program theory |
| 24 |
Managing Technical
Risk: Understanding Private Sector Decision Making on Early Stage,
Technology- Based Projects |
2000 NIST GCR
00–787 |
Lewis Branscomb,
Harvard Univ.; Kenneth Morse, MIT; Michael Roberts, Harvard Univ. |
Funding gap for
high-risk research |
Modeling underlying
program theory; expert judgment |
| 25 |
Estimating Future
Consumer Benefits from ATP-Funded Innovation: The Case of Digital
Data Storage |
2000 NIST GCR
00–790 |
David Austin and
Molly Macauley, Resources for the Future |
A qualityadjusted
cost index method to estimate expected returns to investments
in new technologies |
Emerging method:
costindex method; econometric/ statistical; economic case study |
| 26 |
Reinforcing Interactions
between the Advanced Technology Program and State Technology
Programs; vol. 2: Case Studies of Technology Pioneering Startup
Companies and Their Use of State and Federal Programs |
2000 NISTIR 6523 |
Maryann Feldman,
Johns Hopkins Univ.; Maryellen Kelley, ATP; Joshua Schaff, New
York City Democracy Network; Gabriel Farkas, Dartmouth College |
Complementary
use by companies of ATP, state, and other federal programs to
assist them in developing technologies, and relationships among
state and federal programs |
Descriptive case
study; informing underlying program theory |
| 27 |
Advanced Technology
Program’s Commercialization and Business Planning Guide
in the Post- Award Period |
2000 NIST GCR
99–779 |
Jenny Servo, Dawnbreaker
Press |
Business planning
guide to increase the likelihood of commercial success of ATP
awardees in the post-award period |
Modeling underlying
program theory; descriptive case study |
| 28 |
Development, Commercialization,
and Diffusion of Enabling Technologies: Progress Report |
2000 NISTIR 6491 |
Jeanne Powell
and Karen Lellock, ATP |
Assessment using
1997 BRS data of progress of 539 companies and 261 projects funded
1993–1997 |
Survey; indicator
metrics; bibliometrics |
| 29 |
Winning an Award
from the Advanced Technology Program: Pursuing R&D Strategies
in the Public Interest and Benefiting from a Halo Effect |
2001 NISTIR 6577 |
Maryann Feldman,
Johns Hopkins Univ.; Maryellen Kelley, ATP |
Behavior of ATP
award winners versus nonwinners |
Survey; econometrics/
statistical |
| 30 |
Performance of
50 Completed ATP Projects, Status Report 2 |
2001 NIST SP 950–2 |
Collection of
50 mini case studies, with aggregate output and outcome statistics,
and project and portfolio performance scores |
ATP |
Case study; indicator
data; bibliometrics |
| 31 |
The Advanced Technology
Program: Assessing Outcomes |
2001 NAS Press |
Charles Wessner,
NRC |
Report of a symposium,
a collection of condensed study reports, findings, and recommendations
about ATP |
Expert judgment
informed by other methods |
| 32 |
Temporary Organizations
for Collaborative R&D: Analyzing Deployment Prospects |
2000 draft |
Stanley Przybylinski,
ERIM; Sean McAlinden, Univ. of Michigan; Dan Lura, Mich. Manufacturing
Tech. Center |
Estimating the
propensity of a technology to diffuse |
Descriptive case
study; modeling underlying program theory; sociometric |
| 33 |
Measuring the
Impact of ATP-Funded Research Consortia on Research Productivity
of Participating Firms |
2002 NIST GCR
02–830 |
Mariko Sakakibara,
UCLA; Lee Branstetter, Columbia Business School |
Assessing impact
of research consortia on research productivity of firms |
Econometric/ statistical;
informing underlying program theory |
| 34 |
ATP and the U.S.
Innovation System-: A Methodology for Identifying Enabling R&D
Spillover Networks |
2000 draft |
Michael Fogarty,
Case Western Univ.; Amit Sinha, Case Western Reserve Univ; Adam
Jaffe, Brandeis Univ. |
Identifying projects
with above average spillovers |
Emerging method:
econometric/ social network analysis using fuzzy logic; case
study |
| 35 |
The Role of Knowledge
Spillovers in ATP Consortia |
2000 draft |
David Mowery,
Univ. of California; Joanne Oxley, Univ. of Mich.; Brian Silverman,
Univ. of Toronto |
Internalization
of knowledge spillovers among consortia members |
Econometric/ statistical |
| 36 |
R&D Policy
in Israel: An Overview and Reassessment |
2000 draft |
Z. Griliches,
Harvard Univ., NBER; M. Trajtenberg, Tel Aviv Univ., NBER, CIAR;
H. Regev, Israel Central Bureau of Statistics |
Using data from
a counterpart program in Israel that had a longer operational
history than ATP to demonstrate how government support of technology
development fostered strong growth rates in Israel’s high-tech
sector |
Econometric/ statistical;
informing underlying program theory |
| 37 |
Universities as
Research Partners |
2002 NIST GCR
02–829 |
Bronwyn Hall,
Univ. of Calif.- Berkeley, NBER; Albert Link, Univ. of N.C.-
Greensboro; John Scott, Dartmouth College |
Contributions
of universities to ATP-funded projects |
Survey; econometrics/
statistical; informing underlying program theory |
| 38 |
R&D Spillovers,
Appropriability and R&D Intensity: A Survey-Based Approach |
2000 draft |
Wesley Cohen,
Carnegie Mellon Univ.; John Walsh, Univ. of Illinois- Chicago |
Offsetting relationship
between profitability and information sharing |
Econometrics/
statistical |
| 39 |
Public-Private
Partnering and Innovation Performance Among U.S. Biotechnology
Firms |
2000 draft |
Bruce Kogut, Wharton
School, Univ. of Penn.; Michelle Gittelman, NYU |
Increased innovation
in biotech from universityfirm partnerships |
Econometrics/
statistical; informing underlying program theory |
| 40 |
Program Design
and Firm Success in the Advanced Technology Program: Project
Structure and Innovation Outcomes |
Andrew Wang, ATP
2002 NISTIR 6943 |
Michael Darby
and Lynne Zucker, Univ. of Calif.-LA, NBER; |
How ATP promotes
innovation and success of firms by encouraging collaboration
and building institutional networks for cooperation |
Econometrics/
statistical; social network analysis; informing underlying program
theory |
| 41 |
Closed Cycle Air
Refrigeration Technology for Cross- Cutting Applications in Food
Processing, Volatile Organic Compound Recovery and Liquefied
Natural Gas Industries |
2001 NIST GCR
01–819 |
Tom Pelsoci, Delta
Research Co. |
Benefit-cost analysis
of a new environmentally benign industrial refrigeration for
ultra-cold applications such as food processing |
Economic case
study |
| 42 |
Study of the Management
of Intellectual Property in ATP-Grantee Firms |
2000 draft |
Julia Liebeskind,
Univ. of Southern Calif. |
How conflicts
over IP may inhibit success of ATP projects |
Descriptive case
study; informing underlying program theory |
| 43 |
Determinants of
Success in ATP-Sponsored R&D Joint Ventures; A Preliminary
Analysis Based on 18 Automobile Manufacturing Projects |
2002 NIST GCR
00–803 |
Jeffrey Dyer,
Brigham Young Univ.; Benjamin Powell, Univ. of Penn. |
Factors believed
important by joint venture members to joint venture success |
Descriptive case
study using semi-structured interviews; informing underlying
program theory |
| 44 |
A Composite Performance
Rating System for ATP-Funded Completed Projects |
2003 NIST GCR
03–851 |
Rosalie Ruegg,
TIA Consulting, Inc. |
0–4 star
rating system computed using output and outcome data from status
reports to provide an overall performance measure against multiple
mission goals |
Emerging method:
composite scoring using indicator metrics and expert judgment |
| 45 |
Between Invention
and Innovation: An Analysis of the Funding for Early- Stage Technology
Development |
2002 NIST GCR
02–841 |
Lewis Branscomb
and Philip Auerswald, Harvard Univ. |
Investigation
of sources of investments into early stage technology development
projects |
Modeling underlying
program theory; expert judgment |
*The paper was published
independently of the ATP.
____________________
57 ATP’s first director, George Uriano, with his background combining
scientific, administrative, and business expertise, played a key role in
shaping ATP and its early evaluation effort. (For more information on Mr.
Uriano’s background, see NIST Press
Release 94–36, dated September 13, 1994.)
58 These are expanded from
suggested tests put forth by R. Ruegg, “Assessment of the ATP,” in
Charles W. Wessner, ed., The
Advanced Technology Program, Challenges and Opportunity (Washington,
DC: National Academy Press, 1999), p. 80.
59 ATP’s in-house
evaluation staff is the Economic Assessment
Office (EAO), whose primary focus is evaluation, but has other program
responsibilities, such as serving on technology proposal selection boards
and helping to oversee funded technology projects from an economics and
business perspective.
Return to Table
of Contents or go to Part II.
Date created: July 13,
2004 Last updated:
August 2, 2005
|