NIST Advanced Technology Program
Return to ATP Home Page
ATP Historical Award Statistics Business Reporting System Surveys EAO Economic Studies and Survey Results ATP Factsheets ATP Completed Projects Status Reports EAO Home Page
NISTIR 05–7174
Evaluation Best Practices and Results: The Advanced Technology Program

3.  ATP’s Evaluation Best Practices

ATP’s experience in funding early-stage technologies and evaluating the impact of its awarded projects has resulted in many best practices. These best practices may prove useful to similar government programs in the early stages of their operations or to government programs that must meet external performance reporting requirements.

COMMITTING TO PERFORMANCE EVALUATION

One of the most important best practices is to establish the practice of evaluation and to sustain those activities despite budgetary pressures. ATP allocates funding for a staff dedicated to eval-uation activities—the Economic Assessment Office—and for carrying out evaluation activities using internal and external resources. It is important for public research and development pro-grams to treat evaluation as a core activity and to pursue evaluation within a framework that measures the program against its stated objectives. Having a dedicated staff with appropriate backgrounds, capabilities, and experience is essential. Having a dedicated budget for evaluation activities is critical.

USING A MULTIFACETED APPROACH TO EVALUATION

Evaluation of early stage technologies and their longer run impacts requires a multi-faceted approach to evaluation because developing new technologies is complex and can take several years. Our evaluation tools assess commercialization as well as knowledge creation and dissem-ination. These methods must accommodate the measuring of inputs, outputs, outcomes, and impacts over the life cycle of a project. Research and development take place in the short to mid term, commercialization in the mid to longer term, and widespread diffusion of the technology over a longer time horizon. This time frame varies by technology area—shorter for information technology projects and much longer for biotech projects (Powell and Moris, 2002). It also accounts for why multiple evaluation approaches are needed to capture the status of projects at various stages of their life cycle.

COMMISSIONING EXTERNAL STUDIES BY EXPERTS

ATP contracts with experts to conduct economic analysis of individual projects, clusters of proj-ects, or concepts underlying the economic principles of the program. ATP’s Economic Assessment Office works with well-known researchers to shape, manage, and produce many of our reports. In the early years, ATP’s Economic Assessment Office worked with economists affiliated with the National Bureau of Economic Research to help lay a strong foundation for evaluating the program. Zvi Griliches, Edwin Mansfield, Adam Jaffe, Bronwyn Hall, and others collaborated with ATP on important research to explore how to measure and track key economic concepts that apply to government support for the development of high-risk, enabling technologies carried out by the private sector. They studied concepts such as spillovers (knowledge, network, and market spillovers—see Jaffe, 1997), return on investment (social, private, and public rates of return), (see Mansfield, 1996) and research productivity (see Darby et al., 2002 and Sakakibara and Branstetter, 2003).

EVALUATING UNSUCCESSFUL PROJECTS

Another best practice is evaluating unsuccessful projects along with successful ones. There is a great deal to learn from projects that failed to complete their goals or to deliver promised benefits. ATP has analyzed the reasons behind projects terminating early (see ATP, 2001, Appendix B). The knowledge generated by examining the reasons why projects fail can enhance project selection and project management.
Almost 10 percent of projects terminate early. A project can end early or not start for partici-pant-initiated reasons, such as a change in goals, financial distress, lack of technical progress, or the inability of a joint venture project to reach an agreement on rights to intellectual property. A project can also end for ATP-initiated reasons, such as the project no longer meets ATP project selection criteria or it shifts away from the pursuit of high-risk research. In a very few cases, early success was the cause for early termination.

STRATEGICALLY PRESENTING RESULTS

Results have more effect if they are presented so that a nontechnical person can understand the science and commercialization. Results are presented in multiple ways—a brief abstract enabling someone to quickly grasp the key findings, an executive summary for someone who wants an overview of key highlights, and the full report. Quantitative findings are presented in tabular form, with graphics, and with accompanying qualitative analyses. We release many of our findings in fact sheets and make them available on ATP’s website. We have also published three special topic brochures that highlight projects in the health care, energy, and manufacturing sectors. (See ATP, 2003b, 2003c, 2005.)

Another way that results and data are summarized is in the form of a ‘statistical abstract,’ an idea borrowed from the U.S. Census Bureau’s annual Statistical Abstract. Plans are to publish the ATP statistical abstract every other year—the first was released in September 2004, in a report called Measuring ATP Impact, 2004 Report on Economic Progress. The report describes ATP, using findings and data from recent reports and statistics. It also provides summaries of recent studies and ten detailed statistical tables that provide data on number and distributions by types of awards, technology areas, geographic regions, university participation, number of patents, commercialization, and post-award attraction of external funding. The data are pre-sented for all projects and by characteristic of the project.

DEVELOPING INNOVATIVE METHODS TO EVALUATE ATP’S EFFECTIVENESS

Evaluation of emerging technologies is a relatively new field. While traditional economic and social science methods can be employed to assess program success, the existing tools are often insufficient to describe the nuances and input of public-private investments. It is appropriate to modify existing tools, develop exciting new tools, or combine existing methods in ways never before explored.

For example, one of the more difficult concepts to measure is social return resulting from an ATP project. Social return includes private returns to the participating company in the project, and public returns, including knowledge, network, and market spillover benefits to that com-pany’s customers or to other firms, and a variety of indirect benefits to other companies and their customers as a result of the diffusion of knowledge created from the project. (See Jaffe, 1997; Chang, Shipp, and Wang, 2002, for an historical description of this issue.)

Despite the difficulties in measuring social return, ATP has pursued a greater understanding of this concept by collaborating with consultants, professional economists, and academicians. Together, we carry out prospective benefit-cost studies of a range of technologies and projects to test and stretch various methodological approaches. These studies include case studies of projects that developed closed-cycle air refrigeration technology (Pelsoci, 2001), flow-control machining technology (Ehlen, 1999), and technologies that reduced the dimensional variation of U.S. motor vehicles (Polenske et al., 2004). By supplementing core in-house evaluation capa-bility with expertise provided by outside contractors, ATP has pursued a balanced approach to evaluation and has welcomed new ideas and approaches—another example of a best practice.

In measuring spillovers, for example, we have used various approaches and means of illustration. To capture knowledge spillovers, for the status reports of completed projects—a portfolio-wide, mini–case study tool—we developed patent trees, which illustrate multi-tiered citations of patents that were issued for ATP-funded technologies. In addition, we commissioned a study to examine knowledge spillovers using social network analysis. This emerging method uses fuzzy logic and systems analysis to examine knowledge spillovers from research and development projects within networks of participating organizations. (See the discussion in Ruegg and Feller, 2003, pp. 271–75; Fogarty et al., 2005.)

To study market spillovers, we have recently begun exploring the use of the U.S. Department of Commerce’s Bureau of Economic Analysis input-output tables. Specifically, we have taken the first 50 completed ATP projects and mapped their make-and-use industries to trace where the new technologies began and where they have since ended up (Popkin, 2003). We are also exploring other emerging methods to measure spillovers and the impact of ATP funding, including coding potential commercial applications identified by ATP project participants using NAICs (North American Industry Classification) codes to identify make-and-use industries that illuminate the spillover path (Nail and Brown, 2005, forthcoming).

USING SYSTEMATIC DATA COLLECTION

Perhaps the cornerstone of ATP’s evaluation program is its comprehensive survey and data collection system. Our survey collection efforts are structured to align with our overall evalua-tion goals, which in turn are crafted to optimize the performance of ATP. As part of an ongoing survey and database assessment effort, we have identified five broad-based goals that form the conceptual basis of our surveys: (1) opportunities for national economic benefits, (2) acceleration of R&D, (3) increased investment in high-risk, long-term technology, (4) stimulation of collabo-ration, and (5) progress in commercialization of technology. These five goals define how ATP projects affect the economy and society.

Go to next section.

Date created: July 20, 2005
Last updated: August 3, 2005

Return to ATP Home Page

ATP website comments: webmaster-atp@nist.gov  / Technical ATP inquiries: InfoCoord.ATP@nist.gov.

NIST is an agency of the U.S. Commerce Department
Privacy policy / Security Notice / Accessibility Statement / Disclaimer / Freedom of Information Act (FOIA) /
No Fear Act Policy / NIST Information Quallity Standards / ExpectMore.gov (performance of federal programs)

Return to NIST Home Page
Return to ATP Home Page Return to NIST Home Page Go to the NIST Home Page