|
|
||||||||||||||||||
|
|
|
NISTIR 05–7174
Evaluation Best Practices and Results: The Advanced Technology Program 4. ATP SurveysAs mentioned above, systematic data collection is one component of ATP’s best practices. This section provides an in-depth explanation of how ATP surveys have evolved and continue to improve. CONCEPTUAL BASIS OF ATP SURVEYS Initiated in ATP’s early years, the survey program has continually incorporated refinements in structure, content, and form to ensure that they remain effective and informative tools for the assessment of ATP progress toward its goals. From their inception in 1991 as telephone surveys, ATP’s surveys have evolved through several distinct stages of development. This evolution has occurred as we (1) have gained a deeper understanding of the economic, social, business, and technological processes in which ATP participates; (2) incorporated feedback from prior evaluation efforts, including surveys, economic and policy research studies, and benefit-cost analyses; and (3) integrated state-of-the-art survey methodologies and techniques into our analytical and administrative structures. ATP surveys can be viewed as a microcosm of our overall evaluation program at ATP. The survey system is a multifaceted effort that is designed to meet multiple (but complementary) program goals while balancing efficiency in operation with excellence in results. This is achieved by identifying and leveraging both internal and external resources and harnessing the benefits of collaboration with survey experts (a tactic learned through our evaluation of ATP), and by relying on continual self-assessment and feedback. We do not lose sight of our goal to measure against mission, in the short, medium, and long term. To design our survey content, economic and organizational changes we would expect to observe in our award recipient population if progress were made toward those goals are identified. We then crafted survey questions to capture the observed behaviors and outcomes. The themes and topics defined by the goals are reflected in multiple lines of questions that vary in a logical progression over the survey lifetime. Baseline information is collected on the initial survey, and follow-up questions in each area are included at the appropriate anniversary, closeout, or post-project surveys. Several variants of the surveys for different types of organizations in the population are used. For example, participating nonprofit organizations or universities are given a slightly different survey than companies to reflect their unique roles in the project and different organizational structures. Diffusion of knowledge is captured by tracking patents, papers, presentations, and other information about intellectual property. Measures of social and environmental effects, or spillovers, of the technologies are captured. Acceleration of R&D is measured in terms of reduced time to achieve technical progress. Instances in which the technology would not have been developed at all, or perhaps on a lesser scale or scope, without ATP funding are identified. This captures the key counterfactual ques-tion: what is the difference made by ATP? Reductions in time-to-market are also tracked. The goal of increased investment in high-risk, long-term technologies is an indicator of the extent to which the work of ATP results in sustained technological development. The level of R&D risk is collected by seeking estimators of probability of success, ambitiousness of goals, technical difficulty, degree of innovation, and project duration. The “halo effect,” the extent to which the initial ATP award has stimulated additional funding, either from within the firm or from external sources is also captured. (Survey of ATP Applicants 2000, (2003a); Feldman and Kelley, 2001; Solomon Associates, 1993.) The right mix of collaboration can lead to positive outcomes for technology creation, develop-ment, and commercialization, and for the national benefits that arise from knowledge diffusion and new commercial products and processes. To determine whether and how collaboration is a factor in creating such impacts among our award recipients, we survey to learn about the struc-ture and nature of collaborative arrangements. And because our primary goal in the Economic Assessment Office is to measure the impact of ATP, we are interested in whether the collabora-tion would have occurred at all without ATP funding. The centerpiece of our surveys is the Business Reporting System, or BRS, which superseded the original telephone surveys in the mid-1990s (Anderson et al., 2003). As did its predecessor, the first BRS addressed the dual objectives of ATP project management and program evaluation. It also incorporated lessons learned, most notably, use of the counterfactual approach to measure the overall impact of ATP. The BRS is not a single survey instrument, but rather a system of surveys administered to all ATP participants throughout the project’s duration and beyond. Then, as now, the BRS consisted of a series of surveys conducted at baseline; annually upon each project anniversary (up to five years, depending on the type of project award); and two, four, and six years after project closeout. The first BRS was sent to participants on a diskette. Following developments in survey design and methodology, we transitioned to a web-based application and revised the survey design in 1999. National economic benefits are measured by gathering information about business growth, the development of business relationships and networks, and the diversity of commercial applications arising from the technology that ATP has funded. Finally, ATP surveys capture commercialization progress, results, and expectations, or specific aspects of it. Although, ATP does not fund the commercialization phase of projects. Our mission at ATP is not simply to fund high-risk technologies, but to fund high-risk technologies that have a strong potential to enhance economic growth. Economic growth can only be achieved when the technology enters the marketplace. To measure this impact, information is collected from firms on current and expected economic value achieved through revenues from commercial applications of the technology, licensing, and cost savings. SURVEY AND DATABASE REFINEMENTS AND INTEGRATION Today, ATP is in the midst of a conversion to another iteration of our surveys. The current transition involves three components: (1) refinements to the content and structure of the survey instruments; (2) collaboration with an external professional survey services firm for program-ming, administration, and support for survey design and data collection; and (3) integration into the BRS of special surveys that we have designed and conducted in recent years. Refinements to the survey content and structure are being made by incorporating feedback from prior survey and evaluation efforts, then by assessing proposed survey revisions against the five conceptual goals outlined above. An internal survey design team, working in conjunction with an external survey design firm, made initial revisions. The in-house team provides expertise in subject matter content, past practice, program goals, and overall direction, and the external contractor provides professional guidance in survey methodology. The contractor will ultimately program and conduct the revamped system of surveys, lending efficiency and consistency in the long-term administration of the surveys, enabling us to focus on analysis and results. The survey organization will follow up with respondents who do not answer the survey questions or answer only some of the questions. They will also edit the data for consistency and to reduce item non-response. Assessment of the revised BRS and other surveys against evaluation program goals is a long-term process being conducted by the full economic assessment staff, led by the survey design team. As the survey design team completes a draft of each phase of the revised survey instru-ment, the team submits the draft to the staff for review against program goals. Staff members provide input and comments, focusing on their assigned areas of expertise, enabling the survey team to further refine the draft. Because the BRS is a unified system, changes to later stages of the survey—such as the project closeout or post-project surveys—affect the baseline or anniversary surveys, necessitating further modifications to earlier drafts. Although we expect the bulk of the survey redesign work and transition to the contractor for administration to be completed by late-2005, we additionally expect the survey evaluation and revision mechanism to be an ongoing process. Though not a large survey in terms of population size, the BRS has a complex structure. Including 32 ATP awards made in September 2004, ATP has now funded 768 projects involving more than 1,500 participants. The population of the survey is diverse. About one-third of the projects are joint ventures, and over half include universities. All projects involve one or more for-profit firms, and these firms vary in size from tiny start-ups to large Fortune 500 companies. Furthermore, although the survey is administered to firms (or other business or research organizations), it is the project rather than the organization that is the focal point. We are interested in following the technology, rather than the firm per se, although we are interested in the economic and institutional effects and interactions between the technology and the firm. Thus, this is not strictly an establishment survey. A layer of complexity stems from the survey time frame. ATP funding is awarded through competitions, but the competitions do not occur at the same time each year and, especially in recent years, competitions may consist of several batches. This raises the question as to how to define a cohort and complicates survey administration and analysis. For example, the 27 projects that were awarded funding in May 2004 were actually the last batch of award recipients from our 2002 competition. They were the first group to begin with the newly redesigned baseline survey in October 2004. Furthermore, project duration may vary from about two to five years, and individual projects may be terminated early or suspended temporarily, requiring an adjustment to survey cycle times. A component of the current survey conversion effort is the integration of special surveys into the BRS. In particular, ATP conducted two separate surveys in recent years: a Survey of ATP Joint Ventures and a Survey of ATP Applicants. The Survey of ATP Joint Ventures addresses the structure, nature, and impact of collaborative relationships. The Survey of ATP Applicants covers the full population of all firms that applied for ATP funding in a given competition, enabling analysis of the question of additionality (or the difference that the ATP award made) by studying both firms that do receive ATP funding and firms that do not. Incorporating con-cepts and questions from both surveys into the BRS will improve efficiency in administration, facilitate analysis, and reduce respondent burden. Nonawardees will continue to receive a sepa-rate survey, since the BRS is designed to collect data from awardees. Finally, we are embarking on a redesign and integration of the databases that underlie our multifaceted surveys. We also hope to integrate data from external sources, such as data from Compustat and Dun & Bradstreet, into our internal databases. The database integration effort complements our survey redesign work, and it is also necessitated by the growth of ATP. Database methods and structures created in the early years are no longer adequate for an expanding survey population, especially in light of the complex structure of the survey cycles. Because our ultimate goal is the collection and production of data to serve the multiple functions of program evaluation, the analysis of R&D, and the identification and development of new research and evaluation methodologies, survey and database development will remain on the agenda for continuous process improvement. Go to next section. Date created: July 20,
2005
|
ATP website comments: webmaster-atp@nist.gov / Technical ATP inquiries: InfoCoord.ATP@nist.gov. NIST is an agency of the U.S. Commerce Department |