Impact Enrollment by grade K-8 K-12 9-12 Number of high school graduates Number of classroom teachers Source: Adapted from National Center for Education Statistics, Projections of Education Statistics to 1992-1993: Methodological Report with Detailed Projection Tables (Washington, D.C.: U.S. Government Printing Office, 1985), p. 31. CCD provides most of the basic data for elementary and secondary education for the Projections of Education Statistics; it is featured in The Condition of Education and the Digest of Education Statistics (about two thirds of all tables reporting on elementary and secondary education in recent editions have involved data from CCD); and it appears to be used extensively by the department's statistical information office in responding to questions from a wide variety of federal, state, and local policymakers, teachers, and other constituents. However, in serving the needs of education policymakers, CCD has not had the kind of impact it could have had if problems of technical adequacy, timeliness, and relevance had been corrected. Summary and Conclusions Although CCD has some strengths with regard to its relevance, timeli- The Quality of Information Fast Response Survey Purpose and Background Relevance The Fast Response Survey System was established by NCES in the mid1970's to furnish data quickly when timely national estimates were needed for important educational issues. FRSS was designed to (1) minimize the respondents' burden (typically three to five questions are asked in a sample survey), (2) keep the time between a survey in the field and reporting its results to a minimum through a network of data coordinators, and (3) collect narrowly limited information that was not available from other sources. FRSS was designed to gather information as needed through a contractual Whereas all state education agencies were included in the system (making it a census), stratified random samples (with numbers of respondents ranging from 500 to 1,000) were designed to yield reliable national estimates for schools, local education agencies, and other institutions. Twenty-four reports or bulletins were issued between 1976, when the first FRSS study was reported, and September 1986. In interviews with present and former FRSS project monitors, we were told that surveys of state education agencies usually cost about $25,000 each and surveys based on nationally representative samples cost $80,000 to $100,000. The FRSS system is currently funded by a 5-year contract and has an annual budget of about $200,000 to $350,000. FRSS is different from other information collection systems by being an information service, available only to department officials, rather than an existing information source. Practices differed across FRSS surveys, but the contents were generally specified collaboratively by the The Quality of Information Timeliness requester, center staff, the Committee on Evaluation and Information Systems, and the contractor, all jointly involved in refining the policy questions, developing the survey, and determining the nature and scope of the analyses. The system was designed to tailor data collection to the needs of the requester. That is, relevance is built into the system, if it is fast enough to deliver results before changes overtake the requester and the questions. FRSS differs from other information-gathering activities in terms of FRSS was developed to address, in part, concerns for which existing data were not available, not current, or not national in scope. While evidence of the timeliness of all fast response surveys is incomplete, it appears that delays have gotten longer. Figure 3.1 displays the available data on the elapsed time between the completion of a survey protocol and its publication or release date. For the early years of the system, 1976-79, the publication date could be determined for only five of the nine surveys. For these, the elapsed time varied between 6 and 14 months. The data for later years, 1981 to 1986, are more complete; the elapsed time was variable, ranging from 6 to 34 months. Timeliness appears to have declined, on the average, since the late 1970's. Interviews with center staff suggest that this indicator overestimates |