Imágenes de páginas
PDF
EPUB
[blocks in formation]

Impact

Enrollment by grade

K-8

K-12

9-12

Number of high school graduates

Number of classroom teachers

[blocks in formation]

Source: Adapted from National Center for Education Statistics, Projections of Education Statistics to 1992-1993: Methodological Report with Detailed Projection Tables (Washington, D.C.: U.S. Government Printing Office, 1985), p. 31.

CCD provides most of the basic data for elementary and secondary education for the Projections of Education Statistics; it is featured in The Condition of Education and the Digest of Education Statistics (about two thirds of all tables reporting on elementary and secondary education in recent editions have involved data from CCD); and it appears to be used extensively by the department's statistical information office in responding to questions from a wide variety of federal, state, and local policymakers, teachers, and other constituents. However, in serving the needs of education policymakers, CCD has not had the kind of impact it could have had if problems of technical adequacy, timeliness, and relevance had been corrected.

Summary and Conclusions

Although CCD has some strengths with regard to its relevance, timeli-
ness, and technical adequacy, the balance of evidence suggests that it
contains numerous inadequacies. With few exceptions, data were not
uniformly available from all states. When data were available, they
were not on the same level of aggregation; some states had data availa-
ble at the school level, while others maintained them at the school dis-
trict level. Further, definitions and procedures for reporting data
elements differed across states. With respect to CCD's impact, there is
little direct evidence of its use in policy decisions other than its role in
supporting center publications and education projections and as a
resource for answering inquiries. Despite complaints dating back 30
years, recent reviews indicate that few of the problems have been
solved. However, some CCD data were technically sound enough to yield
consistent short-term projections, and some data elements were reasona-
bly consistent across states, suggesting that it is possible to obtain some
usable data from administrative records.

The Quality of Information

Fast Response Survey
System

Purpose and Background

Relevance

The Fast Response Survey System was established by NCES in the mid1970's to furnish data quickly when timely national estimates were needed for important educational issues. FRSS was designed to (1) minimize the respondents' burden (typically three to five questions are asked in a sample survey), (2) keep the time between a survey in the field and reporting its results to a minimum through a network of data coordinators, and (3) collect narrowly limited information that was not available from other sources.

FRSS was designed to gather information as needed through a contractual
arrangement with a private survey research firm from one or more of
the following six educational sectors: (1) state education agencies, (2)
local education agencies, (3) public elementary and secondary schools,
(4) nonpublic elementary and secondary schools, (5) institutions of
higher education, and (6) noncollegiate postsecondary schools with
occupational programs. A data collection network was developed for
each sector. Coordinators assisted in collecting data by maintaining liai-
son with sampled institutions and agencies. Representatives of each
institution or agency were identified and responsible for completing the
questionnaires. Data collection was intended to take 6 to 10 weeks.

Whereas all state education agencies were included in the system (making it a census), stratified random samples (with numbers of respondents ranging from 500 to 1,000) were designed to yield reliable national estimates for schools, local education agencies, and other institutions. Twenty-four reports or bulletins were issued between 1976, when the first FRSS study was reported, and September 1986. In interviews with present and former FRSS project monitors, we were told that surveys of state education agencies usually cost about $25,000 each and surveys based on nationally representative samples cost $80,000 to $100,000. The FRSS system is currently funded by a 5-year contract and has an annual budget of about $200,000 to $350,000.

FRSS is different from other information collection systems by being an information service, available only to department officials, rather than an existing information source. Practices differed across FRSS surveys, but the contents were generally specified collaboratively by the

The Quality of Information

Timeliness

requester, center staff, the Committee on Evaluation and Information Systems, and the contractor, all jointly involved in refining the policy questions, developing the survey, and determining the nature and scope of the analyses. The system was designed to tailor data collection to the needs of the requester. That is, relevance is built into the system, if it is fast enough to deliver results before changes overtake the requester and the questions.

FRSS differs from other information-gathering activities in terms of
whom it attempted to serve in each survey. As a matter of policy, the
system is limited to officials within the department who have a high-
priority need for quick information. Our analysis of the initiation of 23
requests showed that three broad groups have relied on FRSS. In 10
instances, requests came from officials within the Department of Educa-
tion (the assistant secretary, undersecretary, or a program officer
within the department). In 4 other cases, the department official used
the system as a way of fulfilling parts of congressional mandates or
requests. Six studies were initiated by special commissions, members of
advisory groups such as NCER and the National Commission on Excel-
lence in Education, or leaders of special initiatives established by the
secretary or the president. It was not possible to determine who initiated
the remaining 3 FRSS reports.

FRSS was developed to address, in part, concerns for which existing data were not available, not current, or not national in scope. While evidence of the timeliness of all fast response surveys is incomplete, it appears that delays have gotten longer. Figure 3.1 displays the available data on the elapsed time between the completion of a survey protocol and its publication or release date. For the early years of the system, 1976-79, the publication date could be determined for only five of the nine surveys. For these, the elapsed time varied between 6 and 14 months. The data for later years, 1981 to 1986, are more complete; the elapsed time was variable, ranging from 6 to 34 months. Timeliness appears to have declined, on the average, since the late 1970's.

Interviews with center staff suggest that this indicator overestimates
and underestimates the timeliness of the FRSS products. On the one hand,
since no documentation is available on when requests were made, using
time when the survey protocol is developed and has been cleared
through the review process-that is, through the Federal Education
Data Acquisition Council (FEDAC), CEIS, and the Office of Management
and Budget (OMB)—may underestimate the elapsed time by 4 to 8
months.

[merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][ocr errors][merged small]
[merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][ocr errors][merged small][merged small][merged small]
« AnteriorContinuar »