The Survey of Labour and Income Dynamics died this morning.
The notice was given quietly by Statistics Canada: “This is the last release of longitudinal data from the Survey of Labour and Income Dynamics. Effective with next year’s release of 2011 data, only cross-sectional estimates will be available.”
A short, simple, and slightly obtuse, statement of a profound change for the user community and Canadians in general.
When I recently described the loss of a similar survey to a co-author over the telephone, she paused and said with sadness, “Ahhh…,” as if a friend had died.
There is no doubt that Statistics Canada also recognizes the value of this survey, and others like it. But there are important challenges in managing the information derived from so-called “longitudinal surveys”, and Canadians might be wondering whether or not they are being sold short.
The Statistics Canada website describes the value of this recent casualty of budget cutting in this way:
The Survey of Labour and Income Dynamics (SLID) complements traditional survey data on labour market activity and income with an additional dimension: the changes experienced by individuals over time. At the heart of the survey’s objectives is the understanding of the economic well-being of Canadians: what economic shifts do individuals and families live through, and how does it vary with changes in their paid work, family make-up, receipt of government transfers or other factors? The survey’s longitudinal dimension makes it possible to see such concurrent and often related events.
By “traditional survey data” the Agency is referring to what is commonly called “cross-sectional data”, information on individuals at a particular point in time.
The Labour Force Survey is a case in point, a monthly household survey addressing employment related activities. Cross-sectional surveys may be regularly repeated. Since the main purpose of the Labour Force Survey is to track overall developments in the jobs market, the same set of questions are asked each month to a different, yet representative, group of Canadians.
The individual responses are added up to estimate the total number of Canadians employed and unemployed in a particular month, as well as the employment and unemployment rates.
This yields another type of data that we are familiar with: “time series data,” a consistent series of statistics that map the evolution of an aspect of the economy or society as a whole. The monthly unemployment rate is an example, but there are many others: the inflation rate, retail sales, and even estimates of the Gross Domestic Product are all based upon regularly repeated cross-sectional surveys of individuals or establishments.
Times series data are the bread-and-butter of national statistical agencies, and the principle reason they undertake individual level surveys. The managerial structure of these agencies has historically been optimized for the design, production, and dissemination of these types of data. This is the stuff of GDP, GNP, inflation, and unemployment: the “National Accounts” charting the macro-economics of a country.
But over the years it became clear to outside researchers that cross-sectional surveys are valuable not just as a way of calculating these macro-economic data; they have value in their own right.
The responses individuals give to a series of questions about their background characteristics allows researchers to examine the underlying correlates of important outcomes.
How much of the observed differences in male-female wage rates is due to underlying differences in education or work experience, as opposed to their differential evaluation? Knowing the answer gives us a sense of the degree to which there is wage discrimination.
Or similarly, what are the reasons recent immigrants have lower wages than their counterparts decades ago: is it language skills, education levels, or intangibles associated with the changing mix in countries of origin?
Statistics Canada slowly accommodated these needs. In the mid 1980s it started giving researchers access to the individual responses to cross-sectional data, always in a way that respected confidentiality.
But as valuable as this information proved to be, it was immediately apparent that it could not fully explain how Canadians live their lives.
Observing someone at a point in time cannot tell us how long they will be unemployed, or poor, or rich; what caused the spell to start, and what caused it to end.
Knowing that the average monthly unemployment rate during a particular year is, for example, 10%, does not tell us if the same individuals are unemployed each and every month during the year, experiencing very long spells of unemployment; or if a new group of unemployed start a very short spell each month. Similarly for poverty, or for that matter high income.
This is the “additional dimension” that Statistics Canada is suggesting is valuable, the need to follow the same individuals through time.
The SLID is an example of what is called a “longitudinal survey”, a series of questionnaires posed to the same individuals over successive periods of time. These types of surveys allow analysts to examine social and economic dynamics over time: over weeks, over months, years, and even over generations depending upon their design and purpose.
The National Longitudinal Survey of Children and Youth, which followed successive cohorts of children beginning in 1994, is another example. But this is also another death in the family. The fact that the last year of available information from this survey is for 2009 brought on my co-author’s lament.
Statistics Canada is proposing to stop the longitudinal dimension of SLID, but to continue using parts of the questionnaire as a cross-sectional survey. As such it will retain the capacity to calculate poverty rates, but only retain a much more limited capacity to explain how long they are, what causes them to start or end, how many people experience income losses or gains.
Innovative as longitudinal surveys sound, Statistics Canada was actually late to this party, and in a sense always early to leave.
It was late because it is a very cautious and careful statistical agency, incrementally improving its portfolio rather than drastically changing course. In part that is because of history, but also because of an ethic and culture of quality.
Indeed, when the SLID began in the mid 1990s it was only after more than a decade of experience with successively more complex surveys. A series of surveys called the Survey of Annual Work Patterns in the late 1970s and early 1980s followed the labour market activities of individuals over the course of a particular year; then the Labour Market Activity Survey followed individuals for periods of two or three years; and then finally the Survey of Labour and Income Dynamics. But even this survey had a limited horizon, following individuals for a maximum of six years.
Though the delay in launching truly longitudinal surveys did not serve the research community and Canadians well, it should be understandable as it reflected the need to develop skills and change an institutional culture.
But unfortunately this same caution in starting surveys is not matched in discontinuing them. This clearly does not serve the community well since longitudinal surveys increase exponentially in value the longer they are in the field, and many will likely find it to be less understandable, particularly in light of the experience elsewhere.
Consider the United Kingdom.
About 17,500 babies were born during a particular week in March of 1958 in England, Scotland, and Wales. They were part of a health survey designed to examine the factors associated with stillbirth and death in early infancy. But we know a good deal more about them.
They were surveyed again at the age of seven in 1965, and again in 1969, again in 1974 and 1981, and yet again in 1991 and in 2000, 2004, and 2008, and even now there are plans to survey them in 2013 when they turn 55 years old. These surveys are known as the National Child Development Studies, or simply the NCDS.
They have inspired the popular documentaries of the Up Series, but even more importantly they have inspired successive waves of surveys on other cohorts of children: a group born in a particular week in 1970 called the British Cohort Study; and another group born in 2000-01 called the Millennium Cohort Study.
The German Socio-Economic Panel began in 1984 with a representative sample of about 12,000 individuals. They have been surveyed every year, for what is now more than 25 consecutive years.
Consider the United States.
The grand-daddy of all longitudinal surveys is the Panel Survey of Income Dynamics—only one of many, many US longitudinal surveys—but whose website proudly proclaims it as “the longest running longitudinal household survey in the world.”
The PSID, as it is affectionately called, began in 1968 as an outgrowth of Lyndon Johnson’s “War on Poverty”. It was based on about 18,000 individuals who have been followed on an annual or bi-annual basis since. Not only that, the children, and even now the grandchildren, of the original cohort are followed once they became old enough to leave home and form their own households.
Alas the NLSCY is dead, and now the longitudinal part of SLID is also dead. But true to form Statistics Canada has for years been planning the next longitudinal labour market survey, which has already been tested, and put into the field. It appears that it will fill some of the gaps left by the re-design of the SLID.
It is also claimed that this Longitudinal and International Study of Adults (why “international”?, who knows, but the acronym sounds great, LISA) will be truly longitudinal, extending beyond the limited six-year horizon of the SLID.
The Statistics Canada website describes it, and its value, in a familiar tone:
The Longitudinal and International Study of Adults aims to improve our understanding of what is happening in the lives of Canadians so we can see what services are suitable for them, and what kinds of information they need to support their decision making about today and the future… [The questions this survey will answer include, among others:] How do people’s standards of living change, as they move into and out of work, relationships and parenthood, or retirement?
It is a major feat of management and organization to put a survey of this sort into the field, and sustain it over decades. The capacity to run longitudinal surveys is now part of the Statistic Canada’s skill set. It can start these surveys, but can it keep them going?
The managers and mathematicians at Statistics Canada are surely as capable and energetic as their British, German, and American counterparts, but what do these countries have in common that has led to such longevity, and that Canada may lack?
Here are some hints culled from the respective web sites:
- “The NCDS is managed by Centre for Longitudinal Studies and funded by the Economic and Social Research Council.”
- “From 1990 to 2002, [the German Socio-Economic Panel] was funded through the German National Science Foundation, … [It] now receives continued funding through the Joint Science Conference …”
- “The PSID is directed by faculty at the University of Michigan, and the data are available on this website without cost to researchers and analysts.”
The surveys are housed and managed by independent research institutes like the German Institute for Economic Research (DIW), or by research institutes affiliated with universities like the University of London, or the University of Michigan.
The financing comes from established agencies tied, not from particular government departments, but from agencies responsible for the social sciences, the equivalent of Canada’s Social Sciences and Humanities Research Council.
In all cases the research community has independent control over not just developing content, but managing the surveys and disseminating the data. Government departments and national statistical agencies are not directly involved, or if they are it is a hands-off and once removed involvement.
Further, the data are disseminated easily and widely, creating a broad constituency of users across disciplines, and between public and private sectors.
These agencies, institutes, and the user community all have an interest in the long-term.
These examples seem to suggest that to be successful longitudinal surveys need not only to be managed competently in a statistical sense, but they also need a managerial structure that matches their long-term horizon.
At Statistics Canada funding is annual, subject to the trade-offs in managing a whole portfolio of statistical products. It is also dependent upon financial support and direction from particular government departments whose interests and priorities ebb and flow, and are tied to broader government objectives.
Statistics Canada is responsible for managing the data, as well as its dissemination. This implies that surveys are released to researchers with considerable red-tape and bureaucracy since any slip in confidentiality could bring other parts of the statistical system into disrepute.
In a recent interview the current Chief Statistician of Australia, Brian Pink, made a revealing and important comment: “Neither the Treasurer nor Prime Minister can tell me how to go about my business. They can tell me what information to collect, but they can’t tell me how to do it, when to do it or how often to do it.”
But it is telling that the Australian longitudinal labour market survey—The Household, Income and Labour Dynamics in Australia Survey—which was started in 2001 and has guaranteed funding for 12 years, is not being run by the Australian Bureau of Statistics but rather by an institute at the University of Melbourne.
The current Chief Statistician of Canada is in a more challenging position. He also has the responsibility to manage surveys that form no part of Mr. Pink’s mandate, surveys whose value is in the long-term, much longer than a fiscal year, and even longer than an electoral cycle.
As Canadians embark on another experiment in longitudinal survey taking they should have confidence that Statistics Canada will design and manage the technical details in an efficient, effective, and indeed innovative way; but past experience, both here and abroad, may also make them wonder if the managerial structure and financial responsibility is designed to match the long-term horizon these data require.
[Update, June 22nd 2012: I just had an informal discussion with a senior official from the University of Melbourne who deals with the HILDA, and was told that the financing of this survey is not as firm as their website suggests. The inference I made in this post is likely not accurate. Multi-year agreements with the government must be negotiated, but it is the case that support and commitment of the University has also been necessary.]