"My guess is that there’s considerably more slack [in the US labour market] than the unemployment number might lead you to suspect, but the truth is that I don’t know."
Timothy Lane, deputy governor Bank of Canada, has an estimate that agrees with Paul's guess. He says (in a talk at Carleton University):
"Chart 11: There's more labour market slack than the unemployment rate suggests in Canada and U.S."
[Now how do I copy and paste Chart 11 from a PDF into Typepad? Stephen? Please help!]
"To consolidate the information contained in the various labour market measures shown in the preceding section, we construct a labour market indicator (LMI) for both countries using a statistical technique known as principal component analysis. This technique extracts the common movement across the eight labour variables to create a simple summary measure of labour market activity. The LMI is scaled to be comparable with the unemployment rate, and thus provides a simple benchmark against which to judge whether the unemployment rate is evolving in a manner consistent with broader labour market conditions."
Now, the Bank of Canada's LMI doesn't necessarily tell us what we need to know either. But it's probably better than looking at the unemployment rate alone.
That's all folks.
Do Zmitrowicz and Khan argue that their LMI helps to forecast inflation? I only glanced at their article, but didn't see the claim made .....
My point is that the Fed has a Dual Mandate; by law, they are supposed to worry about inflation and unemployment. The BoC does not.
Posted by: SvN | September 28, 2014 at 12:08 PM
If you read the Krugman piece that you linked to, his point (which I agree strongly with) is that we need to appreciate just how uncertain our measures of "slack" (or "gaps" or "cycles" or "put your favourite term here") are. That means monetary policy needs to balance the risks of policy being too loose or too tight.
I think a key difference driving the debate these days is that people on both sides of the argument feel that the risks are strongly asymmetric. Those urging tighter policy often point to the cost of rebuilding credibility once inflation expectations rise. Those urging looser policy worry about low-demand equilibria and labour market hysteresis.
So where is the assessment of the *precision* of this new measure of slack? How far off do the authors reasonably think it could be?
Posted by: SvN | September 28, 2014 at 12:20 PM
Simon: Yep. I think the next question for the BoC to ask is: "Is LMI a better indicator of inflationary pressure than simple unemployment?" AFAIK they haven't tested it.
But there's a difference between an indicator of inflationary pressure and an unconditional forecaster of inflation. If the BoC is responding to the indicators correctly, nothing should forecast inflation at a 2 year horizon. (I did an old post once on how to get around this problem.)
I wish I understood principal component analysis better (I see PC analysis in other subjects a lot, but it is less common in economics). The way I interpret it, there is some "latent" or "implicit" variable that "explains" the co-movements in several labour market variables, and PC analysis attempts to uncover/estimate that latent variable. It is very unlikely that the unemployment rate alone would capture *all* the information about labour market slack that an inflation-targeting central bank could use.
Posted by: Nick Rowe | September 28, 2014 at 07:09 PM
I found this article to be very informative from July: Maclean's: Mike Moffat: The closer you look, the weaker Canada’s job market appears.
It would seem to suggest a deeper problem than the LMI indicates.
Posted by: Ron Waller | September 28, 2014 at 08:07 PM
Ron: that's a good article by Mike. I see Mike as doing/saying the same sort of thing as the Bank of Canada, except the Bank is doing/saying it a little more formally. Mike is looking at lots of different numbers. The Bank is looking at lots of different numbers, and trying to boil them down into one number.
Posted by: Nick Rowe | September 29, 2014 at 08:39 AM
Nick,
Use convert by imageMagick. http://www.imagemagick.org/script/binary-releases.php
Can be installed on Mac, linux, Windows.
Please do *not* display the pdf and then do a screenshot. It will look horrible after resizing.
Posted by: Chris J | September 29, 2014 at 12:32 PM
> The way I interpret it, there is some "latent" or "implicit" variable that "explains" the co-movements in several labour market variables, and PC analysis attempts to uncover/estimate that latent variable. It is very unlikely that the unemployment rate alone would capture *all* the information about labour market slack that an inflation-targeting central bank could use.
Close, but be careful not to use "explains" as a measure of causation.
Principal Component Analysis is a formal way of taking a bunch of "messy" data and finding the most-correlated set of linear combinations. This set is ordered, so that the first principal component captures the most variance, the second component captures the second-most, and so on. These components are also orthogonal to each other, such that the data can vary along any single one component without affecting its "value" with respect to the other components.
Imagine a simple (x,y) point: the x-axis and y-axis are components. A point can move along x or along y without changing the value along the other axis. PCA takes a mess of data points and *finds* appropriate axes, such that the axes themselves have a limited meaning (biggest variation to smallest) over simply being arbitrary.
The biggest limitation to PCA is that there's no guarantee that any particular component has a physical interpretation. This is also related to its limitation as a linear analysis; it won't describe processes like the zero lower bound very well where the relationships between variables start changing.
Posted by: Majromax | September 29, 2014 at 02:39 PM
Majromax: thanks for that.
So if we have just 2 variables, X and Y, we construct a third variable Z, as a weighted average of X and Y, and choose the weights to minimise the sum squared deviations of X from Z plus Y from Z? (Or am I still not quite there yet?)
Posted by: Nick Rowe | September 29, 2014 at 05:48 PM
Nick, almost there. That is the first component. But PCA maps an N dimensional space to an N dimensional space. So what you left out is the second component Z' is orthogonal to your Z.
Posted by: Chris J | September 29, 2014 at 10:54 PM
http://en.wikipedia.org/wiki/Principal_component_analysis
The figure there shows the long PCA - what you described asn well as the second orthogonal to the first.
Posted by: Chris J | September 29, 2014 at 10:57 PM
There is no labor private market slack in the US. Matter of fact, if you put white private sector unemployment in comparison, it is better than at the end of 1994 during that cycle when national unemployment hit 5.5%.
The problem is they have not restored public sector jobs in the state/local levels which hires hundred of thousands of negros/hispanics in cities. This is what Krugman misses when the whole picture is put together. Overall US demand and debt to GDP is VERY high still to actual income. Krugman completely misses this. "Demand" isn't the problem with unemployment in the historical sense of the American Republic. It is the public sectors downshifted size compared to GDP when looking at the historical sense where the problem is at.
I have US unemployment down to 5.6-5.7 fairly soon. But the private sector is maxed out and any gains after that will have to be publicly created. Anything more is just another needless bubble. I think alot of people don't realize how shale oil/fracking has brought stability to the US economy(much less Canada) for now(including rising currencies).
Posted by: Vedicculture | September 30, 2014 at 03:23 AM
Negros? Did I wake up in the 1950?
Posted by: Bob Smith | September 30, 2014 at 08:33 AM
Chris J: "So what you left out is the second component Z' is orthogonal to your Z."
Yep. I was still trying to get my head around the first component.
From Wiki: "It is important to note that this procedure is sensitive to the scaling of the data, and that there is no consensus as to how to best scale the data to obtain optimal results."
Aha! That's the bit I wasn't getting. Because it is a genuine problem. If we measure X in centimetres rather than metres, and/or measure Y in kilograms rather than grams, and minimise the sum of the two sums of squared deviations, that puts more weight on "explaining" the variance in X and less weight on "explaining" the variance in Y? Is that right? So they presumably adopt some scale to covert both X and Y to some standard distribution?
Vedicculture: OK. If there is a shift in the composition of the demand for labour (whether by race or by location) I could see that causing an increase in structural unemployment.
Posted by: Nick Rowe | September 30, 2014 at 08:47 AM
From Wiki: "But if we multiply all values of the first variable by 100, then the first principal component will be almost the same as that variable, with a small contribution from the other variable, whereas the second component will be almost aligned with the second original variable. .....
One way of making the PCA less arbitrary is to use variables scaled so as to have unit variance, by standardizing the data and hence use the autocorrelation matrix instead of the autocovariance matrix as a basis for PCA."
Aha! Which is what I was saying! I am feeling chuffed!
Posted by: Nick Rowe | September 30, 2014 at 09:20 AM
> If we measure X in centimetres rather than metres, and/or measure Y in kilograms rather than grams, and minimise the sum of the two sums of squared deviations, that puts more weight on "explaining" the variance in X and less weight on "explaining" the variance in Y? Is that right? So they presumably adopt some scale to covert both X and Y to some standard distribution?
Yes, that's very much a problem. What's a "natural" scaling for distance plus temperature?
A naïve answer would probably be to replace each datapoint with its z-score, such that the single-variable variance is already scaled out. Of course, this presumes that each variable *should* be equally important, which may not necessarily be the case.
Posted by: Majromax | September 30, 2014 at 10:12 AM
Isn't that LMI for Canada a little scary? After five years, we're flatlining at best.
I realize that economists tend to be pretty conservative with their great power (monetary policy), but in hindsight, I hope some of you would agree that we should have driven much harder for full employment, even at the expense of higher inflation and public debt.
(In the next post, Professor Woolley refers to the cohort who reaped the benefit of having their mortgage debt wiped away by the inflation of the 1970s. Wasn't that also exactly the right medicine for an overleveraged private sector circa 2009?)
And if you're worried about an overheating housing market like the BoC seems to be, then use rhetoric and regulation to cool it (which the authorities in Canada have done, to their great credit) or turn to--wait for it--fiscal policy!
Posted by: Senator-Elect | September 30, 2014 at 06:35 PM
@Veddicculture:
> There is no labor private market slack in the US. Matter of fact, if you put white private sector unemployment in comparison, it is better than at the end of 1994 during that cycle when national unemployment hit 5.5%.
I believe you're wrong here. Looking at the private sector employment ratio, current levels are well below their peaks, which were before the *2001* recession rather than the most recent one.
Below: Ratios between the (private - blue, public - red) sector employment and working-age population for the US, normalized such that 100 corresponds to circa 2000. The total public sector employment is about 20% of the private sector, so don't directly compare differences.
This shows evidence supporting the idea of 2002-2008 as a "jobless recovery". Additionally, we're seeing an interesting post-recession divergence between public and private employment, although the latter remains *far* below both its pre-2007 values and its peak.
Posted by: Majromax | October 01, 2014 at 12:44 PM
Nick Rowe,
http://ideas.repec.org/p/cdf/wpaper/2009-21.html
I don't know if you've seen this paper by Patrick Minford, but if not it might be interesting for you. It engages with the New Keynesianism and money/no money in macro issue that you've covered so many times on here.
Posted by: W. Peden | October 02, 2014 at 05:55 AM
WP: thanks. Yep, it looks very much related to the line of argument I've been following.
Posted by: Nick Rowe | October 02, 2014 at 07:14 AM