Cookbook econometrics has few fans.

I am one of them.

Cookbook econometrics provides clear algorithms for solving econometrics problems, without providing detailed explanations of why these algorithms work, or why specific steps in that algorithm are required.

For example, cookbook econometrics says "if you are trying to explain variable that can take on just two possible values, for example, smoker/non-smoker, use probit," but skips the formula for the inverse cumulative density function of the standard normal distribution, any discussion of how probit estimates are actually calculated, and proofs of the properties of the probit estimator.

David Giles sets out the case against cookbook methods in his blog:

*My contention is that if you've been taken through the proof, and seen the assumptions "in action", you're more likely to pay proper attention to those assumptions being satisfied when you use the result, day to day, in your empirical work.*

My position is the exact opposite: When you have struggled with empirical work, and seen econometrics "in action", you're more likely to pay proper attention to the proof, and understand the underlying assumptions.

The difference stems from contrasting views about how people learn. His position appears to be that it is possible -- indeed desirable -- to grasp and understand abstract concepts before applying them and working with them.

Yet it is impossible to know what other people are thinking. Sure, I could flash this on powerpoint:

I could make students copy it down, memorize it and reproduce it. But how many learn more about cumulative density functions than they would have if I'd just drawn a picture on the blackboard and explained it with words? When students are taken through proofs, how much do they actually understand? My experience suggests that only a few student gain a deep understanding of econometrics through abstract, theoretical instruction - though only they know what they know.

Moreover, time constraints mean that only a fraction of the econometric techniques in common use (for example, calculation of marginal effects in probit or tobit, quantile regression, interval regression, survival and hazard models, panel data techniques) can be taught with any degree of rigor at the undergraduate or even the master's level.

A possible response is that people who don't understand the proofs shouldn't go out and estimate things. Economists can take this position; people in other disciplines, whose adherence to rational self-interest is more practical than theoretical, will not. After all, "*Whether you are a beginner or an experienced analyst or statistician, IBM SPSS Statistics puts the power of advanced statistical analysis in your hands." *If economists don't provide their students with cookbooks, people will go out and hire chefs trained elsewhere.

Moreover, some recent thinking in mathematics pedagogy suggests that the concepts first approach may be misguided. John Mighton argues in "The Myth of Ability" that "The idea that it is always harmful to teach rules before concepts is not supported by the actual practice of mathematics." For example, practicing fractions - even if it feels like rote drill - builds the foundation for basic algebra.

I wonder: if "cookbook econometrics" had some other name - such as "basic training" or "data exploration" or "methods testing" or "inductive learning" - it would get a better rap. Because it is, in a sense, all of those things. A basic training in applied methods does not preclude theoretical instruction. Rather cookbook econometrics teaches econometrics in an inductive way - do it, get some experience, and then generalize from that experience. Observations first, theory second.

But cookbooks are girly and feminine, whereas serious econometrics is "penetrated by constructive and rigorous thinking." Yet gender stereotypes blinker our vision, and stop us from seeing reality accurately.

Cookbooks power social change, by telling people how to do new things, giving people new skills and techniques.

So if you don't like the way that things are being done, go out and write a better cookbook.

Hi Frances,

We introduced a 2nd year course a few years ago that allows students to 'get their hands dirty' with statistics before they take the abstract stuff.

https://courses.students.ubc.ca/cs/main?pname=subjarea&tname=subjareas&req=5&dept=ECON&course=226§ion=001

We use MS Excel.

Students have to collect and analyze some basic data.

I haven't taught this before, but my sense is that something like the Central Limit Theorem really becomes real to students if they get their hands dirty and try it out themselves. Much more so then proving it abstractly.

Posted by: Kevin Milligan | October 06, 2011 at 11:45 AM

In the mid-to-late 70's , we got our introduction to economics with Johnston's Econometric Methods , 2nd ed. after going through Blum and Rosenblatt's Probability and Statistics.

It was joked that you got an A if you could repeat the first paragraph, a B if you didn't fell asleep and a C for showing your face in the door.

I had survived and enjoyed mathematical physics, wave functions and all the stuff in Kreyszig's Advanced Engineering Mathematics as well as most of the Berkeley Physics series and Feinman Lectures on physics.

For the life of me , the only thing that remains is that beta hat is a better estimator of god-knows-what ( and why not gamma tugue?), that the critical value of the Durbin-Watson is 2.16 ( but must it be above or below?) I am still unsure if heteroscedastic marriage is legal in Canada.

Most drivers,even excellent ones,do not need to understand the laws of thermodynamics.

The longer I teach Introductory Macro and Micro, the more I understand my students dejected faces. They need to see how the economy works. Not some abstract concept,some of them we have never observed and never computed. The only demand curves I ever saw was as exercise in econometrics classes. I never saw a supply curve. Never heard of anyone computing one. Estimates of potential oil production are not done using a"supply curve".

Cookbooks are for people who wants to start cooking. I enjoy cooking and cokbooks. One day , I found that I needed books like Harold McGee

http://www.amazon.com/Food-Cooking-Science-Lore-Kitchen/dp/0684800012

But only after I cooked.

Posted by: Jacques René Giguère | October 06, 2011 at 12:24 PM

"But cookbooks are girly and feminine, whereas serious econometrics is "penetrated by constructive and rigorous thinking.""

So we're all getting screwed by econometrics?

Skilled cooks don't need a PhD in organic chemistry to make wonderful food. I don't know really anything about quantum mechanics, yet I manage to make a computers do useful stuff. Besides, it's fun to see that math does indeed 'work'; that tables of numbers really can tell you something interesting about the world.

Posted by: Patrick | October 06, 2011 at 12:32 PM

I guess I was wrong, cookbook econometrics does have fans! Thanks for the comments. That's interesting about the UBC course Kevin.

Posted by: Frances Woolley | October 06, 2011 at 03:54 PM

Gotta say I'm on David Giles' side on this one. It's important to make sure that the assumptions you're willing to make about the economics don't conflict with the assumptions you're making about the econometrics.

Posted by: Stephen Gordon | October 06, 2011 at 04:39 PM

Why is there necessarily a conflict between these two approaches to teaching applied statistics? Why can't you (for instance) teach general concepts in lecture and use lab sections to give the students practice applying those concepts to data?

I also think it's important to distinguish between teaching abstract concepts, and the rigor or formality with which you teach those concepts. When I teach introductory biostatistics to biology students, I don't prove the central limit theorem, or show equations for cumulative density functions, or derive properties of estimators, or anything like that. But I do teach the ideas that the abstract mathematics expresses. For instance, I teach about the relative power of parametric vs. nonparametric methods as a function of whether parametric assumptions are satisfied using an analogy about searching for a lost wallet.

Posted by: Jeremy Fox | October 06, 2011 at 05:24 PM

I replied thusly to Giles' original post, here it is again with a few edits:

Proofs are nice (it was nice to see (XtX)^-1(XtY) derived as the OLS estimator), but the epistemological foundations of econometrics (particularly questions of model selection and specification searching) are far more important, and usually neglected.

Consider the first assumption: "The model is correctly specified - that is, there are no omitted or extraneous regressors; and the functional form (with respect to the variables) is correct."

The problem here is that the very notion of model X having "missing" regressors or the "correct" functional form is meaningless unless there is a "true model" with the "right" regressors and the "right" functional form to which X can be compared.

What would "the true model" be? If the data of interest were generated by a model, then that's the "true model". The problem of course is that the data of interest never come from models, they come from the world.

Therefore, when dealing with the real world, there are, a priori, no "correctly specified models" (and there aren't any 'true coefficients' either). What then, does our first assumption even mean?

I found all econometrics and econometric theory completely baffling until I realized that all theoretical discussions of the properties of estimators were in the context of known DGPs, and that in the real world there ain't no such animal. The specification search (methods for which were NEVER ONCE discussed in any econometrics class I ever took, at either the undergrad or grad level), is the crude and imperfect fitting of a complex world into an oversimplified model... and there are many possible oversimplified models to choose from.

That this point is not generally made clear from the outset is an intellectual scandal IMHO. Get your head around what you are actually doing when you do econometrics (hopefully: intellectually honest inferences from data)... step through the proofs later.

I am a cheerful and enthusiastic proponent of cookbook econometrics.

Posted by: Darren | October 06, 2011 at 05:35 PM

all theoretical discussions of the properties of estimators were in the context of known DGPs, and that in the real world there ain't no such animalImportant clarification: all

classicaldiscussions do that. Bayesian methods are much, much better suited for studying economic data.Posted by: Stephen Gordon | October 06, 2011 at 05:44 PM

Of course, that's why I agree with David Giles' point. Are you performing repeated experiments with sample sizes that approach infinity? No?

Then why pretend you are?Posted by: Stephen Gordon | October 06, 2011 at 05:47 PM

Stephen - the question is not "Does a complete and thorough understanding of theory help one do better econometric work?"

The question is: "Does introducing students to theoretical concepts before teaching them how to do econometric procedures produce better applied economists?"

The answer to that question is far less obvious.

Let's go back to the cookbook analogy. The theory of yeast bread is basic biology: the yeast micro-organisms feed on water and sugars and emit carbon dioxide, which causes the bread to rise.

Millions of biology students know this.

But I would bet that people who have tried to make bread, and found themselves with a glutenous leaden mass because they added water or milk straight from the fridge, are far more likely to remember that yeast like to reproduce in a nice warm environment - and are far more likely to be motivated to learn about the life cycle of yeast.

Or here's another example - people sometimes pool panel data and estimate it like a cross-section. This requires the use of clustered standard errors. I've seen papers where this wasn't done. Now people who have read Mostly Harmless may remember Keisuke Harano's haiku:

T-stat looks too good.

Use robust standard errors--

significance gone.

We agree on the ends. It's the means we differ on.

Posted by: Frances Woolley | October 06, 2011 at 06:24 PM

Frances, you seem to be very good at stirring the pot. Great post, as always. I generally am in the school that there is a time and place for both.

I received most of my econometric training from the man himself, David Giles. I loved every minute of it. I discovered econometrics after my poli sci degree (I was likely desperate for something to have an actual right answer). I loved the math, the proofs, the theory, all of it. I will also say that David teaches theory and always follows it up with applications. I better understood both the theory AND the applications when they were combined in this manner.

Now, however, I teach in a program where math seems to be verboten. I teach economics with no math, only intuition including a public finance course. I am about to teach a quantitative course in program evaluation with no math. Much like in the economics courses I teach, I will focus on intuition (how do we determine causality). This is because they need to be able to understand quant methods used in evaluation and why. We can do this with words. I fell a little dirty doing it but adding in the math and proofs to this bunch will do more harm than good.

So I think it very much depends on the audience. There is a time and a place.

Posted by: Lindsay | October 06, 2011 at 06:29 PM

And on a more positive note, I really enjoyed this post by David on micronumerosity

Posted by: Frances Woolley | October 06, 2011 at 07:04 PM

"The question is: 'Does introducing students to theoretical concepts before teaching them how to do econometric procedures produce better applied economists? The answer to that question is far less obvious.'"

Here is a messy experiment: in the medical sciences and in the social sciences other than economics quantitative methods are commonly taught with much less rigor than in economics. Explanations are verbal, not mathematical. It's a hardcore cookbook approach. Which students come out of their coursework better statisticians? There is a selection bias problem here but I think that even if we could correct for that, the answer would still be very clear.

Posted by: Chris Auld | October 06, 2011 at 08:11 PM

Frances, the example of "cookbook econometrics" that you give illustrates perfectly just why it is so harmful. By far the single most important thing to get across to students when introducing probit models is why OLS is

notappropriate when the dependent variable is binary. Sure, you don'tneedto know all of the details about maximum likelihood estimation of probit models in order to competently "use" them, but, more generally, without some serious foundations in probability and mathematical statistics, why would students have any reason to think they should bother to use anything other than OLS for every application they ever run into?As a side note, the more practical side of me also realizes that most students earning a BA or MA (and even some earning a PhD) will literally never "do" any econometrics for the rest of their lives anyway (I find it hard to believe that people are sitting around in their community centres using econometrics to push for "social change"). Accordingly, teaching "methods" is entirely wasteful - at least with teaching "theory", you can hope to offer some "intellectual exercise" (if you care for such things). As I tell my students all the time, even monkeys can be trained to push the right buttons on a keyboard.

On another side note, I'm not sure where you got the idea to suggest that cookbooks are "girly and feminine": I generally find that my female students actually prefer the theory, while my males students just want to know the methods. Perhaps I'm being sexist here, but I would think that males are more likely to prefer getting their hands "dirty", while females are more likely to prefer keeping their hands "neat and clean".

Posted by: econometrician | October 06, 2011 at 08:29 PM

Econometrician

" without some serious foundations in probability and mathematical statistics, why would students have any reason to think they should bother to use anything other than OLS for every application they ever run into"

Because they trust that the person who wrote the cookbook has superior knowledge to them, and that they're likely to do the right thing if clearly written instructions are available to them. No one wants to do lousy, sloppy econometrics if there are better techniques readily available.

You could give students a rigorous proof of why probit is necessary. I could draw a bunch of dots, a linear probability model, a cdf, and say "Look, the linear probability will sometimes predict negative probabilities or probabilities greater than one. This is clearly stupid so you're better off using probit." (Actually you probably draw that picture too).

I'm betting that in five years time the picture is all that they will remember.

" at least with teaching "theory", you can hope to offer some "intellectual exercise"" - thinking about how economic models can be applied and tested is a pretty good intellectual exercise - it actually gets people thinking about economic theory, what counts as an economic explanation, the nature of causality - a whole load of deep and abstract things. And because time is limited it is sometimes simply impossible to explain the why behind everything.

On cookbooks being girly and feminine - Can we agree that cooking and kitchens are traditionally associated with women? And that a way to insult a man is to depreciate his masculinity e.g. call him an f_g? Hence "cookbook econometrics" is insulting by nature of its association with femininity, lack of toughness, strength and rigor?

Posted by: Frances Woolley | October 06, 2011 at 09:52 PM

At my department, most PhD students already do something similar. We have to take two courses in econometrics (half of Greens book per course), so we do – but then we go to the biostatistics department to really learn (and then, afterwards, you might go back to Green and actually enjoy it).

Posted by: Nemi | October 07, 2011 at 01:54 AM

Sorry - I where supposed to start the last comment by quoting:

"If economists don't provide their students with cookbooks, people will go out and hire chefs trained elsewhere."

Posted by: Nemi | October 07, 2011 at 01:56 AM

Stephen Gordon: "Bayesian methods are much, much better suited for studying economic data."

In that case, it's rather unfortunate that these methods are usually left completely out of the economics cirriculum.

----------

econometrician: "As a side note, the more practical side of me also realizes that most students earning a BA or MA (and even some earning a PhD) will literally never "do" any econometrics for the rest of their lives anyway (I find it hard to believe that people are sitting around in their community centres using econometrics to push for "social change"). Accordingly, teaching "methods" is entirely wasteful - at least with teaching "theory", you can hope to offer some "intellectual exercise" "

I find this bizarre. "It doesn't matter, since our students will never do any econometrics anyway" is not exactly a spirited defense of the current cirriculum.

I would suggest that if "intellectual exercise" is the goal, there are many, many, many subjects that are far more worthy of a person's precious time on this earth than econometric theory (a subject, which, divorced from practice, is likely the most tedious and sterile field of intellectual inquiry ever devised).

"Teaching method is entirely wasteful"? What? You want to teach econometrics without discussing problem articulation, data cleaning, and specification searching?

Wow. Just.... wow.

Posted by: Darren | October 07, 2011 at 12:28 PM

To be effective cookbook econometrics should use well prepared data sets with known properties. There is an equivalent in physics - laboratory experiments. Before starting actual measurements, students need to learn some basic statistics (i.e. econometrics) and then apply it to the measured data. The trick is that all experiments are extremely well controlled. As a rule, any deviation from the original conditions can be easily found by statistics.

In econometrics, actual data lack simplicity and thus cookbook might not give desired results. (One may use synthetic data, however.) Moreover, physical experiments have to confirm known facts with the measurments merging with all previous sets of similar data. Statistics in economics, dirty or not, is usually uncertain. This reduces practicality of (cookbook) econometrics.

Posted by: kio | October 07, 2011 at 04:46 PM

Except that almost all of our data are non-experimental.

Posted by: Stephen Gordon | October 07, 2011 at 05:16 PM

kio: "In econometrics, actual data lack simplicity and thus cookbook might not give desired results."

Actual data don't follow the assumptions of the standard OLS model, and thus a straightforward application of theoretical models to actual data might not give desired results either.

A good cookbook will always specify under which conditions a given recipe will work. E.g. my pesto recipe says "begin with 2 cups fresh basil." My roast chicken recipe specifies that the chicken be given certain types of preparation, innards removed, cavity rinsed, etc.

Now as an experienced chef, I know that I can substitute parsley for up to 1/2 cup of the basil, and the result will still be palatable. I know I can skip warming the chicken to room temperature if I add a bit to the cooking time.

Indeed, an absolutely crucial part of econometrics is diagnosis - figuring out which solution to use when. Some general theoretical principles help with diagnosis e.g. what are you assuming about the error term? Where does the error in the data come from? What is the source of the variance? What are your identifying assumptions? What is your population? But a lot of the time it comes down to "O.k., this is my problem, what is a good solution - tell me what I need to do, don't bother to explain precisely the mechanics behind why it works."

I wonder if econometrics "cookbooks" would have more prestige if they were called "diagnostic manuals"?

Posted by: Frances Woolley | October 07, 2011 at 05:21 PM

Frances:

> thinking about how economic models can be applied and tested is a pretty good intellectual exercise

I totally agree. I guess that I am interpreting "cookbook econometrics", as something akin to "load Stata, click button x, if number y is bigger z, reject"

Darren:

I said only that, if students never "do" econometrics later in life, then teaching methods (as in learning how to follow recipes like the one I give above), is entirely wasteful. Teaching students how to "think like" an econometrician/statistician is perhaps one of the most useful intellectual exercises I could think of. To paraphrase Joan Robinson, "the purpose of studying econometrics is not to acquire a set of ready-made answers to econometric questions, but to learn how to avoid being deceived by econometricians (or other people using econometrics)."

Posted by: econometrician | October 07, 2011 at 08:37 PM

econometrician - "I guess that I am interpreting "cookbook econometrics", as something akin to "load Stata, click button x, if number y is bigger z, reject""

That doesn't sound like the work of someone with a cookbook, that sounds like someone who's just making things up as they go along without the benefit of a decent recipe.

Posted by: Frances Woolley | October 07, 2011 at 10:17 PM

Frances:

You'd be surprised! The students I get (who are at the upper-level undergraduate and graduate levels) tell me that this is more-or-less exactly the kind of thing they get from their "statistics for economics"/"regression for business" courses. In a nutshell, they just memorize some equations and learn how to push buttons on their calculators. They have, for example, no idea what a p-value means, but they know that, when it is less than 0.05, it's time to "reject H0".

Posted by: econometrician | October 07, 2011 at 11:47 PM

econometrician:

"They have, for example, no idea what a p-value means, but they know that, when it is less than 0.05, it's time to "reject H0"."

There is a world of difference between what a professor has attempted to impart, and what ends up in students' brains six months after completing the course.

Just because the students have no idea what a p-value means does not mean that they were not provided with an explanation, perhaps even a lucid, correct, or perhaps even a brilliant explanation of the concept. It could be that, even provided with a brilliant explanation, they did not understand it, or that the concept was not deemed worthy of storage space in long-term memory.

After all, doing applied econometrics is very much a post-modern type of exercise: there is no truth, only truths.

Suppose, for example, I am attempting to explain the quantity of leafy green vegetables demanded. Theoretically, both income and education should be related to consumption of leafy greens. But income and education are strongly correlated. Is it possible to actually know how much income, as opposed to education, affects leafy green consumption? Perhaps you could give me some method of separating out the two, which would depend upon making a whole bunch of other assumptions, that may or may not be valid given the data at hand. I'm frequently sceptical of these things ("The effect is positive, but when we instrument education with grandmother's eye colour it becomes negative.")

All you can do, really, is be careful, explain your methods, and let your results speak for themselves.

And while doing that, you don't actually need to explain what a p-value is. It is what it is - the reader can make up his or her own mind whether or not to believe your results.

At this point, I very much doubt that I could come up with an explanation of a p-value that would satisfy your exacting standards. When I'm writing about data, I don't ever need to - I just say "the coefficient is statistically significant".

Indeed, one of the challenges I came across when I was taking students through the result of the impact of height on income results (see my post on 9 steps to cleaner data) was trying to convey to them "look, everything is statistically significant, but don't get too excited about that, when you have 80,000 observations just about anything is going to be statistically significant." Also they were a bit puzzled about how the choice of base case affects the significance of dummy variables: "The marital status dummies aren't significant, but that's just because of my choice of base case - if I'd chosen maritalstatus2 as my base case, marital status3 would be statistically significant."

Posted by: Frances Woolley | October 08, 2011 at 09:39 AM

Frances: Nice post, as always. Some fascinating comments, too.

For the record:

at no point did I suggest that the econometric theory should precede getting one's hands dirty with real data.

I don't believe that's desirable, and I've never suggested that it is - ever. I guess some of the comments were based on not reading my post. Empirical work is essential to the learning of econometrics. Personally, I'd refuse to teach an econometrics course that did not include hands-on computing lab. classes with decent data-sets.

In my own case, I learned some basic econometrics - from an excellent teacher; spent some time getting my hands extremely dirty working on the construction of "large" econometric models for a central bank; and by then I was highly motivated to get into econometrics seriously.

Motivation and intuition are essential to the whole learning process, and I personally teach my own courses that way - whether "classical" or Bayesian - I teach both approaches - my PhD was in Bayesian econometrics.

The one thing I always tell my students in the first class of a new course is that part of my job is to teach them to be VERY skeptical of the econometrics they'll read about in their textbooks (which are NEVER of the cook book variety ;-)). If I don't succeed, then I've failed the course! That's precisely why they need to know the theory - if not, they'll end up thinking that it's actually O.K. to use the techniques they've learned about without questioning if the conditions necessary for their correct application are actually satisfied.

Re-reading that post, I think my main point is pretty clear. If you're going to teach a course that purports to be "Econometrics", then do it properly. If you want to teach a course that's a non-technical introduction to some quantitative methods for economists, then just call it that - unashamedly - but please don't call it something it's not. I'd make the same point about courses in any other area of economics, for that matter.

Frankly, when we dispense with the myth that I think that the theory should be taught before one is "let loose" on the data, I don't think we're really too far apart on most of this.

Posted by: Dave Giles | October 10, 2011 at 11:53 PM

Dave, thanks for taking the time to respond - b.t.w., I just told one of my econometrician colleagues about your blog, and Chris Auld's, and suggested that they point their grad students to them. When I understand your posts, I really enjoy them!

I guess we differ in a couple of respects.

The first is the nature of knowledge, and the possibility of knowing theory. A person with good spatial sense will be able to "see" orthogonality in three dimensions, a person without good spatial sense won't. Econometrics is advancing so rapidly that sooner or later most people come to point where either (a) they start running into the limits of their conceptual abilities, i.e. to visualize multi-dimensional spaces or (b) there's not enough time to thoroughly learn the theory underlying the methods that they are using.

After all, what does it mean to know the theory? E.g. one can know the equation that is being estimated in a logit model, or one can know the algorithm that Stata uses to estimate that model, or one can know how Stata is programmed, or one can know how the computer operates. (Your post on the problems with spreadsheets pointed out the importance of knowing program algorithms). At some point one ends up doing stuff without really knowing what is going on underneath.

Does not knowing theory matter?

You can't know everything. The post I'm working on right now will make the argument that sample selection and choice of explanatory variables often matter much more than such things as logit v. probit v. linear probability models. So one has to ask: where is the value of the latest hour spent in research design the greatest - in learning the theory behind the econometrics, or reading the questionnaire, and figuring out exactly how each question was framed?

My view is that given the limits to knowledge, at some point most applied economists end up just asking someone more knowledgeable "what's the best thing to do in this situation."

And this brings me to the second point where, I think, we differ: how we perceive cookbooks.

Cookbooks like those written by Mark Bittman are all about technique - here is a method for stuffing vegetables, these are steps or procedures that you should follow, this is when the method can and cannot be applied.

Not "non-technical introductions." Just a different emphasis: read the questionnaire, look at your data, figure out how the questions was framed, how to treat missing observations/outliers, refine your research question, think about what your sample should be, etc. Very technical. Just not econometric theory-type technical.

But as important, if not more important, than econometric theory techniques.

Hopefully you saw Lindsay's nice comments about your econometrics course and defence of your approach?

Posted by: Frances Woolley | October 11, 2011 at 08:12 AM

Frances: Fair enough. Actually, I don't think we really differ on the second point at all. I have no problem with "....read the questionnaire, look at your data, figure out how the questions was framed, how to treat missing observations/outliers, refine your research question, think about what your sample should be, etc. Very technical. Just not econometric theory-type technical."

My gripe is with students whose only prior exposure to "econoemetrics" when they enter a grad. program is some badly conceived undergrad. course in which they've basically been told, "this is the formula - go off and use it"; or "if p < 5% then reject the null hypothesis"; etc.

That's what I mean by a "cookbook course". And believe me, there are lots of these courses and such students around. Grrrrrr!!!

What you're describing in the above quote is just great! I wish all of my students had been exposed to something like that before taking a theory course - at the undergrad. level. In this respect, I think our wishes and objectives are fully complementary.

Looking forward to your next post.

Posted by: David Giles | October 11, 2011 at 11:45 AM