Intro. [Recording date: February 1st, 2024.]
Russ Roberts: Today is February 1st, 2024, and my guest is economist and author Jeremy Weber. He is the author of Statistics for Public Policy: A Practical Guide to Being Mostly Right or At Least Respectably Wrong, which is the topic of our conversation. Jeremy, welcome to EconTalk.
Jeremy Weber: Thanks so much for having me. It’s a privilege.
Russ Roberts: How did you come to write this book?
Jeremy Weber: The book was in development in my head for probably more than a decade. It began after I spent four years working in the Federal Government, in a Federal statistic agency, the economic research service. And, that was a great place to be as a recent econ Ph.D. grad. And, it was a mix of and more academic research, very policy-oriented research, and generating real official federal government statistics, interacting with policy people.
Then I went into academia to teach statistics to policy students. And, the book I was using, the course that I inherited, very quickly I had the feeling I was more or less wasting students’ time, or at the very least there were huge gaps such that when they left my class, they weren’t going to be prepared to use any of this to help anyone in a practical setting.
And, from that point on, I started to accumulate notes on things that, if I were to write a book, I would want to include and things that I was now using to complement the statistics textbook to give my students more.
And then, in 2019, I spent a year and a half at the Council of Economic Advisers [CEA] and that was like a accelerator for this whole idea. Because, being engrossed in that environment, gave me many examples, many ideas. And then when I came back to the University of Pittsburgh and had a sabbatical, I said I’ve got to write this.
Russ Roberts: What is its purpose and who is the audience?
Jeremy Weber: Yeah. I’ll start with the audience.
The audience is broad, because frankly, whether it’s your first statistics class or your fifth, many of the issues are the same and neither the intro nor the advanced tends to do some things well. In particular, the communication of statistics to a non-academic audience, the integration of context and purpose of the moment or of the organization or of the audience into what you’re presenting–its significance for the situation at hand–we tend to not do that well, I think, at the undergraduate level. Or for Ph.D.s who are in their fifth year of econometrics. So it’s–the audience is broad.
Russ Roberts: So, it’s a very short book. There are a couple of equations, but there as–kind of like illustrations. And, what is spectacular about the book I would say–and I would recommend it to non-technical readers–what is very powerfully and well done about the book is giving the reader who is not an econometrics grad student, a very clear basic understanding of terms that you’ve heard all the time out in the world from journalists and occasionally a website you might visit that highlights academic research.
So, you’ll learn what a standard error is, you’ll learn what a confidence interval is. But, it’s not a statistics textbook in that sense.
However, those–that jargon–and other concepts that are used widely in statistics are very intimidating, I think, for non-academics.
And your book does an excellent job of making them accessible.
And then, of course it goes well beyond that. You’re trying to give people the flavor of how to use these concepts, use data that’s produced in all kinds of ranges of applications, calculation of means and correlations up through regression results that is more sophisticated. Statistical analysis. You’re going to give people insights in how to use them thoughtfully.
And, as you point out, no one teaches you how to do that in graduate school or in undergraduate if you take statistics. They’re taught more as, I would say, a cooking class. You learned to add certain ingredients together. If you want to make a cake, you need flour and you need eggs and you need this and a certain amount of heat. Whether it’s going to be a good cake or not is a different question. Whether that cake belongs to a certain kind of meal or a different meal, those are the things that practitioners learn if they’re lucky. But, you’re not taught those things.
And certainly people who don’t go to graduate school or don’t take a number of statistics classes in college will never, ever have any idea about it. So, I just want to recommend the book. If those kind of ideas appeal to you, you’ll enjoy this book and it will be useful to you. Is that a fair assessment?
Jeremy Weber: That’s a very fair assessment. You use the cooking example. I allude to kind of a vocational example in the book, where our statistics education, I would say teaches–it shows you: Here is the saw. And: Here are the parts of the saw. And, maybe we even, like, start it. And then, we put it down and we move on to another tool. Or, maybe we work with 10 different types of souped-up chainsaws, really sophisticated chainsaws. But, we’re just like, these are again, the features and parts of the chainsaw.
Actually going out and cutting down trees, like, do that–we don’t do that. That is–we don’t do that. We know people do that, but we’re not doing that.
And, that’s a bit of the gap I’m trying to fill.
Russ Roberts: And, the more standard metaphor you also use is the hammer. And, we may come back to this, but of course the standard, the cliché’d condemnation of mindless statistical education is: Once you have a hammer, everything looks like a nail. And, it’s really fun to run regressions and do statistical analyses once you understand how basic statistical packages work, without wondering whether it’s a good idea, what’s the implication of the analysis, how reliable is it, and does it answer questions as opposed to just provide ammunition for various armies in the policy battle?
And I think for me, that’s one of my concerns. We’ll come back and talk to it later I hope in terms of how we should think about the education in the practice of statistics. But, it’s such a fun tool. It’s a lot more fun than a hammer. It is more like a chainsaw. It’s noisy and attracts attention and people like to cut down trees. So, there is a certain danger to it that your book highlights–in a very polite way–but, I think there’s a danger to it. You can respond to that.
Jeremy Weber: Yeah. It is fun until it’s not.
And, when it’s not is when you are using this regression tool and you’ve maybe used it with the academic crowd; and that was fun. But then, you go to another crowd–the City Council crowd or some sort of more non-academic crowd–and you present it; and suddenly it’s not fun because nobody knows what you’re talking about and the conversation quickly moves on and you feel, like, out of place. Fish out of water. You’ve miscommunicated. People are confused. And now they’re ignoring you.
Russ Roberts: But of course, the flip side also occurs, right? The scientist in the white coat. And, in this case it’s the economist or policy analyst armed with Greek letters in their appendix. At least in their paper if not their physical one.
And, there’s an awe of these kinds of people: ‘And, obviously they’re smarter than I am and obviously they’re experts. Maybe I’m overly pessimistic here.’
A lot of times I feel like in those settings outside of academic life, there’s a lot of trust in the reliability of numbers produced with what I would call standard practice. And, once you follow the rules of standard practice–which means statistical significance, confidence intervals and so on, and you frame your work with those footnotes, then you’re credible.
And just simply because you’re in the arena and you’ve been trained accordingly, you’re a bit of a shaman. And, I think that’s a little bit dangerous.
As is the opposite: ‘Well, they’re obviously wrong. They are a bunch of academic eggheads and they don’t know what they’re talking about.’ So, I think there’s an interesting challenge there, I think, when we go out into the world.
Jeremy Weber: Yeah. You’re right. In certain environments there’s that deference, that credibility conferred because of the mathiness, because of the training, the aura. I agree: That is a case that does happen in certain environments.
Russ Roberts: Now, I argued in a recent episode that statistical analysis is used more for weaponry than truth-seeking in the political process. And, I think it was misunderstood by some listeners. I think it’s very useful to politicians to have data numbers and policy players. But I don’t think they’re so interested in the truth[?a truce?], and I wonder how your book would be perceived by them.
Jeremy Weber: Yeah. I agree with your assessment. Primarily weaponry, especially in the D.C. [Washington, D.C.] area.
But, if the weapons being picked up are actually real, understood measurements that accurately reflect an issue–they don’t reflect the full scope. They’re being used selectively. But if there’s good measurements out there and there are competing parties fighting, it means the party is going to pick up the most effective weapon that most appeals to the audience out there.
And so, if there are, in a way, better weapons out there that can be picked up, I think you have a greater tendency to some major problems being avoided or opportunities pursued.
And I’ll give you a concrete example. When I was in the White House, the commerce department was petitioned by some uranium mining producers for protection. They didn’t want imported uranium into the United States. The Commerce Department conducted an investigation, did a Report on the issue. They did their own–they did a survey. They presented some statistics in this Report that went to the President recommending restrictions on imports. Okay? You know.
So, Commerce Department, they’ve got their weapon. All right? And, CEA got involved–
Russ Roberts: The Council of Economic Advisers–
Jeremy Weber: That’s right. Council of Economic Advisers got involved.
I grabbed some other data. I did some analysis. I generated, you could say, another weapon that I thought was actually a better depiction or reflection of the economic reality and what was likely to happen under the Commerce [Department of Commerce] proposal.
All right. So, we got together Commerce, other agencies in the room, and in a way we had our battle. We picked up our weapons. I think we–at the end of the day–we ended up at a better place because I was able to pick up a weapon and there was this back-and-forth with the data. So, but, had CEA not been there, nobody or those reports that I relied on from the Energy Information Administration had that not been there, everybody would have bowed down to Commerce and they would have rolled right through, and the President would have said, we’ve got to import or restrict uranium imports so we can prop up these several producers out in Utah–at the expense of the nuclear power industry and electricity consumers. [More to come, 14:10]