Intro. [Recording date: February 20, 2024.]
Russ Roberts: Today is February 20th, 2024, and my guest is author and technology expert Azeem Azhar. He is the author of Exponential: Order and Chaos in an Age of Accelerating Technology, which is our general topic for today along with what is coming next. His Substack is Exponential View.
Azeem, welcome to EconTalk.
Azeem Azhar: Russ, it’s really great to be on the show. Thank you.
Russ Roberts: What is your background? What have you done with yourself besides write a book and a very interesting Substack?
Azeem Azhar: I’m just a really lucky creature of time because I was born just as the microprocessor revolution took off in 1972. So, as a child I had a computer. We had a couple of computers by 1981 in the home. I always had them. But, my parents were economists and I ended up doing a social science degree, which included economics, but never leaving sight of my love of computing.
And, my career has been a bridge between those two worlds for the last 30 years. And so, I’ve worked in the tech industry. I know a little bit about economics. Not as much as you. Not as much as some of your guests. And, I try to bring them together in my daily life.
Russ Roberts: I hope your parents are okay with the fact that you’ve slipped into a more practical realm of life.
Azeem Azhar: I think they were quite happy when the book came out and it wasn’t about building products: it was about presenting ideas to the world.
Russ Roberts: So, let’s talk a little bit about your book to get started. It’s called Exponential. Why?
Azeem Azhar: What I had noticed was that there were a load of technologies–several technologies that were improving at these double-digit exponential rates other than computers. We’d known about computers improving at this 50, 60% per annum rate because of this articulation of Moore’s Law. But, it became clear to me by about 2014 or 2015 that we were seeing exponentials in other domains. So, in the cost of lithium ion batteries or the falling cost of solar panels. And, as I started to look around, I found more and more of these relationships that looked like Moore’s Law relationships, and I wanted to understand them. And so, I started to dig a bit deeper.
Now, of course, you are an academic and I was writing a trade book, so there is always a little bit of artistry in connecting those ideas for a general audience. But, I think the idea that we’re in an age of exponential technologies where things get cheaper by 10, 20, 40, 50% per annum on a compounded basis, and therefore they get deployed in our economies at very, very high rates is reasonably robust empirical observation. We can see it across a lot of different technologies.
Russ Roberts: And, it’s–a dramatic example of that which you use and illustrate is the speed of adoption of various technologies: how long it takes a technology to reach a particular threshold of market penetration. And, it’s faster today.
Azeem Azhar: It is so much faster, and it’s perpetually out of date. Because, when I submitted the written manuscript to the publishers, TikTok wasn’t a thing. And, while I was writing the second draft, TikTok had gone past a billion users faster than Facebook. And then, of course since then we’ve seen ChatGPT get to a hundred million users within a matter of a few days.
There are some obvious reasons for that. The first is that you don’t need to build out the infrastructure. We all have smartphones, we all have internet access, and that wasn’t the case for Yahoo or Amazon back in the mid-1990s.
But there’s a second reason, which is really to do with, I think, our stance and willingness to explore–maybe not the entirely novel, but the incrementally new. There are just mechanisms. Social media lets me find out about something far faster than I did previously. The idea that you psychologically might have fallen off the latest cool trend drives people to experiment with these things in ways that I didn’t really see people dipping into the Internet in the late 1990s.
And, yeah. So, this idea of time compression for adoption is very true on digital technologies.
But Russ, I also think that it is true in physical technologies. Because if we just look at this as an Internet phenomenon, we ignore the fact that there’s a lot of stuff going on in the back end in supply chains, in logistics, in marketing that makes it much more efficient for markets to let a customer know about a product and then physically get that in their hands.
And so, we can look at something as big and clunky as an electric vehicle. It weighs 3,000, 4,000, 5,000 pounds. And, these things are flying off the shelves far faster than anyone had forecast. And, the question is: Why is that? And, part of it is to do with the same phenomenon that makes us all learn about ChatGPT very quickly. Right? It’s social networks. The information spreads faster, firms are much more efficient.
At the heart of that, of course, remains information technology–IT. But, we are at this moment where it’s not just the digital products: It is the big heavy physical ones that are also being deployed in our economies at rates that we haven’t really seen before.
Russ Roberts: And, as you point out, this is driven–well, let’s talk about–you mentioned Moore’s Law in passing. For listeners who don’t know what it is, explain what it is. And then talk about Wright’s Law. W-R-I-G-H-T, Wright’s Law. Which actually is more interesting. So, talk about both of those and what drives them.
Azeem Azhar: Yeah. Moore’s Law is the thing that has made the computer industry the big successful thing that it is today. It was an observation by one of the founders of Intel that we would be able to put more transistors on a single silicon wafer at an increasing rate–roughly twice the density–every couple of years. And, if you did that, you would get performance improvements.
Russ Roberts: And, as you point out though, it’s not a law like gravity. So, what’s causing that phenomenon? It has slowed down a little bit in recent years and it’s caused some people to wonder whether it, quote, “no longer holds.” But, it held quite remarkably with quite a bit of reliability for a very, very long time. Why?
Azeem Azhar: So, it was so reliable for six decades. And, I think the beauty of it is that it was about collaboration and it was about incentives. So, you’ll discover in our discussion that I’m a wishy-washy centrist. I think there are things on liberal approaches and collaborative approaches and market approaches and communal approaches. They all have a part to play. And, I think Moore’s Law was exactly that. So, the Moore’s Law articulated effectively a social contract across the very big and increasingly complex semiconductor industry where people felt that they had to hit this clock speed of the doubling. And, it required lots of alignment in a way and individually developed R&D [Research and Development] plans to dovetail to the results that we saw for six decades.
But at the heart of that was the economic incentive of a growing market and being able to sell more products at better margins.
And, at the very top of that pile was the relationship between Andy Grove and Bill Gates, which was: What Andy giveth–Andy Grove from Intel–Bill taketh away. In other words, every time Andy came up with a new processor with more processor cycles, Bill Gates would figure out how to use them for a new application, forcing Intel to do that again.
But that process echoes all the way down the supply chain, and that micro-economy that was the semiconductor industry.
And so, in a way, some of the best analyses I’ve seen of it have been, say, this was as much about a sort of social belief that emerged within the participants of this economy and these individual agents–these firms–worked to deliver on it in a way that can only work in a market economy.
But it isn’t a law of gravity. And that, I think, is the important observation.
Russ Roberts: Yeah. I don’t know if you’re right. I suspect you are. But what’s fascinating about it is the idea that a cultural norm–almost like a religious belief–that people strove to fulfill it. That people made an effort. Partly because they were afraid they’d be left behind, by the way, if it was sustained. And, of course, that fear helps sustain that pace.
Now, talk about Wright’s Law.
Azeem Azhar: Yeah. So, Wright’s Law, I think, has got more of the attributes of a law that can be predicted. And, Wright’s Law emerges in 1936 when Theodore Wright is an aircraft engineer and he is looking at how the unit cost of making an airplane–an airframe–would decline as the engineers acquired more knowledge. Other economists–Marshall, I think, had said this 50 years ago, but hadn’t got the empirical data to back it up.
And essentially, what Wright said, was that for every doubling in cumulative production, the per-unit cost would decline–in this case by about 15%–as a result of learning rates. Right? So, the compounding knowledge of the engineers’ figuring out which screws weren’t needed and shaving off a little bit of the airframe here, being a bit more efficient with a process, reordering things, delivered this learning benefit. And then, it was revived in the 1960s by the Boston Consulting Group as the Learning Curve or sometimes the Experience Curve.
And, when we look at engineered products with many components, there should be a learning element to them. In other words, they’re big and complex and clunky when we first build them. And, as we get better and better, we are able to optimize that.
Now the thing about Wright’s Law is that Wright’s Law can be applied to the cost declines that we see in the semiconductor industry, and it ends up being more predictive than Moore’s Law. But, it also can be applied to other technologies. So, solar panels, lithium ion batteries, various types of other mechanical processes.
And the question is: Why does it come about?
And I think that it’s easy to tell by way of a story. During the COVID lockdowns, I started to bake. And the first loaf of bread I baked was really expensive. I mean, I just wasted lots of flour and the ingredients.
By the time I got to my eighth loaf of bread, it was so much more better value for money because I got better at what I was doing. My processes were better. And, that is at the heart of Wright’s Law.
Russ Roberts: And of course, the biggest cost of baking is your time. And, I’m sure you get better at that. Even though it’s not out of pocket, it’s an expenditure you have to make.
You know, it actually goes back before Marshall: It goes back to a guy named Adam Smith–
Azeem Azhar: Well, of course, yeah–
Russ Roberts: Smith writes that the division of labor is limited by the extent of the market. And, it’s a nice phrase. Economists learn it at some point–some do–and they can roll it off their tongue.
But, it’s more about–its as much about learning by doing. In other words, it’s not just to the extent of the market: it’s the extent of the process that the individuals in the firm are using. And, of course, he writes very eloquently in his very simplified and perhaps inaccurate, but still valuable, story of the pin factory of how you get better. It’s about learning by doing.
And it’s phenomenal, that process, that improvement, that better understanding that–Smith has all these examples of the kid who is working on this process, figuring out how to do it a little bit better. And, the idea is that if you’re specializing in it, you get focused on improving that process.
And, it doesn’t have to work that way. It could work that if you’re specializing in, you’re bored and you get driven insane–but, in a modern complex engineering problem, the opportunity for those improvements, as you say, is almost always there. And, they come about through experience. It’s really an amazing thing.
And, the other part of it that is so powerful, as you point out–they’re also related to Smith–is globalization.
So, at the same time that firms are expanding and price is falling–which is increasing the quantity demanded of the product–in a world of globalization, that opportunity to expand the scope of market penetration and learn by doing, as you expand, and drive the price further via competition, as others–firms–are doing that–getting better, finding those improvements–it’s a really beautiful feedback loop that’s, I think, not well understood by economists or laypeople because it’s dynamic. It’s not easily described in, say, a supply and demand picture. But it really is a beautiful thing.
Azeem Azhar: It’s really dynamic.
And, I think there’s something else that we can pick up on from learning-by-doing, which is that the idea of learning means that there is some knowledge which is likely to be intangible. And, the ability for us to share that knowledge can expand the number of firms who are applying that knowledge and can contribute back into the rate of learning.
And, a simple model would be that–you see this in Silicon Valley in California–where people leave firms regularly and they go from one to another and they take with them that tacit knowledge. And, while it’s harder to work out the learning rate for a software product, you can see, analogously, that there is an increased rate of learning because of that revolving door.
And you can also see, historically, moments where we started to pool knowledge: we were able to drive exceptional social outcomes in terms of driving prices down.
And one of my favorite examples is around the steel-making process when you had this period of time in the late 19th-century–Bessemer collected invention–where the demand for steel for the railroads and for industrialization was so great that some of the steel manufacturers pooled their patents–their know-how–together in order to share in a much larger market.
And, I think information technology plays a role in accelerating learning rates, because we are much better at codifying that knowledge and therefore using it within and across firms that are getting bigger and bigger.
And, I think one of the things that I touch on in the book is how in computer science it started, but it’s moved into other disciplines. Academics now shortcut the very long peer-review process. And they pre-print their information on something called Archive.
Now, in computer science, theory and practice are quite closely related, right? Because you just put the code in.
But, I think that it’s really interesting to me that the period of time from an innovation of being sort of identified by academics and making its way into working code has really collapsed.
So, in the late 1970s, some mathematicians–Ron Rivest and his colleagues–came up with an encryption algorithm called RSA [Rivest-Shamir-Adleman]. And, it was first published in some academia in the mid-1970s. But it didn’t make its way into mainstream consumer products for two decades.
And, today what will happen is that an academic–I’ve just before I spoke to you, spoken to one of the authors of the transformer paper from Google in 2017–the transformer being the architecture that gives rise to large language models. And, that paper was written and published in 2017. We had the first products–products/productizable products–within a year. And seven years on, yeah, there’s hundreds of millions of users.
And that’s quite a long timeframe compared to where we are with the spread of knowledge.
That’s not always learning by doing, Russ, but my observation is that learning by doing happens in a distributed way. It happens within organizations. And, because of IT, they’re able to share that knowledge much more rapidly.
And, they’re now moving into the next phase of this, which is to simulate the learning by doing so. There are many companies in physical engineering and manufacturing who instead of building 10,000 prototypes, each one better than the previous one, they model 10 million in a computer simulation and they get to a point of efficiency much, much more quickly. So, their starting point is better.
Now what I don’t know–and I would love to find research on this–is how that affects the ongoing learning rates, if you already start at a good place. One of the reasons I’m excited about where we stand is because once we see the value of economies of learning, not just economies of scale, once we start to acknowledge something that I think economists have known for such a long time, which is that technology has compounded knowledge, we can find ourselves at a point where we can drive these social outcomes, which I mean in the pure economic sense–right? welfare, prosperity–in ways that are not left to the spirits of arbitrary decisions.
And, I think this insight that was an insight for me and your peers had known for a long time about a decade ago, has really electrified me about how I feel about the next 20 or 30 years and what it might mean for the state of humans and humanity.
Russ Roberts: Well, let’s talk about that a little bit. Your book is, in a sense, out of date. Was written in 1921–
Russ Roberts: 2021.
Azeem Azhar: Feels like 1921 now.
Russ Roberts: Yeah. Yeah. It was written two and a half or so years ago, or published two and a half years ago.
For better or for worse, most of it hasn’t changed at all. It is in that sense a tragically timeless book in the following sense. There’s two themes of the book. The first theme is that these technologies, both in the world of silicon and also in the world of physical processes are speeding up. So, you focus on computing, energy, biology, and manufacturing. I assume–I’ll let you talk in a sec–but I assume all those trends have just continued.
The second part of the book is our ability to cope with this change has–we haven’t kept up. You call it the exponential gap. You talk about regulation, you talk about norms, legal systems like copyright. And, you suggest lots of interesting ways that we might respond to this changing world we’re in and how the world that we have of regulation, copyright, intellectual property, and norms or institutions and so on isn’t keeping up. But, not much has changed there. It seems to me that we’ve not made much progress at all in how to cope with this change.
So, let’s start with first, have the trends that you wrote about continue to accelerate in those four areas? And then, we’ll talk about our lack of progress in coping with that.
Azeem Azhar: Yeah. They’ve definitely–we’ve seen an acceleration. Within computing and the world of AI [artificial intelligence] it’s just hard to put words on what we’ve seen. One thing to look at is that the firms that make the largest capital investments every year now are the big tech firms like TSMC [Taiwan Semiconductor Manufacturing Company] and chips and Google and Amazon and Microsoft. It’s no longer the oil industry, and the oil industry is sort of a distant second place as industries go. And, we’re also seeing it, of course, in terms of the way in which companies are spending money in that area.
But, I think one area that I spent a bit of time on in the book and deserves more attention is what’s happening in energy. And, what’s happening in energy is that the price of solar panels is coming down really, really dramatically. And, in fact, I had tracked a 15-19% compound decline since 1970. If you ever watched the James Bond film–there is a James Bond film called The Man with the Golden Gun which was all about stealing a piece of solar power technology. You wouldn’t do that now, because it is so dirt cheap.
But, in the last year, Chinese manufacturers halved the price of solar panels or one of the components within solar panels. And so, that’s continuing.
And, I think it’s worth thinking about how dramatic and radical that is in an economic context for our economies. What you do when you are running off solar power rather than off fossil fuels is that you are away from the commodity volatility. Your price of energy is not dependent on what the regional autocrat feels like on a given day. You can make 20-year forecasts of what your price will be, and every subsequent installation will be, much, much cheaper. So, you trade off uncertainty and volatility, which has all of these frictional costs that we have to live with and contend with as the energy crisis of the last few years has shown.
The other thing is that solar panels are a modular technology; and modularity is a key part of taking advantage of Wrightean economics. Because, in modularity, your number of units produced is much, much larger than with these monolithic systems. So, you have more iterations of the learning rate because cumulative capacity is doubling faster.
But modularity also hugely expands markets. Because, 25 years ago, to become an energy producer, I would need a billion dollars, maybe $2 billion. Today I need $5,000, and I can stick some panels on my roof and I can connect them up to the grid. And, markets will then really expand rapidly. And, we’ve already seen that. So, if you consider it as an economy, China’s rooftops–domestic rooftops–are the second-largest provider of solar electricity anywhere in the world–right?–compared to all the utility scale in other parts of the world.
And so, I think understanding what’s happening in solar really is critical.
And, I’ll just share with you a couple of data points. So, the amount of new solar that we’ve added globally has increased by effectively 61% compounded since 2010. That is: net new adds each year. And in 2022, global electricity generating capacity was about nine terawatts across coal, and nuclear, and wind, and solar, and so on. Bloomberg has just forecast today that they think over the next seven years to the end of the decade, solar will add seven terawatts of new generating capacity. And, Bloomberg’s forecasts are always far short of what actually happens.
And that’s remarkable. Because, energy is wealth. The thing that has transformed humanity from 9,000 B.C. has been our ability to harness energy. And, the fact that we can have an energy system that is affordable, predictable, and in a sense, almost abundant–I mean, not literally abundant–has really significant implications.
And I’m excited. I’ll give you two economic implications. One is: It means that energy independence is affordable for many more nations. It’s not just the United States and Saudi Arabia and Qatar who can achieve this.
But, the second is that it enables local economic agency, local economic production. There’s a fascinating battle going on in South Africa at the moment which is full of brownouts because there’s not enough generating capacity. But, Cape Town has substantial renewable resources because of wind power; and Eskom, the sort of national body, has been really reluctant to allow Cape Town to access its own electricity resources because it wants to spread that energy nationally. And, the regulatory framework doesn’t make sense, right? Because you’ve got these local investments taking place.
And, I think this idea that decentralized low-cost, solar power can create much more local economic agency, create more economic principals, is a really, really exciting one. And, what it means, especially in underdeveloped markets, I think is yet to be fully thought through.
Russ Roberts: Have we gotten better at storage?
Russ Roberts: The big question with wind is wind and solar have two problems. They’re not on a hundred percent. There’s cloudy days and windless days or days with much less wind. And then it’s hard to store. Have we gotten better at that?
Azeem Azhar: We’re getting better at storage. We have, for the short duration, battery prices have come down really substantially over the last 20 years. They were about over a thousand dollars per kilowatt-hour a decade or so ago, and it’s now approaching a hundred dollars per kilowatt-hour. So, it’s becoming more affordable.
There is still an enormous gap in terms of medium-duration storage and longer-duration storage, both in terms of proven technologies, but also in physical capacity that exists and investment that’s going in there.
But, the thing that I would say is that it’s natural that storage will follow generation, because the need doesn’t emerge until the need exists. And so, I would expect storage to follow up quite quickly.
And, how that creates a patchwork of solutions is going to vary economy by economy. In a country like the United Kingdom where 25% of every car sold is an electric vehicle with 10 days worth of storage for a house in the battery, you might be able to solve part of the storage problem through a decentralized solution like that. In other markets, incredibly energy-poor–like Tanzania or Kenya–the amount of storage you need to keep a ‘fridge running–which transforms outcomes–and to keep an irrigation system running, is a few cheap lead acid batteries.
And so, we’re able to move into this space of transforming people’s lives. When we think of it from the bottom up rather than the top down, Gosplan [the state planning commission of the former Soviet Union] would not be able to make sense of how we have to plan for storage. But, I think that the market can do that if the incentives are allowed to flow through to the innovators and to the entrepreneurs and to the business people. I mean, I think it can.
Russ Roberts: Do you have an idea of what portion of solar energy is coming from rooftops versus solar farms? You mentioned the Chinese rooftops. After a while, you’ll have a solar panel on every roof. Possibly–potentially. There’s sort of a limit. Now maybe there’s not a limit for how it absorbs it. I don’t know. But is that what’s driving it? Is it rooftops getting solar panels, or is it also solar farms?
Azeem Azhar: The beauty of solar is that it’s both.
And I think the analogy to go back to is the microchip. So, prior to the arrival of the microchip, computers were very, very big. And, they were only bought by large companies; and they were in rooms. And then as we started to miniaturize the computer, it gave access to a whole new segment, which was corporates buying computers for their employees desks. And alongside, individuals could go off and buy the same.
And, today it effectively–no one buys mainframes. If you’re going to spend a hundred million dollars on computers for a data center, they’re not too dissimilar than our laptops without a screen. But, what you’re able to do is address a very, very broad market. And so, which part of the computing industry is the most important when it comes to selling chips? Well, there’s a bunch of quite large segments.
And so, I think, the beauty of that is that a given economy can put in the set of incentives that it feels are appropriate given its natural wind and solar resources, and whatever hydropower it’s got, and nuclear, kicking around. And, if it makes sense to incentivize homeowners to fill the gap, then you can do that. And, if you want to incentivize building on brownfields–old industrial land–for solar farms, you can do that. But, you have have the choice in a way that you didn’t have the choice when it was really about building big nuclear power stations. And, that was then all about where do you site them and who’s going to be willing to have it in their backyard? And, I think that that creates, I think, a much better starting position for the marvels of economics and incentives to play their role.
Russ Roberts: Let’s turn to computing. You have an essay on your Substack about just how much computing we’re going to need in the next 10, 20, 30 years. It’s unimaginably large. So, talk about why that’s the case, first of all; and then I want to turn to AI. So, first talk about just the demands for computing power that are coming.
Azeem Azhar: Yeah. Today’s demands for computing are visibly coming from AI systems that need huge numbers of these GPUs [graphics processing units]. In terms of processing, I think that we’re talking about 10-to-the 25-floating point operations to train the big state-of-the-art models. I mean, that–it’s a number that doesn’t really exist in economics, etc., in the worst cases of hyperinflation.
And, that’s why you are seeing $50-billion-dollar-a-year-plus CapExes [Capital Expenditures–Econlib Ed.] in servers by the big cloud providers.
I want to–let me just zoom back and say: What have we actually seen with the economy’s willingness to use computing? There were less than a hundred computers in the world in 1945. There are more than 25 billion today. So, the economy, pre-large language models and pre-AI, had a really insatiable desire to put information processing throughout the economy in the center, in the edge. Because what you are doing with information is–actually it’s game of efficiency. Information makes processes efficient. It’s a sort of analogous to learning by doing.
And, many of the issues that we used to run into in the 1970s–when my dad was working, and you’d have to look at stock levels. And, stock levels had to be really high because you just didn’t know what demand was going to be and you didn’t know when your supplier was going to supply. Well, with the arrival of IT and computers, we could better forecast demand and we could better predict supply. And, the amount of inventory that companies hold as dead capital has declined significantly.
So, the economy has shown an enormous appetite for the ability to process information. And, we can really go back to–you know, Khipu in Peru, in the South America, and tally sticks before that, to understand that.
So, then the question is: Well, let’s be a bit more discriminative and reductionist about where the sources of demand will come from.
So, AI is one. Another is each of us individually. You know, whether we like it or not, we upgrade our phones. A billion people-plus don’t have smartphones. Two billion don’t have modern smartphones. All of those will require upgrades. And, there are places where we don’t yet have intelligence that will make really meaningful differences to how well people can live their lives.
So, we don’t necessarily have small edge-based computers in fields across farms in India and Africa and the United States, all of which will help in precision agriculture to improve agricultural yields, to reduce the use of pesticides and herbicides and the like.
So, it’s not clear that there is a point at which we have satisficed our need for compute, or the benefits that we get from compute. There will be particular applications where we don’t need any more compute.
I mean, I think a good example is 8K resolution on monitors, which is already above what the human eye can discern. We might not ever need to go above that because we’ve satisficed that.
But, in other parts of the economy, I don’t see there being a decline in demand.
And, in that essay on the Substack, I did something very simple: which is I said: How much has compute grown globally since 1971? I choose 1971 because the Intel 4004 was released in that year. And, it was roughly 65% per annum on a compounding basis, which of course gets you to a really big number. And by that, I was trying to count the number of computers and their rough processing power and multiply them together.
It’s a great–I’m not even sure–Yeah, let’s make fun of economists. I’m not even sure economists would be happy with that estimate. I know physicists certainly wouldn’t be, but it’s an estimate. And, all I did was I just drew that out.
I said, it’s probably safer for me to extrapolate this at 65% than it is to say the regime that has held for 60 years is going to change.
And, that took me to a particular number. But, I’m really mindful of the fact that five years ago, six years ago, I had conversations with people in AI companies and semiconductor companies, and they were saying to me they were expecting the demand for compute over the next decade–so that’s five years ago–to increase by a factor of hundreds of thousands or millions of times.
And so, I’m trying to put together both the history, the theory, some working hypothesis, and what people on the ground in industry tell me. And, it’s a sort of embarrassingly simple curve that points upwards. And, even if I’m wrong by two orders of magnitude, we’re still talking about huge demand for computing in 30 years.
Russ Roberts: So, let’s turn to AI. In our recent survey–and I’ll let listeners know that I hope to tell you the favorite episodes of 2023 in a week or so–but, in that survey I asked listeners to give me feedback. And, one of the things many listeners said was they were sick of hearing about AI on my program. Which shocked me. I thought it was so interesting. And, whether it was going to save our souls or destroy them was an important question, and I thought we should spend some time on it. But I think for some listeners it was a little too much time.
But, I want to ask a different question of you–not this one of whether it’s going to destroy us. So, as you point out, the number of users of AI, CHatGPT, or others has crossed the hundred million threshold remarkably quickly, a couple of days.
Azeem Azhar: Right. Something like that.
Russ Roberts: Something absurd.
But, I’m one of those users, and I don’t use it. I use it as a novelty item occasionally. I don’t think to use it. It’s not part of my daily workflow. When I’m trying to write, I don’t think to start there. When I’m editing, I don’t think to end there and get feedback from it. Maybe that day will come. But, it has had virtually no impact on my life, except as the President of a college, we’ve had a number of conversations about what does this mean for our students in submitting papers? As they read books, will they be tempted to use it as a crutch? Should we regulate it? monitor it? and so on?
I have a feeling I’m missing something. I have a feeling below the surface, there’s a lot of usage of it that I’m unaware of as either a user or a consumer products that it’s built into. So, tell me where you think AI is going as a–not as a destroyer of worlds or the builder of paperclip factories, but as a changer of our lives, both in terms–in good and bad ways.
Azeem Azhar: I hear, and I recognize every word you’ve just said, Russ. You are not in the uncommon at all on this question of, ‘Well, what does it do?’ It’s a sort of moment of, ‘Well, now what?’ [More to come, 40:45]