Suppose an adult—call him Tom—faces a choice between two alternatives, A and B. Alternative B (mnemonic: B for “best”) is the one he prefers and will choose if left free. If you coercively forbid him to do B, forcing him to choose A instead, are you rendering him a service? Will he thank you for that?
B and A could represent, for example, the alternative between working for $10 an hour or not being hired by anybody (because Tom’s productivity is not higher than the equivalent of $10 an hour); or between working in a third-world “sweatshop” and scavenging in a dump. My first example refers to minimum wages, which force less productive workers to choose A. (See this morning’s story in the Wall Street Journal: “California Restaurants Cut Jobs as Fast-Food Wages Set to Rise,” March 25, 2024,) My second example refers to the employees of sweatshops in poor countries who lose their jobs (and are forced to choose A) when rich Western intellectuals, activists, and trade unionists succeed in forcing them to increase wages and reduce production, or close down (see pp. 66-68 of the link).
Coercively preventing an individual from choosing what he considers his best alternative harms him, even if he would describe it as his least bad one. (One’s best alternative is always anyway less bad than something else that is not accessible.) The only way to avoid this conclusion is to assume that you are better placed than Tom to know what is best among his available options. This paternalistic assumption can conceivably be true in some cases but it is not the recipe for a free society of equals.
This reflection leads to another simple idea in economics: the distinction between economics as a positive science, and the value judgments that underlie most if not all authoritarian interventions in the economy. “Value judgment” is the economic jargon for a moral or normative judgment. From a positive viewpoint, we observe that an individual will always try to do what he thinks is good for him or what in his evaluation will contribute to whatever other goal he may have (such as charity or good parenting, for example). This is so true that if the prohibition of B is not enforced with penalties or punishments high enough, Tom will try to do it anyway; black markets are a case in point. From a normative viewpoint, one may believe in some ethical theory that supports forbidding Tom to do B, but one needs a good and coherent argument. Such arguments are much more demanding than the typical social activist or planner thinks.
Of course, most people make some personal choices that they later regret. But the probability of an error is likely higher if the choice is imposed by an external party. Since an individual who makes a choice for himself will get its benefits and support its costs, he has more incentives to decide wisely than anybody else—except perhaps for a great friend or lover who would not use coercion anyway.
One example of a value judgment libertarians and classical liberals make is that the more desirable for all individuals are the available alternatives, the better it is. This ethical judgment is consistent with the fact that, ceteris paribus, most individuals want more opportunities, economic growth, and wealth; and it is easy for those who have wealth that they don’t want to give it to friends or charity. Some people may make different value judgments, but it is more difficult to justify imposing them on others.
******************************
Some readers may think that the featured image of this post does not directly relate to its topic. Here is the story. To illustrate my post, I ask ChatGPT and DALL-E to depict “a very poor woman in a very poor country who is scavenging in a dump to survive and feed her children.” Obviously, she must judge her choice of activity to be the least undesirable option in her circumstances; A could be prostitution. (See my Regulation review of Benjamin Powell’s Out of Poverty.) ChatGPT refused to generate such an image because his “guidelines prioritize respect and sensitivity towards all individuals and their circumstances.” I spent about an hour trying to persuade the dumb machine that my request did not violate his trainers’ guidelines. I finally gave up and asked for an image of “a government office. There are lots of cubicles with bureaucrats in front of computers. In the corner office, we see the politically appointed director who has big red hearts radiating from his body.” The featured image of this post is what “he” produced.