Expected utility — a serious case of theory-induced blindness

22 Jul, 2016 at 13:01 | Posted in Economics | 1 Comment

Although the expected utility theory is obviously both theoretically and descriptively inadequate, colleagues and microeconomics textbook writers gladly continue to use it, as though its deficiencies were unknown or unheard of.

Daniel Kahneman writes — in Thinking, Fast and Slow — that expected utility theory is seriously flawed since it doesn’t take into consideration the basic fact that people’s choices are influenced by changes in their wealth. Where standard microeconomic theory assumes that preferences are stable over time, Kahneman and other behavioural economists have forcefully again and again shown that preferences aren’t fixed, but vary with different reference points. How can a theory that doesn’t allow for people having different reference points from which they consider their options have an almost axiomatic status within economic theory?

Thinking Fast and SlowThe mystery is how a conception of the utility of outcomes that is vulnerable to such obvious counterexamples survived for so long. I can explain it only by a weakness of the scholarly mind … I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking it is extraordinarily difficult to notice its flaws … You give the theory the benefit of the doubt, trusting the community of experts who have accepted it … But they did not pursue the idea to the point of saying, “This theory is seriously wrong because it ignores the fact that utility depends on the history of one’s wealth, not only present wealth.”

On a more economic-theoretical level, information theory — and especially the so called Kelly criterion — also highlights the problems concerning the neoclassical theory of expected utility.
Suppose I want to play a game. Let’s say we are tossing a coin. If heads comes up, I win a dollar, and if tails comes up, I lose a dollar. Suppose further that I believe I know that the coin is asymmetrical and that the probability of getting heads (p) is greater than 50% – say 60% (0.6) – while the bookmaker assumes that the coin is totally symmetric. How much of my bankroll (T) should I optimally invest in this game?

A strict neoclassical utility-maximizing economist would suggest that my goal should be to maximize the expected value of my bankroll (wealth), and according to this view, I ought to bet my entire bankroll.

Does that sound rational? Most people would answer no to that question. The risk of losing is so high, that I already after few games played — the expected time until my first loss arises is 1/(1-p), which in this case is equal to 2.5 — with a high likelihood would be losing and thereby become bankrupt. The expected-value maximizing economist does not seem to have a particularly attractive approach.

So what’s the alternative? One possibility is to apply the so-called Kelly criterion — after the American physicist and information theorist John L. Kelly, who in the article A New Interpretation of Information Rate (1956) suggested this criterion for how to optimize the size of the bet — under which the optimum is to invest a specific fraction (x) of wealth (T) in each game. How do we arrive at this fraction?

When I win, I have (1 + x) times as much as before, and when I lose (1 – x) times as much. After n rounds, when I have won v times and lost n – v times, my new bankroll (W) is

(1) W = (1 + x)v(1 – x)n – v T

[A technical note: The bets used in these calculations are of the “quotient form” (Q), where you typically keep your bet money until the game is over, and a fortiori, in the win/lose expression it’s not included that you get back what you bet when you win. If you prefer to think of odds calculations in the “decimal form” (D), where the bet money typically is considered lost when the game starts, you have to transform the calculations according to Q = D – 1.]

The bankroll increases multiplicatively — “compound interest” — and the long-term average growth rate for my wealth can then be easily calculated by taking the logarithms of (1), which gives

(2) log (W/ T) = v log (1 + x) + (n – v) log (1 – x).

If we divide both sides by n we get

(3) [log (W / T)] / n = [v log (1 + x) + (n – v) log (1 – x)] / n

The left hand side now represents the average growth rate (g) in each game. On the right hand side the ratio v/n is equal to the percentage of bets that I won, and when n is large, this fraction will be close to p. Similarly, (n – v)/n is close to (1 – p). When the number of bets is large, the average growth rate is

(4) g = p log (1 + x) + (1 – p) log (1 – x).

Now we can easily determine the value of x that maximizes g:

(5) d [p log (1 + x) + (1 – p) log (1 – x)]/d x = p/(1 + x) – (1 – p)/(1 – x) =>
p/(1 + x) – (1 – p)/(1 – x) = 0 =>

(6) x = p – (1 – p)

Since p is the probability that I will win, and (1 – p) is the probability that I will lose, the Kelly strategy says that to optimize the growth rate of your bankroll (wealth) you should invest a fraction of the bankroll equal to the difference of the likelihood that you will win or lose. In our example, this means that I have in each game to bet the fraction of x = 0.6 – (1 – 0.6) ≈ 0.2 — that is, 20% of my bankroll. Alternatively, we see that the Kelly criterion implies that we have to choose x so that E[log(1+x)] — which equals p log (1 + x) + (1 – p) log (1 – x) — is maximized. Plotting E[log(1+x)] as a function of x we see that the value maximizing the function is 0.2:

kelly2

The optimal average growth rate becomes

(7) 0.6 log (1.2) + 0.4 log (0.8) ≈ 0.02.

If I bet 20% of my wealth in tossing the coin, I will after 10 games on average have 1.0210 times more than when I started (≈ 1.22).

This game strategy will give us an outcome in the long run that is better than if we use a strategy building on the neoclassical economic theory of choice under uncertainty (risk) – expected value maximization. If we bet all our wealth in each game we will most likely lose our fortune, but because with low probability we will have a very large fortune, the expected value is still high. For a real-life player – for whom there is very little to benefit from this type of ensemble-average – it is more relevant to look at time-average of what he may be expected to win (in our game the averages are the same only if we assume that the player has a logarithmic utility function). What good does it do me if my tossing the coin maximizes an expected value when I might have gone bankrupt after four games played? If I try to maximize the expected value, the probability of bankruptcy soon gets close to one. Better then to invest 20% of my wealth in each game and maximize my long-term average wealth growth!

When applied to the neoclassical theory of expected utility, one thinks in terms of “parallel universe” and asks what is the expected return of an investment, calculated as an average over the “parallel universe”? In our coin toss example, it is as if one supposes that various “I” are tossing a coin and that the loss of many of them will be offset by the huge profits one of these “I” does. But this ensemble-average does not work for an individual, for whom a time-average better reflects the experience made in the “non-parallel universe” in which we live.

The Kelly criterion gives a more realistic answer, where one thinks in terms of the only universe we actually live in, and ask what is the expected return of an investment, calculated as an average over time.

Since we cannot go back in time — entropy and the “arrow of time ” make this impossible — and the bankruptcy option is always at hand (extreme events and “black swans” are always possible) we have nothing to gain from thinking in terms of ensembles and “parallel universe.”

Actual events follow a fixed pattern of time, where events are often linked in a multiplicative process (as e. g. investment returns with “compound interest”) which is basically non-ergodic.

Instead of arbitrarily assuming that people have a certain type of utility function – as in the neoclassical theory – the Kelly criterion shows that we can obtain a less arbitrary and more accurate picture of real people’s decisions and actions by basically assuming that time is irreversible. When the bankroll is gone, it’s gone. The fact that in a parallel universe it could conceivably have been refilled, are of little comfort to those who live in the one and only possible world that we call the real world.

Our coin toss example can be applied to more traditional economic issues. If we think of an investor, we can basically describe his situation in terms of our coin toss. What fraction (x) of his assets (T) should an investor – who is about to make a large number of repeated investments – bet on his feeling that he can better evaluate an investment (p = 0.6) than the market (p = 0.5)? The greater the x, the greater is the leverage. But also – the greater is the risk. Since p is the probability that his investment valuation is correct and (1 – p) is the probability that the market’s valuation is correct, it means the Kelly criterion says he optimizes the rate of growth on his investments by investing a fraction of his assets that is equal to the difference in the probability that he will “win” or “lose.” In our example this means that he at each investment opportunity is to invest the fraction of x = 0.6 – (1 – 0.6), i.e. about 20% of his assets. The optimal average growth rate of investment is then about 2 % (0.6 log (1.2) + 0.4 log (0.8)).

Kelly’s criterion shows that because we cannot go back in time, we should not take excessive risks. High leverage increases the risk of bankruptcy. This should also be a warning for the financial world, where the constant quest for greater and greater leverage – and risks – creates extensive and recurrent systemic crises. A more appropriate level of risk-taking is a necessary ingredient in a policy to come to curb excessive risk taking.

The works of people like Kelly and Kahneman show that expected utility theory is indeed a serious case of theory-induced blindness that transmogrifies truth.

Cherry picking economic models

21 Jul, 2016 at 11:25 | Posted in Economics | Comments Off on Cherry picking economic models

chameleon-ipad-backgroundChameleons arise and are often nurtured by the following dynamic. First a bookshelf model is constructed that involves terms and elements that seem to have some relation to the real world and assumptions that are not so unrealistic that they would be dismissed out of hand. The intention of the author, let’s call him or her “Q,” in developing the model may to say something about the real world or the goal may simply be to explore the implications of making a certain set of assumptions. Once Q’s model and results become known, references are made to it, with statements such as “Q shows that X.” This should be taken as short-hand way of saying “Q shows that under a certain set of assumptions it follows (deductively) that X,” but some people start taking X as a plausible statement about the real world. If someone skeptical about X challenges the assumptions made by Q, some will say that a model shouldn’t be judged by the realism of its assumptions, since all models have assumptions that are unrealistic. Another rejoinder made by those supporting X as something plausibly applying to the real world might be that the truth or falsity of X is an empirical matter and until the appropriate empirical tests or analyses have been conducted and have rejected X, X must be taken seriously. In other words, X is innocent until proven guilty. Now these statements may not be made in quite the stark manner that I have made them here, but the underlying notion still prevails that because there is a model for X, because questioning the assumptions behind X is not appropriate, and because the testable implications of the model supporting X have not been empirically rejected, we must take X seriously. Q’s model (with X as a result) becomes a chameleon that avoids the real world filters.

The best way to illustrate what chameleons are is to give some actual examples …

In April 2012 Harry DeAngelo and René Stulz circulated a paper entitled “Why High Leverage is Optimal for Banks.” The title of the paper is important here: it strongly suggests that the authors are claiming something about actual banks in the real world. In the introduction to this paper the authors explain what their model is designed to do:

“To establish that high bank leverage is the natural (distortion-free) result of intermediation focused on liquid-claim production, the model rules out agency problems, deposit insurance, taxes, and all other distortionary factors. By positing these idealized conditions, the model obviously ignores some important determinants of bank capital structure in the real world. However, in contrast to the MM framework – and generalizations that include only leverage-related distortions – it allows a meaningful role for banks as producers of liquidity and shows clearly that, if one extends the MM model to take that role into account, it is optimal for banks to have high leverage.” [emphasis added]

Their model, in other words, is designed to show that if we rule out many important things and just focus on one factor alone, we obtain the particular result that banks should be highly leveraged. This argument is for all intents and purpose analogous to the argument made in another paper entitled “Why High Alcohol Consumption is Optimal for Humans” by Bacardi and Mondavi. In the introduction to their paper Bacardi and Mondavi explain what their model does:

“To establish that high intake of alcohol is the natural (distortion free) result of human liquid-drink consumption, the model rules out liver disease, DUIs, health benefits, spousal abuse, job loss and all other distortionary factors. By positing these idealized conditions, the model obviously ignores some important determinants of human alcohol consumption in the real world. However, in contrast to the alcohol neutral framework – and generalizations that include only overconsumption- related distortions – it allows a meaningful role for humans as producers of that pleasant “buzz” one gets by consuming alcohol, and shows clearly that if one extends the alcohol neutral model to take that role into account, it is optimal for humans to be drinking all of their waking hours.”[emphasis added]

Deangelo and Stulz model is clearly a bookshelf theoretical model that would not pass through any reasonable filter if we want to take its results and apply them directly to the real world. In addition to ignoring much of what is important (agency problems, taxes, systemic risk, government guarantees, and other distortionary factors), the results of their main model are predicated on the assets of the bank being riskless and are based on a posited objective function that is linear in the percentage of assets funded with deposits. Given this the authors naturally obtain a corner solution with assets 100% funded by deposits. (They have no explicit model addressing what happens when bank assets are risky, but they contend that bank leverage should still be “high” when risk is present) …

cherry-pickDeAngelo and Stulz paper is a good illustration of my claim that one can generally develop a theoretical model to produce any result within a wide range. Do you want a model that produces the result that banks should be 100% funded by deposits? Here is aset of assumptions and an argument that will give you that result. That such a model exists tells us very little. By claiming relevance without running it through the filter it becomes a chameleon …

Whereas some theoretical models can be immensely useful in developing intuitions, in essence a theoretical model is nothing more than an argument that a set of conclusions follows from a given set of assumptions. Being logically correct may earn a place for a theoretical model on the bookshelf, but when a theoretical model is taken off the shelf and applied to the real world, it is important to question whether the model’s assumptions are in accord with what we know about the world. Is the story behind the model one that captures what is important or is it a fiction that has little connection to what we see in practice? Have important factors been omitted? Are economic agents assumed to be doing things that we have serious doubts they are able to do? These questions and others like them allow us to filter out models that are ill suited to give us genuine insights. To be taken seriously models should pass through the real world filter.

Chameleons are models that are offered up as saying something significant about the real world even though they do not pass through the filter. When the assumptions of a chameleon are challenged, various defenses are made (e.g., one shouldn’t judge a model by its assumptions, any model has equal standing with all other models until the proper empirical tests have been run, etc.). In many cases the chameleon will change colors as necessary, taking on the colors of a bookshelf model when challenged, but reverting back to the colors of a model that claims to apply the real world when not challenged.

Paul Pfleiderer

Reading Pfleiderer’s absolutely fabulous gem of an article reminded me of what H. L. Mencken once famously said:

There is always an easy solution to every problem – neat, plausible and wrong.

Pfleiderer’s perspective may be applied to many of the issues involved when modeling complex and dynamic economic phenomena. Let me take just one example — simplicity.

When it come to modeling I do see the point emphatically made time after time by e. g. Paul Krugman in simplicity — as long as it doesn’t impinge on our truth-seeking. “Simple” macroeconomic models may of course be an informative heuristic tool for research. But if practitioners of modern macroeconomics do not investigate and make an effort of providing a justification for the credibility of the simplicity-assumptions on which they erect their building, it will not fulfill its tasks. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of  “simple” macroeconomic models and theories. So far, I can’t really see that e. g. “simple” microfounded models have yielded very much in terms of realistic and relevant economic knowledge.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

But models do not only face theory. They also have to look to the world. Being able to model a “credible world,” a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though — as Pfleiderer acknowledges — all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

Explanation, understanding and prediction of real world phenomena, relations and mechanisms therefore cannot be grounded on simpliciter assuming simplicity. If we cannot show that the mechanisms or causes we isolate and handle in our models are stable, in the sense that what when we export them from are models to our target systems they do not change from one situation to another, then they – considered “simple” or not – only hold under ceteris paribus conditions and a fortiori are of limited value for our understanding, explanation and prediction of our real world target system.

The obvious ontological shortcoming of a basically epistemic – rather than ontological – approach, is that “similarity” or “resemblance” tout court do not guarantee that the correspondence between model and target is interesting, relevant, revealing or somehow adequate in terms of mechanisms, causal powers, capacities or tendencies. No matter how many convoluted refinements of concepts made in the model, if the simplifications made do not result in models similar to reality in the appropriate respects (such as structure, isomorphism etc), the surrogate system becomes a substitute system that does not bridge to the world but rather misses its target.

Constructing simple macroeconomic models somehow seen as “successively approximating” macroeconomic reality, is a rather unimpressive attempt at legitimizing using fictitious idealizations for reasons more to do with model tractability than with a genuine interest of understanding and explaining features of real economies. Many of the model assumptions standardly made by neoclassical macroeconomics – simplicity being one of them – are restrictive rather than harmless and could a fortiori anyway not in any sensible meaning be considered approximations at all.

If economists aren’t able to show that the mechanisms or causes that they isolate and handle in their “simple” models are stable in the sense that they do not change when exported to their “target systems”, they do only hold under ceteris paribus conditions and are a fortiori of limited value to our understanding, explanations or predictions of real economic systems.

That Newton’s theory in most regards is simpler than Einstein’s is of no avail. Today Einstein has replaced Newton. The ultimate arbiter of the scientific value of models cannot be simplicity.

As scientists we have to get our priorities right. Ontological under-labouring has to precede epistemology.

 

Footnote: And of course you understood that the Bacardi/Mondavi paper is fictional. Or?

Rasismens fula tryne

21 Jul, 2016 at 11:14 | Posted in Politics & Society | Comments Off on Rasismens fula tryne

 

Ja hur ska man reagera på dessa uttryck för oförblommerat svinaktig rasism?

Kanske med att lyssna på Olof Palme

Why economists can’t reason

19 Jul, 2016 at 17:01 | Posted in Economics | 3 Comments

reasoning-9780070558823Reasoning is the process whereby we get from old truths to new truths, from the known to the unknown, from the accepted to the debatable … If the reasoning starts on firm ground, and if it is itself sound, then it will lead to a conclusion which we must accept, though previously, perhaps, we had not thought we should. And those are the conditions that a good argument must meet; true premises and a good inference. If either of those conditions is not met, you can’t say whether you’ve got a true conclusion or not.

Neoclassical economic theory today is in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to substitute experimenting with something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models” rather than engineering things happening in real economies.

Neoclassical economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. The one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics, is a scientific cul-de-sac. To have valid evidence is not enough. What economics needs is sound evidence.

Avoiding logical inconsistencies is crucial in all science. But it is not enough. Just as important is avoiding factual inconsistencies. And without showing — or at least warrantedly arguing — that the assumptions and premises of their models are in fact true, mainstream economists aren’t really reasoning, but only playing games. Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.

Goodbye Lenin!

19 Jul, 2016 at 11:42 | Posted in Varia | Comments Off on Goodbye Lenin!

 

How do we attach probabilities to the real world?

19 Jul, 2016 at 11:11 | Posted in Statistics & Econometrics | 1 Comment

Econometricians usually think that the data generating process (DGP) always can be modelled properly using a probability measure. The argument is standardly based on the assumption that the right sampling procedure ensures there will always be an appropriate probability measure. But – as always – one really has to argue the case, and present warranted evidence that real-world features are correctly described by some probability measure.

There are no such things as free-standing probabilities – simply because probabilities are strictly seen only defined relative to chance set-ups – probabilistic nomological machines like flipping coins or roulette-wheels. And even these machines can be tricky to handle. Although prob(fair coin lands heads|I toss it) = prob(fair coin lands head & I toss it)|prob(fair coin lands heads) may be well-defined, it’s not certain we can use it, since we cannot define the probability that I will toss the coin given the fact that I am not a nomological machine producing coin tosses.

No nomological machine – no probability.

A chance set-up is a nomological machine for probabilistic laws, and our description of it is a model that works in the same way as a model for deterministic laws … A situation must be like the model both positively and negatively – it must have all the characteristics featured in the model and it must have no significant interventions to prevent it operating as envisaged – before we can expect repeated trials to give rise to events appropriately described by the corresponding probability …

dappledProbabilities attach to the world via models, models that serve as blueprints for a chance set-up – i.e., for a probability-generating machine … Once we review how probabilities are associated with very special kinds of models before they are linked to the world, both in probability theory itself and in empirical theories like physics and economics, we will no longer be tempted to suppose that just any situation can be described by some probability distribution or other. It takes a very special kind of situation withe the arrangements set just right – and not interfered with – before a probabilistic law can arise …

Probabilities are generated by chance set-ups, and their characterisation necessarily refers back to the chance set-up that gives rise to them. We can make sense of probability of drawing two red balls in a row from an urn of a certain composition with replacement; but we cannot make sense of the probability of six per cent inflation in the United Kingdom next year without an implicit reference to a specific social and institutional structure that will serve as the chance set-up that generates this probability.

Küssen kann man nicht alleine

18 Jul, 2016 at 19:33 | Posted in Varia | Comments Off on Küssen kann man nicht alleine

 

Ich hab’ mein Herz in Heidelberg verloren (personal)

18 Jul, 2016 at 19:02 | Posted in Varia | 2 Comments

heidelberg

On probability distributions and uncertainty

18 Jul, 2016 at 17:31 | Posted in Economics | Comments Off on On probability distributions and uncertainty

treatprobAlmost a hundred years after John Maynard Keynes wrote his seminal A Treatise on Probability (1921), it is still very difficult to find economics textbooks that seriously try to incorporate his far-reaching and incisive analysis of induction and evidential weight.

The standard view in mainstream economics – and the axiomatic probability theory underlying it – is to a large extent based on the rather simplistic idea that ‘more is better.’ But as Keynes argues – ‘more of the same’ is not what is important when making inductive inferences. It’s rather a question of ‘more but different.’

Variation, not replication, is at the core of induction. Finding that p(x|y) = p(x|y & w) doesn’t make w ‘irrelevant.’ Knowing that the probability is unchanged when w is present gives p(x|y & w) another evidential weight (‘weight of argument’). Running 10 replicative experiments do not make you as ‘sure’ of your inductions as when running 10 000 varied experiments – even if the probability values happen to be the same.

According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but ‘rational expectations.’ Keynes rather thinks that we base our expectations on the confidence or ‘weight’ we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by ‘degrees of belief,’ beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modeled by mainstream social economics.

How strange that writers of economics textbooks as a rule do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical ‘probability.’ In the quest for quantities one puts a blind eye to qualities and looks the other way – but Keynes ideas keep creeping out from under the mainstream economics carpet.

It’s high time that economics textbooks give Keynes his due.

There are at least two ways to formally distinguish Keynes’s idea that the future is unknowable in principle from the neoclassical idea that the future is stochastic-stable and that agents know, or act as if they know, this distribution with absolute certainty. First, as Keynes, Shackle, Vickers, and others have stressed, it is logically impossible for agents to assign numerical probabilities to the potentially infinite number of imaginable future states. Even Savage acknowledged that, taken literally, the assumption that agents are able to consider all possible future economic states “is utterly ridiculous” (1954, p. 16). Worse yet, many possible future events are not even imaginable in the present moment: such events obviously cannot be assigned a probability …

j_crotty_nov2011_cropped_smaller1Alternatively, we could — for the sake of argument — think of firms and portfolio selectors as somehow forcing themselves to assign expected future returns to all the assets under evaluation even though they are conscious of the fact that their knowledge of the future is inherently incomplete and unreliable. The key point is that such subjective probability distributions would not be knowledge, and –most important — any rational agent would know they were not knowledge … Hicks insisted that in the nonergodic real world, people “do not know what is going to happen and know that they do not know what is going to happen. As in history!” …

Therefore, even given the unrealistic assumption of the existence of these distributions, there is a crucial piece of information about agent decision making that would be missing from any subjectivist theory — the extent to which the agents believe in the meaningfulness of their forecasts or, in Keynes’s words, the “weight of belief” or “the degree of rational belief” the agents assign to these probabilities. When knowledge of the future is subjective and imperfect, as it always is, the expectations of rational agents can never be fully and adequately represented solely by probability distributions because such distributions fail to incorporate the agents’ own understanding of the degree of incompleteness of their knowledge. These functions neglect the agents’ “confidence” in the meaningfulness of the forecasts — “how highly we rate the likelihood of our best forecast turning out to be quite wrong” (Keynes 1936, p. 148).

Keynes stressed the centrality of agents’ consciousness of their ignorance: the state of confidence plays a crucial role in his theory of the investment decision. “The state of confidence [in the ability to make meaningful forecasts] is relevant because it is one of the major factors determining [investment]” (1936, p. 149). The central role of confidence in the investment decision-making process has disappeared from mainstream Keynesian models and cannot exist by assumption in New Classical and neoclassical models.

James Crotty

Economics for everyone

18 Jul, 2016 at 13:26 | Posted in Economics | 2 Comments

 

What is ‘effective demand’?

18 Jul, 2016 at 11:04 | Posted in Economics | 7 Comments

J__Jespersen_683346aEconomists of all shades have generally misunderstood the theoretical structure of Keynes’s The General Theory. Quite often this is a result of misunderstanding the concept of ‘effective demand’ — one of the key theoretical innovations of The General Theory.

Jesper Jespersen untangles the concept and shows how Keynes, by taking uncertainty seriously, contributed to forming an analytical alternative to the prevailing neoclassical general equilibrium framework:

Effective demand is one of the distinctive analytical concepts that Keynes developed in The General Theory. Demand and demand management have thereby come to represent one of the distinct trademarks of Keynesian macroeconomic theory and policy. It is not without reason that the central position of this concept has left the impression that Keynes’s macroeconomic model predominantly consists of theories for determining demand, while the supply side is neglected. From here it is a short step within a superficial interpretation to conclude that Keynes (and post-Keynesians) had ended up in a theoretical dead end, where macroeconomic development is exclusively determined by demand factors …

It is the behaviour of profit-seeking firms acting under the ontological condition of uncertainty that is at the centre of post-Keynesian concept of effective demand. It is entrepreneurs’ expectations with regard to demand and supply factor that determine their plans for output as a whole and by that the effective demand for labour.

Therefore, it was somewhat unfortunate that Keynes called his new analytical concept ‘effective demand’, which may have contributed to misleading generations of open minded macroeconomists to concluding that it was exclusively realized demand for consumer and investment goods that drives the macroeconomic development. Hereby a gateway for the IS/LM-model interpretation of effective demand was opened, where demand creates its own supply.

tmp10C1_thumb1On the contrary, it is the interaction between the sum of the individual firms’ sales expectations (aggregate demand) and their estimated production costs (aggregate supply) that together with a number of institutional conditions (bank credit, labour market organization, global competition and technology) determine the business sector decisions on output as a whole and employment …

The supply side in the goods market is an aggregate presentation of firms’ cost functions considered as a whole. It shows a relation between what Keynes called ‘supply price’, i.e. the sales proceeds that, given the production function and cost structures, is needed to ‘just make it worth the while of the entrepreneurs to give that employment’ (Keynes, 1936: 24). This means that behind the supply curve there is a combination of variable costs plus an expected profit at different levels of employment. At each level firms try to maximise their profit, if they succeed there is no (further) incentive for firms to change production or employment.

These assumptions entail that the aggregate supply function (what Keynes called the Z-curve) is upward sloping and represents the proceeds that has to be expected by the industry as a whole to make a certain employment ‘worth undertaken’ … In fact, this aggregate supply function looks like it was taken directly from a standard, neoclassical textbook, where decreasing marginal productivity of labour within the representative firm is assumed; the main difference is that Keynes is dealing with the aggregate sum of heterogeneous firms i.e. the industry as a whole.

The other equally important part of effective demand is aggregate demand function, which is the value of the sales that firms as a whole expect at different levels of macro-activity measured by employment (as a whole) …

Firms make a kind of survey-based expectation with regard to the most likely development in sales and proceeds in the nearer future. This expectation of aggregate demand (as a whole) is a useful point of departure for the individual firms when they have to form their specific expectation of future proceeds. This sales expectation will therefore centre around the future macroeconomic demand (and on the intensity of international competition).

Accordingly, Keynes’s macro-theory has a microeconomic foundation of firms trying to maximise profit, but differs from neoclassical theory by introducing uncertainty related to the future, which makes an explicit introduction of aggregate demand relevant i.e. the expected sales proceeds by business as a whole.

‘Effective demand’ is nothing but the value of the aggregate demand function where it equals the aggregate supply function, which is at the point where the firms expect to maximize their profits. To Keynes — contrary to the ‘classical’ theory, which assumes that aggregate demand always accommodates to aggregate supply and hence is consistent with ‘effective demand’ having an infinite range of values — ‘effective demand’ has a unique equilibrium value. And — most importantly — that value may be at a level below the one that is compatible with full employment.

The invisible hand — invisible because it’s not there

17 Jul, 2016 at 22:13 | Posted in Economics | 1 Comment

Daniel Kahneman … has demonstrated how individuals systematically behave in ways less rational than orthodox economists believe they do. His research shows not only that individuals sometimes act differently than standard economic theories predict, but that they do so regularly, systematically, and in ways that can be understood and interpreted through alternative hypotheses, competing with those utilised by orthodox economists.

stiglitz3To most market participants – and, indeed, ordinary observers – this does not seem like big news … In fact, this irrationality is no news to the economics profession either. John Maynard Keynes long ago described the stock market as based not on rational individuals struggling to uncover market fundamentals, but as a beauty contest in which the winner is the one who guesses best what the judges will say …

Adam Smith’s invisible hand – the idea that free markets lead to efficiency as if guided by unseen forces – is invisible, at least in part, because it is not there …

For more than 20 years, economists were enthralled by so-called “rational expectations” models which assumed that all participants have the same (if not perfect) information and act perfectly rationally, that markets are perfectly efficient, that unemployment never exists (except when caused by greedy unions or government minimum wages), and where there is never any credit rationing.

That such models prevailed, especially in America’s graduate schools, despite evidence to the contrary, bears testimony to a triumph of ideology over science. Unfortunately, students of these graduate programmes now act as policymakers in many countries, and are trying to implement programmes based on the ideas that have come to be called market fundamentalism … Good science recognises its limitations, but the prophets of rational expectations have usually shown no such modesty.

Joseph Stiglitz

On the limits of the invisible hand

17 Jul, 2016 at 13:43 | Posted in Economics | 5 Comments


It might look trivial at first sight, but what Harold Hotelling did show in his classic paper Stability in Competition (1929) was that there are cases when Adam Smith’s invisible hand doesn’t actually produce a social optimum.

With the advent of neoclassical economics at the end of the 19th century a large amount of intellectual energy was invested in trying to formalize the stringent conditions of obtaining equilibrium and showing in what way the prices and quantities of free competition constituted some kind of social optimum.

That the equilibrium reached in free competition is an optimum for each individual – given prevailing prices and income distribution – was not, however, seen by some economists as making a very strong case for a free market economy per se. It wasn’t possible to prove that free trade and competition gave a maximum of social utility. The gains made in exchange weren’t a manifestation of a maximum social utility.

wicksell2Knut Wicksell was one of those who criticized the idea of regarding the gain in utility arising from free competition as an absolute maximum. This market fundamentalist idea of harmony in a free market system didn’t live up to Wicksell’s demand for objectivity in science – and  “the harmony economists, who endeavoured to extend the doctrine so that it might become a defence of the existing distribution of wealth” were judged severely by Wicksell (Lectures 1934 (1901) p. 39).

When propounders of the new marginalist theory – especially Walras and Pareto – overstepped the strict boundaries of science and used it in ascribing to the market properties it did not possess, Wicksell had to react. To Wicksell (Lectures 1934 (1901) p. 73) it was

almost tragic that Walras … imagined that he had found the rigorous proof … merely because he clothed in mathematical formula the very arguments which he considered insufficient when they were expressed in ordinary language.

But what about the Pareto criterion? Wicksell had actually more or less anticipated it in his review (in Zeitschrift für Volkswirtschaft, Sozialpolitik und Verwaltung, 1913: 132-51) of Pareto’s Manuel, but didn’t think it really contributed anything useful. It was just the same old doctrine in a new disguise. To Wicksell the market fundamentalist doctrine of the Lausanne School obviously didn’t constitute an advance in economics.

From a methodological point of view there are also one or two lessons to learn from this history.

Models may help us to explain things by providing us with a frame/instrument for analysing and explaining the real world. However, to do that, there has to be an adequate similarity between model and reality. Otherwise they cannot function as eye openers that widen our cognitive horizon and make it possible to see and detect fundamental forces/ relations/mechanisms operating in the real world. And — most importantly — we always have to scrutinize the assumptions the models build on and so test their plausibility.

Logic, coherence, consistency, simplicity, and deductivity is not enough. Without confronting models with the real world, they are nothing but empty thought experiments — and should be treated as such.

David K. Levine — unlucky when trying to think

17 Jul, 2016 at 11:54 | Posted in Economics | 3 Comments

50cf9626f2deeIn the wake of the latest financial crisis many people have come to wonder why economists never have been able to predict these manias, panics and crashes that haunt our economies.

In responding to these warranted wonderings, some economists – like professor David K. Levine in the article Why Economists Are Right: Rational Expectations and the Uncertainty Principle in Economics in the Huffington Post – have maintained that
 
 

it is a fundamental principle that there can be no reliable way of predicting a crisis.

To me this is a totally inadequate answer. And even trying to make an honour out of the inability of one’s own science to give answers to just questions, is indeed proof of a rather arrogant and insulting attitude.

Fortunately yours truly is not the only one racting to this guy’s arrogance …

Steve Blough trolls me this morning over on the Twitter Machine about the truly remarkable ignorance of economics professor David K. Levine:

I confess I am embarrassed for my great-grandfather Roland Greene Usher, who sweated blood all his life trying to help build Washington University in St. Louis into a great university, that WUSTL now employs people like David K. Levine:

Levine, you see, appears to believe that we live not in a monetary but in a barter economy. And so Levine claims that the Friedmanite-monetarist expansionary policies to fight recessions that recommended by Milton Friedman cannot, in fact, work:

David K. Levine: The Keynesian Illusion:
I want to think here of a complete economy peopled by real people … a phone guy who makes phones, a burger flipper, a hairdresser and a tattoo artist…. The burger flipper only wants a phone, the hairdresser only wants a burger, the tattoo artist only wants a haircut and the phone guy only wants a tattoo…. Each can produce one phone, burger, haircut or tattoo…. The phone guy produces a phone, trades it to the tattoo artist in exchange for a tattoo, who trades the phone to the hairdresser in exchange for a haircut, who trades it to burger flipper in exchange for a burger. All are employed… everyone is happy.

Now suppose that the phone guy suddenly decides he doesn’t like tattoos enough to be bothered building a phone…. Catastrophe. Everyone is unemployed…. The stupid phone guy… is lazy and doesn’t want to work…. The burger flipper would like to work making burgers if he can get a phone, the hairdresser would like cut hair if he could get a burger and the tattoo artist would like to work if he could get a haircut and yet all are unemployed …

Maybe the government should follow Keynes’s [note: Levine means “Milton Friedman’s” here] advice and print some money…. Then the phone guy can buy a tattoo, and the tattoo guy can buy a haircut and the haircutter can buy a burger, and the burger flipper — ooops… he can’t buy a phone because there are no phones…. [Perhaps] the burger flipper realizes he shouldn’t sell the burger because he can’t buy anything he wants… and we are right back… with everyone unemployed…. Maybe he doesn’t realize that and gets left holding the bag… a Ponzi scheme…. It seems like a poor excuse for economic policy that our plan is that we hope the burger flipper will be a fool and be willing to be left holding the bag.

DKL’s argument that Friedmanite-monetarist expansionary policies cannot cure a downturn is, I believe, correct — if the downturn is caused by a sudden outbreak of worker laziness, an adverse supply shock that reduces potential output.

Expansionary monetary policy in such a situation will indeed produce inflation. People’s expectations of the prices at which they will be able to buy are disappointed on the upside as too much money chases too few goods. It is not clear to me why DKL calls this a “Ponzi scheme” rather than “unanticipated inflation”.

But does anybody — save DKL — believe that an extraordinary and contagious outbreak of worker laziness is what caused the downturn that began in 2008?

No.

Everybody else believes that the downturn that began in 2008 occurred not because of a supply shock in which workers suddenly became lazy but because of a demand shock in which the financial crisis caused nearly everybody in the economy to try to rebuild their stocks of safe, liquid, secure financial assets. Everybody else believes that the right way to model the economy is not the barter economy of DKL — trading phones for tattoos, etc. — but as a monetary economy, in which people hold stocks of financial assets and trade them for currently-produced goods and services.

This matters.

This matters a lot.

Brad DeLong

Per Svensson — ännu en av dessa antidemokratiska demokrater

17 Jul, 2016 at 10:42 | Posted in Politics & Society | Comments Off on Per Svensson — ännu en av dessa antidemokratiska demokrater

Inför brexit-omröstningen var det få kommentatorer som ondgjorde sig över att man i Storbritannien valt att låta medborgarna i en folkomröstning tala om huruvida man ville stanna kvar i EU eller ej. För de flesta framstod detta lika självklart som att Sverige för lite mer än tio år sedan folkomröstade om vi ville vara med i EMU eller ej.

slide_3Men när väl det — för de flesta — överraskande resultatet av brexit-omröstningen stod klart blev det andra tongångar. När ‘folket’ inte valde som ‘etablissemanget’ var det helt plötsligt sååå fel med folkomröstningar.

En av alla dessa debattörer och kommentatorer som nu förfasar sig över att britterna ‘valde fel’ är Sydsvenskans Per Svensson. I en artikel med rubriken Med folkets stöd mot avgrunden talar Svensson — hedersdoktor vid Malmö högskola — om att folkomröstningar mest är något som ‘karikerar’ demokrati och utnyttjas av ‘populister’ och ‘charlataner’. Tanken att en majoritet av britterna ville lämna EU skulle kunna bero på att för de minst bemedlade och svagaste grupperna har EU och dess åtstramningspolitik inte levererat ett dyft, föresvävar uppenbarligen inte Svensson.

När resultatet inte gick etablissemangets väg är helt plötsligt folkomröstningsinstitutet inte ett uttryck för ‘verklig’ demokrati.

Får man föreslå herr Svensson en resa till Schweiz? Eller det är kanske också bara ett land där ‘eliten’ utnyttjar ‘folket’ för att driva igenom sina egna intressen?

Demokrati är ingen gottepåse.

Demokrati är inget vi står upp för bara när den resulterar i beslut som vi gillar.

Demokrati och ‘rule of law’ är något vi ska slå vakt om. Överallt. Alltid.

L’urne que nous interrogeons

15 Jul, 2016 at 19:19 | Posted in Economics | Comments Off on L’urne que nous interrogeons

treatprob-2In my judgment, the practical usefulness of those modes of inference, here termed Universal and Statistical Induction, on the validity of which the boasted knowledge of modern science depends, can only exist—and I do not now pause to inquire again whether such an argument must be circular—if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appear more and more clearly as the ultimate result to which material science is tending …

The physicists of the nineteenth century have reduced matter to the collisions and arrangements of particles, between which the ultimate qualitative differences are very few …

The validity of some current modes of inference may depend on the assumption that it is to material of this kind that we are applying them … Professors of probability have been often and justly derided for arguing as if nature were an urn containing black and white balls in fixed proportions. Quetelet once declared in so many words—“l’urne que nous interrogeons, c’est la nature.” But again in the history of science the methods of astrology may prove useful to the astronomer; and it may turn out to be true—reversing Quetelet’s expression—that “La nature que nous interrogeons, c’est une urne”.

Professors of probability and statistics, yes. And more or less every mainstream economist!

On tour (personal)

14 Jul, 2016 at 09:37 | Posted in Varia | Comments Off on On tour (personal)

Touring Germany yours truly also had time to visit son completing law studies at Heidelberg University. Amazingly beautiful town.
 
Heidelberg_Castle_and_Bridge.jpg

Critical inspiration

11 Jul, 2016 at 09:28 | Posted in Economics | 4 Comments

Image

Almost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria. What we do know is that — under very restrictive assumptions — unique Pareto-efficient equilibria do exist.

But what good does that do? As long as we cannot show, except under exceedingly unrealistic assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria – the value of general equilibrium theory is nil. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory. A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids.

Continuing to model a world full of agents behaving as economists — “often wrong, but never uncertain” — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away), is a gross misallocation of intellectual resources and time.

In case you think this verdict is only a heterodox idiosyncrasy, here’s what one of the world’s greatest microeconomists — Alan Kirman — writes in his thought provoking paper The intrinsic limits of modern economic theory:

If one maintains the fundamentally individualistic approach to constructing economic models no amount of attention to the walls will prevent the citadel from becoming empty …

kirman[The results of Sonnenchein (1972), Debreu (1974), Mantel (1976) and Mas Collel (1985)] shows clearly why any hope for uniqueness or stability must be unfounded …

The idea that we should start at the level of the isolated individual is one which we may well have to abandon … we should be honest from the outset and assert simply that by assumption we postulate that each sector of the economy behaves as one individual and not claim any spurious microjustification …

Economists therefore should not continue to make strong assertions about this behaviour based on so-called general equilibrium models which are, in reality, no more than special examples with no basis in economic theory as it stands.

Getting around Sonnenschein-Mantel-Debreu using representative agents may be — from a purely formalistic point of view — very expedient. But relevant and realistic? No way!

Although garmented as a representative agent, the emperor is still naked.

Axel Leijonhufvud on why economics has become so boring

11 Jul, 2016 at 00:05 | Posted in Economics | Comments Off on Axel Leijonhufvud on why economics has become so boring


51BMduFh0cL._SX373_BO1,204,203,200_Trying to delineate the difference between ‘New Keynesianism’ and ‘Post Keynesianism’ — during an interview a couple of months ago — yours truly was confronted by the odd and confused view that Axel Leijonhufvud was a ‘New Keynesian.’ I wasn’t totally surprised — I had run into that misapprehension before — but still, it’s strange how wrong people sometimes get things.

The  last time I met Axel, we were both invited keynote speakers at the conference “Keynes 125 Years – What Have We Learned?” in Copenhagen. Axel’s speech was later published as Keynes and the crisis and contains the following thought provoking passages:

For many years now, the main alternative to Real Business Cycle Theory has been a somewhat loose cluster of models given the label of New Keynesian theory. New Keynesians adhere on the whole to the same DSGE modeling technology as RBC macroeconomists but differ in the extent to which they emphasise inflexibilities of prices or other contract terms as sources of shortterm adjustment problems in the economy. The “New Keynesian” label refers back to the “rigid wages” brand of Keynesian theory of 40 or 50 years ago. Except for this stress on inflexibilities this brand of contemporary macroeconomic theory has basically nothing Keynesian about it.

The obvious objection to this kind of return to an earlier way of thinking about macroeconomic problems is that the major problems that have had to be confronted in the last twenty or so years have originated in the financial markets – and prices in those markets are anything but “inflexible”. But there is also a general theoretical problem that has been festering for decades with very little in the way of attempts to tackle it. Economists talk freely about “inflexible” or “rigid” prices all the time, despite the fact that we do not have a shred of theory that could provide criteria for judging whether a particular price is more or less flexible than appropriate to the proper functioning of the larger system. More than seventy years ago, Keynes already knew that a high degree of downward price flexibility in a recession could entirely wreck the financial system and make the situation infinitely worse. But the point of his argument has never come fully to inform the way economists think about price inflexibilities …

I began by arguing that there are three things we should learn from Keynes … The third was to ask whether events provedthat existing theory needed to be revised. On that issue, I conclude that dynamic stochastic general equilibrium theory has shown itself an intellectually bankrupt enterprise. But this does not mean that we should revert to the old Keynesian theory that preceded it (or adopt the New Keynesian theory that has tried to compete with it). What we need to learn from Keynes, instead, are these three lessons about how to view our responsibilities and how to approach our subject.

Economics has become boring? Yes. Axel Leijonhufvud a ‘New Keynesian’? Forget it!

How Richard Posner became a Keynesian

10 Jul, 2016 at 15:52 | Posted in Economics | 1 Comment

Until [2008], when the banking industry came crashing down and depression loomed for the first time in my lifetime, I had never thought to read The General Theory of Employment, Interest, and Money, despite my interest in economics … I had heard that it was a very difficult book and that the book had been refuted by Milton Friedman, though he admired Keynes’s earlier work on monetarism. I would not have been surprised by, or inclined to challenge, the claim made in 1992 by Gregory Mankiw, a prominent macroeconomist at Harvard, that “after fifty years of additional progress in economic science, The General Theory is an outdated book. . . . We are in a much better position than Keynes was to figure out how the economy works.”

adaWe have learned since [2008] that the present generation of economists has not figured out how the economy works …

Baffled by the profession’s disarray, I decided I had better read The General Theory. Having done so, I have concluded that, despite its antiquity, it is the best guide we have to the crisis …

It is an especially difficult read for present-day academic economists, because it is based on a conception of economics remote from theirs. This is what made the book seem “outdated” to Mankiw — and has made it, indeed, a largely unread classic … The dominant conception of economics today, and one that has guided my own academic work in the economics of law, is that economics is the study of rational choice … Keynes wanted to be realistic about decision-making rather than explore how far an economist could get by assuming that people really do base decisions on some approximation to cost-benefit analysis …

Economists may have forgotten The General Theory and moved on, but economics has not outgrown it, or the informal mode of argument that it exemplifies, which can illuminate nooks and crannies that are closed to mathematics. Keynes’s masterpiece is many things, but “outdated” it is not.

Richard Posner

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.