Krugman’s gadget interpretation of economics

31 August, 2016 at 19:26 | Posted in Economics | Comments Off on Krugman’s gadget interpretation of economics

Paul Krugman has often been criticized by people like yours truly for getting things pretty wrong on  the economics of  John Maynard Keynes.

krugmanWhen Krugman has responded to the critique, by himself rather gratuitously portrayed as about “What Keynes Really Meant,” the overall conclusion is — “Krugman Doesn’t Care.”

Responding to a post up here on Krugman not being a real Keynesian, Krugman writes:

Surely we don’t want to do economics via textual analysis of the masters. The questions one should ask about any economic approach are whether it helps us understand what’s going on, and whether it provides useful guidance for decisions.

So I don’t care whether Hicksian IS-LM is Keynesian in the sense that Keynes himself would have approved of it, and neither should you.

The reason for this rather debonair attitude seems to be that history of economic thought may be OK, but what really counts is if reading Keynes gives birth to new and interesting insights and ideas.

No serious economist would question that explaining and understanding “what’s going on” in our economies is the most important task economists can set themselves — but it is not the only task.  And to compare one’s favourite economic gadget model to what “austerians” and other madmen from Chicago have conjured up, well, that’s like playing tennis with the nets down, and we have to have higher aspirations as scientists.

Although I have a lot of sympathy for Krugman’s view on authority, there is also a somewhat disturbing and unbecoming coquetting in his attitude towards the great forerunners he is discussing. Krugman is a great economist, but it smacks not so little of hubris to simply say “if where you take the idea is very different from what the great man said somewhere else in his book, so what?” Physicists arguing like that when discussing Newton, Einstein, Bohr or Feynman would not be taken seriously.

Krugman’s comment on this issue is interesting, however, because it sheds light on a kind of inconsistency in his own art of argumentation. During a couple of years Krugman has in more than one article criticized mainstream economics for using to much (bad) mathematics and axiomatics in their model-building endeavours. But when it comes to defending his own position on various issues, he usually himself ultimately falls back on the same kind of models. Models that actually, when it comes to methodology and assumptions, have a lot in common with the kind of model-building he otherwise criticizes. And although Krugman repeatedly says that he is a strong believer in “simple models,” those models are far from simple (at least not in any interesting meaning of the word).

But I think the absolute all-time low in Krugman’s response is this remarkable passage:

Has declaring uncertainty to be unquantifiable, and mathematical modeling in any form foolish, been productive? Remember, that’s what the Austrians say too.

23e17eb61f800c21ca20e84926a714a278a62f70f97c7ed404223ffa5adfbd3eI won’t comment on the shameful guilt-by-association part of the quote, but re uncertainty it’s absolutely gobsmacking how Krugman manages to mix up the ontological question — is the economy permeated by calculable risk or by genuine and often uncalculable uncertainty  — with the epistemological question — how do we manage to analyze/understand/explain/model such an economy. Here Krugman seems to say — much in the spirit of Robert Lucas — that if reality is uncertain and non-ergodic, well then let’s just pretend it’s ergodic and susceptible to standard probabilistic analysis, so that we can go on with our FORTRAN programs and mathematical models! In other areas of science that would rightfully be considered fraud, but in “modern” neoclassical mainstream economics it’s obviously thought of as an unprobematical and justified procedure.

And then, of course, not really trying to clinch the deep theoretical issue at stake, Krugman for the n:th time puts forward his IS-LM gadget interpretation of economics.

Being able to model a “gadget world” — a world that somehow could be considered real or similar to the real world — is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

No matter how many convoluted refinements of concepts made in the gadget model, if the “successive approximations” do not result in models similar to reality in the appropriate respects (such as structure, isomorphism etc), the surrogate gadget system becomes a substitute system that does not bridge to the world, but rather misses its target.

So — constructing gadgets like IS-LM macroeconomic models as “stylized facts” somehow “successively approximating” macroeconomic reality, is a rather unimpressive attempt at legitimizing using fictitious idealizations for reasons more to do with model tractability than with a genuine interest of understanding and explaining features of real economies. Many of the model assumptions made in IS-LM models and “New Keynesian” DSGE models are restrictive  rather than harmless and could a fortiori anyway not in any sensible meaning be considered approximations at all.

Where does all this leave us? Well, I for one, is not the least impressed by Krugman’s gadget interpretation of economics. And if labels are as uninteresting as he says — well, then I suggest Krugman and other “New Keynesians” stop calling themselves Keynesians at all. I’m pretty sure Keynes would have appreciated not having his theories and thoughts being referred to by people having preciously little to do with those theories and thoughts.

Studying great forerunners like Keynes may help us to construct better and more relevant economic models – models that really help us to explain and understand reality. So when Krugman writes

Second — and this plays a surprisingly big role in my own pedagogical thinking — we do want, somewhere along the way, to get across the notion of the self-correcting economy, the notion that in the long run, we may all be dead, but that we also have a tendency to return to full employment via price flexibility

I would certainly recommend him to compare his own statement with what Keynes himself wrote:

Though we all started out in the same direction, we soon parted company into two main groups. What made the cleavage that thus divided us?

On the one side were those who believed that the existing economic system is in the long run self-adjusting, though with creaks and groans and jerks, and interrupted by time-lags, outside interference and mistakes … These economists did not, of course, believe that the system is automatic or immediately self-adjusting, but they did maintain that it has an inherent tendency towards self-adjustment, if it is not interfered with, and if the action of change and chance is not too rapid.

John Maynard KeynesThose on the other side of the gulf, however, rejected the idea that the existing economic system is, in any significant sense, self-adjusting. They believed that the failure of effective demand to reach the full potentialities of supply, in spite of human psychological demand being immensely far from satisfied for the vast majority of individuals, is due to much more fundamental causes …

The gulf between these two schools of thought is deeper, I believe, than most of those on either side of it realize. On which side does the essential truth lie?

The strength of the self-adjusting school depends on its having behind it almost the whole body of organized economic thinking and doctrine of the last hundred years. This is a formidable power. It is the product of acute minds and has persuaded and convinced the great majority of the intelligent and disinterested persons who have studied it. It has vast prestige and a more far-reaching influence than is obvious. For it lies behind the education and the habitual modes of thought, not only of economists but of bankers and business men and civil servants and politicians of all parties …

Thus, if the heretics on the other side of the gulf are to demolish the forces of nineteenth-century orthodoxy … they must attack them in their citadel … Now I range myself with the heretics. I believe their flair and their instinct move them towards the right conclusion. But I was brought up in the citadel and I recognize its power and might … For me, therefore, it is impossible to rest satisfied until I can put my finger on the flaw in the part of the orthodox reasoning that leads to the conclusions that for various reasons seem to me to be inacceptable. I believe that I am on my way to do so. There is, I am convinced, a fatal flaw in that part of the orthodox reasoning that deals with the theory of what determines the level of effective demand and the volume of aggregate employment …

John Maynard Keynes (1934)

Advertisements

The reality of how money is created

27 August, 2016 at 15:32 | Posted in Economics | 4 Comments

Everything we know is not just wrong – it’s backwards. When banks make loans, they create money. This is because money is really just an IOU. The role of the central bank is to preside over a legal order that effectively grants banks the exclusive right to create IOUs of a certain kind, ones that the government will recognise as legal tender by its willingness to accept them in payment of taxes.

es514f00bf

There’s really no limit on how much banks could create, provided they can find someone willing to borrow it. They will never get caught short, for the simple reason that borrowers do not, generally speaking, take the cash and put it under their mattresses; ultimately, any money a bank loans out will just end up back in some bank again. So for the banking system as a whole, every loan just becomes another deposit. What’s more, insofar as banks do need to acquire funds from the central bank, they can borrow as much as they like; all the latter really does is set the rate of interest, the cost of money, not its quantity. Since the beginning of the recession, the US and British central banks have reduced that cost to almost nothing. In fact, with “quantitative easing” they’ve been effectively pumping as much money as they can into the banks, without producing any inflationary effects.

What this means is that the real limit on the amount of money in circulation is not how much the central bank is willing to lend, but how much government, firms, and ordinary citizens, are willing to borrow. Government spending is the main driver in all this … So there’s no question of public spending “crowding out” private investment. It’s exactly the opposite.

David Graeber

Sounds odd, doesn’t it?

This guy must sure be one of those strange and dangerous heterodox cranks?

Well, maybe you should reconsider …

The reality of how money is created today differs from the description found in some economics textbooks:
• Rather than banks receiving deposits when households save and then lending them out, bank lending creates deposits.
• In normal times, the central bank does not fix the amount of money in circulation, nor is central bank money ‘multiplied up’ into more loans and deposits …
Most of the money in circulation is created, not by the printing presses of the Bank of England, but by the commercial banks themselves: banks create money whenever they lend to someone in the economy or buy an asset from consumers. And in contrast to descriptions found in some textbooks, the Bank of England does not directly control the quantity of either base or broad money. The Bank of England is nevertheless still able to influence the amount of money in the economy. It does so in normal times by setting monetary policy — through the interest rate that it pays on reserves held by commercial banks with the Bank of England. More recently, though, with Bank Rate constrained by the effective lower bound, the Bank of England’s asset purchase programme has sought to raise the quantity of broad money in circulation. This in turn affects the prices and quantities of a range of assets in the economy, including money.

Michael McLeay, Amar Radia and Ryland Thomas
Bank of England’s Monetary Analysis Directorate

Poupée de cire, poupée de son

27 August, 2016 at 15:14 | Posted in Varia | 1 Comment


France Gall. Une fille-femme merveilleuse. Magnifique.


‘C’est comme toute l’histoire du peuple noir qui se balance entre l’amour et l’désespoir.’

When we die / När vi dör

27 August, 2016 at 13:12 | Posted in Varia | Comments Off on When we die / När vi dör

 

The real debt problem

26 August, 2016 at 19:31 | Posted in Economics | 8 Comments

national debt5One of the most effective ways of clearing up this most serious of all semantic confusions is to point out that private debt differs from national debt in being external. It is owed by one person to others. That is what makes it burdensome. Because it is interpersonal the proper analogy is not to national debt but to international debt…. But this does not hold for national debt which is owed by the nation to citizens of the same nation. There is no external creditor. We owe it to ourselves.

A variant of the false analogy is the declaration that national debt puts an unfair burden on our children, who are thereby made to pay for our extravagances. Very few economists need to be reminded that if our children or grandchildren repay some of the national debt these payments will be made to our children or grandchildren and to nobody else. Taking them altogether they will no more be impoverished by making the repayments than they will be enriched by receiving them.

Abba Lerner The Burden of the National Debt (1948)

Few issues in politics and economics are nowadays more discussed – and less understood – than public debt. Many raise their voices to urge for reducing the debt, but few explain why and in what way reducing the debt would be conducive to a better economy or a fairer society. And there are no limits to all the – especially macroeconomic –calamities and evils a large public debt is supposed to result in – unemployment, inflation, higher interest rates, lower productivity growth, increased burdens for subsequent generations, etc., etc.

People usually care a lot about public sector budget deficits and debts, and are as a rule worried and negative. Drawing analogies from their own household’s economy, debt is seen as a sign of an imminent risk of default and hence a source of reprobation. But although no one can doubt the political and economic significance of public debt, there’s however no unanimity whatsoever among economists as to whether debt matters, and if so, why and in what way. And even less – one doesn’t know what is the “optimal” size of public debt.

Through history public debts have gone up and down, often expanding in periods of war or large changes in basic infrastructure and technologies, and then going down in periods when things have settled down.

The pros and cons of public debt have been put forward for as long as the phenomenon itself has existed, but it has, notwithstanding that, not been possible to reach anything close to consensus on the issue — at least not in a long time-horizon perspective. One has as a rule not even been able to agree on whether public debt is a problem, and if — when it is or how to best tackle it. Some of the more prominent reasons for this non-consensus are the complexity of the issue, the mingling of vested interests, ideology, psychological fears, the uncertainty of calculating ad estimating inter-generational effects, etc., etc.

In the mercantilist era public debt was as a rule considered positive (cf. Berkeley, Melon, de Pinto), a view that was later repeated in the 19th century by, e.g., economists Adolf Wagner, Lorenz von Stein and Carl Dietzel. The state’s main aim was to control and distribute the resources of the nation, often through regulations and forceful state interventions. As a result of increased public debt, the circulation of money and credit would increase the amount of capital and contribute to the wealth of nations. Public debt was basically considered something that was moved from “the right hand to the left hand.” The economy simply needed a state that was prepared to borrow substantial amounts of money and financial papers and incur indebtedness in the process.

There was also a clear political dimension to the issue, and some authors were clearly aware that government loan/debt activities could have a politically stabilizing effect. Investors had a vested interest in stable governments (low interest rate and low risk premium) and so instinctively were loyal to the government.

In classical economics — following in the footsteps of David Hume – especially Adam Smith, David Ricardo, and Jean-Baptiste Say put forward views on public debt that was more negative. The good budget was a balanced budget. If government borrowed money to finance its activities, it would only give birth to “crowding out” private enterprise and investments. The state was generally considered incapable if paying its debts, and the real burden would therefor essentially fall on the taxpayers that ultimately had to pay for the irresponsibility of government. The moral character of the argumentation was a salient feature — “either the nation must destroy public credit, or the public credit will destroy the nation” (Hume 1752)

Later on in the 20th century economists like John Maynard Keynes, Abba Lerner and Alvin Hansen would again hold a more positive view on public debt. Public debt was normally nothing to fear, especially if it was financed within the country itself (but even foreign loans could be beneficient for the economy if invested in the right way). Some members of society would hold bonds and earn interest on them, while others would have to pay the taxes that ultimately paid the interest on the debt. But the debt was not considered a net burden for society as a whole, since the debt cancelled itself out between the two groups. If the state could issue bonds at a low interest rate, unemployment could be reduced without necessarily resulting in strong inflationary pressure. And the inter-generational burden was no real burden according to this group of economists, since — if used in a suitable way — the debt would, through its effects on investments and employment, actually be net winners. There could, of course, be unwanted negative distributional side effects, for the future generation, but that was mostly considered a minor problem since (Lerner 1948) “if our children or grandchildren repay some of the national debt these payments will be made to our children and grandchildren and to nobody else.”

Central to the Keynesian influenced view is the fundamental difference between private and public debt. Conflating the one with the other is an example of the atomistic fallacy, which is basically a variation on Keynes’ savings paradox. If an individual tries to save and cut down on debts, that may be fine and rational, but if everyone tries to do it, the result would be lower aggregate demand and increasing unemployment for the economy as a whole.

An individual always have to pay his debts. But a government can always pay back old debts with new, through the issue of new bonds. The state is not like an individual. Public debt is not like private debt. Government debt is essentially a debt to itself, its citizens. Interest paid on the debt is paid by the taxpayers on the one hand, but on the other hand, interest on the bonds that finance the debts goes to those who lend out the money.

Abba Lerner’s essay Functional Finance and the Federal Debt set out guiding principles for governments to adopt in their efforts to use economic – especially fiscal – policies in trying to maintain full employment and prosperity in economies struggling with chronic problems with maintaining a high enough aggregate demand.

Because of this inherent deficiency, modern states tended to have structural and long-lasting problems of maintaining full employment. According to Lerner’s Functional Finance principles, the private sector has a tendency not to generate enough demand on its own, and so the government has to take on the responsibility to make sure that full employment was attained. The main instrument in doing this is open market operations – especially selling and buying interest-bearing government bonds.

Although Lerner seems to have had the view that the ideas embedded in Functional Finance was in principle applicable in all kinds of economies, he also recognized the importance of the institutional arrangements in shaping the feasibility and practical implementation of it.

Functional Finance critically depends on nation states being able to tax its citizens, have a currency — and bonds — of its own. As has become transparently clear during the Great Recession, EMU has not been able to impose those structures, since as Hayek noted already back in 1939, “government by agreement is only possible provided that we do not require the government to act in fields other than those in which we can obtain true agreement.” The monetary institutional structure of EMU makes it highly unlikely – not to say impossible — that this will ever become a “system” in which Functional Finance is adapted.

To Functional Finance the choices made by governments to finance the public deficits — and concomitant debts — was important, since bond-based financing was considered more expansionary than using taxes also. According to Lerner, the purpose of public debt is to achieve a rate of interest that results in investments making full employment feasible. In the short run this could result in deficits, but he firmly maintained that there was no reason to assume that the application of Functional Finance to maintain full employment implied that the government had to always borrow money and increase the public debt. An application of Functional Finance would have a tendency to balance the budget in the long run since basically the guarantee of permanent full employment will make private investment much more attractive and a fortiori the greater private investment will diminish the need for deficit spending.

To both Keynes and Lerner it was evident that the state had the ability to promote full employment and a stable price level – and that it should use its powers to do so. If that meant that it had to take on a debt and (more or less temporarily) underbalance its budget – so let it be! Public debt is neither good nor bad. It is a means to achieving two over-arching macroeconomic goals – full employment and price stability. What is sacred is not to have a balanced budget or running down public debt per se, regardless of the effects on the macroeconomic goals. If “sound finance”, austerity and a balanced budgets means increased unemployment and destabilizing prices, they have to be abandoned.

Now against this reasoning, exponents of the thesis of Ricardian equivalence, have maintained that whether the public sector finances its expenditures through taxes or by issuing bonds is inconsequential, since bonds must sooner or later be repaid by raising taxes in the future.

Robert Barro (1974) attempted to give the proposition a firm theoretical foundation, arguing that the substitution of a budget deficit for current taxes has no impact on aggregate demand and so budget deficits and taxation have equivalent effects on the economy.

If the public sector runs extra spending through deficits, taxpayers will according to the hypothesis anticipate that they will have to pay higher taxes in future — and therefore increase their savings and reduce their current consumption to be able to do so, the consequence being that aggregate demand would not be different to what would happen if taxes were raised today.

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

The Ricardo-Barro hypothesis, with its view of public debt incurring a burden for future generations, is the dominant view among mainstream economists and politicians today. The rational people making up the actors in the model are assumed to know that today’s debts are tomorrow’s taxes. But — one of the main problems with this standard neoclassical theory is, however, that it doesn’t fit the facts.

From a more theoretical point of view, one may also strongly criticize the Ricardo-Barro model and its concomitant crowding out assumption, since perfect capital markets do not exist and repayments of public debt can take place far into the future and it’s dubious if we really care for generations 300 years from now.

At times when economic theories have been in favour of public debt one gets the feeling that the more or less explicit assumption is that public expenditures are useful and good for the economy, since they work as an important — and often necessary — injection to the economy, creating wealth and employment. At times when economic theories have been against public debt, the basic assumption seems to be that public expenditures are useless and only crowd out private initiatives and has no positive net effect on the economy.

Wolfgang Streeck argues in Buying Time: The Delayed Crisis of Democratic Capitalism (2014) for an interpretation of the more or less steady increase in public debt since the 1970s as a sign of a transformation of the tax state (Schumpeter) into a debt state. In his perspective public debt is both an indicator and a causal factor in the relationship between political and economic systems. The ultimate cause behind the increased public debt is the long run decline in economic growth, resulting in a doubling of the average public debt in OECD countries for the last 40 years. This has put strong pressures on modern capitalist states, and parallel to this, income inequality has increased in most countries. This is according to Streeck one manifestation of a neoliberal revolution – with its emphasis on supply side politics, austerity policies and financial deregulation — that has taken place and where democratic-redistributive intervention has become ineffectual.

Today there seems to be a rather widespread consensus of public debt being acceptable as long as it doesn’t increase too much and too fast. If the public debt-GDP ratio becomes higher than X % the likelihood of debt crisis and/or lower growth increases.

But in discussing within which margins public debt is feasible, the focus, however, is solely on the upper limit of indebtedness, and very few asks the question if maybe there is also a problem if public debt becomes too low.

The government’s ability to conduct an “optimal” public debt policy may be negatively affected if public debt becomes too small. To guarantee a well-functioning secondary market in bonds it is essential that the government has access to a functioning market. If turnover and liquidity in the secondary market becomes too small, increased volatility and uncertainty will in the long run lead to an increase in borrowing costs. Ultimately there’s even a risk that market makers would disappear, leaving bond market trading to be operated solely through brokered deals. As a kind of precautionary measure against this eventuality it may be argued – especially in times of financial turmoil and crises — that it is necessary to increase government borrowing and debt to ensure – in a longer run – good borrowing preparedness and a sustained (government) bond market.

The failure of successive administrations in most developed countries to embark on any vigorous policy aimed at bringing down unconscionably high levels of unemployment has been due in no small measure to a ‘viewing with alarm’ of the size of the national debts, often alleged to be already excessive, or at least threatening to become so, and  by ideologically urged striving toward ‘balanced’ government budgets without any consideration of whether such debts and deficits are or threaten to become excessive in terms of some determinable impact on the real general welfare. darling-let-s-get-deeply-into-debtIf they are examined in the light of their impact on welfare, however, they can usually be shown to be well below their optimum levels, let alone at levels that could have dire consequences.

To view government debts in terms of the ‘functional finance’ concept introduced by Abba Lerner, is to consider their role in the macroeconomic balance of the economy. In simple, bare bones terms, the function of government debts that is significant for the macroeconomic health of an economy is that they provide the assets into which individuals can put whatever accumulated savings they attempt to set aside in excess of what can be wisely invested in privately owned real assets. A debt that is smaller than this will cause the attempted excess savings, by being reflected in a reduced level of consumption outlays, to be lost in reduced real income and increased unemployment.

William Vickrey

Lucas’ Copernican revolution — nonsense on stilts

24 August, 2016 at 19:18 | Posted in Economics | 3 Comments

In Michel De Vroey’s version of the history of macroeconomics, Robert Lucas’ declaration of the need for macroeconomics to be pursued only within ‘equilibrium discipline’ and declaring equilibrium to exist as a postulate, is hailed as a ‘Copernican revolution.’ Equilibrium is not to be considered something that characterises real economies, but rather ‘a property of the way we look at things.’ De Vroey  — approvingly — notices that this — as well as Lucas’ banning of disequilibrium as referring to ‘unintelligible behaviour’ — ‘amounts to shrinking the pretence of equilibrium theory.’

Mirabile dictu!

Is it really a feasible methodology for economists to make a sharp divide between theory and reality, and then — like De Vroey and Lucas — treat the divide as something recommendable and good? I think not.

Fortunately there are other economists with a less devoted hagiographic attitude towards Lucas and his nonsense on stilts.

Alessandro Vercelli is one:

The equilibria analysed by Lucas are conceived as stationary stochastic processes. The fact that they are stationary imposes a long series of restrictive hypotheses on the range of applicability of the heuristic model, and these considerably reduce the empirical usefulness of Lucas’s equlibrium method …

9780521074735For such a method to make sense … the stationary ‘equilibrium’ stochastic process must also be ‘dynamically stable,’ or ‘ergodic,’ in the terminology of stochastic processes …

What is worse, if one adopts Lucas’s method of pure equilibrium implying the non-intelligibility of disequilibrium positions, there is no way to argue about the robustness of the alternative equilibria under consideration. In other words, Lucas’s heuristic model, not to mention the analytical models built according to his instructions, prove to be useless for the very purpose for which they were primarily constructed — the evaluation of alternative economic policies.

2-format2010Another one is Roman Freedman, Professor of Economics at New York University and a long time critic of the rational expectations hypothesis. In his seminal 1982 American Economic Review article Towards an Understanding of Market Processes: Individual Expectations, Learning, and Convergence to Rational Expectations Equilibrium — an absolute must-read for anyone with a serious interest in understanding what are the issues in the present discussion on rational expectations as a modeling assumption — he showed that the kind of models that Lucas recommends — models founded on ‘equilibrium discipline’ and the rational expectations hypothesis — are inadequate as representation of economic agents’ decision making.

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

In one of their latest books on rational expectations, Roman Frydman and his colleague Michael Goldberg write:

Beyond_Mechanical_MarketsThe belief in the scientific stature of fully predetermined models, and in the adequacy of the Rational Expectations Hypothesis to portray how rational individuals think about the future, extends well beyond asset markets. Some economists go as far as to argue that the logical consistency that obtains when this hypothesis is imposed in fully predetermined models is a precondition of the ability of economic analysis to portray rationality and truth.

For example, in a well-known article published in The New York Times Magazine in September 2009, Paul Krugman (2009, p. 36) argued that Chicago-school free-market theorists “mistook beauty . . . for truth.” One of the leading Chicago economists, John Cochrane (2009, p. 4), responded that “logical consistency and plausible foundations are indeed ‘beautiful’ but to me they are also basic preconditions for ‘truth.’” Of course, what Cochrane meant by plausible foundations were fully predetermined Rational Expectations models. But, given the fundamental flaws of fully predetermined models, focusing on their logical consistency or inconsistency, let alone that of the Rational Expectations Hypothesis itself, can hardly be considered relevant to a discussion of the basic preconditions for truth in economic analysis, whatever “truth” might mean.

There is an irony in the debate between Krugman and Cochrane. Although the New Keynesian and behavioral models, which Krugman favors, differ in terms of their specific assumptions, they are every bit as mechanical as those of the Chicago orthodoxy. Moreover, these approaches presume that the Rational Expectations Hypothesis provides the standard by which to define rationality and irrationality.

In fact, the Rational Expectations Hypothesis requires no assumptions about the intelligence of market participants whatsoever … Rather than imputing superhuman cognitive and computational abilities to individuals, the hypothesis presumes just the opposite: market participants forgo using whatever cognitive abilities they do have. The Rational Expectations Hypothesis supposes that individuals do not engage actively and creatively in revising the way they think about the future. Instead, they are presumed to adhere steadfastly to a single mechanical forecasting strategy at all times and in all circumstances. Thus, contrary to widespread belief, in the context of real-world markets, the Rational Expectations Hypothesis has no connection to how even minimally reasonable profit-seeking individuals forecast the future in real-world markets. When new relationships begin driving asset prices, they supposedly look the other way, and thus either abjure profit-seeking behavior altogether or forgo profit opportunities that are in plain sight.

Beyond Mechanical Markets

And in a recent article the same authors write:

Contemporary economists’ reliance on mechanical rules to understand – and influence – economic outcomes extends to macroeconomic policy as well, and often draws on an authority, John Maynard Keynes, who would have rejected their approach. Keynes understood early on the fallacy of applying such mechanical rules. “We have involved ourselves in a colossal muddle,” he warned, “having blundered in the control of a delicate machine, the working of which we do not understand.”

In The General Theory of Employment, Interest, and Money, Keynes sought to provide the missing rationale for relying on expansionary fiscal policy to steer advanced capitalist economies out of the Great Depression. But, following World War II, his successors developed a much more ambitious agenda. Instead of pursuing measures to counter excessive fluctuations in economic activity, such as the deep contraction of the 1930’s, so-called stabilization policies focused on measures that aimed to maintain full employment. “New Keynesian” models underpinning these policies assumed that an economy’s “true” potential – and thus the so-called output gap that expansionary policy is supposed to fill to attain full employment – can be precisely measured.

But, to put it bluntly, the belief that an economist can fully specify in advance how aggregate outcomes – and thus the potential level of economic activity – unfold over time is bogus …

Roman Frydman & Michael Goldberg

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away  à la Lucas by assuming equilibrium and rational expectations, and treating uncertainty as if it was possible to reduce to stochastic risk. That is scientific cheating. And it has been going on for too long now.

Microfoundations — contestable incoherence

24 August, 2016 at 15:46 | Posted in Economics | 2 Comments

ba7658c533d4de20cf77161b8910d903cab9cbe6_m

Defenders of microfoundations and its rational expectations equipped representative agent’s intertemporal optimization often argue as if sticking with simple representative agent macroeconomic models doesn’t impart a bias to the analysis. I unequivocally reject that unsubstantiated view, and have given the reasons why here.

These defenders often also maintain that there are no methodologically coherent alternatives to microfoundations modeling. That allegation is of course difficult to evaluate, substantially hinging on how coherence is defined. But one thing I do know, is that the kind of microfoundationalist macroeconomics that New Classical economists and “New Keynesian” economists are pursuing, are not methodologically coherent according to the standard coherence definition (see e. g. here). And that ought to be rather embarrassing for those ilks of macroeconomists to whom axiomatics and deductivity is the hallmark of science tout court.

The fact that Lucas introduced rational expectations as a consistency axiom is not really an argument to why we should accept it as an acceptable assumption in a theory or model purporting to explain real macroeconomic processes (see e. g. here). And although virtually any macroeconomic empirical claim is contestable, so is any claim in micro (see e. g. here).

Blue in green

24 August, 2016 at 13:39 | Posted in Varia | 3 Comments

 

Robert Lucas’ warped methodology

23 August, 2016 at 19:43 | Posted in Economics | Comments Off on Robert Lucas’ warped methodology

Economic theory, like anthropology, ‘works’ by studying societies which are in some relevant sense simpler or more primitive than our own, in the hope either that relations that are important but hidden in our society will be laid bare in simpler ones, or that concrete evidence can be discovered for possibilities which are open to us which are without precedent in our own history. 1643.Lebowski.jpg-610x0Unlike anthropologists, however, economists simply invent the primitive societies we study, a practice which frees us from limiting ourselves to societies which can be physically visited as sparing us the discomforts of long stays among savages. This method of society-invention is the source of the utopian character of economics; and of the mix of distrust and envy with which we are viewed by our fellow social scientists. The point of studying wholly fictional, rather than actual societies, is that it is relatively inexpensive to subject them to external forces of various types and observe the way they react. If, subjected to forces similar to those acting on actual societies, the artificial society reacts in a similar way, we gain confidence that there are useable connections between the invented society and the one we really care about.

Robert Lucas

Although neither yours truly, nor anthropologists (I guess), will recognise anything in this description even remotely reminiscent of practices actually used in real sciences, this quote still gives a very good picture of Lucas’ warped methodology.

60088455All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

The implications that follow from the kind of models that people like Robert Lucas — according to Ed Prescott, ‘the master of methodology’ — are always conditional on the simplifying assumptions used — assumptions predominantly of a rather far-reaching and non-empirical character with little resemblance to features of the real world. From a descriptive point of view there is a fortiori usually very little resemblance between the models used and the empirical world. ‘As if’ explanations building on such foundations are not really any explanations at all, since they always conditionally build on hypothesized law-like theorems and situation-specific restrictive assumptions. The empirical-descriptive inaccuracy of the models makes it more or less miraculous if they should — in any substantive way — be able to be considered explanative at all. If the assumptions that are made are known to be descriptively totally unrealistic (think of e.g. ‘rational expectations’) they are of course likewise totally worthless for making empirical inductions. Assuming — as Lucas — that people behave ‘as if’ they were rational FORTRAN programmed computers doesn’t take us far when we know that the ‘if’ is false.

The obvious shortcoming of a basically epistemic — rather than ontological — approach such as ‘successive approximations’ and ‘as if’ modeling assumptions, is that ‘similarity’, ‘analogy’ or ‘resemblance’ tout court do not guarantee that the correspondence between model and target is interesting, relevant, revealing or somehow adequate in terms of mechanisms, causal powers, capacities or tendencies. No matter how many convoluted refinements of concepts made in the model, if the successive ‘as if’ approximations do not result in models similar to reality in the appropriate respects (such as structure, isomorphism, etc), they are nothing more than ‘substitute systems’ that do not bridge to the world but rather misses its target.

Economics building on the kind of modeling strategy that Lucas represents does not  produce science.

It’s nothing but pseudo-scientific cheating.

The thrust of this realist rhetoric is the same both at the scientific and at the meta-scientific levels. It is that explanatory virtues need not be evidential virtues. It is that you should feel cheated by “The world is as if T were true”, in the same way as you should feel cheated by “The stars move as if they were fixed on a rotating sphere”. Realists do feel cheated in both cases.

Alan Musgrave

Contrary to what some überimpressed macroeconomists seem to argue, I would say the recent economic crisis and the fact that Chicago economics has had next to nothing to contribute in understanding it, shows that Lucas and his New Classical economics — in Lakatosian terms — is a degenerative research program in dire need of replacement.

Mainstream economic theory has for long been in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to substitute experimenting with something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models” rather than engineering things happening in real economies.

In business cycles theory these models are constructed with the purpose of showing that changes in the supply of money “have the capacity to induce depressions or booms” [Lucas 1988:3] not just in these models, but also in real economies. To do so economists are supposed to imagine subjecting their models to some kind of “operational experiment” and “a variety of reactions”. “In general, I believe that one who claims to understand the principles of flight can reasonably be expected to be able to make a flying machine, and that understanding business cycles means the ability to make them too, in roughly the same sense” [Lucas 1981:8]. To Lucas models are the laboratories of economic theories, and after having made a simulacrum-depression Lucas hopes we find it “convincing on its own terms – that what I said would happen in the [model] as a result of my manipulation would in fact happen” [Lucas 1988:4]. The clarity with which the effects are seen is considered “the key advantage of operating in simplified, fictional worlds” [Lucas 1988:5].

On the flipside lies the fact that “we are not really interested in understanding and preventing depressions in hypothetical [models]. We are interested in our own, vastly more complicated society” [Lucas 1988:5]. But how do we bridge the gulf between model and “target system”? According to Lucas we have to be willing to “argue by analogy from what we know about one situation to what we would like to know about another, quite different situation” [Lucas 1988:5]. Progress lies in the pursuit of the ambition to “tell better and better stories” [Lucas 1988:5], simply because that is what economists do.

We are storytellers, operating much of the time in worlds of make believe. We do not find that the realm of imagination and ideas is an alternative to, or retreat from, practical reality. On the contrary, it is the only way we have found to think seriously about reality. In a way, there is nothing more to this method than maintaining the conviction … that imagination and ideas matter … there is no practical alternative” [Lucas 1988:6].

Lucas has applied this mode of theorizing by constructing “make-believe economic systems” to the age-old question of what causes and constitutes business cycles. According to Lucas the standard for what that means is that one “exhibits understanding of business cycles by constructing a model in the most literal sense: a fully articulated artificial economy, which behaves through time so as to imitate closely the time series behavior of actual economies” [Lucas 1981:219].

To Lucas business cycles is an inherently systemic phenomenon basically characterized by conditional co-variations of different time series. The vision is “the possibility of a unified explanation of business cycles, grounded in the general laws governing market economies, rather than in political or institutional characteristics specific to particular countries or periods” [Lucas 1981:218]. To be able to sustain this view and adopt his “equilibrium approach” he has to define the object of study in a very constrained way. Lucas asserts, e.g., that if one wants to get numerical answers “one needs an explicit, equilibrium account of the business cycles” [Lucas 1981:222]. But his arguments for why it necessarily has to be an equilibrium is not very convincing. The main restriction is that Lucas only deals with purportedly invariable regularities “common to all decentralized market economies” [Lucas 1981:218]. Adopting this definition he can treat business cycles as all alike “with respect to the qualitative behavior of the co-movements among series” [1981:218].

Postulating invariance paves the way for treating various economic entities as stationary stochastic processes (a standard assumption in most modern probabilistic econometric approaches) and the possible application of “economic equilibrium theory.” The result is that Lucas business cycle is a rather watered-down version of what is usually connoted when speaking of business cycles.

Based on the postulates of “self-interest” and “market clearing” Lucas has repeatedly stated that a pure equilibrium method is a necessary intelligibility condition and that disequilibria are somehow “arbitrary” and “unintelligible” [Lucas 1981:225]. Although this might (arguably) be requirements put on models, these requirements are irrelevant and totally without justification vis-à-vis the real world target system. Why should involuntary unemployment, for example, be considered an unintelligible disequilibrium concept? Given the lack of success of these models when empirically applied, what is unintelligible, is rather to pursue in this reinterpretation of the ups and downs in business cycles and labour markets as equilibria. To Keynes involuntary unemployment is not equatable to actors on the labour market becoming irrational non-optimizers. It is basically a reduction in the range of working-options open to workers, regardless of any volitional optimality choices made on their part. Involuntary unemployment is excess supply of labour. That unemployed in Lucas business cycles models only can be conceived of as having chosen leisure over work is not a substantive argument about real world unemployment. Sometimes workers are not employed. That is a real phenomenon and not a “theoretical construct … the task of modern theoretical economics to ‘explain’” [Lucas 1981:243].

All economic theories have to somehow deal with the daunting question of uncertainty and risk. It is “absolutely crucial for understanding business cycles” [Lucas 1981:223]. To be able to practice economics at all, “we need some way … of understanding which decision problem agents are solving” [Lucas 1981:223]. Lucas – in search of a “technical model-building principle” [Lucas 1981:1] – adapts the rational expectations view, according to which agents’ subjective probabilities are identified “with observed frequencies of the events to be forecast” are coincident with “true” probabilities. This hypothesis [Lucas 1981:224]

will most likely be useful in situations in which the probabilities of interest concern a fairly well defined recurrent event, situations of ‘risk’ [where] behavior may be explainable in terms of economic theory … In cases of uncertainty, economic reasoning will be of no value … Insofar as business cycles can be viewed as repeated instances of essentially similar events, it will be reasonable to treat agents as reacting to cyclical changes as ‘risk’, or to assume their expectations are rational, that they have fairly stable arrangements for collecting and processing information, and that they utilize this information in forecasting the future in a stable way, free of systemic and easily correctable biases.

To me this seems much like putting the cart before the horse. Instead of adapting the model to the object – which from both ontological and epistemological considerations seem the natural thing to do – Lucas proceeds in the opposite way and chooses to define his object and construct a model solely to suit own methodological and theoretical preferences. All those – interesting and important – features of business cycles that have anything to do with model-theoretical openness, and a fortiori not possible to squeeze into the closure of the model, are excluded. One might rightly ask what is left of that we in a common sense meaning refer to as business cycles. Einstein’s dictum – “everything should be made as simple as possible but not simpler” falls to mind. Lucas – and neoclassical economics at large – does not heed the implied apt warning.

The development of macro-econometrics has according to Lucas supplied economists with “detailed, quantitatively accurate replicas of the actual economy” thereby enabling us to treat policy recommendations “as though they had been experimentally tested” [Lucas 1981:220]. But if the goal of theory is to be able to make accurate forecasts this “ability of a model to imitate actual behavior” does not give much leverage. What is required is “invariance of the structure of the model under policy variations”. Parametric invariance in an economic model cannot be taken for granted, “but it seems reasonable to hope that neither tastes nor technology vary systematically” [Lucas 1981:220].

The model should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way. Hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that. But does it? Lucas appeals to “reasonable hope”, a rather weak justification for a modeler to apply such a far-reaching assumption. To warrant it one would expect an argumentation that this assumption – whether we conceive of it as part of a strategy of “isolation”, “idealization” or “successive approximation” – really establishes a useful relation that we can export or bridge to the target system, the “actual economy.” That argumentation is neither in Lucas, nor – to my knowledge – in the succeeding neoclassical refinements of his “necessarily artificial, abstract, patently ‘unreal’” analogue economies [Lucas 1981:271]. At most we get what Lucas himself calls “inappropriately maligned” casual empiricism in the form of “the method of keeping one’s eyes open.” That is far from sufficient to warrant any credibility in a model pretending to explain the complex and difficult recurrent phenomena we call business cycles. To provide an empirical “illustration” or a “story” to back up your model do not suffice. There are simply too many competing illustrations and stories that could be exhibited or told.

As Lucas has to admit – complaining about the less than ideal contact between theoretical economics and econometrics – even though the “stories” are (purportedly) getting better and better, “the necessary interaction between theory and fact tends not to take place” [Lucas 1981:11].

The basic assumption of this “precise and rigorous” model therefore cannot be considered anything else than an unsubstantiated conjecture as long as it is not supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence have been presented. This is the more tantalizing since Lucas himself stresses that the presumption “seems a sound one to me, but it must be defended on empirical, not logical grounds” [Lucas 1981:12].

And applying a “Lucas critique” on Lucas own model, it is obvious that it too fails. Changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas hope of being able to model the economy as “a FORTRAN program” and “gain some confidence that the component parts of the program are in some sense reliable prior to running it” [Lucas 1981:288] therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters “tastes” and “technology” shows that if you neglect ontological considerations pertaining to the target system, ultimately reality kicks back when at last questions of bridging and exportation of model exercises are laid on the table. No matter how precise and rigorous the analysis is, and no matter how hard one tries to cast the argument in “modern mathematical form” [Lucas 1981:7] they do not push science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about external validity.

 

References

Lucas, Robert (1981), Studies in Business-Cycle Theory. Oxford: Basil Blackwell.

– (1986), Adaptive Behavior and Economic Theory. In Hogarth, Robin & Reder, Melvin (eds) Rational Choice (pp. 217-242). Chicago: The University of Chicago Press.

– (1988), What Economists Do.

Syll, Lars (2016), On the use and misuse of theories and models in economics.

What was Robert Barro smoking when he came up with ‘Ricardian equivalence’?

23 August, 2016 at 15:12 | Posted in Economics | 1 Comment

 

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

6a00e54ecbb69a88330148c6a1f2cd970c

Why?

In the standard neoclassical consumption model — used in DSGE macroeconomic modeling — people are basically portrayed as treating time as a dichotomous phenomenon  today and the future — when contemplating making decisions and acting. How much should one consume today and how much in the future? Facing an intertemporal budget constraint of the form

ct + cf/(1+r) = ft + yt + yf/(1+r),

where ct is consumption today, cf is consumption in the future, ft is holdings of financial assets today, yt is labour incomes today, yf is labour incomes in the future, and r is the real interest rate, and having a lifetime utility function of the form

U = u(ct) + au(cf),

where a is the time discounting parameter, the representative agent (consumer) maximizes his utility when

u'(ct) = a(1+r)u'(cf).

This expression – the Euler equation – implies that the representative agent (consumer) is indifferent between consuming one more unit today or instead consuming it tomorrow. Typically using a logarithmic function form – u(c) = log c – which gives u'(c) = 1/c, the Euler equation can be rewritten as

1/ct = a(1+r)(1/cf),

or

cf/ct = a(1+r).

This importantly implies that according to the neoclassical consumption model changes in the (real) interest rate and consumption move in the same direction. And — it also follows that consumption is invariant to the timing of taxes, since wealth — ft + yt + yf/(1+r) — has to be interpreted as present discounted value net of taxes. And so, according to the assumption of Ricardian equivalence, the timing of taxes does not affect consumption, simply because the maximization problem as specified in the model is unchanged.

That the theory doesn’t fit the facts we already knew.

And a couple of months ago, on Voxeu, Jonathan A. Parker summarized a series of studies empirically testing the theory, reconfirming how out of line with reality is Ricardian equivalence.

This only, again, underlines that there is, of course, no reason for us to believe in that fairy-tale. Ricardo himself — mirabile dictu — didn’t believe in Ricardian equivalence. In Essay on the Funding System (1820) he wrote:

But the people who paid the taxes never so estimate them, and therefore do not manage their private affairs accordingly. We are too apt to think that the war is burdensome only in proportion to what we are at the moment called to pay for it in taxes, without reflecting on the probable duration of such taxes. It would be difficult to convince a man possessed of £20,000, or any other sum, that a perpetual payment of £50 per annum was equally burdensome with a single tax of £1000.

And as one Nobel laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter

Bank stress testing — not particularly helpful

23 August, 2016 at 12:31 | Posted in Economics | Comments Off on Bank stress testing — not particularly helpful

 

Keynes-Hicks macrotheory — a ‘New Keynesian’ unicorn fantasy

22 August, 2016 at 17:08 | Posted in Economics | 20 Comments

Paul Krugman has in numerous posts on his blog tried to defend “the whole enterprise of Keynes/Hicks macroeconomic theory” and especially his own somewhat idiosyncratic version of IS-LM.

Unicorn-s-fantasy-227928_300_312The main problem is simpliciter that there is no such thing as a Keynes-Hicks macroeconomic theory!

So, let us get some things straight.

There is nothing in the post-General Theory writings of Keynes that suggests him considering Hicks’s IS-LM anywhere near a faithful rendering of his thought. In Keynes’s canonical statement of the essence of his theory in the 1937 QJE-article there is nothing to even suggest that Keynes would have thought the existence of a Keynes-Hicks-IS-LM-theory anything but pure nonsense. So of course there can’t be any “vindication for the whole enterprise of Keynes/Hicks macroeconomic theory” – simply because “Keynes/Hicks” never existed.

And it gets even worse!

John Hicks, the man who invented IS-LM in his 1937 Econometrica review of Keynes’ General Theory – ‘Mr. Keynes and the ‘Classics’. A Suggested Interpretation’ – returned to it in an article in 1980 – ‘IS-LM: an explanation’ – in Journal of Post Keynesian Economics. Self-critically he wrote:

I accordingly conclude that the only way in which IS-LM analysis usefully survives — as anything more than a classroom gadget, to be superseded, later on, by something better – is in application to a particular kind of causal analysis, where the use of equilibrium methods, even a drastic use of equilibrium methods, is not inappropriate. I have deliberately interpreted the equilibrium concept, to be used in such analysis, in a very stringent manner (some would say a pedantic manner) not because I want to tell the applied economist, who uses such methods, that he is in fact committing himself to anything which must appear to him to be so ridiculous, but because I want to ask him to try to assure himself that the divergences between reality and the theoretical model, which he is using to explain it, are no more than divergences which he is entitled to overlook. I am quite prepared to believe that there are cases where he is entitled to overlook them. But the issue is one which needs to be faced in each case.

When one turns to questions of policy, looking toward the future instead of the past, the use of equilibrium methods is still more suspect. For one cannot prescribe policy without considering at least the possibility that policy may be changed. There can be no change of policy if everything is to go on as expected-if the economy is to remain in what (however approximately) may be regarded as its existing equilibrium. It may be hoped that, after the change in policy, the economy will somehow, at some time in the future, settle into what may be regarded, in the same sense, as a new equilibrium; but there must necessarily be a stage before that equilibrium is reached …

I have paid no attention, in this article, to another weakness of IS-LM analysis, of which I am fully aware; for it is a weakness which it shares with General Theory itself. It is well known that in later developments of Keynesian theory, the long-term rate of interest (which does figure, excessively, in Keynes’ own presentation and is presumably represented by the r of the diagram) has been taken down a peg from the position it appeared to occupy in Keynes. We now know that it is not enough to think of the rate of interest as the single link between the financial and industrial sectors of the economy; for that really implies that a borrower can borrow as much as he likes at the rate of interest charged, no attention being paid to the security offered. As soon as one attends to questions of security, and to the financial intermediation that arises out of them, it becomes apparent that the dichotomy between the two curves of the IS-LM diagram must not be pressed too hard.

The editor of JPKE, Paul Davidson, gives the background to Hicks’s article:

I originally published an article about Keynes’s finance motive — which in 1937 Keynes added to his other liquidity preference motives (transactions, precautionary, speculative motives) , I showed that adding this finance motive required that Hicks’s IS curve and LM curves to be interdependent — and thus when the IS curve shifted so would the LM curve.
Hicks and I then discussed this when we met several times.
When I first started to think about the ergodic vs. nonergodic dischotomy, I sent to Hicks some preliminary drafts of articles I would be writing about nonergodic processes. Then John and I met several times to discuss this matter further and I finally convinced him to write the article — which I published in the Journal of Post Keynesian Economics– in which he renounces the IS-LM apparatus. Hicks then wrote me a letter in which he thought the word nonergodic was wonderful and said he wanted to lable his approach to macroeconomics as nonergodic!

So – back in 1937 John Hicks said that he was building a model of John Maynard Keynes’ General Theory. In 1980 he openly admits he wasn’t.

What Hicks acknowledges in 1980 is basically that his original review totally ignored the very core of Keynes’ theory – uncertainty. In doing this he actually turned the train of macroeconomics on the wrong tracks for decades. It’s about time that neoclassical economists – as Krugman, Mankiw, or what have you – set the record straight and stop promoting something that the creator himself admits was a total failure. Why not study the real thing itself – General Theory – in full and without looking the other way when it comes to non-ergodicity and uncertainty?

Paul Krugman persists in talking about a Keynes-Hicks-IS-LM-model that really never existed. It’s deeply disappointing. You would expect more from a Nobel prize winner.

In his 1937 paper Hicks actually elaborates four different models (where Hicks uses I to denote Total Income and Ix to denote Investment):

1) “Classical”: M = kI   Ix = C(i)   Ix = S(i,I)

2) Keynes’ “special” theory: M = L(i)   Ix = C(i)    I = S(I)

3) Keynes’ “general” theory: M = L(I, i)   Ix = C(i)   I = S(I)

4) The “generalized general” theory: M = L(I, i)   Ix =C(I, i)  Ix = S(I, i)

It is obvious from the way Krugman draws his IS-LM curves that he is thinking in terms of model number 4 – and that is not even by Hicks considered a Keynes model (modells 2 and 3)! It’s basically a loanable funds model, that belongs in the “classical” camp and which you find reproduced in most mainstream textbooks. Hicksian IS-LM? Maybe. Keynes? No way!

Take Five

22 August, 2016 at 10:25 | Posted in Varia | 2 Comments

 

Steve Keen, Noah Smith and heterodox ‘anti-math’ economics

21 August, 2016 at 15:56 | Posted in Economics | 5 Comments

4816e5631fdce0191e6f8ba67d81758dResponding to the critique of his Bloomberg View post on heterodox economics and its alleged anti-math position, Noah Smith approvingly cites Steve Keen telling us there is

a wing of heterodox economics that is anti-mathematical. Known as “Critical Realism” and centred on the work of Tony Lawson at Cambridge UK, it attributes the failings of economics to the use of mathematics itself…

Although yours truly appreciate much of Steve Keen’s debunking of mainstream economics, on this issue he is, however, just plain wrong! For a more truthful characterization of Tony Lawson’s position, here’s what Axel Leijonhufvud has to say:

For a good many years, Tony Lawson has been urging economists to pay attention to their ontological presuppositions. Economists have not paid much attention, perhaps because few of us know what “ontology” means. This branch of philosophy stresses the need to “grasp the nature of the reality” that is the object of study – and to adapt one’s methods of inquiry to it.
5112X+PoJkLEconomics, it might be argued, has gotten this backwards. We have imposed our pre-conceived methods on economic reality in such manner as to distort our understanding of it. We start from optimal choice and fashion an image of reality to fit it. We transmit this distorted picture of what the world is like to our students by insisting that they learn to perceive the subject matter trough the lenses of our method.

The central message of Lawson’s critique of modern economics is that an economy is an “open system” but economists insist on dealing with it as if it were “closed.” Controlled experiments in the natural sciences create closure and in so doing make possible the unambiguous association of “cause” and “effects”. Macroeconomists, in particular, never have the privilege of dealing with systems that are closed in this controlled experiment sense.

Our mathematical representations of both individual and system behaviour require the assumption of closure for the models to have determinate solutions. Lawson, consequently, is critical of mathematical economics and, more generally, of the role of deductivism in our field. Even those of us untutored in ontology may reflect that it is not necessarily a reasonable ambition to try to deduce the properties of very large complex systems from a small set of axioms. Our axioms are, after all, a good deal shakier than Euclid’s.

The impetus to “closure” in modern macroeconomics stems from the commitment to optimising behaviour as the “microfoundations” of the enterprise. Models of “optimal choice” render agents as automatons lacking “free will” and thus deprived of choice in any genuine sense. Macrosystems composed of such automatons exclude the possibility of solutions that could be “disequilibria” in any meaningful sense. Whatever happens, they are always in equilibrium.

Axel Leijonhufvud

Modern economics has become increasingly irrelevant to the understanding of the real world. In his seminal book Economics and Reality (1997) Tony Lawson traced this irrelevance to the failure of economists to match their deductive-axiomatic methods with their subject.

It is — sad to say — as relevant today as it was seventeen years ago.

It is still a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models is beyond my imagination. As long as mainstream economists do not come up with any export-licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism!

Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in dire straits.

To many mainstream economists, Tony Lawson is synonymous with anti-mathematics. But I think reading what Tony Lawson or yours truly have written on the subject, shows how unfounded and ridiculous is the idea that many mainstream economists have that because heterodox people often criticize the application of mathematics in mainstream economics, we are critical of math per se.

Indeed.

No, there is nothing wrong with mathematics per se.

No, there is nothing wrong with applying mathematics to economics.

amathMathematics is one valuable tool among other valuable tools for understanding and explaining things in economics.

What is, however, totally wrong, are the utterly simplistic beliefs that

• “math is the only valid tool”

• “math is always and everywhere self-evidently applicable”

• “math is all that really counts”

• “if it’s not in math, it’s not really economics”

“almost everything can be adequately understood and analyzed with math”

What is wrong with these beliefs is that they do not — as forcefully argued by Tony Lawson — reflect an ontological reflection on what can be rightfully expected from using mathematical methods in different context. Or as Knut Wicksell put it already a century ago:

Knut-WicksellOne must, of course, beware of expecting from this method more than it can give. Out of the crucible of calculation comes not an atom more truth than was put in. The assumptions being hypothetical, the results obviously cannot claim more than a vey limited validity. The mathematical expression ought to facilitate the argument, clarify the results, and so guard against possible faults of reasoning — that is all.

It is, by the way, evident that the economic aspects must be the determining ones everywhere: economic truth must never be sacrificed to the desire for mathematical elegance.
 

De Vroey’s Chicago style History of Macroeconomics

21 August, 2016 at 14:32 | Posted in Economics | Comments Off on De Vroey’s Chicago style History of Macroeconomics

A couple of years ago Michel De Vroey felt the urge to write a defense of Robert Lucas’ denial of involuntary unemployment:

What explains the difficulty of constructing a theory of involuntary unemployment? Is it, as argued by Lucas, that the “thing” to be explained doesn’t exist, or is it due to some deeply embedded premise of economic theory? My own view tilts towards the latter. Economic theory is concerned with fictitious parables. The premises upon which it is based have the advantage of allowing tractable, rigorous theorising, but the price of this is that important facts of life are excluded from the theoretical universe. Non-chosen outcomes is one of them. The underlying reason lies in the trade technology and information assumptions upon which both the Walrasian and the Marshallian (and the neo-Walrasian and neo-Marshallian) approaches are based. This is a central conclusion of my inquiry: the stumbling block to the introduction of involuntary unemployment lies in the assumptions about trade technology that are usually adopted in economic theory.

unemployed1Foregoing the involuntary unemployment claim may look like a high price to pay, particularly if it is admitted that good reasons exist for believing in its real world relevance. But would its abandonment really be so dramatic? …

First of all, the elimination of this concept would only affect the theoretical sphere. Drawing conclusions from this sphere about the real world would be a mistake. No jumps should be made from the world of theory to the real world, or vice-versa … The fact that solid arguments can be put forward as to its real world existence is not a sufficient condition to give involuntary unemployment theoretical legitimacy.

Michel De Vroey

nonsequitur090111

I have to admit of being totally unimpressed by this rather defeatist methodological stance. Is it really a feasible methodology for economists to make  a sharp divide between theory and reality, and then treat the divide as something recommendable and good? I think not.

Models and theories should — if they are to be of any real interest — have to look to the world. Being able to construct “fictitious parables” or build models of a “credible world,” is not enough. No matter how many convoluted refinements of concepts made in the theory or model, if they do not result in “things” similar to reality in the appropriate respects, such as structure, isomorphism etc, the surrogate system becomes a substitute system — and why should we care about that? Science has to have higher aspirations.

Mainstream economic theory today is in the story-telling business whereby economic theorists create mathematical make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the theory or model and lose sight of reality. Insisting — like De Vroey — that “no jumps should be made from the world of theory to the real world, or vice-versa” is an untenable methodological position.

full coverIn his new book — A History of Macroeconomics from Keynes to Lucas and Beyond (CUP 2016) — De Vroey basically tells the same misleading and wrong-headed story as in the article mentioned above. He explicitly acknowledges that his assessment of The General Theory ‘follows from this analysis.’ Citing Mankiw and Patinkin, the author criticises Keynes for not having built his analysis on an ‘explicit and complete model’ and therefore was unable to translate his views ‘into a rigorous demonstration.’

Where did Keynes’ unemployment analysis go wrong? To De Vroey the answer  seems to be that ‘the notion of involuntary unemployment was not questioned’ and that  ‘the fact that unemployment was massive was taken as an indication that it could not be voluntary.’

182ytxt83k5oxjpgDoes it sound familiar? Well it should! Because this is the standard Chicago New Classical Economics view that never has accepted Keynes’s distinction between voluntary and involuntary unemployment. According to New Classical übereconomist Robert Lucas, an unemployed worker can always instantaneously find some job. No matter how miserable the work options are, “one can always choose to accept them,” according to Lucas:

KLAMER: My taxi driver here is driving a taxi, even though he is an accountant, because he can’t find a job …

LUCAS: I would describe him as a taxi driver [laughing], if what he is doing is driving a taxi.

KLAMER: But a frustrated taxi driver.

LUCAS: Well, we draw these things out of urns, and sometimes we get good draws, sometimes we get bad draws.

Arjo Klamer

In New Classical Economics unemployment is seen as as a kind of leisure that workers optimally select. In the basic DSGE models used by these economists, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

It is extremely important to pose the question why mainstream economists choose to work with these kinds of models. It is not a harmless choice based solely on ‘internal’ scientific considerations. It is in fact also, and not to a trivial extent, a conscious choice motivated by ideology.

By employing these models one is actually to a significant degree absolving the structure of market economies from any responsibility in creating unemployment. Focussing on the choices of individuals, the unemployment ‘problem’ is reduced to being an individual ‘problem’, and not something that essentially has to do with the workings of market economies. A conscious methodological choice in this way comes to work as an apologetic device for not addressing or challenging given structures.

Not being able to explain unemployment, these models can’t help us to change the structures and institutions that produce the arguably greatest problem of our society.

 

Added GMT 1800: Although De Vroey has a whole chapter on Hicks and the IS-LM model, he does not mention that Hicks returned to his 1937 IS-LM article in an article in 1980 – ‘IS-LM: an explanation’ – in Journal of Post Keynesian Economics — and self-critically wrote:

I accordingly conclude that the only way in which IS-LM analysis usefully survives — as anything more than a classroom gadget, to be superseded, later on, by something better – is in application to a particular kind of causal analysis, where the use of equilibrium methods, even a drastic use of equilibrium methods, is not inappropriate. I have deliberately interpreted the equilibrium concept, to be used in such analysis, in a very stringent manner (some would say a pedantic manner) not because I want to tell the applied economist, who uses such methods, that he is in fact committing himself to anything which must appear to him to be so ridiculous, but because I want to ask him to try to assure himself that the divergences between reality and the theoretical model, which he is using to explain it, are no more than divergences which he is entitled to overlook. I am quite prepared to believe that there are cases where he is entitled to overlook them. But the issue is one which needs to be faced in each case.

When one turns to questions of policy, looking toward the future instead of the past, the use of equilibrium methods is still more suspect. For one cannot prescribe policy without considering at least the possibility that policy may be changed. There can be no change of policy if everything is to go on as expected-if the economy is to remain in what (however approximately) may be regarded as its existing equilibrium. It may be hoped that, after the change in policy, the economy will somehow, at some time in the future, settle into what may be regarded, in the same sense, as a new equilibrium; but there must necessarily be a stage before that equilibrium is reached …

I have paid no attention, in this article, to another weakness of IS-LM analysis, of which I am fully aware; for it is a weakness which it shares with General Theory itself. It is well known that in later developments of Keynesian theory, the long-term rate of interest (which does figure, excessively, in Keynes’ own presentation and is presumably represented by the r of the diagram) has been taken down a peg from the position it appeared to occupy in Keynes. We now know that it is not enough to think of the rate of interest as the single link between the financial and industrial sectors of the economy; for that really implies that a borrower can borrow as much as he likes at the rate of interest charged, no attention being paid to the security offered. As soon as one attends to questions of security, and to the financial intermediation that arises out of them, it becomes apparent that the dichotomy between the two curves of the IS-LM diagram must not be pressed too hard.

So – back in 1937 John Hicks said that he was building a model of John Maynard Keynes’ General Theory. In 1980 he openly admits he wasn’t. Not mentioning that, not even in a footnote, in a book on the History of Macroeconomics, would, I guess, by many be considered  an example of Chicago-inspired intellectual dishonesty.

 

Where did the Greek bailout money go?

20 August, 2016 at 20:19 | Posted in Economics | Comments Off on Where did the Greek bailout money go?

corruptionThis paper provides a descriptive analysis of where the Greek bailout money went since 2010 and finds that, contrary to widely held beliefs, less than €10 billion or a fraction of less than 5% of the overall programme went to the Greek fiscal budget. In contrast, the vast majority of the money went to existing creditors in the form of debt repayments and interest payments. The resulting risk transfer from the private to the public sector and the subsequent risk transfer within the public sector from international organizations such as the ECB and the IMF to European rescue mechanisms such as the ESM still constitute the most important challenge for the goal to achieve a sustainable fiscal situation in Greece.

Jörg Rocholl & Axel Stahmer

Bayesian rationality — nothing but a probabilistic version of irrationalism

19 August, 2016 at 09:26 | Posted in Economics, Theory of Science & Methodology | 9 Comments

The initial choice of a prior probability distribution is not regulated in any way. The probabilities, called subjective or personal probabilities, reflect personal degrees of belief. From a Bayesian philosopher’s point of view, any prior distribution is as good as any other. Of course, from a Bayesian decision maker’s point of view, his own beliefs, as expressed in his prior distribution, may be better than any other beliefs, but Bayesianism provides no means of justifying this position. Bayesian rationality rests in the recipe alone, and the choice of the prior probability distribution is arbitrary as far as the issue of rationality is concerned. Thus, two rational persons with the same goals may adopt prior distributions that are wildly different …

bayestheoremBayesian learning is completely inflexible after the initial choice of probabilities: all beliefs that result from new observations have been fixed in advance. This holds because the new probabilities are just equal to certain old conditional probabilities …

According to the Bayesian recipe, the initial choice of a prior probability distribution is arbitrary. But the probability calculus might still rule out some sequences of beliefs and thus prevent complete arbitrariness.

Actually, however, this is not the case: nothing is ruled out by the probability calculus …

Thus, anything goes … By adopting a suitable prior probability distribution, we can fix the consequences of any observations for our beliefs in any way we want. This result, which will be referred to as the anything-goes theorem, holds for arbitrarily complicated cases and any number of observations. It implies, among other consequences, that two rational persons with the same goals and experiences can, in all eternity, differ arbitrarily in their beliefs about future events …

From a Bayesian point of view, any beliefs and, consequently, any decisions are as rational or irrational as any other, no matter what our goals and experiences are. Bayesian rationality is just a probabilistic version of irrationalism. Bayesians might say that somebody is rational only if he actually rationalizes his actions in the Bayesian way. However, given that such a rationalization always exists, it seems a bit pedantic to insist that a decision maker should actually provide it.

Max Albert

The Keynes-Ramsey-Savage debate on probability

18 August, 2016 at 16:23 | Posted in Economics | 3 Comments

Neoclassical economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules, axiomatized by Ramsey (1931) and Savage (1954) – that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational, and ultimately – via some “Dutch book” or “money pump”argument – susceptible to being ruined by some clever “bookie”.

calvin-math-atheist3-2Bayesianism reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – do rational agents really have to be Bayesian? As I have been arguing elsewhere (e. g. here, here and here) there is no strong warrant for believing so.

In many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.

Say you have come to learn (based on own experience and tons of data) that the probability of you becoming unemployed in Sweden is 10 %. Having moved to another country (where you have no own experience and no data) you have no information on unemployment and a fortiori nothing to help you construct any probability estimate on. A Bayesian would, however, argue that you would have to assign probabilities to the mutually exclusive alternative outcomes and that these have to add up to 1, if you are rational. That is, in this case – and based on symmetry – a rational individual would have to assign probability 10% to becoming unemployed and 90% of becoming employed.

That feels intuitively wrong though, and I guess most people would agree. Bayesianism cannot distinguish between symmetry-based probabilities from information and symmetry-based probabilities from an absence of information. In these kinds of situations most of us would rather say that it is simply irrational to be a Bayesian and better instead to admit that we “simply do not know” or that we feel ambiguous and undecided. Arbitrary an ungrounded probability claims are more irrational than being undecided in face of genuine uncertainty, so if there is not sufficient information to ground a probability distribution it is better to acknowledge that simpliciter, rather than pretending to possess a certitude that we simply do not possess.

I think this critique of Bayesianism is in accordance with the views of John Maynard Keynes’ A Treatise on Probability (1921) and General Theory (1937). According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but rational expectations. Sometimes we “simply do not know.” Keynes would not have accepted the view of Bayesian economists, according to whom expectations “tend to be distributed, for the same information set, about the prediction of the theory.” Keynes, rather, thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief”, beliefs that have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents modeled by Bayesian economists.

Stressing the importance of Keynes’ view on uncertainty John Kay writes in Financial Times:

Keynes believed that the financial and business environment was characterised by “radical uncertainty”. The only reasonable response to the question “what will interest rates be in 20 years’ time?” is “we simply do not know” …

For Keynes, probability was about believability, not frequency. He denied that our thinking could be described by a probability distribution over all possible future events, a statistical distribution that could be teased out by shrewd questioning – or discovered by presenting a menu of trading opportunities. In the 1920s he became engaged in an intellectual battle on this issue, in which the leading protagonists on one side were Keynes and the Chicago economist Frank Knight, opposed by a Cambridge philosopher, Frank Ramsey, and later by Jimmie Savage, another Chicagoan.

Keynes and Knight lost that debate, and Ramsey and Savage won, and the probabilistic approach has maintained academic primacy ever since. A principal reason was Ramsey’s demonstration that anyone who did not follow his precepts – anyone who did not act on the basis of a subjective assessment of probabilities of future events – would be “Dutch booked” … A Dutch book is a set of choices such that a seemingly attractive selection from it is certain to lose money for the person who makes the selection.

I used to tell students who queried the premise of “rational” behaviour in financial markets – where rational means are based on Bayesian subjective probabilities – that people had to behave in this way because if they did not, people would devise schemes that made money at their expense. I now believe that observation is correct but does not have the implication I sought. People do not behave in line with this theory, with the result that others in financial markets do devise schemes that make money at their expense.

Although this on the whole gives a succinct and correct picture of Keynes’s view on probability, I think it’s necessary to somewhat qualify in what way and to what extent Keynes “lost” the debate with the Bayesians Frank Ramsey and Jim Savage.

In economics it’s an indubitable fact that few mainstream neoclassical economists work within the Keynesian paradigm. All more or less subscribe to some variant of Bayesianism. And some even say that Keynes acknowledged he was wrong when presented with Ramsey’s theory. This is a view that has unfortunately also been promulgated by Robert Skidelsky in his otherwise masterly biography of Keynes. But I think it’s fundamentally wrong. Let me elaborate on this point (the argumentation is more fully presented in my book John Maynard Keynes (SNS, 2007)).

It’s a debated issue in newer research on Keynes if he, as some researchers maintain, fundamentally changed his view on probability after the critique levelled against his A Treatise on Probability by Frank Ramsey. It has been exceedingly difficult to present evidence for this being the case.

Ramsey’s critique was mainly that the kind of probability relations that Keynes was speaking of in Treatise actually didn’t exist and that Ramsey’s own procedure  (betting) made it much easier to find out the “degrees of belief” people were having. I question this both from a descriptive and a normative point of view.

What Keynes is saying in his response to Ramsey is only that Ramsey “is right” in that people’s “degrees of belief” basically emanates in human nature rather than in formal logic.

Patrick Maher, former professor of philosophy at the University of Illinois, even suggests that Ramsey’s critique of Keynes’s probability theory in some regards is invalid:

Keynes’s book was sharply criticized by Ramsey. In a passage that continues to be quoted approvingly, Ramsey wrote:

“But let us now return to a more fundamental criticism of Mr. Keynes’ views, which is the obvious one that there really do not seem to be any such things as the probability relations he describes. He supposes that, at any rate in certain cases, they can be perceived; but speaking for myself I feel confident that this is not true. I do not perceive them, and if I am to be persuaded that they exist it must be by argument; moreover, I shrewdly suspect that others do not perceive them either, because they are able to come to so very little agreement as to which of them relates any two given propositions.” (Ramsey 1926, 161)

I agree with Keynes that inductive probabilities exist and we sometimes know their values. The passage I have just quoted from Ramsey suggests the following argument against the existence of inductive probabilities. (Here P is a premise and C is the conclusion.)

P: People are able to come to very little agreement about inductive proba- bilities.
C: Inductive probabilities do not exist.

P is vague (what counts as “very little agreement”?) but its truth is still questionable. Ramsey himself acknowledged that “about some particular cases there is agreement” (28) … In any case, whether complicated or not, there is more agreement about inductive probabilities than P suggests.

Ramsey continued:

“If … we take the simplest possible pairs of propositions such as “This is red” and “That is blue” or “This is red” and “That is red,” whose logical relations should surely be easiest to see, no one, I think, pretends to be sure what is the probability relation which connects them.” (162)

I agree that nobody would pretend to be sure of a numeric value for these probabilities, but there are inequalities that most people on reflection would agree with. For example, the probability of “This is red” given “That is red” is greater than the probability of “This is red” given “That is blue.” This illustrates the point that inductive probabilities often lack numeric values. It doesn’t show disagreement; it rather shows agreement, since nobody pretends to know numeric values here and practically everyone will agree on the inequalities.

Ramsey continued:

“Or, perhaps, they may claim to see the relation but they will not be able to say anything about it with certainty, to state if it ismore or less than 1/3, or so on. They may, of course, say that it is incomparable with any numerical relation, but a relation about which so little can be truly said will be of little scientific use and it will be hard to convince a sceptic of its existence.” (162)

Although the probabilities that Ramsey is discussing lack numeric values, they are not “incomparable with any numerical relation.” Since there are more than three different colors, the a priori probability of “This is red” must be less than 1/3 and so its probability given “This is blue” must likewise be less than 1/3. In any case, the “scientific use” of something is not relevant to whether it exists. And the question is not whether it is “hard to convince a sceptic of its existence” but whether the sceptic has any good argument to support his position …

Ramsey concluded the paragraph I have been quoting as follows:

“Besides this view is really rather paradoxical; for any believer in induction must admit that between “This is red” as conclusion and “This is round” together with a billion propositions of the form “a is round and red” as evidence, there is a finite probability relation; and it is hard to suppose that as we accumulate instances there is suddenly a point, say after 233 instances, at which the probability relation becomes finite and so comparable with some numerical relations.” (162)

Ramsey is here attacking the view that the probability of “This is red” given “This is round” cannot be compared with any number, but Keynes didn’t say that and it isn’t my view either. The probability of “This is red” given only “This is round” is the same as the a priori probability of “This is red” and hence less than 1/3. Given the additional billion propositions that Ramsey mentions, the probability of “This is red” is high (greater than 1/2, for example) but it still lacks a precise numeric value. Thus the probability is always both comparable with some numbers and lacking a precise numeric value; there is no paradox here.

I have been evaluating Ramsey’s apparent argument from P to C. So far I have been arguing that P is false and responding to Ramsey’s objections to unmeasurable probabilities. Now I want to note that the argument is also invalid. Even if P were true, it could be that inductive probabilities exist in the (few) cases that people generally agree about. It could also be that the disagreement is due to some people misapplying the concept of inductive probability in cases where inductive probabilities do exist. Hence it is possible for P to be true and C false …

I conclude that Ramsey gave no good reason to doubt that inductive probabilities exist.

Ramsey’s critique made Keynes more strongly emphasize the individuals’ own views as the basis for probability calculations, and less stress that their beliefs were rational. But Keynes’s theory doesn’t stand or fall with his view on the basis for our “degrees of belief” as logical. The core of his theory – when and how we are able to measure and compare different probabilities – he doesn’t change. Unlike Ramsey he wasn’t at all sure that probabilities always were one-dimensional, measurable, quantifiable or even comparable entities.

Keynes’s analysis of the practical relevance of probability and weight to decision-making provides the basis for a theory of decision under uncertainty that, in its critique of mathematical expectations in the TP, constitutes the grounds on which Benthamite calculation is deemed to be ill-suited to deal with uncertainty in the GT. In his last letter to Townshend, this aspect clearly emerges … As already noted, Keynes reminds Townshend that he is inclined to associate “risk premium with probability strictly speaking, and liquidity premium with what in my Treatise on Probability I called ‘weight’”. Also, the correspondence shows that significant technical aspects of the TP survived Ramsey’s critique and Keynes did not endorse the subjective probability viewpoint suggested by Ramsey … Had he yielded to Ramsey’s on the possibility to derive point probabilities from action in every instances, Keynes would not refer to non-numerical probabilities as such strong an objection to received analysis of decision-making under uncertainty … Keynes’s analogy in his last letter to Townshend, associating the liquidity premium with “an increased sense of comfort and confidence”, cannot be accommodated with Ramsey’s subjectivist perspective, in which there is no room for a measure representing the degree of reliance on a probability assessment. So he may have been perplexed, in the assessment of his early beliefs, about the significance of defending the epistemological underpinnings of his theory of probability … But the correspondence shows that Keynes never stopped thinking of possible uses of his theory of probability.

Carlo Zappia

Uskali Mäki and Tony Lawson — different varieties of realism

18 August, 2016 at 11:05 | Posted in Theory of Science & Methodology | Comments Off on Uskali Mäki and Tony Lawson — different varieties of realism

We are all realists and we all—Mäki, Cartwright, and I—self-consciously present ourselves as such. The most obvious research-guiding commonality, perhaps, is that we do all look at the ontological presuppositions of economics or economists.

ecrealWhere we part company, I believe, is that I want to go much further. I guess I would see their work as primarily analytical and my own as more critically constructive or dialectical. My goal is less the clarification of what economists are doing and presupposing as seeking to change the orientation of modern economics … Specifically, I have been much more prepared than the other two to criticise the ontological presuppositions of economists—at least publically. I think Mäki is probably the most guarded. I think too he is the least critical, at least of the state of modern economics …

One feature of Mäki’s work that I am not overly convinced by, but which he seems to value, is his method of theoretical isolation (Mäki 1992). If he is advocating it as a method for social scientific research, I doubt it will be found to have much relevance—for reasons I discuss in Economics and reality (Lawson 1997). But if he is just saying that the most charitable way of interpreting mainstream economists is that they are acting on this method, then fine. Sometimes, though, he seems to imply more …

I cannot get enthused by Mäki’s concern to see what can be justified in contemporary formalistic modelling endeavours. The insights, where they exist, seem so obvious, circumscribed, and tagged on anyway …

As I view things, anyway, a real difference between Mäki and me is that he is far less, or less openly, critical of the state and practices of modern economics … Mäki seems more inclined to accept mainstream economic contributions as largely successful, or anyway uncritically. I certainly do not think we can accept mainstream contributions as successful, and so I proceed somewhat differently …

So if there is a difference here it is that Mäki more often starts out from mainstream academic economic analyses accepted rather uncritically, whilst I prefer to start from those everyday practices widely regarded as successful.

Tony Lawson

On the irrelevance of Milton Friedman

17 August, 2016 at 17:16 | Posted in Economics | 5 Comments

In producing theories couched in terms of isolated atoms that are quite at odds with social reality, modellers are actually compelled to make substantive claims that are wildly unrealistic. And because social reality does not conform to systems of isolated atoms, there is no guarantee that event regularities of the sort pursued will occur. Indeed, they are found not to …

milton_friedman_1Friedman enters this scene arguing that all we need to do is predict successfully, that this can be done even without realistic theories, and that unrealistic theories are to be preferred to realistic ones, essentially because they can usually be more parsimonious.

The first thing to note about this response is that Friedman is attempting to turn inevitable failure into a virtue. In the context of economic modelling, the need to produce formulations in terms of systems of isolated atoms, where these are not characteristic of social reality, means that unrealistic formulations are more or less unavoidable. Arguing that they are to be preferred to realistic ones in this context belies the fact that there is not a choice.

What amazed me about the initial responses to Friedman by numerous philosophers and others is that they mostly took the form: prediction is not enough, we need explanation too. Rarely, if ever, was it pointed out that because the social world is open, we cannot have successful prediction anyway.

So my own response to Friedman’s intervention is that it was mostly an irrelevancy, but one that has been opportunistically grasped by some as a supposed defence of the profusion of unrealistic assumptions in economics. This would work if successful prediction were possible. But usually it is not.

Tony Lawson

If scientific progress in economics – as Robert Lucas and other latter days followers of Milton Friedman seem to think – lies in our ability to tell ‘better and better stories’ one would of course expect economics journal being filled with articles supporting the stories with empirical evidence confirming the predictions. However, I would argue that the journals still show a striking and embarrassing paucity of empirical studies that (try to) substantiate these predictive claims. Equally amazing is how little one has to say about the relationship between the model and real world target systems. It is as though thinking explicit discussion, argumentation and justification on the subject isn’t considered required.

If the ultimate criteria of success of a model is to what extent it predicts and coheres with (parts of) reality, modern mainstream economics seems to be a hopeless misallocation of scientific resources. To focus scientific endeavours on proving things in models, is a gross misapprehension of what an economic theory ought to be about. Deductivist models and methods disconnected from reality are not relevant to predict, explain or understand real-world economies.

the-only-function-of-economic-forecasting-is-to-make-astrology-look-respectable-quote-1A scientific theory is, in fact, the embodiment of its assumptions. There can be no theory without assumptions since it is the assumptions embodied in a theory that provide, by way of reason and logic, the implications by which the subject matter of a scientific discipline can be understood and explained. These same assumptions provide, again, by way of reason and logic, the predictions that can be compared with empirical evidence to test the validity of a theory. It is a theory’s assumptions that are the premises in the logical arguments that give a theory’s explanations meaning, and to the extent those assumptions are false, the explanations the theory provides are meaningless no matter how logically powerful or mathematically sophisticated those explanations based on false assumptions may seem to be.

George Blackford

Robert Lucas’ umbrella

16 August, 2016 at 13:03 | Posted in Economics | 2 Comments

To understand New Classical thinking about this crucial issue, consider Lucas’s response to the following question: If people know the true distribution of future outcomes, why are autocorrelated mistakes such a common occurrence?

Umbrella_large“If you were studying the demand for umbrellas as an economist, you’d get rainfall data by cities, and you wouldn’t hesitate for two seconds to assume that everyone living in London knows how much it rains there. That would be assumption number one. And no one would argue with you either. [But] in macroeconomics, people argue about things like that. (In Klamer 1983, p. 43)”

What Lucas clearly has in mind is a model in which the distribution of outcomes (like the distribution of rainfall in London) is pregiven and independent of agent decisions (about whether or not to carry umbrellas) and agent errors. Future equilibrium states exist prior to and independent of the agent choice process that is supposed to generate them.

James Crotty

Conclusion: umbrellas are not economies. And I guess most people — at least outside The University of Chicago Department of Economics — knows that …

Reasons to dislike DSGE models

15 August, 2016 at 19:16 | Posted in Economics | 1 Comment

There are many reasons to dislike current DSGE models.

ob-lc060_neweco_g_20101129224057

First: They are based on unappealing assumptions. Not just simplifying assumptions, as any model must, but assumptions profoundly at odds with what we know about consumers and firms …

Second: Their standard method of estimation, which is a mix of calibration and Bayesian estimation, is unconvincing …

Third: While the models can formally be used for norma- tive purposes, normative implications are not convincing …

Fourth: DSGE models are bad communication devices …

Olivier Blanchard

And still Blanchard and other mainstream economists seem to be impressed by the ‘rigour’ brought to macroeconomics by New-Classical-New-Keynesian DSGE models and its rational expectations and micrcofoundations — Blanchard even hopes that although current DSGE models are ‘flawed,’ in the future they can ‘fulfil an important need in macroeconomics, that of offering a core structure around which to build and organise discussions.’

It is difficult to see how.

3634flimTake the rational expectations assumption for example. Rational expectations in the mainstream economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

keynes-right-and-wrong

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

Proving things ‘rigorously’ in DSGE models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Blanchard thinks there is a gain from the DSGE style of modeling in its capacity to offer ‘a core structure around which to build and organise discussions.’ To me that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.

Keynesian economics in a nutshell

15 August, 2016 at 16:27 | Posted in Economics | Comments Off on Keynesian economics in a nutshell

Cp17J5DXEAYgpPD.jpg-large

On the importance of pluralism

15 August, 2016 at 16:05 | Posted in Economics | Comments Off on On the importance of pluralism

 

A good illustration of what makes mainstream economics go astray — lack of pluralism (diversity) and like-minded people who are all wrong …

Truth and validity

15 August, 2016 at 14:01 | Posted in Theory of Science & Methodology | 1 Comment

 

Mainstream economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity and truth nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond comprehension. Stupid models are of no or little help in understanding the real world.

Noah Smith — ill-informed and misleading

14 August, 2016 at 11:51 | Posted in Economics | 3 Comments

Yours truly is far from being alone in criticising Noah Smith’s article on heterodox economics and mathematics (on which I commented yesterday). Tom Palley writes:

(1) Pretty much everything Smith charges heterodox economics with can be said about orthodox economics. That’s OK, but in that case we should open the classroom and op-ed pages to a variety of points of view and abandon the neoclassical monopoly.

IGLTA-misinform(2) Smith’s views on mathematical models come close to fetishism. Models have use value but they do not define economics (think of a paper with just math and no words vs. a paper with just words), and models are easily pushed into the realm of “negative” marginal returns.

Furthermore, Smith appears ignorant of the fact that mathematical modelling is very widespread in heterodox economics.

(3) Smith’s comments about predicting the crisis are facile. It’s not about predicting “dates”, but about having a sense of imminent developments and a sense of the deep-seated nature of the problems (i.e. demand shortage, income inequality, financial fragility, and tendency to stagnation). If orthodoxy had anticipated a fraction of what heterodoxy has, it would be trumpeting its achievements.

(4) In sum, this is an ill-informed article that aims to defend the economics status quo with unwarranted claims about the weaknesses of heterodox economics and strengths of orthodox economics.

Noah Smith — confusing mathematical masturbation with intercourse between research and reality

13 August, 2016 at 14:15 | Posted in Economics | 11 Comments

There’s no question that mainstream academic macroeconomics failed pretty spectacularly in 2008 …

Many among the heterodox would have us believe that their paradigm worked perfectly well in 2008 and after … This is dramatically overselling the product. First, heterodox models didn’t “predict” the crisis in the sense of an actual quantitative forecast.

64f5d94d9836c6a09b5d2009f0d4634a845bb2d7ba56bbaa16176c2fd0e958c0This is because much of heterodox theory is non-quantitative. Basically, people write down English words explaining their conceptual ideas about how the economy works. This describes the ideas of mid-20th-century economist Hyman Minsky, who wrote books and essays about the instability of the financial system. Minsky, though trained in math, chose not to use equations to model the economy — instead, he sketched broad ideas in plain English …

At the end of the day, policymakers and investors need to make quantitative decisions — how much to raise or lower interest rates, how big of a deficit to run, or how much wealth to allocate to Treasury bonds.

Noah Smith

Noah Smith — like so many other mainstream economists — obviously has the unfounded and ridiculous idea that because heterodox people like yours truly, Hyman Minsky, Steve Keen, or Tony Lawson, often criticize the application of mathematics in mainstream economics, we are critical of math per se.

I don’t know how many times I’ve been asked to answer this straw-man objection to heterodox economics– but here we go again.

No, there is nothing wrong with mathematics per se.

No, there is nothing wrong with applying mathematics to economics.

amathMathematics is one valuable tool among other valuable tools for understanding and explaining things in economics.

What is, however, totally wrong, are the utterly simplistic beliefs that

• “math is the only valid tool”

• “math is always and everywhere self-evidently applicable”

• “math is all that really counts”

• “if it’s not in math, it’s not really economics”

“almost everything can be adequately understood and analyzed with math”

So — please — let’s have no more of this feeble-minded pseudo debate where heterodox economics is described as simply anti-math!

russell_ackoffNo real problem worth solving can be solved without some basic research. Therefore the engagement of faculty and students on real problems yields basic research problems whose solutions are of practical significance. Furthermore, the validity of these solutions can be tested in the most effective way known: in application. This avoids one’s confusing mathematical masturbation with intercourse between research and reality.

Russell L. Ackoff

 

Serious academics are full of shit

11 August, 2016 at 18:53 | Posted in Education & School | 2 Comments

These fun-hating, highfalutin’ smarties have fought to maintain an exclusive and exclusionary scholastic environment since the first Ivory Towers were built …

I may be a blogger/reporter/writer/occasional internet loudmouth today, but I also identify as a scientist, and to some degree as a serious academic. What that doesn’t make me is smarter, or more important, or more deserving of respect, than you. What that doesn’t make me is free from the responsibility to participate in society, or to explain how I am using your taxpayer dollars. And it sure as hell doesn’t absolve me of the obligation to treat my fellow human beings as equals.

liz-lemon-eye-rollAnd if you’re wondering what any of this has to do with academics rolling their eyes at other academics who take selfies with gators, I’ll tell you: scientists who engage with the public are put down, forced on the defensive, and labeled a “waste of time” by their Smarter, More Serious, and Definitely More Anonymous academic peers, pretty much all the time.

Here’s the thing about scientists: we’re all just a bunch of nerds. Like many of my peers, I was inspired to walk the path of science thanks to an excessive dose of science fiction as a child …

Of course, I got older and discovered that real science is not about fighting alien monsters or building interdimensional laser beams. It is about The Slog …

But scientists aren’t drawn to their profession because they love mind-numbing repetition and enormous spreadsheets full of meaningless numbers. They are drawn to science because they are inspired by big ideas, or because they are obsessed with a burning question, or because they can’t stop dreaming of an awesome technology. In other words, because they are nerds who love science. So why are scientists giving other scientists shit for live-tweeting conferences, writing op-eds, doing public outreach, and finding creative ways to share their passion with the world and make that day-to-day slog more bearable?

The answer is complex. It involves enduring cultural norms, weird academic incentive structures, and institutionalized fears about the democratization of scholarship.

Maddie Stone

On tour again (personal)

7 August, 2016 at 16:50 | Posted in Varia | Comments Off on On tour again (personal)

Guest appearance in Hamburg again.’The Venice of the North’ is today as hip and hot as Berlin. Regular blogging will be resumed next weekend.

On the persistence of science-fiction economics

6 August, 2016 at 11:49 | Posted in Economics | 3 Comments

9781107763111iObscurantism is sustained by the self-interest of non-obscurantist scholars. To be effective, an attack on obscurantism has to be well documented and well argued. Mere diatribes are pointless and sometimes counterproductive. Yet scholars have a greater personal interest in achieving positive results than in exposing the flaws of others, not only because of the reward system of science, but also because achieving positive results is intrinsically more satisfying. On grounds of self-interest, therefore, many schoars will hesitate to take time off from their main work and hope that someone else will do the cleaning … The highly regarded economist Ariel Rubinstein has offered rare insider criticism of mainstream economics, commenting, for example, on a ‘as-if-rationality’ that “it ultimately became clear that the phrase ‘as if’ is a way to avoid taking responsibility for the strong assumptions upon which economic models are founded” …

When Joseph Stiglitz was asked at a private dinner party how economists can make repeated falsified claims without having their careers terminated, he reportedly answered: “I agree with you, but I don’t understand why you are so puzzled. What you should be assuming is that — as is done by most economists — economics is really a religion. So why should you be puzzled by the fact that they cling to and never give up their views despite frequent falsification?”

Confronted with the critique that they do not solve real problems, mainstream economists often react as Saint-Exupéry‘s Great Geographer, who, in response to the questions posed by The Little Prince, says that he is too occupied with his scientific work to be be able to say anything about reality. Confronting economic theory’s lack of relevance and ability to tackle real probems, one retreats into the wonderful world of economic models. One goes in to the shack of tools and stays there playing with the ‘toy-box’. While the economic problems in the world around us steadily increase, one is rather happily playing along with the latest toys in the mathematical toolbox.

Modern mainstream economics is sure very rigorous — but if it’s rigorously wrong, who cares?

Instead of making formal logical argumentation based on deductive-axiomatic models the message, we are better served by economists who more  than anything else try to contribute to solving real problems.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.