What is Post Keynesian Economics?

30 May, 2017 at 12:41 | Posted in Economics | 6 Comments

encyc John Maynard Keynes’s 1936 book The General Theory of Employment, Interest, and Money attempted to overthrow classical theory and revolutionize how economists think about the economy. Economists who build upon Keynes’s General Theory to analyze the economic problems of the twenty-first-century global economy are called Post Keynesians. Keynes’s “principle of effective demand” (1936, chap. 2) declared that the axioms underlying classical theory were not applicable to a money-using, entrepreneurial economic system. Consequently, the mainstream theory’s “teaching is misleading and disastrous if we attempt to apply it to the facts of experience” (Keynes 1936, p. 3). To develop an economic theory applicable to a monetary economy, Keynes suggested rejecting three basic axioms of classical economics (1936, p. 16).

Unfortunately, the axioms that Keynes suggested for rejection are still part of the foundation of twenty-first-century mainstream economic theory. Post Keynesians have thrown out the three axioms that Keynes suggested rejecting in The General Theory. The rejected axioms are the ergodic axiom, the gross-substitution axiom, and the neutral-money axiom … Only if these axioms are rejected can a model be developed that has the following characteristics:

•Money matters in the long and short run, that is, changes in the money supply can affect decisions that determine the level of employment and real economic output.

•As the economic system moves from an irrevocable past to an uncertain future, decision makers recognize that they make important, costly decisions in uncertain conditions where reliable, rational calculations regarding the future are impossible.

•People and organizations enter into monetary contracts. These money contracts are a human institution developed to efficiently organize time-consuming production and exchange processes. The money-wage contract is the most ubiquitous of these contracts.

•Unemployment, rather than full employment, is a common laissez-faire situation in a market-oriented, monetary production economy.

•The ergodic axiom postulates that all future events are actuarially certain, that is, that the future can be accurately forecasted from an analysis of existing market data. Consequently, this axiom implies that income earned at any employment level is entirely spent either on produced goods for today’s consumption or on buying investment goods that will be used to produce goods for the (known) future consumption of today’s savers. In other words, orthodox theory assumes that all income is always immediately spent on producibles, so there is never a lack of effective demand for things that industry can produce at full employment … Post Keynesian theory rejects the ergodic axiom.

In Post Keynesian theory … people recognize that the future is uncertain (nonergodic) and cannot be reliably predicted.

Paul Davidson

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even made it conceivable?

There are many who have ventured to answer that question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians.

0But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course it has to. I mean, who honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

4273570080_b188a92980This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing — economic disaster.

Advertisements

Speech by the British Prime Minister to the American President

29 May, 2017 at 17:49 | Posted in Varia | Comments Off on Speech by the British Prime Minister to the American President

 

Looking forward to hear Theresa May deliver something similar to Donald Trump …

Chicago economics — a dangerous pseudo-scientific zombie

29 May, 2017 at 14:58 | Posted in Economics | 5 Comments

Savings-and-InvestmentsEvery dollar of increased government spending must correspond to one less dollar of private spending. Jobs created by stimulus spending are offset by jobs lost from the decline in private spending. We can build roads instead of factories, but fiscal stimulus can’t help us to build more of both. This form of “crowding out” is just accounting, and doesn’t rest on any perceptions or behavioral assumptions.

John Cochrane

What Cochrane is reiterating here is nothing but Say’s law, basically saying that savings are equal to investments, and that if the state increases investments, then private investments have to come down (‘crowding out’). As an accounting identity there is of course nothing to say about the law, but as such it is also totally uninteresting from an economic point of view. As some of my Swedish forerunners — Gunnar Myrdal and Erik Lindahl — stressed more than 80 years ago, it’s really a question of ex ante and ex post adjustments. And as further stressed by a famous English economist about the same time, what happens when ex ante savings and investments differ, is that we basically get output adjustments. GDP changes and so makes saving and investments equal ex ost. And this, nota bene, says nothing at all about the success or failure of fiscal policies!

Government borrowing is supposed to “crowd out” private investment.

william-vickrey-1914-1996The current reality is that on the contrary, the expenditure of the borrowed funds (unlike the expenditure of tax revenues) will generate added disposable income, enhance the demand for the products of private industry, and make private investment more profitable. As long as there are plenty of idle resources lying around, and monetary authorities behave sensibly, (instead of trying to counter the supposedly inflationary effect of the deficit) those with a prospect for profitable investment can be enabled to obtain financing. Under these circumstances, each additional dollar of deficit will in the medium long run induce two or more additional dollars of private investment. The capital created is an increment to someone’s wealth and ipso facto someone’s saving. “Supply creates its own demand” fails as soon as some of the income generated by the supply is saved, but investment does create its own saving, and more. Any crowding out that may occur is the result, not of underlying economic reality, but of inappropriate restrictive reactions on the part of a monetary authority in response to the deficit.

William Vickrey Fifteen Fatal Fallacies of Financial Fundamentalism

A couple of years ago, in a lecture on the US recession, Robert Lucas gave an outline of what the new classical school of macroeconomics today thinks on the latest downturns in the US economy and its future prospects.

lucasLucas starts by showing that real US GDP has grown at an average yearly rate of 3 per cent since 1870, with one big dip during the Depression of the 1930s and a big – but smaller – dip in the recent recession.

After stating his view that the US recession that started in 2008 was basically caused by a run for liquidity, Lucas then goes on to discuss the prospect of recovery from where the US economy is today, maintaining that past experience would suggest an “automatic” recovery, if the free market system is left to repair itself to equilibrium unimpeded by social welfare activities of the government.

As could be expected there is no room for any Keynesian type considerations on eventual shortages of aggregate demand discouraging the recovery of the economy. No, as usual in the new classical macroeconomic school’s explanations and prescriptions, the blame game points to the government and its lack of supply side policies.

Lucas is convinced that what might arrest the recovery are higher taxes on the rich, greater government involvement in the medical sector and tougher regulations of the financial sector. But – if left to run its course unimpeded by European type welfare state activities -the free market will fix it all.

In a rather cavalier manner – without a hint of argument or presentation of empirical facts — Lucas dismisses even the possibility of a shortfall of demand. For someone who already 30 years ago proclaimed Keynesianism dead — “people don’t take Keynesian theorizing seriously anymore; the audience starts to whisper and giggle to one another” – this is of course only what could be expected. Demand considerations are simply ruled out on whimsical theoretical-ideological grounds, much like we have seen other neo-liberal economists do over and over again in their attempts to explain away the fact that the latest economic crises shows how the markets have failed to deliver. If there is a problem with the economy, the true cause has to be government.

Chicago economics is a dangerous pseudo-scientific zombie ideology that ultimately relies on the poor having to pay for the mistakes of the rich. Trying to explain business cycles in terms of rational expectations has failed blatantly. Maybe it would be asking too much of freshwater economists like Lucas and Cochrane to concede that, but it’s still a fact that ought to be embarrassing. My rational expectation is that 30 years from now, no one will know who Robert Lucas or John Cochrane was. John Maynard Keynes, on the other hand, will still be known as one of the masters of economics.

shackleIf at some time my skeleton should come to be used by a teacher of osteology to illustrate his lectures, will his students seek to infer my capacities for thinking, feeling, and deciding from a study of my bones? If they do, and any report of their proceedings should reach the Elysian Fields, I shall be much distressed, for they will be using a model which entirely ignores the greater number of relevant variables, and all of the important ones. Yet this is what ‘rational expectations’ does to economics.

G. L. S. Shackle

Nothing compares (personal)

28 May, 2017 at 20:44 | Posted in Economics | Comments Off on Nothing compares (personal)


Today is Mother’s Day in Sweden. This one is in loving memory of my mother Lisbeth, and of Kristina, beloved wife and mother of David and Tora.

Those whom the gods love die young.

But in dreams,
I can hear your name.
And in dreams,
We will meet again.

When the seas and mountains fall
And we come to end of days,
In the dark I hear a call
Calling me there
I will go there
And back again.

White Flag

28 May, 2017 at 20:05 | Posted in Varia | Comments Off on White Flag

 

Economic modeling — a realist perspective

28 May, 2017 at 13:37 | Posted in Theory of Science & Methodology | Comments Off on Economic modeling — a realist perspective

411WDSW5BRL._SX331_BO1,204,203,200_To his credit Keynes was not, in contrast to Samuelson, a formalist who was committed to mathematical economics. Keynes wanted models, but for him, building them required ‘ a vigilant observation of the actual working of our system.’ Indeed, ‘to convert a model into a quantitative formula is to destroy its usefulness as an instrument of thought.’ That conclusion can be strongly endorsed!

Modern economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

In mainstream neoclassical economics internal validity is almost everything and external validity next to nothing. Why anyone should be interested in that kind of theories and models is beyond yours truly’s imagination. As long as mainstream economists do not come up with export licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism.

Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in danger of becoming — as John Maynard Keynes put it in a letter to Ragnar Frisch in 1935 — “nothing better than a contraption proceeding from premises which are not stated with precision to conclusions which have no clear application.”

The fundamental econometric dilemma

27 May, 2017 at 10:20 | Posted in Statistics & Econometrics | Comments Off on The fundamental econometric dilemma

fraud-kit

Many thanks for sending me your article. I enjoyed it very much. I am sure these matters need discussing in that sort of way. There is one point, to which in practice I attach a great importance, you do not allude to. In many of these statistical researches, in order to get enough observations they have to be scattered over a lengthy period of time; and for a lengthy period of time it very seldom remains true that the environment is sufficiently stable. That is the dilemma of many of these enquiries, which they do not seem to me to face. Either they are dependent on too few observations, or they cannot rely on the stability of the environment. It is only rarely that this dilemma can be avoided.

Letter from J. M. Keynes to T. Koopmans, May 29, 1941

 

Econometric patterns should never be seen as anything else than possible clues to follow. Behind observable data there are real structures and mechanisms operating, things that are  — if we really want to understand, explain and (possibly) predict things in the real world — more important to get hold of than to simply correlate and regress observable variables.

Math cannot establish the truth value of a fact. Never has. Never will.

Paul Romer

Dido

26 May, 2017 at 21:22 | Posted in Varia | Comments Off on Dido

 

Just face it — austerity policies do not work!

26 May, 2017 at 10:29 | Posted in Economics | 1 Comment

 7ti40If failing to understand some basic Keynes­ian relations is a part of the explanation of what happened, there was also another, and more subtle, story behind the confounded economics of austerity. There was an odd confusion in policy thinking between the real need for institutional reform in Europe and the imagined need for austerity – two quite different things …

An analogy can help to make the point clearer: it is as if a person had asked for an antibiotic for his fever, and been given a mixed tablet with antibiotic and rat poison. You cannot have the antibiotic without also having the rat poison. We were in effect being told that if you want economic reform then you must also have, along with it, economic austerity, although there is absolutely no reason whatsoever why the two must be put together as a chemical compound.

Amartya Sen

austerity22We are not going to get out of the present economic doldrums as long as we continue to be obsessed with the insane idea that austerity is the universal medicine. When an economy is already hanging on the ropes, you can’t just cut government spendings. Cutting government expenditures reduces the aggregate demand. Lower aggregate demand means lower tax revenues. Lower tax revenues means increased deficits — and calls for even more austerity. And so on, and so on.

Kid in suit

26 May, 2017 at 10:06 | Posted in Varia | 1 Comment

 

I used to laugh at my kids when they behaved like this when in kindergarten.
But I guess most people expect something else from a president …

I can’t but grieve for a nation that has given us presidents like George Washington, Thomas Jefferson, Abraham Lincoln, and Franklin D. Roosevelt, and now is run by a witless clown. An absolute disgrace.

Expansionary austerity? You gotta be kidding!

25 May, 2017 at 18:19 | Posted in Economics | Comments Off on Expansionary austerity? You gotta be kidding!


[h/t Gabriel Uriarte]

Modern economics — pseudo-science based on FWUTV

25 May, 2017 at 14:39 | Posted in Statistics & Econometrics | Comments Off on Modern economics — pseudo-science based on FWUTV

The use of FWUTV — facts with unknown truth values — is, as Paul Romeer noticed in last year’s perhaps most interesting insider critique of mainstream economics, all to often used in macroeconomic modelling. But there are other parts of ‘modern’ economics than New Classical RBC economics that also have succumbed to this questionable practice:

CnGyMOeWAAEQVaHStatistical significance is not the same as real-world significance — all it offers is an indication of whether you’re seeing an effect where there is none. Even this narrow technical meaning, though, depends on where you set the threshold at which you are willing to discard the ‘null hypothesis’ — that is, in the above case, the possibility that there is no effect. I would argue that there’s no good reason to always set it at 5 percent. Rather, it should depend on what is being studied, and on the risks involved in acting — or failing to act — on the conclusions …

This example illustrates three lessons. First, researchers shouldn’t blindly follow convention in picking an appropriate p-value cutoff. Second, in order to choose the right p-value threshold, they need to know how the threshold affects the probability of a Type II error. Finally, they should consider, as best they can, the costs associated with the two kinds of errors.

Statistics is a powerful tool. But, like any powerful tool, it can’t be used the same way in all situations.

Narayana Kocherlakota

Good lessons indeed — underlining how important it is not to equate science with statistical calculation. All science entail human judgement, and using statistical models doesn’t relieve us of that necessity. Working with misspecified models, the scientific value of significance testing is actually zero – even though you’re making valid statistical inferences! Statistical models and concomitant significance tests are no substitutes for doing science.

In its standard form, a significance test is not the kind of ‘severe test’ that we are looking for in our search for being able to confirm or disconfirm empirical scientific hypotheses. This is problematic for many reasons, one being that there is a strong tendency to accept the null hypothesis since they can’t be rejected at the standard 5% significance level. In their standard form, significance tests bias against new hypotheses by making it hard to disconfirm the null hypothesis.

And as shown over and over again when it is applied, people have a tendency to read “not disconfirmed” as ‘probably confirmed.’ Standard scientific methodology tells us that when there is only say a 10 % probability that pure sampling error could account for the observed difference between the data and the null hypothesis, it would be more ‘reasonable’ to conclude that we have a case of disconfirmation. Especially if we perform many independent tests of our hypothesis and they all give about the same 10 % result as our reported one, I guess most researchers would count the hypothesis as even more disconfirmed.

We should never forget that the underlying parameters we use when performing significance tests are model constructions. Our p-values mean next to nothing if the model is wrong. And most importantly — statistical significance tests DO NOT validate models!

411-y9smopl-_sx346_bo1204203200_In journal articles a typical regression equation will have an intercept and several explanatory variables. The regression output will usually include an F-test, with p – 1 degrees of freedom in the numerator and n – p in the denominator. The null hypothesis will not be stated. The missing null hypothesis is that all the coefficients vanish, except the intercept.

If F is significant, that is often thought to validate the model. Mistake. The F-test takes the model as given. Significance only means this: if the model is right and the coefficients are 0, it is very unlikely to get such a big F-statistic. Logically, there are three possibilities on the table:
i) An unlikely event occurred.
ii) Or the model is right and some of the coefficients differ from 0.
iii) Or the model is wrong.

Yes, indeed. Forgetting — or at least pretending to forget — that third possibility, turns much of ‘modern’ economics and econometrics into post-real blah blah blah pseudo-science.

Financial crises — no big deal

23 May, 2017 at 17:51 | Posted in Economics | 1 Comment

dumstrutMany say or think that there were problems in the financial system that gave rise to the Great Depression. We’ve looked at that in a systematic way using modern theory. And we found that businesses had all kinds of money to invest, and they didn’t. They increased distributions to owners. Why? The answer is that businesses did not perceive they had profitable investment opportunities.

I don’t think financial crises are a big deal.

Edward Prescott

And this blah blah blah guy got a “Nobel prize” …

‘Modern’ economics — blah blah blah

23 May, 2017 at 16:37 | Posted in Statistics & Econometrics | 2 Comments

A key part of the solution to the identification problem that Lucas and Sargent (1979) seemed to offer was that mathematical deduction could pin down some parameters in a simultaneous system. But solving the identification problem means feeding facts with truth values that can be assessed, yet math cannot establish the truth value of a fact. Never has. Never will.

blah_blahIn practice, what math does is let macro-economists locate the FWUTVs [facts with unknown truth values] farther away from the discussion of identification … Relying on a micro-foundation lets an author say, “Assume A, assume B, …  blah blah blah …. And so we have proven that P is true. Then the model is identified.” …

Distributional assumptions about error terms are a good place to bury things because hardly anyone pays attention to them. Moreover, if a critic does see that this is the identifying assumption, how can she win an argument about the true expected value the level of aether? If the author can make up an imaginary variable, “because I say so” seems like a pretty convincing answer to any question about its properties.

Paul Romer

Yes, indeed, modern mainstream economics — and especially its mathematical-statistical operationalization in the form of econometrics — fails miserably over and over again. One reason why it does, is that the error term in the regression models used are thought of as representing the effect of the variables that were omitted from the models. The error term is somehow thought to be a ‘cover-all’ term representing omitted content in the model and necessary to include to ‘save’ the assumed deterministic relation between the other random variables included in the model. Error terms are usually assumed to be orthogonal (uncorrelated) to the explanatory variables. But since they are unobservable, they are also impossible to empirically test. And without justification of the orthogonality assumption, there is as a rule nothing to ensure identifiability.

In mainstream econometrics the error term is usually portrayed as representing the combined effect of the variables that are omitted from the model. What one does not say — in a way bordering on intellectual dishonesty — is that this assumption only works when (1) the combined effect is independent of each and every variable included in the model, and (2) the expectational value of the combined effect equals zero. And that is something almost never fulfilled in real world settings!

‘Modern’ mainstream economics is based on the belief that deductive-axiomatic modelling  is a sufficient guide to truth. That belief is, however, totally unfounded as long as no proofs are supplied for us to believe in the assumptions on which the model-based deductions and conclusions  build. ‘Mathiness’ masquerading as science is often used by mainstream economists to hide the problematic character of the assumptions used in their theories and models. But — without showing the model assumptions to be realistic and relevant, that kind of economics indeed, as Romer puts it, produces nothing but “blah blah blah.”

Building a science of economics for the real world

22 May, 2017 at 16:51 | Posted in Economics | 1 Comment

smell
Following the greatest economic depression since the 1930s, Robert Solow in 2010 gave a prepared statement on “Building a Science of Economics for the Real World” for this hearing in the U. S. Congress. According to Solow modern macroeconomics has not only failed at solving present economic and financial problems, but is “bound” to fail. Building microfounded macromodels on “assuming the economy populated by a representative agent” — consisting of “one single combination worker-owner-consumer-everything-else who plans ahead carefully and lives forever” — do not pass the smell test: does this really make sense? Solow surmised that a thoughtful person “faced with the thought that economic policy was being pursued on this basis, might reasonably wonder what planet he or she is on.”

Conclusion: an economic theory or model that doesn’t pass the real world smell-test is just silly nonsense that doesn’t deserve our attention and therefore belongs in the dustbin.

Rational expectations — not to mention effective market hypotheses, NAIRU, and DSGE models — immediately comes to mind.

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As I tried to show in my paper Rational expectations — a fallacious foundation for macroeconomics in a non-ergodic world there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong — in the dustbin.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

Neoliberalism — an oversold ideology

22 May, 2017 at 14:37 | Posted in Economics | 3 Comments

So what’s wrong with the economy? …

austerity_world_tour_greeceA 2002 study of United States fiscal policy by the economists Olivier Blanchard and Roberto Perotti found that ‘both increases in taxes and increases in government spending have a strong negative effect on private investment spending.’ They noted that this finding is ‘difficult to reconcile with Keynesian theory.’

Consistent with this, a more recent study of international data by the economists Alberto Alesina and Silvia Ardagna found that ‘fiscal stimuli based on tax cuts are more likely to increase growth than those based on spending increases.’

Greg Mankiw

From Mankiw’s perspective ‘the Alesina work suggests a still plausible hypothesis.’

Hmm …

Austerity policies not only generate substantial welfare costs due to supply-side channels, they also hurt demand — and thus worsen employment and unemployment.The notion that fiscal consolidations can be expansionary (that is, raise output and employment), in part by raising private sector confidence and investment, has been championed by, among others, Harvard economist Alberto Alesina in the academic world and by former European Central Bank President Jean-Claude Trichet in the policy arena. austerity-meme-sequester-thisHowever, in practice, episodes of fiscal consolidation have been followed, on average, by drops rather than by expansions in output. On average, a consolidation of 1 percent of GDP increases the long-term unemployment rate by 0.6 percentage point and raises by 1.5 percent within five years the Gini measure of income inequality ….

The evidence of the economic damage from inequality suggests that policymakers should be more open to redistribution than they are. Of course, apart from redistribution, policies could be designed to mitigate some of the impacts in advance—for instance, through increased spending on education and training, which expands equality of opportunity (so-called predistribution policies). And fiscal con- solidation strategies—when they are needed—could be designed to minimize the adverse impact on low-income groups. But in some cases, the untoward distributional consequences will have to be remedied after they occur by using taxes and government spending to redistribute income. Fortunately, the fear that such policies will themselves necessarily hurt growth is unfounded.

Jonathan Ostry, Prakash Loungani, and David Furceri

Economists have a tendency to get enthralled by their theories and models, and forget that behind the figures and abstractions there is a real world with real people. Real people that have to pay dearly for fundamentally flawed doctrines and recommendations.

Let’s make sure the consequences will rest on the conscience of those economists.

Tio trasiga teorier om ekonomi

22 May, 2017 at 14:13 | Posted in Economics | Comments Off on Tio trasiga teorier om ekonomi

Tio trasiga teorier_alla skisser.pdfUnder ett par decennier har jag undersökt företagsvärlden och samhällsekonomins många olika uttryck och avarter. Men det har hela tiden varit något som skavt …

Tänk om det var något med själva kartan. Själva teorin och modellen … Snabbt blev det uppenbart för mig att många av teorierna och modellerna bakom den dominerande ekonomiska politiken inte alls var så objektiva, vetenskapligt robusta och okontro-versiella som de ofta presenters som. Inte heller verkar nationalekonomin stå så frikopplad från det politiska som man kan förledas att tro.

En klart läsvärd (med undantag för kapitlet om Pikettys “miss”) kritik av mycket som är galet inom dagens nationalekonomi.

The most beautiful identity in mathematics

21 May, 2017 at 20:09 | Posted in Statistics & Econometrics | Comments Off on The most beautiful identity in mathematics

 

It’s time to tax the Wall Street casino!

20 May, 2017 at 15:18 | Posted in Economics | 2 Comments

Speculators may do no harm as bubbles on a steady stream of enterprise. But the position is serious when enterprise becomes the bubble on a whirlpool of speculation. When the capital development of a country becomes a by-product of the activities of a casino, the job is likely to be ill-done.8489342The measure of success attained by Wall Street, regarded as an institution of which the proper social purpose is to direct new investment into the most profitable channels in terms of future yield, cannot be claimed as one of the outstanding triumphs of laissez-faire capitalism — which is not surprising, if I am right in thinking that the best brains of Wall Street have been in fact directed towards a different object.

These tendencies are a scarcely avoidable outcome of our having successfully organised “liquid” investment markets. It is usually agreed that casinos should, in the public interest, be inaccessible and expensive. And perhaps the same is true of Stock Exchanges … The introduction of a substantial Government transfer tax on all transactions might prove the most serviceable reform available, with a view to mitigating the predominance of speculation over enterprise in the United States.

On the fundamental difference between ergodic and non-ergodic processes in economics

19 May, 2017 at 19:43 | Posted in Statistics & Econometrics | 2 Comments

Yours truly has tried to explain the fundamental difference between time averages and ensemble averages repeatedly on this blog. Still people obviously seem to have problems grasping it. Maybe this video will help …

Non-ergodic stationarity (wonkish)

19 May, 2017 at 08:21 | Posted in Economics | 4 Comments

Let’s say we have a stationary process. That does not guarantee that it is also ergodic. The long-run time average of a single output function of the stationary process may not converge to the expectation of the corresponding variables — and so the long-run time average may not equal the probabilistic (expectational) average. singla_slantSay we have two coins, where coin A has a probability of 1/2 of coming up heads, and coin B has a probability of 1/4 of coming up heads. We pick either of these coins with a probability of 1/2 and then toss the chosen coin over and over again. Now let H1, H2, … be either one or zero as the coin comes up heads or tales. This process is obviously stationary, but the time averages — [H1 + … + Hn]/n — converges to 1/2 if coin A is chosen, and 1/4 if coin B is chosen. Both these time averages have a probability of 1/2 and so their expectational average is 1/2 x 1/2 + 1/2 x 1/4 = 3/8, which obviously is not equal to 1/2 or 1/4. The time averages depend on which coin you happen to choose, while the probabilistic (expectational) average is calculated for the whole “system” consisting of both coin A and coin B.

The laws of mathematics and economics

17 May, 2017 at 19:55 | Posted in Economics | 5 Comments

Some commentators on this blog — and elsewhere — seem to have problems with yours truly’s critique of the overly debonair attitude with which mathematics is applied to economics. In case you think the critique is some odd outcome of heterodox idiosyncrasy, well, maybe you should think twice …

einstein

Taylor series (student stuff)

17 May, 2017 at 14:26 | Posted in Economics | Comments Off on Taylor series (student stuff)

 

Mainstream economics — an emperor turned out to be naked

15 May, 2017 at 23:41 | Posted in Economics | 3 Comments

The main reason why the teaching of microeconomics (or of “ micro foundations” of macroeconomics) has been called “autistic” is because it is increasingly impossible to discuss real-world economic questions with microeconomists – and with almost all neoclassical theorists. They are trapped in their system, and don’t in fact care about the outside world any more. If you consult any microeconomic textbook, it is full of maths (e.g. Kreps or Mas-Colell, Whinston and Green) or of “tales” (e.g. Varian or Schotter), without real data (occasionally you find “examples”, or “applications”, with numerical examples – but they are purely fictitious, invented by the authors).

an-inconvenient-truth1At first, French students got quite a lot of support from teachers and professors: hundreds of teachers signed petitions backing their movement – specially pleading for “pluralism” in teaching the different ways of approaching economics. But when the students proposed a precise program of studies … almost all teachers refused, considering that is was “too much” because “students must learn all these things, even with some mathematical details”. When you ask them “why?”, the answer usually goes something like this: “Well, even if we, personally, never use the kind of ‘theory’ or ‘tools’ taught in micoreconomics Courses … surely there are people who do ‘use’ and ‘apply’ them, even if it is in an ‘unrealistic’, or ‘excessive’ way”.

But when you ask those scholars who do “use these tools”, especially those who do a lot of econometrics with “representative agent” models, they answer (if you insist quite a bit): “OK, I agree with you that it is nonsense to represent the whole economy by the (intertemporal) choice of one agent –- consumer and producer — or by a unique household that owns a unique firm; but if you don’t do that, you don’t do anything !”

Bernard Guerrien

Yes indeed — “you don’t do anything!”

Twenty years ago Phil Mirowski was invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the mainstream neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?”

Yes indeed — what shall they do? The emperor turned out to be naked.

How to live your life (personal)

15 May, 2017 at 17:20 | Posted in Varia | 1 Comment

 

Among documentaries this is my absolute favourite.
Whenever my soul gets tired, watching this wonderful video of simple — good — life gives me new energy and hope.

You’re done Tommy boy!

15 May, 2017 at 12:12 | Posted in Economics | 1 Comment


I really love this guy. He immediately goes for the essentials. He has no time for bullshit — and neither should we!

Formal mathematical modeling in economics — a dead-end

14 May, 2017 at 21:07 | Posted in Economics | Comments Off on Formal mathematical modeling in economics — a dead-end

Using formal mathematical modeling, mainstream economists sure can guarantee that the conclusions hold given the assumptions. However, the validity we get in abstract model worlds does not warrantly transfer to real world economies. Validity may be good, but it isn’t enough. From a realist perspective both relevance and soundness are sine qua non.

broken-linkIn their search for validity, rigour and precision, mainstream macro modellers of various ilks construct microfounded DSGE models that standardly assume rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative household/ consumer/producer agents with homothetic and identical preferences, etc., etc. At the same time the models standardly ignore complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc., etc.

Behavioural and experimental economics — not to speak of psychology — show beyond any doubts that “deep parameters” — peoples’ preferences, choices and forecasts — are regularly influenced by those of other participants in the economy. And how about the homogeneity assumption? And if all actors are the same – why and with whom do they transact? And why does economics have to be exclusively teleological (concerned with intentional states of individuals)? Where are the arguments for that ontological reductionism? And what about collective intentionality and constitutive background rules?

These are all justified questions – so, in what way can one maintain that these models give workable microfoundations for macroeconomics? Science philosopher Nancy Cartwright gives a good hint at how to answer that question:

Our assessment of the probability of effectiveness is only as secure as the weakest link in our reasoning to arrive at that probability. We may have to ignore some issues to make heroic assumptions about them. But that should dramatically weaken our degree of confidence in our final assessment. Rigor isn’t contagious from link to link. If you want a relatively secure conclusion coming out, you’d better be careful that each premise is secure going on.

On a deep level one could argue that the one-eyed focus on validity makes mainstream economics irrelevant, since its insistence on deductive-axiomatic foundations doesn’t earnestly consider the fact that its formal logical reasoning, inferences and arguments show an amazingly weak relationship to their everyday real world equivalents. Although the formal logic focus may deepen our insights into the notion of validity, the rigour and precision has a devastatingly important trade-off: the higher the level of rigour and precision, the smaller is the range of real world application. So the more mainstream economists insist on formal logic validity, the less they have to say about the real world.

Structural econometrics

14 May, 2017 at 18:43 | Posted in Statistics & Econometrics | 2 Comments

In a blog post the other day, Noah Smith returned again to the discussion about the ’empirical revolution’ in economics and how to — if it really does exist — evaluate it. Counter those who think quasi-experiments and RCTs are the true solutions to finding causal parameters, Noah argues that without structural models

empirical results are only locally valid. And you don’t really know how local “local” is. If you find that raising the minimum wage from $10 to $12 doesn’t reduce employment much in Seattle, what does that really tell you about what would happen if you raised it from $10 to $15 in Baltimore?

That’s a good reason to want a good structural model. With a good structural model, you can predict the effects of policies far away from the current state of the world.

If only that were true! But it’s not.

Structural econometrics — essentially going back to the Cowles programme — more or less takes for granted the possibility of a priori postulating relations that describe economic behaviours as invariant within a Walrasian general equilibrium system. In practice that means the structural model is based on a straightjacket delivered by economic theory. Causal inferences in those models are — by assumption — made possible since the econometrician is supposed to know the true structure of the economy. And, of course, those exact assumptions are the crux of the matter. If the assumptions don’t hold, there is no reason whatsoever  to have any faith in the conclusions drawn, since they do not follow from the statistical machinery used!

Structural econometrics aims to infer causes from probabilities, inferred from sample data generated in non-experimental settings. Arguably, it is the most ambitious part of econometrics. It aims to identify economic structures, robust parts of the economy to which interventions can be made to bring about desirable events. This part of econometrics is distinguished from forecasting econometrics in its attempt to capture something of the ‘real’ economy in the hope of allowing policy makers to act on and control events …

LierBy making many strong background assumptions, the deductivist [the conventional logic of structural econometrics] reading of the regression model allows one — in principle — to support a structural reading of the equations and to support many rich causal claims as a result. Here, however, the difficulty is that of finding good evidence for many of the assumptions on which the approach rests. It seems difficult to believe, even in cases where we have good background economic knowledge, that the background information will be sufficiently to do the job that the deductivist asks of it. As a result, the deductivist approach may be difficult to sustain, at least in economics.

The difficulties in providing an evidence base for the deductive approach show just how difficult it is to warrant such strong causal claims. In short, as might be expected there is a trade-off between the strength of causal claims we would like to make from non-experimental data and the possibility of grounding these in evidence. If this conclusion is correct — and an appropriate elaboration were done to take into account the greater sophistication of actual structural econometric methods — then it suggests that if we want to do evidence-based structural econometrics, then we may need to be more modest in the causal knowledge we aim for. Or failing this, we should not act as if our causal claims — those that result from structural econometrics — are fully warranted by the evidence and we should acknowledge that they rest on contingent, conditional assumptions about the economy and the nature of causality.

Damien Fennell

Maintaining that economics is a science in the ‘true knowledge’ business, yours truly remains a skeptic of the pretences and aspirations of — both structural and non-structural — econometrics. So far, I cannot see that it has yielded much in terms of relevant, interesting economic knowledge. Over all the results have been bleak indeed.

Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality. But there is always the possibility that there are other variables — of vital importance and although perhaps unobservable and non-additive, not necessarily epistemologically inaccessible — that were not considered for the econometric modeling.

Most econometricians still concentrate on fixed parameter models and the structuralist belief/hope that parameter-values estimated in specific spatio-temporal contexts are exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Most of the assumptions that econometric modeling presupposes  are not only unrealistic — they are plainly wrong.

If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existant. Unfortunately that also makes most of the achievements of both structural and non-structural econometric forecasting and ‘causal explanation’ rather useless.

41svIj0RdVLInvariance assumptions need to be made in order to draw causal conclusions from non-experimental data: parameters are invariant to interventions, and so are errors or their distributions. Exogeneity is another concern. In a real example, as opposed to a hypothetical, real questions would have to be asked about these assumptions. Why are the equations “structural,” in the sense that the required invariance assumptions hold true? Applied papers seldom address such assumptions, or the narrower statistical assumptions: for instance, why are errors IID?

The tension here is worth considering. We want to use regression to draw causal inferences from non-experimental data. To do that, we need to know that certain parameters and certain distributions would remain invariant if we were to intervene. Invariance can seldom be demonstrated experimentally. If it could, we probably wouldn’t be discussing invariance assumptions. What then is the source of the knowledge?

“Economic theory” seems like a natural answer, but an incomplete one. Theory has to be anchored in reality. Sooner or later, invariance needs empirical demonstration, which is easier said than done.

Truth and economics (II)

13 May, 2017 at 12:00 | Posted in Economics | Comments Off on Truth and economics (II)

Reading some of the comments on my earlier post on the status of truth in ‘modern’ economics, yours truly came to think of Robert Solow’s assessment of ludicrously ‘post-real’ model assumptions …

4703325Suppose someone sits down where you are sitting right now and announces to me that he is Napoleon Bonaparte. The last thing I want to do with him is to get involved in a technical discussion of cavalry tactics at the battle of Austerlitz. If I do that, I’m getting tacitly drawn into the game that he is Napoleon. Now, Bob Lucas and Tom Sargent like nothing better than to get drawn into technical discussions, because then you have tacitly gone along with their fundamental assumptions; your attention is attracted away from the basic weakness of the whole story. Since I find that fundamental framework ludicrous, I respond by treating it as ludicrous – that is, by laughing at it – so as not to fall into the trap of taking it seriously and passing on to matters of technique.

Robert Solow

So much for the ’empirical’ revolution in economics

13 May, 2017 at 10:43 | Posted in Economics | 1 Comment

Sometimes a picture is worth a thousand words …

nobel22
Source: Merijn Knibbe

Next Page »

Blog at WordPress.com.
Entries and comments feeds.